healthdatascience / deep-learning-har Goto Github PK
View Code? Open in Web Editor NEWConvolutional and LSTM networks to classify human activity
Convolutional and LSTM networks to classify human activity
Hey,
what do you exactly mean by number of steps in the time series? Where does this number come from?
Best Regards,
edu
Hi @bhimmetoglu ,
Your work is very interesting. I really like how you managed to crank the test accuracy up to 94.8% with the LSTM alone, that is interesting and remarkable. I have also made work about LSTMs for HAR. I think you forgot to cite my work, which is under the MIT License, Copyright (c) 2016 Guillaume Chevalier: https://github.com/guillaume-chevalier/LSTM-Human-Activity-Recognition/blob/master/LICENSE
I used the two stacked LSTM cells like you just did, on the same HAR dataset, and with TensorFlow, too. Overall, the part of you project where the LSTMs are used seems entirely:
The structure of the code in your project follows the same progression as my original code. I also see that you have put a star on my project and that you were aware of it. Also, Some lines of code, as well as the unique comments on the logic of those lines are nearly identical to mines, such as for example here:
lstm_in = tf.transpose(inputs_, [1,0,2]) # reshape into (seq_len, N, channels)
lstm_in = tf.reshape(lstm_in, [-1, n_channels]) # Now (seq_len*N, n_channels)
Compared to my code:
# input shape: (batch_size, n_steps, n_input)
_X = tf.transpose(_X, [1, 0, 2]) # permute n_steps and batch_size
# Reshape to prepare input to hidden activation
_X = tf.reshape(_X, [-1, n_input])
# new shape: (n_steps*batch_size, n_input)
To cite my work, I at least ask to point to the URL of the homepage of the GitHub repository, as stated in the README.md of my project, such as:
Guillaume Chevalier, LSTMs for Human Activity Recognition, 2016
https://github.com/guillaume-chevalier/LSTM-Human-Activity-Recognition
For now:
Please act quickly, your work is already copied and cited by others without citations to mine:
Citing someone is simple. Sorry for the trouble, and thanks in advance.
The offical code in HAR-CNN use "train" folder to generate test data:
X_train, labels_train, list_ch_train = read_data(data_path="./data/", split="train") # train
X_test, labels_test, list_ch_test = read_data(data_path="./data/", split="train") # test
When I change them to:
X_train, labels_train, list_ch_train = read_data(data_path="./data/", split="train") # train
X_test, labels_test, list_ch_test = read_data(data_path="./data/", split="test") # test
The test accuracy will decrease to 0.877917, not 0.98.
Not really an issue (sorry), but I'd really like to know what tool did you use to make the diagrams for the networks. Thanks.
Hi,
i would like to ask for your LSTM so what is your initial input length?
and for every iteration within the LSTM from your picture, stated with 128steps
what does the 27 for each loop for?
and you stacked 2 LSTM cells if im not mistaken. so how do you from input length of 9 jump into LSTM with input of 27?
Sorry that im kind of new to LSTM and stacked LSTM. Do you mind to explain how it works?
Do you mind to explain this part below.
lstm_in = tf.transpose(inputs_, [1,0,2]) # reshape into (seq_len, N, channels)
lstm_in = tf.reshape(lstm_in, [-1, n_channels]) # Now (seq_len*N, n_channels)
lstm_in = tf.layers.dense(lstm_in, lstm_size, activation=None)
lstm_in = tf.split(lstm_in, seq_len, 0)
lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)
drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob_)
cell = tf.contrib.rnn.MultiRNNCell([drop] * lstm_layers)
initial_state = cell.zero_state(batch_size, tf.float32)
你好 首先感谢 您能开源你的模型 能不能把您的数据给我一些 谢谢拉 我不需要全部的数据 只要一点点能把模型跑起来就好 谢谢
xgboost 96 is better than cnn 93,can you tell me why
thank you
Hi,
Thanks for sharing the code, but running the HAR-CNN_LSTM.ipynb, I got the following error message directly related to
with graph.as_default():
outputs, final_state = tf.contrib.rnn.static_rnn(cell, lstm_in, dtype=tf.float32,
initial_state = initial_state)
The error message is as follows, would you like to take a look at it, and see how to fix it? Thanks.
ValueError: Attempt to reuse RNNCell <tensorflow.contrib.rnn.python.ops.core_rnn_cell_impl.BasicLSTMCell object at 0x2b53b31b2710> with a different variable scope than its first use. First use of cell was with scope 'rnn/multi_rnn_cell/cell_0/basic_lstm_cell', this attempt is with scope 'rnn/multi_rnn_cell/cell_1/basic_lstm_cell'. Please create a new instance of the cell if you would like it to use a different set of weights. If before you were using: MultiRNNCell([BasicLSTMCell(...)] * num_layers), change to: MultiRNNCell([BasicLSTMCell(...) for _ in range(num_layers)]). If before you were using the same cell instance as both the forward and reverse cell of a bidirectional RNN, simply create two instances (one for forward, one for reverse). In May 2017, we will start transitioning this cell's behavior to use existing stored weights, if any, when it is called with scope=None (which can lead to silent model degradation, so this error will remain until then.)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.