LSTM-Human-Activity-Recognition and TensorFlow-on-Android-for-Human-Activity-Recognition-with-LSTMs
The second project, "TensorFlow-on-Android-for-Human-Activity-Recognition-with-LSTMs," appears to be a complementary ecosystem sibling to the first, "LSTM-Human-Activity-Recognition," as it provides an iPython notebook and an Android app demonstrating the deployment of an LSTM model, potentially similar to the one built in the first project, onto an Android device.
About LSTM-Human-Activity-Recognition
guillaume-chevalier/LSTM-Human-Activity-Recognition
Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM RNN. Classifying the type of movement amongst six activity categories - Guillaume Chevalier
# Technical Summary Employs a many-to-one LSTM architecture that processes 128-sample time windows of 9-channel inertial sensor data (3-axis accelerometer and gyroscope readings) without extensive feature engineering, relying instead on the recurrent network to automatically learn temporal patterns across sequential measurements. Minimal preprocessing is applied beyond gravity filtering, contrasting with traditional signal-processing-heavy approaches that require manual feature extraction. Built with TensorFlow and includes Jupyter notebook implementations demonstrating end-to-end data loading, model training, and evaluation metrics on the UCI HAR Dataset.
About TensorFlow-on-Android-for-Human-Activity-Recognition-with-LSTMs
curiousily/TensorFlow-on-Android-for-Human-Activity-Recognition-with-LSTMs
iPython notebook and Android app that shows how to build LSTM model in TensorFlow and deploy it on Android
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work