Technology moves fast! ⚡ Don't get left behind.🚶 Subscribe to our mailing list to keep up with latest and greatest in open source projects! 🏆


Subscribe to our mailing list


Statistics on lstm

Number of watchers on Github 535
Number of open issues 10
Average time to close an issue about 2 months
Main language Lua
Average time to merge a PR about 1 hour
Open pull requests 1+
Closed pull requests 5+
Last commit almost 4 years ago
Repo Created about 4 years ago
Repo Last Updated about 1 year ago
Size 2.21 MB
Organization / Authorwojzaremba
Contributors3
Page Updated
Do you use lstm? Leave a review!
View open issues (10)
View lstm activity
View on github
Fresh, new opensource launches 🚀🚀🚀
Trendy new open source projects in your inbox! View examples

Subscribe to our mailing list

Evaluating lstm for your project? Score Explanation
Commits Score (?)
Issues & PR Score (?)

Long Short Term Memory Units

This is self-contained package to train a language model on word level Penn Tree Bank dataset. It achieves 115 perplexity for a small model in 1h, and 81 perplexity for a big model in a day. Model ensemble of 38 big models gives 69 perplexity. This code is derived from https://github.com/wojciechz/learning_to_execute (the same author, but a different company).

More information: http://arxiv.org/pdf/1409.2329v4.pdf

lstm open issues Ask a question     (View All Issues)
  • almost 3 years QUESTION: How to run on a different treebank corpus
  • about 3 years question about the g_cloneManyTimes() function
  • over 3 years attempt to call field 'LogSoftMax_updateOutput' (a nil value)
  • over 3 years why split gates along dimension 2?
  • over 3 years Is cutorch.synchronize() necessary?
  • over 3 years questions about g_cloneManyTimes
  • almost 4 years replicate(x_inp, batch_size)
  • about 4 years 1 to params.seq_length -1 ?
lstm open pull requests (View All Pulls)
  • Optimized hyperparameters on "small" model, achieves 87 perplexity in 1 hour.
lstm questions on Stackoverflow (View All Questions)
  • Training, testing, and validation sets for bidirectional LSTM (BLSTM)
  • How to stack LSTM layers to classify speech files
  • Stock market to LSTM
  • How to train the forget gate of an LSTM to keep the value in the cell, while the target of the block is zero?
  • LSTM Training: to backpropagate or not
  • Building Speech Dataset for LSTM binary classification
  • LSTM NN: forward propagation
  • How do I use 4096D vector as input to LSTM?
  • Classification with CNN/LSTM/RNN
  • Translating a TensorFlow LSTM into synapticjs
  • TensorFlow LSTM Generative Model
  • Request for example: Caffe RNN/LSTM regression for Python
  • LSTM network learning
  • Analysing the result of LSTM Theano Sentiment Analysis
  • plot in python for theano LSTM
  • Multivariate time-series RNN using Tensorflow. Is this possible with an LSTM cell or similar?
  • How to perform multi-label learning with LSTM using theano?
  • Keras LSTM predicting only 1 category, in multi-category classification - how to fix?
  • LSTM Model in Torch not Learning
  • How to use LSTM for sequence labelling in python?
  • LSTM with rnn cuda()?
  • LSTM Classifying all the words as the same class
  • Simple LSTM failing due undefined input dimension
  • Simplest Lstm training with Keras io
  • What is a Recurrent Neural Network, what is a Long Short Term Memory (LSTM) network, and is it always better?
  • Training LSTM Model in Torch and Data Structure
  • Training LSTM in Torch
  • PyBrain LSTM Example resultin in ValueError:Attempted relative import in non-package
  • Large Difference in Output: nnGraph based LSTM vs Sequencer LSTM (Torch)
  • Pybrain - lstm sequence bad predictions
lstm list of languages used
Other projects in Lua