• 欢迎访问爱玩吧
  • 如果您觉得本站非常有看点,那么赶紧使用Ctrl+D 收藏爱玩吧

【实用资料】RNN和LSTM资源目录收集大全

实用教程 aiwanyule 8年前 (2016-05-29) 已收录

实用资料::RNN和LSTM资源目录收集大全

Types of RNN

1) Plain Tanh Recurrent Nerual Networks

2) Gated Recurrent Neural Networks (GRU)

3) Long Short-Term Memory (LSTM)

Tutorials

A Beginner’s Guide to Recurrent Networks and LSTMs

http://deeplearning4j.org/lstm.html

A Deep Dive into Recurrent Neural Nets

nikhilbuduma.com【实用资料】RNN和LSTM资源目录收集大全

Nikhil Buduma | A Deep Dive into Recurrent Neural Nets

<p>Last time, we talked about the traditional feed-forward neural net and concepts that form the basis of deep learning. These ideas are extremely powerful! We saw how feed-forward convolutional neural networks have set records on many difficult tasks including handwritten digit recognition and object classification. And even today, feed-forward neural networks consistently outperform virtually all other approaches to solving classification tasks.</p>

Long Short-Term Memory: Tutorial on LSTM Recurrent Networks

http://people.idsia.ch/~juergen/lstm/index.htm1

LSTM implementation explained

http://apaszke.github.io/lstm-explained.html

Recurrent Neural Networks Tutorial

Understanding LSTM Networks

Recurrent Neural Networks in DL4J

http://deeplearning4j.org/usingrnns.html

Learning RNN Hierarchies

【实用资料】RNN和LSTM资源目录收集大全

Train RNN

A Simple Way to Initialize Recurrent Networks of Rectified Linear Units

Sequence Level Training with Recurrent Neural Networks (ICLR 2016)

Training Recurrent Neural Networks (PhD thesis)

Deep learning for control using augmented Hessian-free optimization


Hierarchical Conflict Propagation: Sequence Learning in a Recurrent Deep Neural Network

Recurrent Batch Normalization

Optimizing Performance of Recurrent Neural Networks on GPUs

Learn To Execute Programs

Learning to Execute

Neural Programmer-Interpreters (Google DeepMind. ICLR 2016 Best Paper)

【实用资料】RNN和LSTM资源目录收集大全

add.gif1000x350

【实用资料】RNN和LSTM资源目录收集大全

cars.gif1000x350

【实用资料】RNN和LSTM资源目录收集大全

A Programmer-Interpreter Neural Network Architecture for Prefrontal Cognitive Control

Convolutional RNN: an Enhanced Model for Extracting Features from Sequential Data

Attention Models

Recurrent Models of Visual Attention (Google DeepMind. NIPS2014)

Recurrent Model of Visual Attention(Google DeepMind)

Show, Attend and Tell: Neural Image Caption Generation with Visual Attention

A Neural Attention Model for Abstractive Sentence Summarization(EMNLP 2015. Facebook AI Research)

Effective Approaches to Attention-based Neural Machine Translation(EMNLP2015)

Generating Images from Captions with Attention

Attention and Memory in Deep Learning and NLP

Survey on the attention based RNN model and its applications in computer vision

Papers

Generating Sequences With Recurrent Neural Networks

Unsupervised Learning of Video Representations using LSTMs(ICML2015)

LSTM: A Search Space Odyssey

Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets

A Critical Review of Recurrent Neural Networks for Sequence Learning

Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks(Winner of MSCOCO image captioning challenge, 2015)

Visualizing and Understanding Recurrent Networks (ICLR 2016. Andrej Karpathy, Justin Johnson, Fei-Fei Li)

Grid Long Short-Term Memory

Depth-Gated LSTM

Deep Knowledge Tracing

Top-down Tree Long Short-Term Memory Networks

Alternative structures for character-level RNNs(INRIA & Facebook AI Research. ICLR 2016)

Pixel Recurrent Neural Networks (Google DeepMind)

Long Short-Term Memory-Networks for Machine Reading

Lipreading with Long Short-Term Memory

Associative Long Short-Term Memory

Representation of linguistic form and function in recurrent neural networks

Architectural Complexity Measures of Recurrent Neural Networks

Easy-First Dependency Parsing with Hierarchical Tree LSTMs

Training Input-Output Recurrent Neural Networks through Spectral Methods

Projects

NeuralTalk (Deprecated): a Python+numpy project for learning Multimodal Recurrent Neural Networks that describe images with sentences

NeuralTalk2: Efficient Image Captioning code in Torch, runs on GPU

char-rnn in Blocks

Project: pycaffe-recurrent

Using neural networks for password cracking

torch-rnn: Efficient, reusable RNNs and LSTMs for torch

Deploying a model trained with GPU in Torch into JavaScript, for everyone to use

LSTM implementation on Caffe

JNN: Java Neural Network Library

LSTM-Autoencoder: Seq2Seq LSTM Autoencoder

RNN Language Model Variations

keras-extra: Extra Layers for Keras to connect CNN with RNN

Blogs

Survey on Attention-based Models Applied in NLP

yanran.li

Survey on Attention-based Models Applied in NLP

Attention-based models are firstly proposed in the field of computer vision around mid 2014. And then they spread into Natural Language Processing. In this post, I will mainly focus on a list of attention-based models applied in natural language processing.

Survey on Advanced Attention-based Models

yanran.li

Survey on Advanced Attention-based Models

In the previous post, I briefly introduce a list of paper applying attention-based models in natural language processing. Though slight different, they are all soft alignment models. However, there actually exits two class of alignment models, the soft one, and also the hard one. In fact, the soft and hard alignment models are concurred in computer vision around late 2014[^1]. Due to differences between CV and NLP (more precisely, image vs. language), hard alignment models are more difficult to transfer into NLP. In this post, I aim at introducing some advanced attention-based models especially hard ones, which have not been yet but will be popular.

Online Representation Learning in Recurrent Neural Language Models

http://www.marekrei.com/blog/online-representation-learning-in-recurrent-neural-language-models/

Fun with Recurrent Neural Nets: One More Dive into CNTK and TensorFlow

esciencegroup.com【实用资料】RNN和LSTM资源目录收集大全

Fun with Recurrent Neural Nets: One More Dive into CNTK and TensorFlow

In a previous article I set about comparing Microsoft’s Computational Network Took Kit for deep neural nets to Google’s TensorFlow.  I concluded that piece with a deep dive into how recurrent neura…

Materials to understand LSTM

https://medium.com/@shiyan/materials-to-understand-lstm-34387d6454c1#.4mt3bzoau

Understanding LSTM and its diagrams (★★★★★)

Persistent RNNs: 30 times faster RNN layers at small mini-batch sizes (Greg Diamos, Baidu Silicon Valley AI Lab)

http://svail.github.io/persistent_rnns/

All of Recurrent Neural Networks

https://medium.com/@jianqiangma/all-about-recurrent-neural-networks-9e5ae2936f6e#.q4s02elqg

Rolling and Unrolling RNNs

shapeofdata.wordpress.com【实用资料】RNN和LSTM资源目录收集大全

Rolling and Unrolling RNNs

A while back, I discussed Recurrent Neural Networks (RNNs), a type of artificial neural network in which some of the connections between neurons point “backwards”. When a sequence of in…

Sequence prediction using recurrent neural networks(LSTM) with TensorFlow: LSTM regression using TensorFlow

Resources

Awesome Recurrent Neural Networks – A curated list of resources dedicated to RNN

Jürgen Schmidhuber’s page on Recurrent Neural Networks

http://people.idsia.ch/~juergen/rnn.html

Reading and Questions

Are there any Recurrent convolutional neural network network implementations out there ?

显示中文翻译(机器翻译别说我啊哈哈):

类型的递归神经网络

递归神经网络的函数1)平原

2)门控的递归神经网络(GRU)

3)长短期记忆(LSTM)

教程

一个初学者的指南lstms递归网络

deeplearning4j.org lstm.html http:/ /

深潜入神经网

http:/ / / / nikhilbuduma.com 2015年01/11/a深潜水到经常性的神经网络/

长期短期记忆:对LSTM网络教程

http:/ / / / / lstm Juergen people.idsia.ch ~。

严格执行解释

apaszke.github.io lstm-explained.html http:/ /

递归神经网络教程

了解LSTM网络

递归神经网络的dl4j

deeplearning4j.org usingrnns.html http:/ /

学习RNN Hierarchies

【实用资料】RNN和LSTM资源目录收集大全

训练RNN

一个简单的方法来初始化校正线性单元的递归网络

与神经网络的训练序列水平(性质2016)

递归神经网络的训练(博士学位论文)

使用增强Hessian-free优化控制深度学习


层次冲突传播:在递归深度神经网络的学习顺序

经常性的批量化

递归神经网络的基于GPU的性能优化

学会执行程序

学习执行

神经程序员译员(谷歌DeepMind。IclR 2016最佳论文)

【实用资料】RNN和LSTM资源目录收集大全

【实用资料】RNN和LSTM资源目录收集大全

【实用资料】RNN和LSTM资源目录收集大全

一个程序员翻译前额叶的认知控制的神经网络结构

卷积网络:一种从序列数据中提取特征增强模型

注意模型

视觉注意递归模型(谷歌DeepMind。nips2014)

视觉注意递归模型(谷歌DeepMind)

显示,参加并告诉:视觉注意的神经图像字幕生成

一个抽象的句子总结神经注意力模型(emnlp 2015。脸谱网的人工智能研究)

基于神经机器翻译关注的有效途径(emnlp2015)

从字幕注意图像生成

在深入学习NLP的注意和记忆

关注调查的递归神经网络模型及其在计算机视觉中的应用

论文

生成递归神经网络的序列

视频lstms表示使用无监督学习(icml2015)

方:搜索太空奥德赛

推理算法模式堆栈增强经常网

序列学习递归神经网络的评论

将神经网络预测的采样序列(mscoco图像字幕的挑战,2015人)

可视化和理解递归神经网络(IclR 2016。Andrej karpathy,贾斯廷约翰逊,李飞飞)

网格长短期记忆

深度门控LSTM

深知识跟踪

自顶向下的树长短期记忆网络

人物等级RNNs替代结构(INRIA和脸谱网人工智能研究。性质2016)

像素递归神经网络(谷歌DeepMind)

机器阅读长短期记忆网络

唇读长短期记忆

结合长短期记忆

在神经网络的语言形式和功能的表示

递归神经网络的建筑的复杂性措施

第一层次树lstms依存句法分析

输入输出训练神经网络通过光谱方法

项目

neuraltalk(不推荐使用):一种多模态神经网络描述图像的句子学习Python NumPy项目

neuraltalk2:高效的图像字幕代码运行在GPU上的火炬,

炭块递归

项目:pycaffe复发

使用神经网络进行密码破解

火炬RNN:高效、可重复使用型和lstms火炬

部署一个GPU在火炬到JavaScript的训练模式,供大家使用

在严格执行咖啡

约翰:java网络图书馆

lstm -基于自编码:seq2seq lstm基于自编码

语言变化的RNN模型

keras额外:为keras连接美国有线电视新闻网与网络层

博客

注意模型应用于自然语言处理的调查

http://yanran.li/peppypapers/2015/10/07/survey-attention-model-1.html

先进的基于注意的测量模型

http://yanran.li/peppypapers/2015/10/07/survey-attention-model-2.html

在线表示在神经语言模式学习

http://www.marekrei.com /博客/在线式学习在复发性神经语言模型/

乐趣与神经网:一个潜入CNTK和tensorflow

http://esciencegroup.com/2016/03/04/fun-with-recurrent-neural-nets-one-more-dive-into-cntk-and-tensorflow/

材料的理解方

http://medium.com十堰/材料/ @ -理解- lstm – 34387d6454c1 # .4mt3bzoau

Understanding LSTM and its diagrams (★★★★★)

持续型:快30倍的网络层在迷你小批量(格雷戈diamos,百度硅谷人工智能实验室)

http://svail.github.io/persistent_rnns/

所有的递归神经网络

http:/ /介质。COM / @ jianqiangma / all-about-recurrent-neural-networks-9e5ae2936f6e #。q4s02elqg

轧制和展开型

shapeofdata.wordpress.com HTTPS:/ / / / / /滚动04 27 2016—展开—RNNs /

利用神经网络的时间序列预测(LSTM)与tensorflow:LSTM回归使用tensorflow

资源

可怕的递归神经网络-策划列表资源致力于网络

J rgenü说的关于递归神经网络

http:/ / / / rnn.html Juergen people.idsia.ch ~

阅读和问题

有没有复发的卷积神经网络实现了吗?


爱玩吧 , 版权所有丨如未注明 , 均为原创丨本网站采用BY-NC-SA协议进行授权
转载请注明原文链接:【实用资料】RNN和LSTM资源目录收集大全
喜欢 (0)