deep learning 소개 - kangwonleeck/nlp2/02_deep_learning.pdf · 2017-09-18 · deep nlp = deep...

40
Deep Learning 소개 강원대학교 IT대학 이창기 I got a lot of content and pictures from http://web.stanford.edu/class/cs224n

Upload: others

Post on 17-Jun-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Deep Learning 소개

강원대학교 IT대학

이창기

I got a lot of content and pictures from http://web.stanford.edu/class/cs224n

Page 2: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Object Recognition

https://www.youtube.com/watch?v=n5uP_LP9SmM

Page 3: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Semantic Segmentation

https://youtu.be/ZJMtDRbqH40

Page 4: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Semantic Segmentation

VGGNet + Deconvolution network

Page 5: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Image Completion

https://vimeo.com/38359771

Page 6: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Neural Art

• Artistic style transfer using CNN

Page 7: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Hand Writing by Machine

Input: recurrent neural network handwriting generation demo

Style:

http://www.cs.toronto.edu/~graves/handwriting.html

LSTM RNN:

Page 8: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Music Composition

https://highnoongmt.wordpress.com/2015/05/22/lisls-stis-recurrent-neural-networks-for-folk-music-generation/

Page 9: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Image Caption Generation

Page 10: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Visual Question Answering

Facebook: Visual Q&A

Page 11: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Play Game

Page 12: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Word Analogy

King – Man + Woman ≈ Queen

http://deeplearner.fz-qqq.net/

Page 13: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Neural Machine Translation

http://104.131.78.120/

Page 14: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Neural Conversation Model

Page 15: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Abstractive Text Summarization

로드킬로 숨진 친구의 곁을 지키는 길고양이의 모습이 포착되었다.

RNN_search+input_feeding+CopyNet

Page 16: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Learning to ExecuteLSTM RNN

Page 17: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Learning Approximate Solutions• Travelling Salesman Problem: NP-hard• Pointer Network can learn approximate solutions: O(n^2)

Page 18: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Learning to Learn with RNN• Deep Learning: hand-designed features learned features• This technique: hand-designed update rules learned update rules

SGD:

Momentum:

Adagrad:

Adadelta or RMSprop:

Adam:

Page 19: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

One Shot Learning• Learning from a few examples• Matching Nets use attention and memory

a(x1,x2) is a attention kernel

Page 20: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

What’s Deep Learning?

• Deep learning is a subfield of machine learning

• Most machine learning methods work well because of human-designed representations and input features

– For example: features for finding named entities like locations or organization names (Finkel et al., 2010)

• Machine learning becomes just optimizing weights to best make a final prediction

Page 21: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Machine Learning vs. Deep Learning

Page 22: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

What’s Deep Learning?

• Representation learning attempts to automatically learn good features or representations

• Deep learning algorithms attempt to learn (multiple levels of) representation and an output

• From “raw” inputs x (e.g., sound, characters, or words)

Page 23: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Reasons for Exploring Deep Learning

• Manually designed features are often over-specified, incomplete and take a long time to design and validate

• Learned Features are easy to adapt, fast to learn

• Deep learning provides a very flexible, (almost?) universal, learnable framework for representing world, visual and linguistic information

• Deep learning can learn unsupervised (from raw text) and supervised (with specific labels like positive/negative)

Page 24: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Reasons for Exploring Deep Learning

• In ~2010 deep learning techniques started outperforming other machine learning techniques. Why this decade?

• Large amounts of training data favor deep learning

• Faster machines and multicore CPU/GPUs favor Deep Learning

• New models, algorithms, ideas

– Better, more flexible learning of intermediate representations

– Effective end-to-end joint system learning

– Effective learning methods for using contexts and transferring between tasks

• Improved performance (first in speech and vision, then NLP)

Page 25: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Why Deep Neural Networks?: Integrated Learning

• 기존 기계학습 방법론– Handcrafting features time-consuming

• Deep Neural Network: Feature Extractor + Classifier

25<겨울학교14 Deep Learning 자료 참고>

Page 26: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Why Deep Neural Networks?: Unsupervised Feature Learning

• 기계학습에 많은 학습 데이터 필요

– 소량의 학습 데이터

• 학습 데이터 구축 비용/시간

– 대량의 원시 코퍼스 (unlabeled data)

• Semi-supervised, Unsupervised …

• Deep Neural Network– Pre-training 방법을 통해 대량의 원시 코퍼스에서 자질 학습

– Restricted Boltzmann Machines (RBM)

– Stacked Autoencoder, Stacked Denosing Autoencoder

– Word Embedding (for NLP)

26

Page 27: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Deep Learning for Speech

• The first breakthrough results of “deep learning” on large datasets happened in speech recognition

• Context-Dependent Pre-trained Deep Neural Networks for Large Vocabulary Speech Recognition Dahl et al. (2010)

Page 28: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Deep Learning for Computer Vision

• Most deep learning groups have focused on computer vision (at least till 2 years ago)

• The breakthrough DL paper:

– ImageNet Classification with Deep Convolutional Neural Networks by Krizhevsky, Sutskever, & Hinton, 2012, U. Toronto. 37% error red.

Page 29: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Deep NLP = Deep Learning + NLP

• Combine ideas and goals of NLP with using representation learning and deep learning methods to solve them

• Several big improvements in recent years in NLP with different

– Levels: speech, words, syntax, semantics

– Tools: parts-of-speech, entities, parsing

– Applications: machine translation, sentiment analysis, dialogue agents, question answering

Page 30: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Word meaning as a neural word vector –visualization

Page 31: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Word similarities

• Nearest words to frog:– 1. frogs

– 2. toad

– 3. litoria

– 4. leptodactylidae

– 5. rana

– 6. lizard

– 7. eleutherodactylus

http://nlp.stanford.edu/projects/glove/

Page 32: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Representations of NLP Levels: Morphology

• Traditional: Words are made of morphemes– prefix stem suffix

– un interest ed

• DL:– every morpheme is a vector

– a neural network combines two vectors into one vector

– Luong et al. 2013

Page 33: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

NLP Tools: Parsing for sentence structure

• Neural networks can accurately determine the structure of sentences, supporting interpretation

Page 34: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Representations of NLP Levels: Semantics

• Traditional: Lambda calculus– Carefully engineered functions

– Take as inputs specific other functions

– No notion of similarity or fuzziness of language

• DL:– Every word and every phrase

and every logical expression is a vector

– a neural network combines two vectors into one vector

– Bowman et al. 2014

Page 35: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

NLP Applications: Sentiment Analysis

• Traditional: Curated sentiment dictionaries combined with either bag-of-words representations (ignoring word order) or hand-designed negation features (ain’t gonna capture everything)

• Same deep learning model that was used for morphology, syntax and logical semantics can be used! RecursiveNN

Page 36: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Question Answering

• Traditional: A lot of feature engineering to capture world and other knowledge, e.g., regular expressions, Berant et al. (2014)

• DL: Again, a deep learning architecture can be used!

• Facts are stored in vectors

Page 37: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Dialogue agents / Response Generation

• A simple, successful example is the auto-replies available in the Google Inbox app

• An application of the powerful, general technique of Neural Language Models, which are an instance of Recurrent Neural Networks

Page 38: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Machine Translation

• Many levels of translation have been tried in the past:

• Traditional MT systems are very large complex systems

Page 39: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Neural Machine Translation

• Source sentence is mapped to vector, then output sentence generated– [Sutskever et al. 2014, Bahdanau et al. 2014, Luong and

Manning 2016]

Page 40: Deep Learning 소개 - Kangwonleeck/NLP2/02_deep_learning.pdf · 2017-09-18 · Deep NLP = Deep Learning + NLP •Combine ideas and goals of NLP with using representation learning

Conclusion: Representation for all levels? Vectors

• We will study in the next lecture how we can learn vector representations for words and what they actually represent