DeepLearning Basic Study
This is repository for Deep Learning Study in Kyung Hee University Computer Engineering Club D.COM
.
Recommend this study to those who want to review the Machine Learning concept again and to those who have just learned Python.
- I've created a course material that will be accessible to the first person to start Python.
-
Tae Hwan Jung(@graykode) will lead this Study with Pytorch for DeepLearning Framework. But I will implement Tensorflow, Pytorch, Keras for beginner.
- We deal with basic mathematical theory and basic models in Deep Learning such as
DNN, CNN, RNN, LSTM
in 1st Study All of Code were implemented with less than 30 lines.
- We will use
Google Colaboratory
GPU for memory resource so you can run easily in Colab link.(Thank for Google!)
- First, I made lecture with page link material in Korean, only wrote Contents in English
Contribution Guide
If you find English link or helpful link irrespective of language, Please give me contribution in README, Markdown like this.
Linear Regression([Eng[(your contribution link), Kor)
Curriculum
Please see down Contents.
- 1 Weeks
- Basic Probability Review
- Supervisor Learning vs. Un-supervisor Learning
- Linear Regression, Logistic Regression
manual
Gradient Descent implementation using pure python
- 2 Weeks
- method using Google Colaboratory.
- Linear Regression, Logistic Regression Review, Convert
manual
to auto
implementation using Pytorch
- 3 Weeks
- Classification with DNN(Deep Neural Network) in
Pytorch
- apply Regularization(DropOut) concept to DNN
- Optimization function in
Pytorch
, mini-batch, SGD, Adagrad, RMSProp, AdaDelta, Adam optimizer
- 4 Weeks
- Basic Convolution Neural Network
- load dataset and use data loader with
torchvision
- apply Machine Learning Diagnostic(Train Set, Cross Validation Set, Test Set) concept to DNN
- Implementation MNIST Classification using CNN
- 5 Weeks
- Basic RNN(Recurrent Neural Network) and LSTM in Pytorch
- Teacher Forcing vs. No Teacher Forcing
- Practice : Predict Next word using RNN or LSTM
- 6 Weeks - Hackathon
- Topic1 : Classification Cat , Dog Image, Dataset
- Topic2 : Classification Positive or Negative Korean Naver Movie, Dataset
Contents
0. Review Basic mathematical Theory with pure Python
-
Supervisor Learning vs. Unsupervisor Learning : In this Study, We will deal with only supervisor concept model.
-
Basic Probability Review
- Bayers Theorem(Eng, Kor), Bayesian inference in Generative Model(Eng, Kor)
- Generative Model vs. Discriminative Model(Eng, Kor)
- Maximum Likelihood vs. Maximum A Posteriori(Eng, Kor)
- Maximizing Likelihood is Minimizing Cross-Entropy(Eng, Kor)
-
Linear Regression(Eng, Kor)
- Univariate Linear Regression(Eng, Kor) vs. Multivariate Linear Regression(Eng, Kor)
-
loss function and activation function in Linear Regression
- activation function : identity map(Eng, Kor)
- loss function : MSE function(Eng, Kor)
- Gradient Descent in Linear Regression
- Problem : XOR
-
Logistic Regression
- What is different with Linear Regression?(Eng, Kor)
-
loss function and activation function in Logistic Regression
- activation function : sigmoid vs. tanh vs. ReLu vs. Softmax
- loss function : Maximizing Likelihood is Minimizing Cross-Entropy(Eng, Kor)
- Gradient Descent in Logistic Regression
- different with binary classification and multi classification(sigmoid vs. Softmax)(Eng, Kor1, Kor2)
- different with Multi-Classification and Multi-labels Classification(Eng, Kor)
-
Optimizing
-
Regularization
- What is Overfitting?(Eng, Kor)
- Regularization : weight decay
- weight decay : Linear Regression(Eng, Kor)
- weight decay : Logistic Regression(Eng, Kor)
- Regularization : dropout(Eng, Kor)
-
Machine Learning Diagnostic
- Train Set, Cross Validation Set, Test Set(Eng, Kor)
- Bias vs. Variance(Eng, Kor)
- Learning Curves(Eng, Kor)
1.DeepLearning FrameWork Basic
- Abstract Model using Pytorch Class : 1.Pytorch-Basic.py
- method using Google Colaboratory
- Convert
manual gradient descent
to auto graident descent
2.DNN(Deep Neural Network)
- Mathematical Back Propagation in Deep Neural Network(Eng, Kor1, Kor2)
- Basic Classification using Deep Neural Network
Classification : Linear Regression in Deep Neural Network
- Classification : Logistic Regression in Deep Neural Network
- Dropout in Deep Neural Network : 2.DNN-Dropout.py
3.DataLoader and basic Dataset and Image handler
4.CNN(Convolution Neural Network)
- awesome lecture
- Structure of CNN
- 4.CNN-Introduce.py
- Convolutional Layer
- Role of filter(=kernel) vs. receptive fields
- Role of Padding
- Weight sharing in Convolutional Layer
- Role of Channel, Reason using Multi Channel
- Weight sharing in CNN
- Pooling Layer
- Max Pooling
- Average Pooling
- FeedForward in Convolution Neural Network
- Mathematical Back Propagation in Convolution Neural Network
- Practice : Classification MNIST
5.RNN(Recurrent Neural Network)
6.LSTM(Long Short Term Memory)
- Structure of LSTM
- 6.LSTM-Introduce.py
- Hidden State, Cell State
- Different of RNN with LSTM
- Output Layer
- Weight sharing in RNN
- FeedForward in LSTM(Eng, Kor)
- Mathematical Back Propagation in LSTM(Eng, Kor)
- Bi-directional LSTM(BiLSTM)(Eng, Kor)
- Practice : LSTM-AutoComplete with LSTM
7. Application Level
- Vision : Cat or Dog Image Classification.
- Natural Language Processing : Positive or Negative Classification with Naver Movie Review.
Reference
- Andrew NG - Machine Learning Lecture
- Korean Andrew Ng NoteBook : WikiBook
Author
License