DeepLearning-Study

This is repository for DeepLearning Study in Kyung Hee University

Stars
27

DeepLearning Basic Study

This is repository for Deep Learning Study in Kyung Hee University Computer Engineering Club D.COM.

Recommend this study to those who want to review the Machine Learning concept again and to those who have just learned Python.

  • I've created a course material that will be accessible to the first person to start Python.
  • Tae Hwan Jung(@graykode) will lead this Study with Pytorch for DeepLearning Framework. But I will implement Tensorflow, Pytorch, Keras for beginner.
  • We deal with basic mathematical theory and basic models in Deep Learning such as DNN, CNN, RNN, LSTM in 1st Study All of Code were implemented with less than 30 lines.
  • We will use Google Colaboratory GPU for memory resource so you can run easily in Colab link.(Thank for Google!)
  • First, I made lecture with page link material in Korean, only wrote Contents in English

Contribution Guide

If you find English link or helpful link irrespective of language, Please give me contribution in README, Markdown like this.

Linear Regression([Eng[(your contribution link), Kor)

Curriculum

Please see down Contents.

  • 1 Weeks
    • Basic Probability Review
    • Supervisor Learning vs. Un-supervisor Learning
    • Linear Regression, Logistic Regression manual Gradient Descent implementation using pure python
  • 2 Weeks
    • method using Google Colaboratory.
    • Linear Regression, Logistic Regression Review, Convert manual to auto implementation using Pytorch
  • 3 Weeks
    • Classification with DNN(Deep Neural Network) in Pytorch
    • apply Regularization(DropOut) concept to DNN
    • Optimization function in Pytorch, mini-batch, SGD, Adagrad, RMSProp, AdaDelta, Adam optimizer
  • 4 Weeks
    • Basic Convolution Neural Network
    • load dataset and use data loader with torchvision
    • apply Machine Learning Diagnostic(Train Set, Cross Validation Set, Test Set) concept to DNN
    • Implementation MNIST Classification using CNN
  • 5 Weeks
    • Basic RNN(Recurrent Neural Network) and LSTM in Pytorch
    • Teacher Forcing vs. No Teacher Forcing
    • Practice : Predict Next word using RNN or LSTM
  • 6 Weeks - Hackathon
    • Topic1 : Classification Cat , Dog Image, Dataset
    • Topic2 : Classification Positive or Negative Korean Naver Movie, Dataset

Contents

0. Review Basic mathematical Theory with pure Python

  • Supervisor Learning vs. Unsupervisor Learning : In this Study, We will deal with only supervisor concept model.

  • Basic Probability Review

    • Bayers Theorem(Eng, Kor), Bayesian inference in Generative Model(Eng, Kor)
    • Generative Model vs. Discriminative Model(Eng, Kor)
    • Maximum Likelihood vs. Maximum A Posteriori(Eng, Kor)
    • Maximizing Likelihood is Minimizing Cross-Entropy(Eng, Kor)
  • Linear Regression(Eng, Kor)

  • Logistic Regression

  • Optimizing

  • Regularization

    • What is Overfitting?(Eng, Kor)
    • Regularization : weight decay
      • weight decay : Linear Regression(Eng, Kor)
      • weight decay : Logistic Regression(Eng, Kor)
    • Regularization : dropout(Eng, Kor)
  • Machine Learning Diagnostic

    • Train Set, Cross Validation Set, Test Set(Eng, Kor)
    • Bias vs. Variance(Eng, Kor)
    • Learning Curves(Eng, Kor)

1.DeepLearning FrameWork Basic

2.DNN(Deep Neural Network)

  • Mathematical Back Propagation in Deep Neural Network(Eng, Kor1, Kor2)
  • Basic Classification using Deep Neural Network
  • Dropout in Deep Neural Network : 2.DNN-Dropout.py

3.DataLoader and basic Dataset and Image handler

4.CNN(Convolution Neural Network)

  • awesome lecture
  • Structure of CNN
    • 4.CNN-Introduce.py
    • Convolutional Layer
      • Role of filter(=kernel) vs. receptive fields
      • Role of Padding
      • Weight sharing in Convolutional Layer
    • Role of Channel, Reason using Multi Channel
    • Weight sharing in CNN
    • Pooling Layer
      • Max Pooling
      • Average Pooling
  • FeedForward in Convolution Neural Network
  • Mathematical Back Propagation in Convolution Neural Network
  • Practice : Classification MNIST

5.RNN(Recurrent Neural Network)

6.LSTM(Long Short Term Memory)

  • Structure of LSTM
    • 6.LSTM-Introduce.py
    • Hidden State, Cell State
    • Different of RNN with LSTM
    • Output Layer
    • Weight sharing in RNN
  • FeedForward in LSTM(Eng, Kor)
  • Mathematical Back Propagation in LSTM(Eng, Kor)
  • Bi-directional LSTM(BiLSTM)(Eng, Kor)
  • Practice : LSTM-AutoComplete with LSTM

7. Application Level

  • Vision : Cat or Dog Image Classification.
  • Natural Language Processing : Positive or Negative Classification with Naver Movie Review.

Reference

  • Andrew NG - Machine Learning Lecture
  • Korean Andrew Ng NoteBook : WikiBook

Author

License