机器学习
资料都在网上 http://speech.ee.ntu.edu.tw/~tlkagk/courses_ML20.html
机器学习就是自动找函数 本质是 function,build model f(input) = output
- speech Recognition
- Image Recognition
- Playing Go
Dialogure System
discriminant
- multi-class
- Binary Classification
- generation 生成模型
- 学习额外的东西,生成模型
分类
supervised learing
- loss function
- Regression
- classification
- CNN
- GAN
- Unsupervised Learning(auto-encoder)
- Anomaly Detection
- Transfer Learning
- Meta Learning
- Life-long learning
- Reinforcement Learning
- Life-long learning
- Meta Learning
- Transfer Learning
- Anomaly Detection
- Explainable AI
- Adversarial Attack()
- RNN
- Seq2seq
- CNN
reinforcement
- alphago
- first move -> many moves… ->win!(Reward)
- alphago
怎么获取 function
- 给定函数范围
- Linear
- Network Architecture
- 函数寻找方法
- Gradient Descent
- implement yourself(Linear)
- Pytorch(RNN,CNN 掉包)
- Gradient Descent
Linear regression
- regularization
- 不需要考虑 b,截距项,不影响平滑程度
bias vs variance large bias small variance(模型简单,空间较小,方差小,但 bias 大,under fitting) small bias large variance(模型复杂,包含真实模型,bias 小,但 variance 大,overfitting)
Recipe on deep learning
- step1: define a set of function
- step2: goodness of function
- step3: pick the best function
- Neural network
- Good results on training data?
- if no go to step 3
- new activation function(vanishing gradient problem) sigmoid 会导致这种问题 (large input change,small output change)
- ReLU
- fast to compute
- biological reason
- infinite sigmoid with different biases
- vanishing gradient problem
- Maxout
- if yes
- Good Results on Test Data?
- if no
- overfitting
- early stopping
- regularization
- dropout
- overfitting
- if yes
- production
- if no
- Good Results on Test Data?
- if no go to step 3
reference
文档信息
- 本文作者:Huang Henry
- 本文链接:https://huanghenry.github.io/2020/06/14/ml_01/
- 版权声明:自由转载-非商用-非衍生-保持署名(创意共享3.0许可证)