-
Classification
- SVM
-
二元多類分類器
- one-against-one(OAO)
- one-against-all(OAA)
- Linear discriminant analysis
-
決策樹
-
類型
- 裝袋算法
- 回歸樹
- 隨機森林
- 旋轉森林
- 分類樹
- 高斯分類器
- 羅吉斯回歸(Logistic regression)
- KNN
-
Rule-Based
- Expert System
-
Backgrond
-
Feature
- SIFT
- HOG
-
Math
-
機率密度值
- p
- t
-
母體樣本及抽樣
- 信賴區間(Confidence interval)
- 變異量
-
機率
-
貝氏決策法則(Bayesian decision rule)
- 事前機率(priori probability)
- 概似函數 (likelihood function)
- 最大期望演算法(Expectation-maximization algorithm
- Gaussian Mixed Model
- 機率密度函數
- 累積貢獻比率 (Cumulative Proportion)
- Euclidean distance
- Lagrange
- partial differential equation
-
相關係數
- 皮爾森相關係數(Pearson’s correlation coefficient)
- 共變異數(Correlation Coefficient and Covariance)
-
Regression
- 線性回歸
- 羅吉斯回歸(Logistic regression)
- 多元回歸(multiple regression)
-
Clustering
- k-means
- fuzzy c-means
- 高斯混和模型(Gaussian Mixture Model)
- EM 演算法(Expectation-Maximization Algorithm, EM)
- GMM-EM
-
Dimension Reduction
- 主成分分析(Principle Component Analysis, PCA)
- (Linear discriminant analysis Feature Extraction, DAFE)
-
矩陣分解(Matrix Factorization)
- 交替最小平方法(Alternating least squares, ALS)
- 加權交替最小平方法(Alternating-least-squares with weighted-λ -regularization, ALS-WR)
-
分散量
- 組內分散量(within-class scatter)
- 組間分散量(between-class scatter)
-
模型選擇或評估(Model selection/evaluation)
-
交叉驗證(Cross-validation, CV)
- Resubstitution
- Holdout CV
- Leave-one-out CV
- K-fold CV
-
驗證指標(validation index)
-
分類指標
- 二元混淆矩陣和相對應驗證指標
- ROC曲線
- AUC
- 多元相關(多元混淆矩陣和相對應驗證指標)
-
回歸指標
- 平均均方誤差(Mean Squared Error, MSE)
- 平均絕對誤差(Mean Absolute Error, MAE)
- 平均均方對數誤差(Mean Squared Logarithmic Error, MSLE)
- AP/mAP
-
遷移學習,transfer learning
- instance-based transfer learning
- feature-representation transfer learning
- parameter-transfer learning
- relational-knowledge transfer learning
-
Ensemble learning
-
Bagging
- Random Forest
- Boosting
- Stacking
- AdaBoost
-
NN
- Perception
- 多層感知機(Multilayer perceptron, MLP)
-
DL
- Restricted Boltzmann Machine(RBN)
- Deep Belief Networks(DBN)
- Generative Adversarial Networks
- Deep Neural Network (DNN)
- Recurrent Neural Network(RNN)
-
Convolutional Neural Network(CNN)
-
Layer
- Convolution
- pad
- kernel_size
- num_output
- stride
- Maxpooling
- Flatten
- Fully connection
-
Forwading
- Math: Differential
-
Training
- Learning Rate
- Stochastic Gradient Descent,SGD
- Momentum
- Adam
- RMSProp
- AdaGrad
- batch gradient descent
- Cost Function
- +Normalization
- L0
- L1
- L2
- Optimization
- Dropout
-
交叉驗證(Cross-validation, CV)
- Resubstitution
- Holdout CV
- Leave-one-out CV
- K-fold CV
-
Design
- 1×1捲積
- Autoencoder
-
Reinforcement learning (RL)
- Q-learning