-
Appendix. Mathematical review
- Linear Algebra
- Optimization Theory
- Probability Theory
- Information Theory
-
Artificial Neurons and Neural Networks
- Artificial Neurons
- Aritificial Neuron
- Components
- Compact Representation
- Signal-Flow graph representation
- Activation functions
- Threshold function(McCulloch and Pitts)
- Sigmoid function(Logistic funtion)
- Recified Linear Unit(ReLU)
- Softplus function
- Hyperbolic tangent function
- Leaky ReLU
- Maxout function
- ELU Function
- Properties of activation functions
- Stochastic artificial neuron
- Probablity of state transition
- Logistic Probability Model
- Neural Network Architectures
- Single-Layer Network
- Multilayer Feedforward Neural Network(FNN)
- Convolutional Neural Network(CNN)
- Recurrent Neural Network(RNN)
- Combined with a Pre-Processor
- Artificial Neurons
-
Rosenblatt's Perceptron
- Rosenblatt's Perceptron Model
- Overview
- Activation function : Sign function(threshold function)
- Network Architecture
- Assumption
- Training Problem Definition
- Training Algorithm for Perceptron
- Geometric Interpretation
- Perceptron Convergence Theorem
- Perceptron Convergence Theorem
- Rosenblatt's Perceptron Model
-
Regression
- Regressive and approximated models
- General regressive model
- Linear Regression
- Linearly approximated model
- Hypothesis
- Linear regression problem
- Learning algorithm : A numerical approach
- Learning algorithm : Least squares(One-shot learning approach)
- Recursive least squares
- Regularized least squares
- Comparisons
- Linear regression with basis functions
- Proper step size
- Good training samples
- Bayesian Regression
- Overview
- Maximum A Posteriori(MAP) estimation
- Maximum Likelihood(ML) estimation
- Bayesian linear regression with ML estimation
- Bayesian linear regression with MAP estimation
- Logistic and Softmax regression
- Logistic regression : hypothesis
- Logistic regression : learning based on gradient ascent algorithm
- Logistic regression : learning via Iterative Reweighted Least Squares(IRLS) based on Newton-Rapson method
- Logistic regression : Binary classification
- Softmax regression : Overview
- Softmax regression : Hypothesis
- Softmax regression : Derivative of softmax function
- Softmax regression : learning based on gradient ascent algorithm
- Softmax regression : learning via Iterative Reweighted Least Squares(IRLS) based on Newton-Rapson method
- Softmax regression : Multi-Class classification via softmax regression
- k-Nearest Neighbors(k-NN) Regression
- 𝑘-NN regression
- 𝑘-NN classfication
- Regressive and approximated models
-
Statistical learning
- Wiener filter(Optimal linear MMSE filter)
- Overview
- Optimal linaer filtering problem
- Wiener filter(Limiting form of the LS solution)
- Steepest Gradient Descent Method and Least Mean Square Algorithm
- Gradient descent algorithm
- Two approaches for gradient descent
- Minimum Mean Square Error(MMSE) Estimator
- Review
- Wiener filter(Optimal linear MMSE filter)
-
Classification
- Definition of classification problem
- Linear Models for Classfication
- Linear discriminant for two classes
- Linear discriminant for multiple classes
- Linear models for classification
- Linear model for classification : Least squares for classification
- Linear model for classification : Fisher's linear discriminant
- Linear model for classification : Perceptron
- Probabilistic Approaches for Classification
- Statistics vs Bayesian Classification
- Probabilities in classification
- A simple binary classification
- Receiver Operating Characteristics (ROC)
- Bayesian classification : Minimum Bayes Risk Classifier for two classes
- Minimum Error Probability Classifier for two classes
- Bayesian classification : Minimum Bayes Risk Classifier for multiple classes
- Minimum Error Probability Classifier for multiple classes
- Naive Bayes classifier
- Assumptions of Naive Bayes classifier
- Bayes Gaussian Classifier
- Generative and discriminative approach
- Probabilistic Generative Models for two classes classification
- Probabilistic Generative Models for two classes classification with continuous features
- Probabilistic Generative Models for multiple classes classification
- Probabilistic Generative Models for multiple classes classification with continuous features
- Probabilistic discriminative models
-
Practical Issues in Machine Learning
- Bias-Variance tradeoff
- Bias-Variance decomposition of the MSE
- Generalization
- Overview
- Training and test data sets
- Overfitting
- How to avoid overfitting?
- More training data
- Reducing the number of features(e.g., by PCA)
- Regularization
- Dropout
- Early-stopping
- Proper model selection
- How to avoid overfitting?
- Model selection
- Model selection with validation
- Model selection with validation set
- Multifold(K-fold) Cross-Validation
- Bootstrap Model Selection
- Model selection with criteria
- Akaike Information Criterion(AIC)
- Minimum Description Length(MDL) Criterion
- Model selection with validation
- Curse of dimensionality
- Bias-Variance tradeoff
-
Multilayer perceptron
-
Support Vector Machine
-
Restricted Bolzmamn Machines
-
Unsupervised learning
-
Notifications
You must be signed in to change notification settings - Fork 0
leibniz21c/Learning-Machine_Learning
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published