A repository with the implementation of classical ML algorithms and other ML related topics from scratch.
- Linear Regression (Normal Equation, Gradient Descent)
- Logistic Regression (Sigmoid Activation, Cross-Entropy Loss)
- k-Nearest Neighbours (k-NN) (Distance Computation: Euclidean, Manhattan)
- Naïve Bayes Classifier (Gaussian, Multinomial)
- Decision Trees (ID3, Gini Index, Entropy)
- Random Forest (Bootstrap Aggregation, Majority Voting)
- Support Vector Machine (SVM) (Hinge Loss, Gradient Descent)
- k-Means Clustering (Centroid Updates, Convergence Conditions)
- Gaussian Mixture Model (GMM) & Expectation-Maximization (EM)
- Principal Component Analysis (PCA) (Eigenvalue Decomposition, SVD)
- t-SNE (Dimensionality Reduction & Visualisation)
-
Gradient Descent Variants
- Stochastic Gradient Descent (SGD)
- Mini-Batch Gradient Descent
- Momentum-based Gradient Descent
- Adam Optimiser
-
Backpropagation (for a Simple Neural Network)
- PageRank Algorithm (Power Iteration Method)
Each algorithm is implemented from scratch in jupyter notebooks with detailed explanations (where possible). Some visualisations are also included to help understand the concepts better. You can run the notebooks directly in your browser using Jupyter Notebook or Jupyter Lab.
To install jupyter notebook, you can use pip:
pip install To run the jupyter notebook, use the following command:
jupyter notebookOr Install Jupyter Lab (Optional) Jupyter Lab provides an improved interface over Jupyter Notebook:
pip install jupyterlabTo run Jupyter Lab, use the following command:
jupyter lab