[go: up one dir, main page]

Skip to content

jayvatti/MLNotebookArchives

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This repository contains a collection of Jupyter notebooks, each highlighting the application and understanding of a distinct machine learning model, from the basics to the more sophisticated. [In Progress]


Content

  • Data folder: A curated repository of datasets, serving as the foundation and fuel for the models crafted in these notebooks.

  • Kaggle folder:

Notebook Structure

Basics

  1. numpyBasics.ipynb
  2. mnist.ipynb

Supervised Learning

  1. singleFeatureLinearRegression.ipynb

    • Description:
      • A beginner's journey into Linear Regression with a single input feature.
      • Implementation with Sci-kit Learn library.
      • Another version without any libraries.
      • Plots of cost functions for learning rate selection.
      • Bonus: A sketched line of best fit! (With and without Z-Score Normalization)
    • Dataset: height_weight.csv
  2. multipleFeatureLinearRegression.ipynb

    • Description:
      • Stepping up to multi-feature Linear Regression.
      • Implementation using Sci-kit Learn.
      • Another version without libraries.
      • Includes cost function plots to guide the choice of learning rate.
      • New: Added plots comparing the predictions from the SciKit Learn version and the version without libraries. Also included are plots comparing each method with the actual predictions.
    • Dataset: multipleFeatures.csv - Because one feature is too mainstream!
  3. normalEquationLinearRegression.ipynb

    • Description:
      • Ditching iterations for equations, this notebook presents Linear Regression via the Normal equation.
      • Employs the same dataset as the singleFeatureLinearRegression notebook
      • Graphical representations may be missing, but the calculated weights and bias are similar to the ones from the gradient descent approach in singleFeatureLinearRegression.ipnyb
    • Dataset: height_weight.csv
  4. polynomialRegression.ipynb

    • Description: Straight lines are too mainstream! This notebook explores Polynomial Regression using Sci-kit Learn.
    • Dataset: polyRegression.csv
  5. binaryLogisticRegression.ipynb

    • Description:
      • Flirting with classification, this notebook delves into Binary Logistic Regression using Logistic Loss and Sigmoid Functions.
      • Includes a visualization of the sigmoid function and classification boundary, helping to understand how well the model separates classes. Classifying has never been so visually appealing!
    • Dataset: logisticRegression.csv
  6. regularizedLinearRegression.ipynb

    • Description:
      • Regularization to the rescue! This notebook presents Regularized Linear Regression.
      • Both library-based and bare-bones implementations are available.
      • Cost function plots to select the best learning rate and lambda values.
      • New: Added a plot to show the predictions of the custom model versus the original data, and another plot to compare the SciKit Learn version with the version without libraries.
    • Dataset: multipleFeatures.csv
  7. regularizedLogisticRegression.ipynb

    • Description:
      • The sequel to regularized linear regression, featuring Regularized Logistic Regression.
      • Implementations both with and without libraries.
      • Cost function plots to guide the choice of learning rate and lambda values.
    • Dataset: logisticRegressionCircular.txt

Neural Networks

  1. NeuralNetworkLinearRegression.ipynb

    • Description:
      • From regressions to neurons, this notebook implements a single-layer neural network for Linear Regression.
      • A regression line is sketched to highlight the model's predictions.
    • Dataset: linearRegressionSimple.csv
  2. NeuralNetworkLogisticRegression.ipynb

    • Description:
      • Continuing the neural exploration, this notebook applies a single-layer neural network for Logistic Regression.
      • A sketched sigmoid function after fine-tuning the weights and biases using Sci-kit learn.
      • Notably, it focuses on tumor classification - the model might not cure cancer, but it sure can classify it!
    • Dataset: logisticRegressionTumorSingle.csv - On a mission to segregate tumors!
  3. NeuralNetworkLayers_binaryClassification.ipynb

    • Description: Unleashing the power of deep learning, this notebook explores a multi-layer neural network for binary classification, featuring:
      • Feature engineering using polynomial features up to degree 6.
      • Visualizations of activation layers and 3D surface plots.
      • Building a neural network from scratch, no high-level libraries involved.
      • New: A 3D Decision Probability Plot, using Plotly - view or interact with it via nbviewer
    • Dataset: logisticRegressionCircular.txt
  4. multiClass_softmax.ipynb

    • Description: Multiclass classification by employing softmax activation in neural networks. Highlights include:
      • Comparing models built with and without 'from_logits = true', thus emphasizing the impact of this parameter on neural network performance.
      • Featuring two different neural network architectures - a simple one with fewer units for a quick yet effective classification, and a moderately large one with more units to generate a more complex decision boundary.
      • Vivid visualizations of decision boundaries via meshgrid and ravel.
      • The data exploration journey begins with a custom-generated dataset using 'make_classification' from sklearn.datasets.
    • Dataset: multi_class_generated.csv - Multi-class problems have never been more fun to solve!

Miscellaneous

  1. dTrees.ipynb

    • Description:
  2. kmeans.ipynb

    • Description:
  3. anomalyDetection.ipynb

    • Description:
  4. pca.ipynb

    • Description: