The approach I selected for Logistic regression in #Week3 (Approximate Logistic regression function using a Single Layer Perceptron Neural Network — SLPNN) took longer to unravel, both from maths as well as from coding perspective that it was practically impossible to provide updates on a weekly basis

Welcome to the final Retrospective of the ML challenge which is going to cover Weeks 4 to 10. As a quick introduction, for those who’d like to follow the full 10-week journey, here’re the links to all previous posts:

· Original post about the challenge: #10WeeksOfMachineLearningFun

· Links to previous retrospectives: #Week1 #Week2 #Week3

Weeks 4–10 has now been completed and so has the challenge!

I’m very pleased for coming that far and so excited to tell you about all the things I’ve learned, but first things first: as a quick explanation as to why I’ve ending up summarising the remaining weeks altogether and so late after completing this:

- The approach I selected for Logistic regression in #Week3 (Approximate Logistic regression function using a Single Layer Perceptron Neural Network — SLPNN) took longer to unravel, both from maths as well as from coding perspective that it was practically impossible to provide updates on a weekly basis
- Also, I probably digressed a bit during that period to understand some of the maths, which was good learning overall e.g. Cost functions and their derivatives, and most importantly when to use one over another and why :) (more on that below)
- Derivative of Cost function: given my approach in #Week3, I had to make sure that the Back-propagation chain rule maths for working out the partial derivative of the Cost function with respect to the weights, tied perfectly with the maths of the analytical calculation for the same partial derivative. And in order to do so, I had to put pen on paper multiple times over and over again until it finally made sense. You can definitely trust the maths once you’ve verified them in two different ways!
- In fact, I have created a handwritten single page cheat-sheet that shows all these, which I’m planning to publish separately so stay tuned.
- Finally, a fair amount of the time, planned initially to spend on the Challenge during weeks 4–10, went to real life priorities in professional and personal life. Which is exactly what happens at work, projects, life, etc… You just have to deal with the priorities and get back to what you’re doing and finish the job! So here I am!

perceptron neural-networks data-science machine-learning logistic-regression

Most popular Data Science and Machine Learning courses — August 2020. This list was last updated in August 2020 — and will be updated regularly so as to keep it relevant

Cheat Sheets for AI, Neural Networks, Machine Learning, Deep Learning & Big Data

PySpark in Machine Learning | Data Science | Machine Learning | Python. PySpark is the API of Python to support the framework of Apache Spark. Apache Spark is the component of Hadoop Ecosystem, which is now getting very popular with the big data frameworks.

Learning is a new fun in the field of Machine Learning and Data Science. In this article, we’ll be discussing 15 machine learning and data science projects.

Fundamentals of Neural Network in Machine Learning. What is a Neuron? What is the Activation Function? How do Neural Network Works? How do Neural Networks Learn?