Differentiable programming makes the Machine Learning world go round. PyTorch, TensorFlow, and any other modern Machine Learning framework all rely on it when it comes to the actual learning. Reason enough for anybody even remotely interested in AI to be acquainted with differentiable programming. We invite you to follow us through a two-part explanation of the conceptual framework that powers a large part of today’s AI industry. No Ph.D. required!
Differentiable programming is vital to Machine Learning because it is the basis of gradient descent, a family of optimization algorithms doing the grunt work in the training of Machine Learning models. The adjective differentiable comes from the noun differentiation, the mathematical process of computing derivatives. Yes, the good old derivatives from your high-school calculus class. Indeed, you get amazingly far with high-school math, when it comes to AI. For example, as we will see below, the gradient in the name gradient descent is essentially a collection of derivatives.

#backpropagation #automatic-differentiation #gradient-descent #machine-learning #ai

High-School Math for Machines: Differentiable Code
1.25 GEEK