News

Gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. To find a local minimum of a function using gradient descent, we take steps ...
Gradient Descent method is a conventional method for optimization of a function. Since gradient of a function is the direction of the steepest ascent, this method chooses negative of the gradient, ...
The article aimed to demonstrate how we compile a neural network by defining loss function and optimizers. In this article, we also discussed what gradient descent is and how it is used. At last, we ...
No libraries, no shortcuts—just pure math and code made simple. Deep Learning with Yacine. Gradient Descent from Scratch in Python. Posted: June 4, 2025 | Last updated: June 4, 2025.