![]() ![]() In machine learning, more often than not we deal with functions that contain hundreds of variables and degrees of freedom. All of these have their foundations in calculus! I’m sure you all have heard about backpropagation or generative adversarial networks. Calculus is not only limited to gradient computation. The value of m and c that we are left with now will be the optimum values for the best fitting line. ![]() This process is repeated until our error function is a small value. ![]() $b = b - (\alpha*b’)$ where $\alpha$ is the learning rate that describes the rate with which the values change. Since the function is defined by two variables (m and b), this is a multivariate calculus problem that requires computing the partial derivatives with respect to the two variables These derivatives work out to be: $ m’ = \frac+b))$ Now, we update the current value of m and c using the following equation: $m = m -(\alpha*m’)$ To compute the gradient, we will need to differentiate the error function. The straight line could be represented as: $y=mx+b$ where $y$ is the predictor, $m$ is the slope, $x$ is the input and $b$ is the y-intercept. But how is calculus used in machine learning? If you're already thinking about gradient descent, you’re ‘rolling’ in the right direction! Consider a linear regression problem of fitting a straight line to a dataset. If you want to make sense of what goes on under the hood or understand research papers discussing the latest advances in machine learning, you’ll need to have a solid grasp of the basics of calculus. Calculus for machine learning is often overlooked because deep learning libraries such as PyTorch conceal the underlying calculus formulae that make things work. Calculus plays an integral (pun intended) role in machine learning. In this way, calculus is described as the study of continuous change. By analyzing how these small pieces are arranged, we can understand how functions vary. Differential calculus cuts mathematical functions up into small pieces, whereas integral calculus joins these small pieces together. Simply put, it describes a branch of mathematics that looks at functions as being made up of infinitesimally small pieces. Today, calculus has evolved to become a branch which deals with much more than basic arithmetic. The word became associated with computation because the Romans did arithmetic with piles of stones. In Latin, ‘calculus’ means a small pebble. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |