Derivatives, mostly in the form of gradients and Hessians, are ubiquitous in machine learning. Automatic differentiation (AD), also called algorithmic differentiation or simply "autodiff", is a family of techniques similar to but more general than backpropagation for efficiently and accurately evaluating derivatives of numeric functions expressed as computer programs. AD is a small but established field with applications in areas including computational fluid dynamics, atmospheric sciences, and engineering design optimization. Until very recently, the fields of machine learning and AD have largely been unaware of each other and, in some cases, have independently discovered each other's results. Despite its relevance, general-purpose AD has been missing from the toolbox of the machine learning community, a situation slowly changing with its ongoing adoption in mainstream machine learning frameworks. We survey the intersection of AD and machine learning, cover applications where AD has direct relevance, and address the main techniques of implementation. By precisely defining the main differentiation techniques and their interrelationships, we aim to bring clarity to the usage of terms "autodiff", "automatic differentiation", and "symbolic differentiation" as these are encountered more and more in machine learning settings.
CITATION STYLE
. T. T. Y. (2014). FOAM FRACTIONATION AS A MEANS TO RECOVER AND FRACTIONATE BIOACTIVES. International Journal of Research in Engineering and Technology, 03(11), 533–545. https://doi.org/10.15623/ijret.2014.0311090
Mendeley helps you to discover research relevant for your work.