**What kind of derivative is used in neural networks?**

In this question here I learned that there are multiple types of derivatives. I was under the impression that the derivative always told you exactly how much the output (y) would change with a certain input (x) given a function, however this is apparently not true.

As I attempt to understand the nuances of this, it would help to know: when we find the partial derivative of the cost function w.r.t a certain weight in a neural network, is this a *derivative* (apparently not exact) or a *discrete derivative* (apparently exact)?

Or… does it not really matter because all we're trying to figure out is whether or not the weight *raises* the output or *lowers* it?

Submitted July 16, 2017 at 11:07AM by ClearlyCoder

via reddit http://ift.tt/2tg65zZ

### Like this:

Like Loading...

*Related*