Deep Learning: Difference between revisions

Line 21: Line 21:
For multi-way classification, can use cross-entropy loss:   
For multi-way classification, can use cross-entropy loss:   
<math>g(z)=\frac{1}{1+e^{-z}}</math>   
<math>g(z)=\frac{1}{1+e^{-z}}</math>   
<math>-\sum_{i=1}^{N}\left[y_i\log(y(f_W(x)) + (1-y_i)\log(1-g(f_W(x))\right]</math>
<math>\min_{W} -\sum_{i=1}^{N}\left[y_i\log(y(f_W(x)) + (1-y_i)\log(1-g(f_W(x))\right]</math>


==Misc==
==Misc==