日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 人工智能 > pytorch >内容正文

pytorch

深度学习都是非凸问题_神经网络的损失函数为什么是非凸的?

發布時間:2023/12/10 pytorch 40 豆豆
生活随笔 收集整理的這篇文章主要介紹了 深度学习都是非凸问题_神经网络的损失函数为什么是非凸的? 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

Ian Goodfellow曾經給在quora回答過,以下是原文:

There are various ways to test for convexity.

One is to just plot a cross-section of the function and look at it. If it has a non-convex shape, you don’t need to write a proof; you have disproven convexity by counter-example.

If you want to do this with algebra, one way is just to take the second derivatives of a function. If the second derivative of a function in 1-D space is ever negative, the function isn’t convex.

For neural nets, you have millions of parameters, so you need a test that works in high-dimensional space. In high-dimensional space, it turns out we can take the second derivative along one specific direction in space. For a unit vector d giving the direction and a Hessian matrix H of second derivatives, this is given by

For most neural nets and most loss functions, it’s very easy to find a point in parameter space and a direction where

is negative.

總結

以上是生活随笔為你收集整理的深度学习都是非凸问题_神经网络的损失函数为什么是非凸的?的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。