日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

【deep learning学习笔记】注释yusugomori的LR代码 --- LogisticRegression.cpp

發布時間:2023/12/9 编程问答 27 豆豆
生活随笔 收集整理的這篇文章主要介紹了 【deep learning学习笔记】注释yusugomori的LR代码 --- LogisticRegression.cpp 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

模型實現代碼,關鍵是train函數和predict函數,都很容易。

?

#include <iostream> #include <string> #include <math.h> #include "LogisticRegression.h" using namespace std;LogisticRegression::LogisticRegression(int size, // Nint in, // n_inint out // n_out) {N = size;n_in = in;n_out = out;// initialize W, b// W[n_out][n_in], b[n_out]W = new double*[n_out];for(int i=0; i<n_out; i++) W[i] = new double[n_in];b = new double[n_out];for(int i=0; i<n_out; i++) {for(int j=0; j<n_in; j++) {W[i][j] = 0;}b[i] = 0;} }LogisticRegression::~LogisticRegression() {for(int i=0; i<n_out; i++) delete[] W[i];delete[] W;delete[] b; }void LogisticRegression::train (int *x, // the input from input nodes in training setint *y, // the output from output nodes in training setdouble lr // the learning rate) {// the probability of P(y|x)double *p_y_given_x = new double[n_out];// the tmp variable which is not necessary being an arraydouble *dy = new double[n_out];// step 1: calculate the output of softmax given inputfor(int i=0; i<n_out; i++) {// initializep_y_given_x[i] = 0;for(int j=0; j<n_in; j++) {// the weight of networksp_y_given_x[i] += W[i][j] * x[j];}// the biasp_y_given_x[i] += b[i];}// the softmax valuesoftmax(p_y_given_x);// step 2: update the weight of networks// w_new = w_old + learningRate * differential (導數)// = w_old + learningRate * x (1{y_i=y} - p_yi_given_x) // = w_old + learningRate * x * (y - p_y_given_x)for(int i=0; i<n_out; i++) {dy[i] = y[i] - p_y_given_x[i];for(int j=0; j<n_in; j++) {W[i][j] += lr * dy[i] * x[j] / N;}b[i] += lr * dy[i] / N;}delete[] p_y_given_x;delete[] dy; }void LogisticRegression::softmax (double *x) {double max = 0.0;double sum = 0.0;// step1: get the max in the X vectorfor(int i=0; i<n_out; i++) if(max < x[i]) max = x[i];// step 2: normalization and softmax// normalize -- 'x[i]-max', it's not necessary in traditional LR.// I wonder why it appears here? for(int i=0; i<n_out; i++) {x[i] = exp(x[i] - max);sum += x[i];} for(int i=0; i<n_out; i++) x[i] /= sum; }void LogisticRegression::predict(int *x, // the input from input nodes in testing setdouble *y // the calculated softmax probability) {// get the softmax output value given the current networksfor(int i=0; i<n_out; i++) {y[i] = 0;for(int j=0; j<n_in; j++) {y[i] += W[i][j] * x[j];}y[i] += b[i];}softmax(y); }


?

?

轉載于:https://www.cnblogs.com/dyllove98/p/3194108.html

創作挑戰賽新人創作獎勵來咯,堅持創作打卡瓜分現金大獎

總結

以上是生活随笔為你收集整理的【deep learning学习笔记】注释yusugomori的LR代码 --- LogisticRegression.cpp的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。