【deep learning学习笔记】注释yusugomori的LR代码 --- LogisticRegression.cpp
生活随笔
收集整理的這篇文章主要介紹了
【deep learning学习笔记】注释yusugomori的LR代码 --- LogisticRegression.cpp
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
模型實現代碼,關鍵是train函數和predict函數,都很容易。
?
#include <iostream> #include <string> #include <math.h> #include "LogisticRegression.h" using namespace std;LogisticRegression::LogisticRegression(int size, // Nint in, // n_inint out // n_out) {N = size;n_in = in;n_out = out;// initialize W, b// W[n_out][n_in], b[n_out]W = new double*[n_out];for(int i=0; i<n_out; i++) W[i] = new double[n_in];b = new double[n_out];for(int i=0; i<n_out; i++) {for(int j=0; j<n_in; j++) {W[i][j] = 0;}b[i] = 0;} }LogisticRegression::~LogisticRegression() {for(int i=0; i<n_out; i++) delete[] W[i];delete[] W;delete[] b; }void LogisticRegression::train (int *x, // the input from input nodes in training setint *y, // the output from output nodes in training setdouble lr // the learning rate) {// the probability of P(y|x)double *p_y_given_x = new double[n_out];// the tmp variable which is not necessary being an arraydouble *dy = new double[n_out];// step 1: calculate the output of softmax given inputfor(int i=0; i<n_out; i++) {// initializep_y_given_x[i] = 0;for(int j=0; j<n_in; j++) {// the weight of networksp_y_given_x[i] += W[i][j] * x[j];}// the biasp_y_given_x[i] += b[i];}// the softmax valuesoftmax(p_y_given_x);// step 2: update the weight of networks// w_new = w_old + learningRate * differential (導數)// = w_old + learningRate * x (1{y_i=y} - p_yi_given_x) // = w_old + learningRate * x * (y - p_y_given_x)for(int i=0; i<n_out; i++) {dy[i] = y[i] - p_y_given_x[i];for(int j=0; j<n_in; j++) {W[i][j] += lr * dy[i] * x[j] / N;}b[i] += lr * dy[i] / N;}delete[] p_y_given_x;delete[] dy; }void LogisticRegression::softmax (double *x) {double max = 0.0;double sum = 0.0;// step1: get the max in the X vectorfor(int i=0; i<n_out; i++) if(max < x[i]) max = x[i];// step 2: normalization and softmax// normalize -- 'x[i]-max', it's not necessary in traditional LR.// I wonder why it appears here? for(int i=0; i<n_out; i++) {x[i] = exp(x[i] - max);sum += x[i];} for(int i=0; i<n_out; i++) x[i] /= sum; }void LogisticRegression::predict(int *x, // the input from input nodes in testing setdouble *y // the calculated softmax probability) {// get the softmax output value given the current networksfor(int i=0; i<n_out; i++) {y[i] = 0;for(int j=0; j<n_in; j++) {y[i] += W[i][j] * x[j];}y[i] += b[i];}softmax(y); }?
?
轉載于:https://www.cnblogs.com/dyllove98/p/3194108.html
創作挑戰賽新人創作獎勵來咯,堅持創作打卡瓜分現金大獎總結
以上是生活随笔為你收集整理的【deep learning学习笔记】注释yusugomori的LR代码 --- LogisticRegression.cpp的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 基于Java和Python实现简单的CA
- 下一篇: BZOJ1787 [Ahoi2008]M