日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

吴恩达机器学习Ex3作业

發布時間:2025/4/5 编程问答 15 豆豆
生活随笔 收集整理的這篇文章主要介紹了 吴恩达机器学习Ex3作业 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

要求:使用邏輯回歸和神經網絡來識別手寫數字。

設置參數
每張圖片大小:20*20,然后resize成一個行向量
數據集的標簽:1-10,其中數字0用10來標記。

%% Setup the parameters you will use for this part of the exercise input_layer_size = 400; % 20x20 Input Images of Digits num_labels = 10; % 10 labels, from 1 to 10% (note that we have mapped "0" to label 10)

Part 1: Loading and Visualizing Data

加載數據
繪制圖形:隨機選擇100張圖,繪制出來。

%% =========== Part 1: Loading and Visualizing Data ============= % We start the exercise by first loading and visualizing the dataset. % You will be working with a dataset that contains handwritten digits. %% Load Training Data fprintf('Loading and Visualizing Data ...\n')load('ex3data1.mat'); % training data stored in arrays X, y m = size(X, 1);%樣本個數m=5000% Randomly select 100 data points to display rand_indices = randperm(m);%返回一個隨機全排列 sel = X(rand_indices(1:100), :);%隨機選擇100個點來展示:100*400displayData(sel);fprintf('Program paused. Press enter to continue.\n'); pause;

函數randperm返回一個隨機全排列

randperm - Random permutationThis MATLAB function returns a row vector containing a random permutation of the integers from 1 to n inclusive.p = randperm(n)p = randperm(n,k)

繪圖函數displayData

function [h, display_array] = displayData(X, example_width) %DISPLAYDATA Display 2D data in a nice grid % [h, display_array] = DISPLAYDATA(X, example_width) displays 2D data % stored in X in a nice grid. It returns the figure handle h and the % displayed array if requested.% Set example_width automatically if not passed in如果沒傳入寬度,則自己生成圖片寬度 if ~exist('example_width', 'var') || isempty(example_width) example_width = round(sqrt(size(X, 2)));%這里對輸入的X開根號,X寬=400,則寬度=20 end% Gray Image colormap(gray);% Compute rows, cols [m n] = size(X);%100*400 example_height = (n / example_width);%每張圖片高度用400除寬度,高度=20% Compute number of items to display display_rows = floor(sqrt(m));%展示10行 display_cols = ceil(m / display_rows);%展示10% Between images padding pad = 1;% Setup blank display display_array = - ones(pad + display_rows * (example_height + pad), ...pad + display_cols * (example_width + pad));% Copy each example into a patch on the display array curr_ex = 1; for j = 1:display_rowsfor i = 1:display_colsif curr_ex > m, %大于100個則退出break; end% Copy the patch% Get the max value of the patchmax_val = max(abs(X(curr_ex, :)));%得到1行(一幅圖)中最大像素值display_array(pad + (j - 1) * (example_height + pad) + (1:example_height), ...pad + (i - 1) * (example_width + pad) + (1:example_width)) = ...reshape(X(curr_ex, :), example_height, example_width) / max_val;curr_ex = curr_ex + 1;%控制圖片個數endif curr_ex > m, break; end end% Display Image h = imagesc(display_array, [-1 1]);% Do not show axis axis image offdrawnow;end

Part 2a: Vectorize Logistic Regression

向量化的邏輯回歸

%% ============ Part 2a: Vectorize Logistic Regression ============ % In this part of the exercise, you will reuse your logistic regression % code from the last exercise. You task here is to make sure that your % regularized logistic regression implementation is vectorized. After % that, you will implement one-vs-all classification for the handwritten % digit dataset. %% Test case for lrCostFunction fprintf('\nTesting lrCostFunction() with regularization');theta_t = [-2; -1; 1; 2]; X_t = [ones(5,1) reshape(1:15,5,3)/10]; y_t = ([1;0;1;0;1] >= 0.5); lambda_t = 3; [J grad] = lrCostFunction(theta_t, X_t, y_t, lambda_t);fprintf('\nCost: %f\n', J); fprintf('Expected cost: 2.534819\n'); fprintf('Gradients:\n'); fprintf(' %f \n', grad); fprintf('Expected gradients:\n'); fprintf(' 0.146561\n -0.548558\n 0.724722\n 1.398003\n');fprintf('Program paused. Press enter to continue.\n'); pause;

邏輯回歸的代碼
lrCostFunction.m

下圖幫助理解向量化的求梯度

下面是正則化之后的代價函數和梯度值

function [J, grad] = lrCostFunction(theta, X, y, lambda) %LRCOSTFUNCTION Compute cost and gradient for logistic regression with %regularization % J = LRCOSTFUNCTION(theta, X, y, lambda) computes the cost of using % theta as the parameter for regularized logistic regression and the % gradient of the cost w.r.t. to the parameters. % Initialize some useful values m = length(y); % number of training examples% You need to return the following variables correctly J = 0; grad = zeros(size(theta));% ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta. % You should set J to the cost. % Compute the partial derivatives and set grad to the partial % derivatives of the cost w.r.t. each parameter in theta % % Hint: The computation of the cost function and gradients can be % efficiently vectorized. For example, consider the computation % % sigmoid(X * theta) % % Each row of the resulting matrix will contain the value of the % prediction for that example. You can make use of this to vectorize % the cost function and gradient computations. % % Hint: When computing the gradient of the regularized cost function, % there're many possible vectorized solutions, but one solution % looks like: % grad = (unregularized gradient for logistic regression) % temp = theta; % temp(1) = 0; % because we don't add anything for j = 0 % grad = grad + YOUR_CODE_HERE (using the temp variable) %theta_1=[0;theta(2:end)];%theta(1)不要正則化 J=-1/m*sum((y.*log(sigmoid(X*theta))+(1-y).*log(1-sigmoid(X*theta))))+lambda/(2*m)*sum(theta_1'*theta_1);grad=1/m*X'*(sigmoid(X*theta)-y)+lambda/m*theta_1;% =============================================================grad = grad(:);end

下面是多類分類器的函數

function [all_theta] = oneVsAll(X, y, num_labels, lambda) %ONEVSALL trains multiple logistic regression classifiers and returns all %the classifiers in a matrix all_theta, where the i-th row of all_theta %corresponds to the classifier for label i返回矩陣,每一行代表一個帶有標簽的分類器 % [all_theta] = ONEVSALL(X, y, num_labels, lambda) trains num_labels % logistic regression classifiers and returns each of these classifiers % in a matrix all_theta, where the i-th row of all_theta corresponds % to the classifier for label i% Some useful variables m = size(X, 1);%行 n = size(X, 2);%% You need to return the following variables correctly all_theta = zeros(num_labels, n + 1);%大小:標簽數*(列+1% Add ones to the X data matrix X = [ones(m, 1) X];%增加一列全1% ====================== YOUR CODE HERE ====================== % Instructions: You should complete the following code to train num_labels % logistic regression classifiers with regularization % parameter lambda. 多標簽的邏輯回歸分類器 % % Hint: theta(:) will return a column vector. % % Hint: You can use y == c to obtain a vector of 1's and 0's that tell you % whether the ground truth is true/false for this class. % % Note: For this assignment, we recommend using fmincg to optimize the cost % function. It is okay to use a for-loop (for c = 1:num_labels) to % loop over the different classes. % % fmincg works similarly to fminunc, but is more efficient when we % are dealing with large number of parameters.處理大量參數時 fmincg更有效 % % Example Code for fmincg: % % % Set Initial theta % initial_theta = zeros(n + 1, 1); % % % Set options for fminunc % options = optimset('GradObj', 'on', 'MaxIter', 50); % % % Run fmincg to obtain the optimal theta % % This function will return theta and the cost % [theta] = ... % fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ... % initial_theta, options); % for i=1:num_labels%遍歷標簽initial_theta=zeros(n+1,1);options=optimset('GradObj','on','MaxIter',50);[theta]=...fmincg(@(t)lrCostFunction(t,X,(y==i),lambda),...initial_theta,options);all_theta(i,:)=theta';%不斷向all_theta 中添加新的 end % =========================================================================end

理解

這個函數返回一個矩陣Θ∈RK×(N+1)\Theta\in R^{K\times (N+1)}ΘRK×(N+1),其中K是分類器的個數,也就是這里的num_labels,本題分類器的個數等于10,每一個分類器識別10個數字中的一個。每一行代表對應一個分類器的參數.

函數fmincg是用來最小化代價函數,得到每一個分類器的參數θ\thetaθ向量。然后匯總到Θ\ThetaΘ中。

本題中 all_theta 的維度是10*401,400是圖片的像素大小,也就是400個特征輸入到邏輯回歸模型中,然后每一個分類器輸出1個數,取最大值為原圖的預測值。比如10個分類器輸出的是[0.8147,0.9058,0.1270,0.9134,0.9649,0.0975,0.2785,0.5469,0.9575,0.7601][ 0.8147 , 0.9058, 0.1270, 0.9134 , 0.9649, 0.0975, 0.2785, 0.5469, 0.9575 , 0.7601 ][0.8147,0.9058,0.1270,0.9134,0.9649,0.0975,0.2785,0.5469,0.9575,0.7601],最大值為0.9649,則表示原圖中的數字是5(索引從1開始,0.9649對應的索引是 5,該索引對應的數字是數字5).

預測函數

function p = predictOneVsAll(all_theta, X) %PREDICT Predict the label for a trained one-vs-all classifier. The labels %are in the range 1..K, where K = size(all_theta, 1). % p = PREDICTONEVSALL(all_theta, X) will return a vector of predictions % for each example in the matrix X. Note that X contains the examples in % rows. all_theta is a matrix where the i-th row is a trained logistic % regression theta vector for the i-th class. You should set p to a vector % of values from 1..K (e.g., p = [1; 3; 1; 2] predicts classes 1, 3, 1, 2 % for 4 examples) m = size(X, 1);%測試組數,5000組 num_labels = size(all_theta, 1);%標簽數,這里是10% You need to return the following variables correctly p = zeros(size(X, 1), 1);%大小5000*1列向量% Add ones to the X data matrix X = [ones(m, 1) X];% ====================== YOUR CODE HERE ====================== % Instructions: Complete the following code to make predictions using % your learned logistic regression parameters (one-vs-all). % You should set p to a vector of predictions (from 1 to % num_labels). % % Hint: This code can be done all vectorized using the max function. % In particular, the max function can also return the index of the % max element, for more information see 'help max'. If your examples % are in rows, then, you can use max(A, [], 2) to obtain the max % for each row. % [q,p]=max((X*all_theta'),[],2); % =========================================================================end

目前為止的得分

總結

以上是生活随笔為你收集整理的吴恩达机器学习Ex3作业的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。

主站蜘蛛池模板: 蜜桃精品久久久久久久免费影院 | 欧美日韩第一页 | 三级黄色在线播放 | 欧美精品一区二区免费看 | 成年网站免费在线观看 | 久久午夜伦理 | 人妻一区二区三区在线 | 中文字幕91在线 | 夫妻毛片| 久久影视精品 | 狠狠插视频 | 男女曰逼视频 | 99久久免费看精品国产一区 | 国产伦精品一区二区三区视频痴汉 | 中文字幕亚洲图片 | 中文字幕国产亚洲 | www,xxx日本 | 精品国产一区二区不卡 | 在线看你懂得 | 欧美精品做受xxx性少妇 | 国产chinesehd天美传媒 | 蕾丝视频污 | 69国产在线 | 久久久久久久久久一区二区三区 | 蜜臀久久精品 | 俺去操| 伊伊综合网 | 欧美鲁鲁 | 亚洲天堂2020 | 聚色av| 破处视频在线观看 | 日本韩国在线 | 亚洲精品久久久乳夜夜欧美 | 黑人毛片网站 | 91正在播放| 久久久久无码国产精品不卡 | 国产黄视频在线观看 | 日韩午夜在线观看 | 成人免费黄色 | 亚洲人掀裙打屁股网站 | 亚洲综合五月天 | 宅男噜噜噜 | 久久99这里只有精品 | 国产裸体无遮挡 | 香港三级韩国三级日本三级 | 日本精品久久久久中文字幕 | 污片视频在线观看 | 超碰公开在线观看 | av免费播放 | 夜夜爱夜夜操 | 国产老肥熟 | 男女啪啪免费 | 91官网视频 | 欧美日韩一区二区三区四区 | 欧美女优一区二区 | 久久亚洲欧洲 | 免费成人高清视频 | www.日日操 | 国产精品99久久久久久久久 | 青青青在线观看视频 | 人人妻人人澡人人爽久久av | 亚洲一区二区三区婷婷 | 亚洲成人久久久 | 国产一区二区三区在线视频观看 | 中文成人无字幕乱码精品区 | 好吊日av| 99亚洲欲妇 | xxxxxx欧美| 男男gay动漫 | 看看毛片| 久久aⅴ国产欧美74aaa | 国产精品一区视频 | www.xxx国产| 欧美日韩中文国产 | 黄色av视屏 | 三级国产在线观看 | 国产91视频在线 | 成人福利网站在线观看 | 午夜激情网站 | 成人高潮片免费 | 中文在线一区二区三区 | 国产suv精品一区二区33 | 亚洲第一色播 | 91国产高清 | 一本亚洲 | 国产美女永久免费无遮挡 | 男生操女生屁股 | 成人无码精品1区2区3区免费看 | 91成人精品国产刺激国语对白 | 成人亚洲一区二区 | 欧美日韩色综合 | 午夜天堂影院 | 中文一区二区在线播放 | 69视频免费 | 亚州av网| 亚洲精品国产精品乱码不66 | 亚洲欧洲综合在线 | 欧美图片一区二区三区 | 怡红院院av |