Halcom 发表于 2017-9-28 22:31:42

Logistic回归

Logistic回归
参考Softmax回归,

以数字图像为例进行研究,X为500x400的矩阵,500表示500张图像,每个图像20x20;y为500x1的矩阵,标签为1:10;主函数为:clc,clear,close all
load('Xy.mat')
% X为500x400的矩阵,500表示500张图像,每个图像20x20;
% y为500x1的矩阵,标签为1:10;
lambda = 0.1;% lambda为正则化参数抑制过拟合
= oneVsAll(X, y, num_labels, lambda);

% 预测
% pred = predictOneVsAll(all_theta, X);
m = size(X, 1);   % m = 500
num_labels = size(all_theta, 1);   % num_labels = 10
pred = zeros(size(X, 1), 1);
% Add ones to the X data matrix
X = ;
= max(X * all_theta' , [], 2); 多分类优化theta,
function = oneVsAll(X, y, num_labels, lambda)
% 多类分类回归
% 当训练数字1时,数字1为一类1,则其它样本数字均为另一类0
% 当训练数字2时,数字2为一类1,则其它样本数字均为另一类0
% 当训练数字3时,数字3为一类1,则其它样本数字均为另一类0
% ……
% 当训练数字10时,10表示0,数字10为一类1,则其它样本数字均为另一类0

m = size(X, 1);% m = 500 样本数
n = size(X, 2);% n = 400 特征数

% 初始化 10x401
all_theta = zeros(num_labels, n + 1);

% 增加直流分量
X = ;

% Set Initial theta
initial_theta = zeros(n + 1, 1);
%
% % Set options for fminunc
options = optimset('GradObj', 'on', 'MaxIter', 50);
%
%% Run fmincg to obtain the optimal theta
%% This function will return theta and the cost
for iter = 1:num_labels
    % J最小时, theta的取值
    all_theta(iter,:)= fmincg (@(t)(lrCostFunction(t, X, (y == iter), lambda)), initial_theta, options);
end

Logistic回归损失函数如下:
function = lrCostFunction(theta, X, y, lambda)
% Compute cost and gradient for logistic regression with regularization

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly
J = 0;
grad = zeros(size(theta));
%grad = grad + YOUR_CODE_HERE (using the temp variable)

h = sigmoid(X*theta);
A = theta(2:size(theta),1)
J = ((-y)'*log(h)-(1-y)'*log(1-h))/m + lambda/(2*m)*sum(A.^2);

% calculate grads
grad = (X'*(h - y))/m + lambda*(1/m)*theta;
grad(1)= grad(1) - lambda*(1/m)*theta(1);
% ========================================
grad = grad(:);

end参考:
【1】Softmax回归





页: [1]
查看完整版本: Logistic回归