• 【matlab】stanford线性回归,logistic regression 实验


    1、找到衡量误差的函数costFunction

    2、拟合参数theta,使costFunction最小。用梯度下降,迭代n次,迭代更新theta,让costFunction减小

    3、找到了合适的参数theta,进行预测

    一、linear regression

    computeCost:

    for i=1:m
        h = X(i,:) * theta;
        J = J + (h - y(i))^2;
    end
    J = J / (2*m);

    梯度下降过程,拟合参数theta

    for iter = 1:num_iters
    
        sum = zeros(size(theta,1),1);
        for j = 1:size(theta,1)
            for i = 1:m
                h = X(i,:) * theta;
                sum(j) = sum(j) + (h - y(i))*X(i,j);
            end
            % theta(j) = theta(j) - alpha * sum / m; 
            %go wrong! simultaneously update theta
        end
        
        theta = theta - sum .* alpha ./ m;
    
        % Save the cost J in every iteration    
        J_history(iter) = computeCostMulti(X, y, theta);
    
    end

    二、Logistic Regression

    costFunction

    function [J, grad] = costFunctionReg(theta, X, y, lambda)
    %COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
    %   J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
    %   theta as the parameter for regularized logistic regression and the
    %   gradient of the cost w.r.t. to the parameters. 
    
    % Initialize some useful values
    m = length(y); % number of training examples
    
    % You need to return the following variables correctly 
    J = 0;
    grad = zeros(size(theta));
    
    
    for i=1:m
        J = J - y(i)*log(h_fun(X(i,:), theta)) - (1-y(i))*log(1-h_fun(X(i,:),theta));
    end
    J = J / m;
    reg = 0;
    for j=2:size(theta)
        reg = reg + theta(j)^2;
    end
    reg = reg * lambda /(2*m);
    J = J + reg;
    
    for i=1:m
        grad(1) = grad(1) + (h_fun(X(i,:),theta) - y(i))*X(i,1);
    end
    grad(1) = grad(1) / m;
    for j=2:size(theta)
        for i=1:m
            grad(j) = grad(j) + (h_fun(X(i,:),theta) - y(i)) * X(i,j) + lambda*theta(j)/m;
        end
        grad(j) = grad(j) / m;
    end
    
    
    end

    参数拟合

    % Initialize fitting parameters
    initial_theta = zeros(size(X, 2), 1);
    
    % Set regularization parameter lambda to 1 (you should vary this)
    lambda = 0;
    
    % Set Options
    options = optimset('GradObj', 'on', 'MaxIter', 400);
    
    % Optimize
    [theta, J, exit_flag] = ...
        fminunc(@(t)(costFunctionReg(t, X, y, lambda)), initial_theta, options);
  • 相关阅读:
    博客园20071027上海聚会
    上海招聘.NET(C#)程序员
    招人
    漂亮的后台WebUi框架(有源码下载)
    js插件库系列导航
    PrestoSQL(trinodb)源码分析 执行(下)
    Extjs4 (二)
    Struts2(1)简介
    css中的字体
    什么是REST架构
  • 原文地址:https://www.cnblogs.com/549294286/p/3033800.html
Copyright © 2020-2023  润新知