• 吴恩达-AI-机器学习课后习题解析-第三周


     =================================================  sigmod.m  =========================================================================

    function g = sigmoid(z)
    %SIGMOID Compute sigmoid function
    % g = SIGMOID(z) computes the sigmoid of z.

    % You need to return the following variables correctly
    g = zeros(size(z));

    % ====================== YOUR CODE HERE ======================
    % Instructions: Compute the sigmoid of each value of z (z can be a matrix,
    % vector or scalar).


    g = 1 ./ (1+exp(-z))         


    % =============================================================

    end

     ===================================================== predict.m =====================================================================

    function p = predict(theta, X)
    %PREDICT Predict whether the label is 0 or 1 using learned logistic
    %regression parameters theta
    % p = PREDICT(theta, X) computes the predictions for X using a
    % threshold at 0.5 (i.e., if sigmoid(theta'*x) >= 0.5, predict 1)

    m = size(X, 1); % Number of training examples

    % You need to return the following variables correctly
    p = zeros(m, 1);

    % ====================== YOUR CODE HERE ======================
    % Instructions: Complete the following code to make predictions using
    % your learned logistic regression parameters.
    % You should set p to a vector of 0's and 1's
    %

    p = round(sigmoid(X * theta));   % round(>= 0.5) = 1, round(< 0.5) = 0


    % =========================================================================


    end

     =================================================  costFunction.m  =========================================================================

    function [J, grad] = costFunction(theta, X, y)
    %COSTFUNCTION Compute cost and gradient for logistic regression
    % J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
    % parameter for logistic regression and the gradient of the cost
    % w.r.t. to the parameters.

    % Initialize some useful values
    m = length(y); % number of training examples

    % You need to return the following variables correctly
    J = 0;
    grad = zeros(size(theta));

    % ====================== YOUR CODE HERE ======================
    % Instructions: Compute the cost of a particular choice of theta.
    % You should set J to the cost.
    % Compute the partial derivatives and set grad to the partial
    % derivatives of the cost w.r.t. each parameter in theta
    %
    % Note: grad should have the same dimensions as theta
    %

    J = (-y' * log(sigmoid(X * theta)) - (1 - y)' * log(1 - sigmoid(X * theta))) / m;

    grad = X' * (sigmoid(X * theta) - y) / m;


    % =============================================================

    end

  • 相关阅读:
    Git 处理tag和branch的命令
    手把手教您使用第三方登录
    iOS 中隐藏UITableView最后一条分隔线
    Android简易实战教程--第四十四话《ScrollView和HorizontalScrollView简单使用》
    iOS-改变UITextField的Placeholder颜色的三种方式
    react-native 关闭黄屏警告
    reactnative js onclick 模拟单击/双击事件
    reactnative 监听屏幕方向变化
    reactnative0.61.2 使用react-native-webrtc
    use react-navigation@2.18.2
  • 原文地址:https://www.cnblogs.com/duenboa/p/10390783.html
Copyright © 2020-2023  润新知