欢迎访问 生活随笔!

生活随笔

当前位置: 首页 > 编程资源 > 编程问答 >内容正文

编程问答

吴恩达 coursera ML 第五课总结+作业答案

发布时间:2025/3/15 编程问答 38 豆豆
生活随笔 收集整理的这篇文章主要介绍了 吴恩达 coursera ML 第五课总结+作业答案 小编觉得挺不错的,现在分享给大家,帮大家做个参考.

前言

学以致用,以学促用,通过笔记总结,巩固学习成果,复习新学的概念。

目录

文章目录

  • 前言
  • 目录
  • 正文
    • 模型引入
    • 决策边界
    • 误差函数
    • 多分类问题
  • 作业答案

正文

本节学习内容主要为逻辑回归-分类。

模型引入

问题引入,收到一封邮件后,电脑如何自动判断将其归类为垃圾邮件,节约我们看邮件的时间。
例子,根据肿瘤尺寸对癌症的良性和恶性进行分类,假设计算的值》=0.5,则认为肿瘤是恶性的。
因为,我们想要0<y(x)<1,因此,我们选择了sigmoid函数作为映射函数,它的函数图像如图所示。
对于理论输出结果的解释,多少概率是这个结果。

决策边界

逻辑回归模型详解,对应于y=1时的原始x值,以及中间输出值z的大小。

决策边界,即是分类超平面,是模型空间里正负两类的分界线。
分类便捷不一定是条直线,对于非线性问题它也可能是一条曲线。

误差函数

为了选择一个合适的参数,我们需要一个合适的误差函数,而且这个误差函数是凸函数。

直观演示逻辑回归函数的误差函数1。
直观演示逻辑回归函数的误差函数2。
误差函数组合,最终形式。

## 梯度下降的实现流程
这个程序的优化算法

多分类问题

多分类的分类边界
多分类问题的实现方式,通过n个单分类器。

作业答案

ex2.m

%% Machine Learning Online Class - Exercise 2: Logistic Regression % % Instructions % ------------ % % This file contains code that helps you get started on the logistic % regression exercise. You will need to complete the following functions % in this exericse: % % sigmoid.m % costFunction.m % predict.m % costFunctionReg.m % % For this exercise, you will not need to change any code in this file, % or any other files other than those mentioned above. %%% Initialization clear ; close all; clc%% Load Data % The first two columns contains the exam scores and the third column % contains the label.data = load('ex2data1.txt'); X = data(:, [1, 2]); y = data(:, 3);%% ==================== Part 1: Plotting ==================== % We start the exercise by first plotting the data to understand the % the problem we are working with.fprintf(['Plotting data with + indicating (y = 1) examples and o ' ...'indicating (y = 0) examples.\n']);plotData(X, y);% Put some labels hold on; % Labels and Legend xlabel('Exam 1 score') ylabel('Exam 2 score')% Specified in plot order legend('Admitted', 'Not admitted') hold off;fprintf('\nProgram paused. Press enter to continue.\n'); pause;%% ============ Part 2: Compute Cost and Gradient ============ % In this part of the exercise, you will implement the cost and gradient % for logistic regression. You neeed to complete the code in % costFunction.m% Setup the data matrix appropriately, and add ones for the intercept term [m, n] = size(X);% Add intercept term to x and X_test X = [ones(m, 1) X];% Initialize fitting parameters initial_theta = zeros(n + 1, 1);% Compute and display initial cost and gradient [cost, grad] = costFunction(initial_theta, X, y);fprintf('Cost at initial theta (zeros): %f\n', cost); fprintf('Expected cost (approx): 0.693\n'); fprintf('Gradient at initial theta (zeros): \n'); fprintf(' %f \n', grad); fprintf('Expected gradients (approx):\n -0.1000\n -12.0092\n -11.2628\n');% Compute and display cost and gradient with non-zero theta test_theta = [-24; 0.2; 0.2]; [cost, grad] = costFunction(test_theta, X, y);fprintf('\nCost at test theta: %f\n', cost); fprintf('Expected cost (approx): 0.218\n'); fprintf('Gradient at test theta: \n'); fprintf(' %f \n', grad); fprintf('Expected gradients (approx):\n 0.043\n 2.566\n 2.647\n');fprintf('\nProgram paused. Press enter to continue.\n'); pause;%% ============= Part 3: Optimizing using fminunc ============= % In this exercise, you will use a built-in function (fminunc) to find the % optimal parameters theta.% Set options for fminunc options = optimset('GradObj', 'on', 'MaxIter', 400);% Run fminunc to obtain the optimal theta % This function will return theta and the cost [theta, cost] = ...fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);% Print theta to screen fprintf('Cost at theta found by fminunc: %f\n', cost); fprintf('Expected cost (approx): 0.203\n'); fprintf('theta: \n'); fprintf(' %f \n', theta); fprintf('Expected theta (approx):\n'); fprintf(' -25.161\n 0.206\n 0.201\n');% Plot Boundary plotDecisionBoundary(theta, X, y);% Put some labels hold on; % Labels and Legend xlabel('Exam 1 score') ylabel('Exam 2 score')% Specified in plot order legend('Admitted', 'Not admitted') hold off;fprintf('\nProgram paused. Press enter to continue.\n'); pause;%% ============== Part 4: Predict and Accuracies ============== % After learning the parameters, you'll like to use it to predict the outcomes % on unseen data. In this part, you will use the logistic regression model % to predict the probability that a student with score 45 on exam 1 and % score 85 on exam 2 will be admitted. % % Furthermore, you will compute the training and test set accuracies of % our model. % % Your task is to complete the code in predict.m% Predict probability for a student with score 45 on exam 1 % and score 85 on exam 2 prob = sigmoid([1 45 85] * theta); fprintf(['For a student with scores 45 and 85, we predict an admission ' ...'probability of %f\n'], prob); fprintf('Expected value: 0.775 +/- 0.002\n\n');% Compute accuracy on our training set p = predict(theta, X);fprintf('Train Accuracy: %f\n', mean(double(p == y)) * 100); fprintf('Expected accuracy (approx): 89.0\n'); fprintf('\n');

sigmoid.m

function g = sigmoid(z) %SIGMOID Compute sigmoid function % g = SIGMOID(z) computes the sigmoid of z.% You need to return the following variables correctly g = zeros(size(z));% ====================== YOUR CODE HERE ====================== % Instructions: Compute the sigmoid of each value of z (z can be a matrix, % vector or scalar). g=1./(1+exp(-z));% =============================================================end

costfunction.m

function [J, grad] = costFunction(theta, X, y) %COSTFUNCTION Compute cost and gradient for logistic regression % J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the % parameter for logistic regression and the gradient of the cost % w.r.t. to the parameters.% Initialize some useful values m = length(y); % number of training examples% You need to return the following variables correctly J = 0; grad = zeros(size(theta));% ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta. % You should set J to the cost. % Compute the partial derivatives and set grad to the partial % derivatives of the cost w.r.t. each parameter in theta % % Note: grad should have the same dimensions as theta % error=0; for i=1:m error=error-y(i)*log(sigmoid(X(i,:)*theta))-(1-y(i))*log(1-sigmoid(X(i,:)*theta)); end J=error/m; for j=1:length(theta)factor=0;for i=1:mfactor=factor+(sigmoid(X(i,:)*theta)-y(i))*X(i,j);endgrad(j)=factor/m; end% =============================================================end

predict.m

function p = predict(theta, X) %PREDICT Predict whether the label is 0 or 1 using learned logistic %regression parameters theta % p = PREDICT(theta, X) computes the predictions for X using a % threshold at 0.5 (i.e., if sigmoid(theta'*x) >= 0.5, predict 1)m = size(X, 1); % Number of training examples% You need to return the following variables correctly p = zeros(m, 1);% ====================== YOUR CODE HERE ====================== % Instructions: Complete the following code to make predictions using % your learned logistic regression parameters. % You should set p to a vector of 0's and 1's %p= sigmoid(X*theta)>0.5;

总结

以上是生活随笔为你收集整理的吴恩达 coursera ML 第五课总结+作业答案的全部内容,希望文章能够帮你解决所遇到的问题。

如果觉得生活随笔网站内容还不错,欢迎将生活随笔推荐给好友。