模式识别与机器学习 实验

  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

CENTRAL SOUTH UNIVERSITY

机器学习课程设计报告

题目实验三、四、五、八

学生姓名

班级学号

指导教师

设计时间 2014.11.29

实验三、Multi-class Classification

and Neural Networks

1.1d ataSet

题目:

These matrices can be read directly into your program

by using the load command.

答案:

octave-3.2.4.exe:7>load('ex3data1')//或者load ex3data1.mat

1.2V isualizing the data

题目:

you are encouraged to examine the code to

see how it works

解答:

octave-3.2.4.exe:7>Run ex3

Vectorizing Logistic Regression

题目:

In this section, you will implement a vectorized version of logistic regression that does not employ any for loops.

Your job is to write the unregularized cost function in the file lrCostFunction.m

Your implementation should use the strategy we presented above to calculate θT x(i). You should also use a vectorized approach for the rest of the cost function. A fully vectorized version of lrCostFunction.m should not contain any loops.

解答:

打开lrCostFunction.m在your code here 加上如下代码

/*your code here*/

sigm = sigmoid(X*theta);

J = sum(-y.*log(sigm) - (1-y).*log(1-sigm))/m + lambda * sum(theta(2:end).^2)/(2*m);

grad = X' * (sigm-y)/m;

grad0 = grad(1);

grad = grad + (lambda/m)*theta;

grad(1) = grad0;

原理:

1.3.2 Vectorizing the gradient

题目:

You should now implement Equation 1 to compute the correct vectorized gradient. Once you are done, complete the function lrCostFunction.m by implementing the gradient.

解答:

打开oneVsALL.m在your code here 下面加上下面代码

/* your code here*/

for k=1:num_labels

initial_theta = zeros(n + 1, 1);

options = optimset('GradObj', 'on', 'MaxIter', 50);

[theta] = fmincg (@(t)(lrCostFunction(t, X, (y == k), lambda)),initial_theta, options);

all_theta(k,:) = theta';

end

1.3.3 Vectorizing regularized logistic regression

题目:

Now modify your code in lrCostFunction to account for regularization.

Once again, you should not put any loops into your code.

You should now complete the code in predictOneVsAll.m to use the

one-vs-all classifier to make predictions. And Once you are done, ex3.m will call your predictOneVsAll function using the learned value of Θ. You should see that the training set

accuracy is about 94.9%

解答:打开predictOneVsAll.m在your code here 加上一下代码

/your code here/

[c,i] = max(sigmoid(X * all_theta'), [], 2);

p = i;

最终结果如下

2Neural Networks

Task:

In this part of the exercise, you will implement a neural network to recognize handwritten digits using the same training set as before.

The provided script, ex3 nn.m, will help you step through this exercise

Question:

You have been provided with a set of network parameters (Θ(1), Θ(2)) already trained by us. These are stored in ex3weights.mat and will be loaded by ex3 nn.m into Theta1 and Theta2 The parameters have dimensions that are sized for a neural network with 25 units in the second layer and 10 output units

Answer:

load(' ex3weights.mat' );

相关文档
最新文档