Description

Perceptrons, and Neural Networks
In this question, you will implement the Perceptron algorithm and compare it with your own implementations of Naive Bayes. If you are unsure of whether your implementation is correct, then you may compare them with WEKA/ScikitLearn implementations of Naive Bayes. As in homework 2, the classi cation task is spam/ham (use the same dataset made available as part of homework 2).
1
80 points Implement the perceptron algorithm (use the perceptron training rule and not the gradient descent rule). Your task here is to experiment with di erent values of number of iterations and the learning rate. Report the accuracy for 20 suitable combinations of number of iterations and the learning rate. Repeat your experiment by ltering out the stop words. Compare the accuracy of your perceptron implementation with that of Naive Bayes (implemented in Homework 2).
20 points Consider the data set given below. Assume that the coordinates of the points are (1,1), (1,1), (1,1) and (1,1).
Construct a neural network that will have zero training error on this dataset. Write down and explain the solution (no programming is necessary for this part).
(Hint: Think XOR. You will need exactly one hidden layer and two hidden nodes.).
2
What to Turn in
Your code
README le for compiling and executing your code. A detailed write up that contains:


The accuracy on the test set di erent values of the number of iterations and the learning rate.



Compare the accuracy across the di erent models and report your observations.



Written answer to XOR question.

3