Machine Learning

Homework 4

Due: October 12, 2007 (midnight)

No late homeworks will be accepted.

Total points: 55

  1. (3 points) Run the NaiveBayes classifier on the weather.arff dataset. Use the default parameter settings, and use the training set as the test option. Include in your submission the printed results from WEKA.
  2. (1 point) What type of distribution does WEKA's NaiveBayes classifier assume for continuous attributes?
  3. (10 points) Redo questions 3 and 4 from HW3, only substitute NaiveBayes for ConjunctiveRule.
  4. (15 points) Redo questions 6 and 7 from HW3, only substitute NaiveBayes for J48.
  5. (8 points) Consider a learning algorithm that finds the hypothesis that is a conjunction of literals and that has the least (but possibly non-zero) error on the training examples.
    1. What is the size |H| of this hypothesis space for the contact-lenses data set from WEKA? Justify your answer.
    2. Give an upper bound on the number of training examples sufficient to assure with 95% confidence that the learned hypothesis will have true error of at most 5%. Show all your work.
  6. (8 points) Exercise 7.2 from Mitchell's book. Show all work and justify all answers. In your answer to part b, give a proof of your own for the VC-dimension.
  7. (10 points) Exercise 7.8 from Mitchell's book.
  8. Email to me (holder@eecs.wsu.edu) a zip file containing the following:
    1. Text file containing the raw output of the NaiveBayes run on the weather dataset.
    2. Text file containing the raw output of the first experiment above (result as from HW3 question 3h).
    3. Raw threshold curve data for NaiveBayes and MultilayerPerceptron on the labor dataset (the two files you saved as in step 6e in HW3 ).
    4. Nicely-formatted report (MSWord, PDF or PostScript) containing:
      • Answer to question 2.
      • Table summarizing results of experiment in question 3.
      • Nicely-formatted plot of the two ROC curves.
      • Discussion of performance comparison based on the ROC curves.
      • Answers to questions 5, 6 and 7.