Skip to content

This is my homework solutions for Machine Learning course at KU, Fall 2018.

Notifications You must be signed in to change notification settings

cristianmtr/ku_machine_learning_course

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

README

My homework solutions for Machine Learning course at KU (University of Copenhagen), Fall 2018.

The code and reports are provided as is, for purposes of education, but with no guarantee of correctness.

Feedback received on assignments, that I didn't have time to incorporate.

1:

  1. 4/5 Sample space is not formally defined. 2. 21/25 2.3 not explained what is a meaning of right-hand side not being an integer. 3. 5/5 4. 30/30 5. 1/5 you need to explain how you will organize majority vote in this question. 6. 30/30 ok

2:

1: 10 2: 10 3: 8 calculations of 3.2 and 3.3 are wrong missing 3.4 4: 12 4.2 calculation is not that accurate 5: 50

3:

1: 4 when $n \to \infty $, the series $ \sum_{i = 1}^{n} X_{i} $ will not converge to $\mu = 0.5$. 2: 30 2.2 there are M hypothesis 2.3c For the case of $L(\hat{h}^{*}, S^{2}_{val} )$, since there is only one hypothesis, and the number of sample is $\frac{n}{2}$, therefore by apply theorem 3.1, 3: 25 4: 35

4:

See annotations on PDF.

5:

Question 1: 1.2) Your bound is correct and well reasoned, but we asked you for a bound that holds with probability of at least 1-\delta. -3pts 37/40 Points Question 2: Statement 2: Your argumentation is good, but you can reason more formally about this by upper bounding the (uniform) priori distribution of a hypothesis class with infinite VC-dimension with 2^-n, and solving the occams razor bound. Statement 3: Theorem 3.18 does not state any requirements on the distribution. In the proof we use a uniform distribution because it is mathematically convenient, but the general notion holds for other distributions as well. Splitting a dataset into 2 evenly sized parts is not a requirement on the distribution either. -6pts. Statement 4: The example with 1-MM classifier is good, but could use more explanation. 24/30 Points Question 3: ok. 30/30 Points Total: 91/100

6:

None provided.

7:

None provided.

About

This is my homework solutions for Machine Learning course at KU, Fall 2018.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages