logo Use CA10RAM to get 10%* Discount.
Order Nowlogo
(5/5)

You have just got a loaded 6-sided dice from your statistician friend. Unfortunately,

INSTRUCTIONS TO CANDIDATES
ANSWER ALL QUESTIONS

MLE, MAP, Concentration (Pengtao)

1. MLE of Uniform Distributions [5 pts]

Given a set of i.i.d samples X1, ..., Xn Uniform(0, θ), find the maximum likelihood estimator of θ.

(a) Write down the likelihood function (3 pts)

(b) Find the maximum likelihood estimator (2 pts)

 

2. Concentration [5 pts]

The instructors would like to know what percentage of the students like the Introduction to Machine Learn- ing course. Let this unknown—but hopefully very close to 1—quantity be denoted by µ. To estimate µ, the instructors created an anonymous survey which contains this question:

 

”Do you like the Intro to ML course? Yes or No”

 

Each student can only answer this question once, and we assume that the distribution of the answers is i.i.d.

 

(a) What is the MLE estimation of µ? (1 pts)

(b) Let  the  above  estimator  be  denoted  by  µˆ.  How  many  students  should  the  instructors  ask  if  they  want the estimated value µˆ  to be so close to the unknown µ such that

P(|µˆ − µ| > 0.1) < 0.05, (4pts)

3. MAP of Multinational Distribution [10 pts]

You have just got a loaded 6-sided dice from your statistician friend. Unfortunately, he does not remem- ber its exact probability distribution p1, p2, ..., p6. He remembers, however, that he generated the vector (p1, p2, . . . , p6) from the following Dirichlet distribution.

 

Γ(Σ6

 

 

 

u ) Y Σ

 

 

   

 

P(p , p , . . . , p ) =

 

i=1 i

 

pui−1δ(

 

pi − 1),

 

where he chose ui = i for all i = 1, . . . , 6. Here Γ denotes the gamma function, and δ is the Dirac delta. To

estimate the probabilities p1, p2, . . . , p6, you roll the dice 1000 times and then observe that side i occurred

ni  times (Σ6 ni = 1000).

 

(a) Prove that the Dirichlet distribution is conjugate prior for the multinomial distribution.

(b) What is the posterior distribution of the side probabilities, P(p1, p2, . . . , p6|n1, n2, . . . , n6)?

 

Linear Regression (Dani)

1. Optimal MSE rule [10 pts]

Suppose we knew the joint distribution PXY . The optimal rule f ∗ : X → Y which minimizes the MSE (Mean Square Error) is given as:

f ∗ = arg min E[(f (X) Y )2]

f

Show that f ∗(X) = E[Y |X].

2. Ridge Regression [10 pts]

In class, we discussed l2 penalized linear regression:

 

 

 

 

 

where Xi = [X(1) . . . X(p)].

 

β 2

i=1

 

i i

a) Show that a closed form expression for the ridge estimator is β = (ATA + λI)−1ATY where

A = [X1; . . . ; Xn] and Y = [Y1; ...; Yn].

b) An advantage of ridge regression is that a unique solution always exists since (ATA+λI) is invertible. To be invertible, a matrix needs to be full rank. Argue that (ATA + λI) is full rank by characterizing its p eigenvalues in terms of the singular values of A and λ.

 

 

 

Logistic Regression (Prashant)

1. Overfitting and Regularized Logistic Regression [10 pts]

a) Plot the sigmoid function 1/(1 + e−wX ) vs. X ∈ R for increasing weight  w  ∈ {1, 5, 100}.  A qualitative sketch is enough. Use these plots to argue why a solution with large weights can cause logistic regression to overfit.

 

 

 

b) To prevent overfitting, we want the weights to be small. To achieve this, instead of maximum conditional likelihood estimation M(C)LE for logistic regression:

 

 

max

w0,...,wd

 

 

n

P (Yi|Xi, w0, . . . , wd),

i=1

 

 

we can consider maximum conditional a posterior M(C)AP estimation:

 

 

max

w0,...,wd

 

 

n

P (Yi|Xi, w0, . . . , wd)P (w0, . . . , wd)

i=1

 

 

where P (w0, . . . , wd) is a prior on the weights.

Assuming a standard Gaussian prior N (0, I) for the weight vector, derive the gradient ascent update rules for the weights.

 

 

(5/5)
Attachments:

Related Questions

. Introgramming & Unix Fall 2018, CRN 44882, Oakland University Homework Assignment 6 - Using Arrays and Functions in C

DescriptionIn this final assignment, the students will demonstrate their ability to apply two ma

. The standard path finding involves finding the (shortest) path from an origin to a destination, typically on a map. This is an

Path finding involves finding a path from A to B. Typically we want the path to have certain properties,such as being the shortest or to avoid going t

. Develop a program to emulate a purchase transaction at a retail store. This program will have two classes, a LineItem class and a Transaction class. The LineItem class will represent an individual

Develop a program to emulate a purchase transaction at a retail store. Thisprogram will have two classes, a LineItem class and a Transaction class. Th

. SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of Sea Ports. Here are the classes and their instance variables we wish to define:

1 Project 1 Introduction - the SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of

. Project 2 Introduction - the SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of Sea Ports. Here are the classes and their instance variables we wish to define:

1 Project 2 Introduction - the SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of

Ask This Question To Be Solved By Our ExpertsGet A+ Grade Solution Guaranteed

expert
Um e HaniScience

979 Answers

Hire Me
expert
Muhammad Ali HaiderFinance

530 Answers

Hire Me
expert
Husnain SaeedComputer science

654 Answers

Hire Me
expert
Atharva PatilComputer science

752 Answers

Hire Me
April
January
February
March
April
May
June
July
August
September
October
November
December
2025
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
SunMonTueWedThuFriSat
30
31
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
1
2
3
00:00
00:30
01:00
01:30
02:00
02:30
03:00
03:30
04:00
04:30
05:00
05:30
06:00
06:30
07:00
07:30
08:00
08:30
09:00
09:30
10:00
10:30
11:00
11:30
12:00
12:30
13:00
13:30
14:00
14:30
15:00
15:30
16:00
16:30
17:00
17:30
18:00
18:30
19:00
19:30
20:00
20:30
21:00
21:30
22:00
22:30
23:00
23:30