(5/5)

# Let p(x|wi) ∼ N (µi, σ2) for a two-category one-dimensional problem with P (ω1) = P (ω2) = 2.

INSTRUCTIONS TO CANDIDATES

Problem 1. Let p(x|wi) ∼ N (µi, σ2) for a two-category one-dimensional problem with P (ω1) = P (ω2) = 2

Show that the minimum probability of error is given by where a = |µ1 µ2 | .

• Use the inequality

Pe = √2π

eu /2du

Pe = √2πet /2    1dt ≤ √2πaea /2

show that Pe goes to zero as |µ1 µ2 | goes to infinity.

Problem 2. Consider a two-category classification problem in two dimensions with(x|ω1) ∼ N (0, I), p(x|ω2) ∼ N (.1Σ , I)and P (ω1) = P (ω2) = 1 ,

• Calculate the Bayes decision
• Calculate the Bhattacharyya error
• Repeat the above for the same prior probabilities, p(x|ω1) ∼ N (0, . 2         0.5Σ), p(x|ω2) ∼ N (.1Σ , .5    4Σ)

Problem 3.        Suppose that we have three categories in two dimensions with the following underlying distributions:

 .  Σ•      |           ∼ N2

• p(x|ω1) ∼ N (0, I)

p(x ω )           ( 1  , I) 1

• p(x|ω3) ∼ 1N (.0., I) + 1 N (.−0.5Σ , I)with P (ωi) = 1 , i = 1, 2, 3.
• By explicit calculation of posterior probabilities, classify the point x = 0.3    for minimum probability

0.3

of error.

• Suppose that for a particular test point the first feature is That is, classify x = .0∗.3Σ.
• Suppose that for a particular test point the second feature is That is, classify x = .0.3Σ.
• Repeat all of the above for x = 0.2 .

Problem 4. Consider two normal distributions with arbitrary but equal covariances. Prove that the Fisher linear discriminant, for suitable threshold, can be derived from the negative of the log-likelihood ratio

Problem 5. It is easy to see that the nearest-neighbor error rate P can equal the Bayes rate P if P = 0 (the best possibility) or if P = c1 (the worst possibility). One might ask whether or not there are problems for which P = P when P is between these extremes.

• Show that the Bayes rate for the one-dimensional case where P (ωi) = 1 and
• Show that for this case the nearest-neighbor rate is P = P .

Problem 6. Prove that the computational complexity of the basic nearest-neighbor editing algorithm for

n points in d dimension is O(d3n| d ∫ ln n).

Nearest-Neighbor Editing Algorithm

1:  begin initialize j            0,           data set, n          #prototypes

2: construct the full Voronoi diagram of 3:   do j    j + 1, for each prototype x³ j 4:      find the Voronoi neighbors of x³ j

5:           if any neighbor is not from the same class as x³ j , then mark x³ j

6: until j = n

7:       discard all points that are not marked

8:       construct the Voronoi diagram of the remaining (marked) prototypes

9: end

Problem 7. Consider classifiers based on samples with priors P (ω1) = P (ω2) = 0.5 and the distributions

p(x|ω  ) = .2x,    0 ≤ x ≤ 1

(2)0, elsewhere

p(x|ω  ) = .2 − 2x,    0 ≤ x ≤ 1

elsewhere

• What is the Bayes decision rule and the Bayes classification error?
• Suppose we randomly select a single point ω1 and a single point from ω2, and create a nearest-neighbor classifier. Suppose too we select a test point from one of the categories (ω1 for definiteness). Integrate to find the expected error rate P2(e).
• Repeat with two training samples from each category and a single test point in order to find P2(e).
• Generalize to show that in general,

Pn(e) = 3 + (n + 1)(n + 3) + 2(n + 2)(n + 3)

Confirm this formula makes sense in the n = 1 case.

• Compare limn→∞ Pn(e) with the Bayes
(5/5)

## Related Questions

##### . Introgramming & Unix Fall 2018, CRN 44882, Oakland University Homework Assignment 6 - Using Arrays and Functions in C

DescriptionIn this final assignment, the students will demonstrate their ability to apply two ma

##### . The standard path finding involves finding the (shortest) path from an origin to a destination, typically on a map. This is an

Path finding involves finding a path from A to B. Typically we want the path to have certain properties,such as being the shortest or to avoid going t

##### . Develop a program to emulate a purchase transaction at a retail store. This program will have two classes, a LineItem class and a Transaction class. The LineItem class will represent an individual

Develop a program to emulate a purchase transaction at a retail store. Thisprogram will have two classes, a LineItem class and a Transaction class. Th

##### . SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of Sea Ports. Here are the classes and their instance variables we wish to define:

1 Project 1 Introduction - the SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of

##### . Project 2 Introduction - the SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of Sea Ports. Here are the classes and their instance variables we wish to define:

1 Project 2 Introduction - the SeaPort Project series For this set of projects for the course, we wish to simulate some of the aspects of a number of

Hire Me