Data Science 101: Machine Learning, Part 4

Print Friendly, PDF & Email

MACHINE LEARNING SERIES – PART 4

The “How Machine Learning Works” lecture series continues by building on top of the Bayesian classifier developed in Part 3 of the series. We’ll build an expectation-maximization (EM) algorithm that locally maximizes the likelihood function. We’ll then go over an improved Python implementation from the last lecture where we reduced the size of training set dramatically and measured the convergence of EM. This shows that using the test data set for improving the model greatly helps in improving the accuracy of the algorithm. This lecture is presented by BloomReach engineer Srinath Sridha.

 

Sign up for the free insideBIGDATA newsletter.

 

 

Speak Your Mind

*