Intuitively Understand Maximum Likelihood*2WHQQEbs7eb3fcwj

Original Source Here

We All Routinely Use Maximum Likelihood Estimation

Imagine a scenario where you have the most delicious chocolate ice cream in the fridge, to be served over dinner. Further, also imagine that you are taking care of 4 children (Alpha, Beta, Gamma, and Delta) whom you have specifically instructed not to eat from the ice cream in the fridge during the day. You go out to get groceries. On your return, you discover that one of the four children has taken a bite from the ice cream. Your task is now to determine who, of the 4 children, ate from the ice cream.

How would you go about identifying the child who ate the ice cream? Let’s also assume that none of the four children are co-operating. You now have to work like a detective gathering clues (i.e. collecting data). Before you collect any data, you have come up with a model which is:

[child’s name] ate from the ice cream.

This model has one free parameter, [child’s name]. If you don’t collect any further clues (data), your best estimate is that the free parameter in the model can take any of the four child names with equal probabilities (i.e. 0.25). However, you do end up collecting several clues. Let’s call those different pieces of clues x1, x2, …., xn implying that you have collected a total of n clues. Once you consider all the clues you have collected, the probability of each child having had the ice cream will change.

Making use of the clues (the data you collected) to help identify the child is done through maximum likelihood. This is what you would do when you are attempting to identify the correct child. Pay attention now!

In your head, you will say:

Given all the data I have now observed, what is the likelihood that Alpha ate the ice cream?

Then, you will repeat the same question for Beta:

Given all the data I have now observed, what is the likelihood that Beta ate the ice cream?

And then, you will repeat it for Gamma and Delta.

How would you conclude and reach an answer? Well, you just pick the child for whom the likelihood is the highest.

In other words, you will pick the answer that gives you the maximum likelihood.

That’s it. You have just gone through a maximum likelihood estimation procedure. To recap, you chose a model that had a parameter. The final answer required choosing a specific value for that parameter. You observed some data. You then estimated the likelihood value considering each value of the parameter in the model. Finally, you chose the final parameter value associated with the highest likelihood. In other words, you chose the answer that resulted in the maximum possible likelihood.


Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot

%d bloggers like this: