Introduction to Probability and Probability Distributions
Introduction to Probability
Bayes Theorem - Spam example
Using Bayes’ theorem deletes (doesn’t use) all the unnecessary information and only uses the information that matters.
In this example, we worry about emails that contain the word lottery and calculate the probability that are spam.
A prior is the original probability that we calculate without knowing anything, a basic piece of information we have (e.g. probability of spam, probability of not spam, etc.)
Bayes Theorem - Prior and Posterior
As explained previously prior is a probability from the minimal information.
A posterior is a conditional probability with the given events, since we have more information, we can calculate much more detailed probabilities.
Bayes Theorem - The Naive Bayes Model
Naive Bayes’ model is a supervised model.
We can have more than 1 event to calculate a conditional probability.
The naive assumption is based on the naive Bayes and we assume the events are independent (not related), and then perform a calculation with individual conditional probabilities.
Assuming that each feature is independent makes it easy to implement and provide great results.
Probability in Machine Learning
Machine learning utilizes probabilities, for instance, the output probability is calculated based on the given features.
All the information provided is based on the Probability & Statistics for Machine Learning & Data Science | Coursera from DeepLearning.AI