본문 바로가기
Coursera/Mathematics for ML and Data Science

Probability & Statistics for Machine Learning & Data Science (4)

by Fresh Red 2024. 9. 6.
728x90
반응형

▤ 목차

    Introduction to Probability and Probability Distributions

    728x90

    Introduction to Probability

    Bayes Theorem - Spam example

    012

    Using Bayes’ theorem deletes (doesn’t use) all the unnecessary information and only uses the information that matters.

    In this example, we worry about emails that contain the word lottery and calculate the probability that are spam.

    A prior is the original probability that we calculate without knowing anything, a basic piece of information we have (e.g. probability of spam, probability of not spam, etc.)

    Bayes Theorem - Prior and Posterior

    012
    Some examples to see the prior and posterior

    As explained previously prior is a probability from the minimal information.

    A posterior is a conditional probability with the given events, since we have more information, we can calculate much more detailed probabilities.

    Bayes Theorem - The Naive Bayes Model

    Naive Bayes’ model is a supervised model.

    We can have more than 1 event to calculate a conditional probability.

    0123
    Posterior with more than 1 event

    The naive assumption is based on the naive Bayes and we assume the events are independent (not related), and then perform a calculation with individual conditional probabilities.

    Assuming that each feature is independent makes it easy to implement and provide great results.

    012
    012

    Probability in Machine Learning

    Machine learning utilizes probabilities, for instance, the output probability is calculated based on the given features.

    01234

    All the information provided is based on the Probability & Statistics for Machine Learning & Data Science | Coursera from DeepLearning.AI

    728x90
    반응형

    home top bottom
    }