Machine Learning News Hubb
Advertisement Banner
  • Home
  • Machine Learning
  • Artificial Intelligence
  • Big Data
  • Deep Learning
  • Edge AI
  • Neural Network
  • Contact Us
  • Home
  • Machine Learning
  • Artificial Intelligence
  • Big Data
  • Deep Learning
  • Edge AI
  • Neural Network
  • Contact Us
Machine Learning News Hubb
No Result
View All Result
Home Machine Learning

Probability Theory (PT), episode 1: Samples, Events & Probability Measure | by Arthur Meltonyan | Sep, 2022

admin by admin
September 9, 2022
in Machine Learning


The author expects you to have a profound knowledge of linear algebra, analytic geometry, and calculus.

Forewarned is forearmed.

The purpose of probability theory is to build up a mathematical framework for various random experiments. Yeah, there will be a bunch of formulas to construct all that mathematical grounds.

A random experiment is a procedure whose collection of every possible outcome can be described before its performance and can be repeated under the same conditions.

A sample is an outcome of a random experiment. A sample space is a collection of samples:

An event is a subset of a sample space. An event space is a collection of events, which complies with the sigma-algebra axioms:

I. an event space contains an entire sample space:

II. an event space is closed under complementation:

III. an event space is closed under countable unions:

Mutually exclusive events are those which can’t happen at the same time:

Collectively exhaustive events are those the union of which is the sample space:

A partition of an event space is such events which are collectively exhaustive and mutually exclusive simultaneously:

A probability measure is a function from an event space into the real numbers, which complies with Kolmogorov’s axioms below:

I. the probability measure of an entire sample space is equal to 1:

II. the probability measure of an event is greater than or equal to 0:

III. the probability measure of a union of mutually exclusive events is equal to the sum of probabilities of events:

There are some properties you need to keep in mind as a result of the axioms above:

I. the sum of the probability of an event and the probability of its complement is equal to 1:

II. the probability of a null event is equal to 0:

III. the probability of an event is greater than or equal to the probability of its subevent and less than or equal to the probability of its superevent:

IV. the probability of an event is greater than or equal to 0 and less than or equal to 1:

V. the probability of a union of two events is equal to the sum of the probability of the first event and the probability of the second, subtracting the probability of their intersection:

VI. the probability of a difference between two events is equal to the difference between the probability of the first event and the probability of their intersection:

A probability space is a triplet, consisting of a sample space, an event space, and a probability measure:

There are two types of probabilities:

I. A classical probability of an event (a discrete case) is the number of samples in that event divided into the total number of samples:

II. A geometric probability of an event (a continuous case) is the area of samples in that event divided into the total area of samples:

Let’s utilize that knowledge and apply that framework to some examples from real life.

A discrete example. We cast dice. The question is: what are the probabilities of a) both equal and even; b) both equal and odd; c) their union; d) their intersection?

In this case, casting dice is a random experiment, which means we should define a probability space:

so that we can calculate the probabilities:

A continuous example. We toss darts. The question is: what are the probabilities of the dart landing in the a) red circle (0 < r ≤ 2); b) blue circle (2 < r ≤ 4)?

In this case, tossing darts is a random experiment, which means we should define a probability space:

so that we can calculate the probabilities:



Source link

Previous Post

Use ADFS OIDC as the IdP for an Amazon SageMaker Ground Truth private workforce

Next Post

How To Stay Up-to-Date in the Field of Data | by Marie Lefevre | Sep, 2022

Next Post

How To Stay Up-to-Date in the Field of Data | by Marie Lefevre | Sep, 2022

How Amp on Amazon used data to increase customer engagement, Part 1: Building a data analytics platform

Forecast Your Weather With Python Script | by prkskrs | Catalysts | Sep, 2022

Related Post

Artificial Intelligence

Dates and Subqueries in SQL. Working with dates in SQL | by Michael Grogan | Jan, 2023

by admin
January 27, 2023
Machine Learning

ChatGPT Is Here To Stay For A Long Time | by Jack Martin | Jan, 2023

by admin
January 27, 2023
Machine Learning

5 steps to organize digital files effectively

by admin
January 27, 2023
Artificial Intelligence

Explain text classification model predictions using Amazon SageMaker Clarify

by admin
January 27, 2023
Artificial Intelligence

Human Resource Management Challenges and The Role of Artificial Intelligence in 2023 | by Ghulam Mustafa Shoaib | Jan, 2023

by admin
January 27, 2023
Deep Learning

Training Neural Nets: a Hacker’s Perspective

by admin
January 27, 2023

© 2023 Machine Learning News Hubb All rights reserved.

Use of these names, logos, and brands does not imply endorsement unless specified. By using this site, you agree to the Privacy Policy and Terms & Conditions.

Navigate Site

  • Home
  • Machine Learning
  • Artificial Intelligence
  • Big Data
  • Deep Learning
  • Edge AI
  • Neural Network
  • Contact Us

Newsletter Sign Up.

No Result
View All Result
  • Home
  • Machine Learning
  • Artificial Intelligence
  • Big Data
  • Deep Learning
  • Edge AI
  • Neural Network
  • Contact Us

© 2023 JNews - Premium WordPress news & magazine theme by Jegtheme.