BAYES THEOREM AND BAYESIAN NETWORK

Rashandeep singh
4 min readJul 28, 2021

--

BAYES THEOREM

Bayes theorem determines the probability of an event with uncertain knowledge. In probability theory, it relates the conditional probability of two random events.

Bayes theorem states that:

Where P(Hi/E) = The probability that hypothesis Hi is true, given evidence E.

P(E/Hi) = The probability that we will observe evidence E given that hypothesis is true.

P(Hi) = A prior probability that hypothesis i is true in the absence of any specific evidence. These probabilities are called prior probabilities or priors.

K = number of possible hypotheses

OR,

OR,

The above equation is the basis of most modern AI systems for probabilistic inference.

Prior Probability — It is the initial probability value originally obtained before any additional information is obtained.

Posterior Probability — It is the probability value that has to be revised by adding additional information that is later obtained.

BAYESIAN NETWORK

The Bayesian network defines probabilistic independencies and dependencies among the variables in the network. It was introduced by pearl in 1988. In this, we preserve the formalism and rely instead on the modularity of the world we are trying to model. The main idea is to describe the real world, it is not necessary to use a huge joint probability table that consists of a list of the probabilities of all conceivable combinations of events.

Most events are conditionally independent of most other ones. So there interactions need not be considered. Instead, we use a more local representation in which we will describe clusters of events that interact.

In this network, we construct a direct Acyclic graph (DAG) which represents the causality relationship among variables. Nodes correspond to random variables which can be continuous or discrete. With every node, we have a conditional probability table (CPT). To become useful as a basis of problem-solving, this DAG is converted into the undirected graph in which the arcs can be used to transmit probabilities in either direction, depending on where evidence is coming from.

The only constraint or arcs allowed in BN is that there must not be any directed cycles: you cannot return to node simply by following directed arcs.

For e.g.

Fig: Example of Bayesian Network

Late wakeup and late for the meeting are not independent but they are conditionally independent given the condition that you are not late for work- conditional independent relationship.

The idea of causality network has proved to be very useful in systems such as:

1. CAS- NET (diagnosis systems)

2. INTERNIST/CADUCEUS (diagnosis system)

Example of Bayesian network — Burglars and earthquake problem

Assume your house has an alarm system against burglary. Also, you live in a seismically active area and the alarm system can get occasionally set off by an earthquake. You have two neighbors who don’t know each other namely Mary and John. If the alarm is heard by them they will call you. BN for this is:

Variables:

Burglary (B)

Earthquake (E)

Alarm system (A)

John calls (J)

Marry calls (M)

Fig: Example of Bayesian Network

· Alarm has 2 parents- Burglary and earthquake.

· Earthquake is the ancestor of both John and Marry calls.

· John calls is a child of Alarm and descendent of Burglary and Earthquake.

· Markov blanket of a node consists of node’s parent, its children, and its children’s parent.

· Burglary and earthquake is the root node, John and Marry calls is leaf node while Alarm is an intermediate node.

Now if we want to find,

--

--

Rashandeep singh
Rashandeep singh

Written by Rashandeep singh

Well-versed in various programming languages like C,C++,Python and Data Structures , Web Development. Pursuing B.E. focused in CSE

No responses yet