Vous êtes sur la page 1sur 15

QUANTIFYING UNCERTAINTY

Presentation for:
Lecturer, School of Computer, University of information technology & sciences

Presented by: Imtiaj Uddin Ahamed Batch: 3rd, Dept: CSE ID:08530101

What is uncertainty?
Uncertainty is defined as the lack of the exact knowledge that would enable us to reach a perfectly reliable conclusion.
There is uncertainty in the facts we know:
Whats the temperature? Imprecise measures Is Sekh Hasina a good president? Imprecise definitions

Why we need uncertainty management


n n

Information can be incomplete, inconsistent, uncertain, or all three. In other words, information is often unsuitable for solving a problem. Classical logic permits only exact reasoning. It assumes that perfect knowledge always exists and the law of the excluded middle can always be applied:
IF A is true THEN A is not false IF A is false THEN A is not true

Sources of Uncertainty

Uncertain data missing data, unreliable, ambiguous, imprecise representation, inconsistent, subjective, derived from defaults, noisy Uncertain knowledge Multiple causes lead to multiple effects Incomplete knowledge of causality in the domain Probabilistic/stochastic effects inference process Derived result is formally correct, but wrong in the real world New conclusions are not well-founded (eg, inductive reasoning) Incomplete, default reasoning methods

Reasoning Under Uncertainty


So how do we do reasoning under uncertainty and with inexact knowledge?
heuristics empirical associations probabilities

Basic probability theory


The concept of probability has a long history that goes back thousands of years when words like probably, likely, maybe, perhaps and possibly were introduced into spoken languages. However, the mathematical theory of probability was formulated only in the 17th century. The probability of an event is the proportion of cases in which the event occurs. Probability can also be defined as a scientific measure of chance.

Prior probability: probability in the absence of any other information P(Dice = 2) = 1/6 random variable: Dice domain = <1, 2, 3, 4, 5, 6>

If s is the number of times success can occur, and f is the number of times failure can occur, then

s Psuccess p s f
and p+q=1 Throw a coin, then.

P failure q

f s f

If we throw a coin, the probability of getting a head will be equal to the probability of getting a tail. In a single throw, s = f = 1, and therefore the probability of getting a head (or a tail) is 0.5.

Conditional probability
Let A be an event in the world and B be another event. Events A and B are not mutually exclusive, but occur conditionally on the occurrence of the other. The probability that event A will occur if event B occurs is called the conditional probability. Conditional probability is denoted mathematically as p(AB) : P(A | B) = P(A B)/P(B)

P(A B) = P(A | B).P(B)

Bayesian Approaches
derive the probability of an event given another event Often useful for diagnosis:
If X are (observed) effects and Y are (hidden) causes, We may have a model for how causes lead to effects (P(X | Y))

has gained importance recently due to advances in efficiency


more computational power available better methods

Bayesian rule

where: p(AB) is the conditional probability that event A occurs given that event B has occurred; p(BA) is the conditional probability of event B occurring given that event A has occurred; p(A) is the probability of event A occurring; p(B) is the probability of event B occurring.

Bayesian reasoning
Suppose all rules in the knowledge base are represented in the following form: IF THEN E is true H is true {with probability p}

This rule implies that if event E occurs, then the probability that event H will occur is p. In expert systems, H usually represents a hypothesis and E denotes evidence to support this hypothesis.

Bayes Example: Diagnosing Meningitis


Suppose we know that Stiff neck is a symptom in 50% of meningitis cases

Meningitis (m) occurs in 1/50,000 patients


Stiff neck (s) occurs in 1/20 patients Then

P(s|m)
P(m|s)

= 0.5, P(m) = 1/50000, P(s) = 1/20


= (P(s|m) P(m))/P(s) = (0.5 x 1/50000) / 1/20 = .0002

So we expect that one in 5000 patients with a stiff neck to have meningitis.

Advantages and Problems Of Bayesian Reasoning


advantages
sound theoretical foundation well-defined semantics for decision making

problems
requires large amounts of probability data sufficient sample sizes subjective evidence may not be reliable independence of evidences assumption often not valid relationship between hypothesis and evidence is reduced to a number explanations for the user difficult high computational overhead

Uncertainty: Conclusions
In AI we must often represent and reason about uncertain information This is no different from what people do all the time! There are multiple approaches to handling uncertainty. Probabilistic methods are most rigorous but often hard to apply; Bayesian reasoning and Dempster-Shafer extend it to handle problems of independence and ignorance of data Empirically, it is often the case that the main need is some way of expressing "maybe". Any system which provides for at least a three-valued logic tends to yield the same decisions.

END.

Vous aimerez peut-être aussi