Académique Documents
Professionnel Documents
Culture Documents
Presentation for:
Lecturer, School of Computer, University of information technology & sciences
Presented by: Imtiaj Uddin Ahamed Batch: 3rd, Dept: CSE ID:08530101
What is uncertainty?
Uncertainty is defined as the lack of the exact knowledge that would enable us to reach a perfectly reliable conclusion.
There is uncertainty in the facts we know:
Whats the temperature? Imprecise measures Is Sekh Hasina a good president? Imprecise definitions
Information can be incomplete, inconsistent, uncertain, or all three. In other words, information is often unsuitable for solving a problem. Classical logic permits only exact reasoning. It assumes that perfect knowledge always exists and the law of the excluded middle can always be applied:
IF A is true THEN A is not false IF A is false THEN A is not true
Sources of Uncertainty
Uncertain data missing data, unreliable, ambiguous, imprecise representation, inconsistent, subjective, derived from defaults, noisy Uncertain knowledge Multiple causes lead to multiple effects Incomplete knowledge of causality in the domain Probabilistic/stochastic effects inference process Derived result is formally correct, but wrong in the real world New conclusions are not well-founded (eg, inductive reasoning) Incomplete, default reasoning methods
Prior probability: probability in the absence of any other information P(Dice = 2) = 1/6 random variable: Dice domain = <1, 2, 3, 4, 5, 6>
If s is the number of times success can occur, and f is the number of times failure can occur, then
s Psuccess p s f
and p+q=1 Throw a coin, then.
P failure q
f s f
If we throw a coin, the probability of getting a head will be equal to the probability of getting a tail. In a single throw, s = f = 1, and therefore the probability of getting a head (or a tail) is 0.5.
Conditional probability
Let A be an event in the world and B be another event. Events A and B are not mutually exclusive, but occur conditionally on the occurrence of the other. The probability that event A will occur if event B occurs is called the conditional probability. Conditional probability is denoted mathematically as p(AB) : P(A | B) = P(A B)/P(B)
Bayesian Approaches
derive the probability of an event given another event Often useful for diagnosis:
If X are (observed) effects and Y are (hidden) causes, We may have a model for how causes lead to effects (P(X | Y))
Bayesian rule
where: p(AB) is the conditional probability that event A occurs given that event B has occurred; p(BA) is the conditional probability of event B occurring given that event A has occurred; p(A) is the probability of event A occurring; p(B) is the probability of event B occurring.
Bayesian reasoning
Suppose all rules in the knowledge base are represented in the following form: IF THEN E is true H is true {with probability p}
This rule implies that if event E occurs, then the probability that event H will occur is p. In expert systems, H usually represents a hypothesis and E denotes evidence to support this hypothesis.
P(s|m)
P(m|s)
So we expect that one in 5000 patients with a stiff neck to have meningitis.
problems
requires large amounts of probability data sufficient sample sizes subjective evidence may not be reliable independence of evidences assumption often not valid relationship between hypothesis and evidence is reduced to a number explanations for the user difficult high computational overhead
Uncertainty: Conclusions
In AI we must often represent and reason about uncertain information This is no different from what people do all the time! There are multiple approaches to handling uncertainty. Probabilistic methods are most rigorous but often hard to apply; Bayesian reasoning and Dempster-Shafer extend it to handle problems of independence and ignorance of data Empirically, it is often the case that the main need is some way of expressing "maybe". Any system which provides for at least a three-valued logic tends to yield the same decisions.
END.