Vous êtes sur la page 1sur 110

Logic review

A summary of chapters 7, 8, 9 of
Artificial Intelligence: A modern approach
by Russel and Norving

Outline

Knowledge-based agents
Logic in general - models and entailment
Propositional (Boolean) logic
Equivalence, validity, satisfiability
Inference rules
forward chaining
backward chaining
resolution

Knowledge bases

Knowledge base = set of sentences in a formal language


Declarative approach to building an agent (or other system):
Tell it what it needs to know

Then it can Ask itself what to do - answers should follow


from the KB
Agents can be viewed at the knowledge level
i.e., what they know, regardless of how implemented

Or at the implementation level


i.e., data structures in KB and algorithms that manipulate them

A simple knowledge-based agent

The agent must be able to:

Represent states, actions, etc.


Incorporate new percepts
Update internal representations of the world
Deduce hidden properties of the world
Deduce appropriate actions

Logic in general
Logics are formal languages for representing
information such that conclusions can be drawn
Syntax defines the sentences in the language
Semantics define the "meaning" of sentences;
i.e., define truth of a sentence in a world

E.g., the language of arithmetic


x+2 y is a sentence; x2+y > {} is not a sentence
x+2 y is true iff the number x+2 is no less than the number
y
x+2 y is true in a world where x = 7, y = 1
x+2 y is false in a world where x = 0, y = 6

Entailment
Entailment means that one thing follows from
another:
KB
Knowledge base KB entails sentence if and
only if is true in all worlds where KB is true
E.g., the KB containing the Giants won and the
Reds won entails Either the Giants won or the
Reds won
E.g., x+y = 4 entails 4 = x+y
Entailment is a relationship between sentences
(i.e., syntax) that is based on semantics

Models
Logicians typically think in terms of models, which are formally
structured worlds with respect to which truth can be evaluated
We say m is a model of a sentence if is true in m
M() is the set of all models of
Then KB iff M(KB) M()

E.g. KB = Giants won and Reds


won = Giants won

Inference
KB i = sentence can be derived from KB by
procedure i
Soundness: i is sound if whenever KB i , it is also
true that KB
Completeness: i is complete if whenever KB , it is
also true that KB i
Preview: we will define a logic (first-order logic)
which is expressive enough to say almost anything
of interest, and for which there exists a sound and
complete inference procedure.
That is, the procedure will answer any question
whose answer follows from what is known by the
KB.

Propositional logic: Syntax


Propositional logic is the simplest logic illustrates
basic ideas
The proposition symbols P1, P2 etc are sentences
If S is a sentence, S is a sentence (negation)
If S1 and S2 are sentences, S1 S2 is a sentence
(conjunction)
If S1 and S2 are sentences, S1 S2 is a sentence (disjunction)
If S1 and S2 are sentences, S1 S2 is a sentence
(implication)
If S1 and S2 are sentences, S1 S2 is a sentence
(biconditional)

Propositional logic: Semantics


Each model specifies true/false for each proposition symbol
E.g. P1,2 P2,2 P3,1
false
true false

With these symbols, 8 possible models, can be enumerated automatically.


Rules for evaluating truth with respect to a model m:
S is true iff
S is false
S1 S2 is true iff
S1 is true and
S2 is true
S1 S2 is true iff
S1is true or S2 is true
S1 S2 is true iff
S1 is false orS2 is true
i.e.,
is false iff S1 is true and
S2 is false
S1 S2 is true iff
S1S2 is true andS2S1 is true
Simple recursive process evaluates an arbitrary sentence, e.g.,
P1,2 (P2,2 P3,1) = true (true false) = true true = true

Truth tables for connectives

Inference by enumeration
Depth-first enumeration of all models is sound and complete

For n symbols, time complexity is O(2n), space complexity is


O(n)

Logical equivalence
Two sentences are logically equivalent} iff true in
same models: iff and

Validity and satisfiability


A sentence is valid if it is true in all models,
e.g., True,

A A, A A, (A (A B)) B

Validity is connected to inference via the Deduction Theorem:


KB if and only if (KB ) is valid

A sentence is satisfiable if it is true in some model


e.g., A B,

A sentence is unsatisfiable if it is true in no models


e.g., AA

Satisfiability is connected to inference via the following:


KB if and only if (KB ) is unsatisfiable

Proof methods
Proof methods divide into (roughly) two kinds:
Application of inference rules
Legitimate (sound) generation of new sentences from old
Proof = a sequence of inference rule applications
Can use inference rules as operators in a standard search
algorithm
Typically require transformation of sentences into a normal
form

Model checking
truth table enumeration (always exponential in n)
improved backtracking, e.g., Davis--Putnam-LogemannLoveland (DPLL)
heuristic search in model space (sound but incomplete)
e.g., min-conflicts-like hill-climbing algorithms

Resolution
Conjunctive Normal Form (CNF)
conjunction of disjunctions of literals
clauses
E.g., (A B) (B C D)

Resolution inference rule (for CNF):


l1 lk,
m1 mn
l1 li-1 li+1 lk m1 mj-1 mj+1 ... mn
where li and mj are complementary literals.
E.g., P1,3 P2,2, P2,2
P1,3
Resolution is sound and complete for propositional logic

Resolution
Soundness of resolution inference rule:

(li li-1 li+1 lk) li


mj (m1 mj-1 mj+1 ... mn)
(li li-1 li+1 lk) (m1 mj-1 mj+1 ...
m n)

Conversion to CNF
B1,1 (P1,2 P2,1)
1. Eliminate , replacing with ( )( ).
(B1,1 (P1,2 P2,1)) ((P1,2 P2,1) B1,1)

2. Eliminate , replacing with .


(B1,1 P1,2 P2,1) ((P1,2 P2,1) B1,1)

3. Move inwards using de Morgan's rules and


double-negation:
(B1,1 P1,2 P2,1) ((P1,2 P2,1) B1,1)

4. Apply distributivity law ( over ) and flatten:


(B1,1 P1,2 P2,1) (P1,2 B1,1) (P2,1 B1,1)

Resolution algorithm
Proof by contradiction, i.e., show KB
unsatisfiable

Resolution example
KB = (B1,1 (P1,2 P2,1)) B1,1 = P1,2

Forward and backward chaining


Horn Form (restricted)
KB = conjunction of Horn clauses
Horn clause =
proposition symbol; or
(conjunction of symbols) symbol

E.g., C (B A) (C D B)

Modus Ponens (for Horn Form): complete for Horn KBs


1, ,n,
1 n

Can be used with forward chaining or backward chaining.


These algorithms are very natural and run in linear time

Forward chaining
Idea: fire any rule whose premises are satisfied in
the KB,
add its conclusion to the KB, until query is found

Forward chaining algorithm

Forward chaining is sound and complete for


Horn KB

Forward chaining example

Forward chaining example

Forward chaining example

Forward chaining example

Forward chaining example

Forward chaining example

Forward chaining example

Forward chaining example

Proof of completeness

FC derives every atomic sentence that is


entailed by KB
1. FC reaches a fixed point where no new atomic
sentences are derived
2. Consider the final state as a model m, assigning
true/false to symbols
3. Every clause in the original KB is true in m
a1 ak b

4. Hence m is a model of KB
5. If KB q, q is true in every model of KB,
including m

Backward chaining
Idea: work backwards from the query q:
to prove q by BC,
check if q is known already, or
prove by BC all premises of some rule concluding q

Avoid loops: check if new subgoal is already on the


goal stack
Avoid repeated work: check if new subgoal
1. has already been proved true, or
2. has already failed

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Forward vs. backward chaining


FC is data-driven, automatic, unconscious
processing,
e.g., object recognition, routine decisions

May do lots of work that is irrelevant to the goal


BC is goal-driven, appropriate for problem-solving,
e.g., Where are my keys? How do I get into a PhD program?

Complexity of BC can be much less than linear in


size of KB

Expressiveness limitation of propositional


logic
KB contains "physics" sentences for every single
square
For every time t and every location [x,y],
Ltx,y FacingRightt Forwardt t Lx+1,y
Rapid proliferation of clauses

Summary of propositional logic


Logical agents apply inference to a knowledge base to derive
new information and make decisions
Basic concepts of logic:

syntax: formal structure of sentences


semantics: truth of sentences wrt models
entailment: necessary truth of one sentence given another
inference: deriving sentences from other sentences
soundness: derivations produce only entailed sentences
completeness: derivations can produce all entailed sentences

Resolution is complete for propositional logic


Forward, backward chaining are linear-time, complete for Horn
clauses
Propositional logic lacks expressive power

First-Order Logic

Outline
Why FOL?
Syntax and semantics of FOL
Using FOL
Knowledge engineering in FOL

Pros and cons of propositional logic


Propositional logic is declarative
Propositional logic allows
partial/disjunctive/negated information
(unlike most data structures and databases)

Propositional logic is compositional:

meaning of B1,1 P1,2 is derived from meaning of B1,1 and of


P1,2

Meaning in propositional logic is contextindependent


(unlike natural language, where meaning depends on
context)

Propositional logic has very limited expressive


power
(unlike natural language)
E.g., cannot say "pits cause breezes in adjacent squares
except by writing one sentence for each square

First-order logic
Whereas propositional logic assumes the world contains facts,
first-order logic (like natural language) assumes the world
contains
Objects: people, houses, numbers, colors, baseball games, wars,

Relations: red, round, prime, brother of, bigger than, part of,
comes between,
Functions: father of, best friend, one more than, plus,

Syntax of FOL: Basic elements


Constants KingJohn, 2, NUS,...
Predicates Brother, >,...
Functions

Sqrt, LeftLegOf,...

Variables

x, y, a, b,...

Connectives

, , , ,

Equality

Quantifiers

Atomic sentences
Atomic sentence = predicate (term1,...,termn)
or term1 = term2
Term

function (term1,...,termn)
or constant or variable

E.g.,
Brother(KingJohn,RichardTheLionheart)
Married(FatherOf(Richard), MotherOf(John))

Complex sentences
Complex sentences are made from atomic sentences using
connectives

S, S1 S2, S1 S2, S1 S2, S1 S2,


E.g. Sibling(KingJohn,Richard) Sibling(Richard,KingJohn)
>(1,2) (1,2)
>(1,2) >(1,2)

Truth in first-order logic


Sentences are true with respect to a model and an
interpretation
Model contains objects (domain elements) and relations among
them
Interpretation specifies referents for
constant symbols
predicate symbols
function symbols

objects
relations
functional relations

An atomic sentence predicate(term1,...,termn) is true


iff the objects referred to by term1,...,termn
are in the relation referred to by predicate

Models for FOL: Example

Universal quantification
<variables> <sentence>
Everyone at NUS is smart:
x At(x,NUS) Smart(x)
x P is true in a model m iff P is true with x being
each possible object in the model
Roughly speaking, equivalent to the conjunction of
instantiations of P
At(KingJohn,NUS) Smart(KingJohn)
At(Richard,NUS) Smart(Richard)
At(NUS,NUS) Smart(NUS)
...

Existential quantification
<variables> <sentence>
Someone at NUS is smart:
x At(x,NUS) Smart(x)$
x P is true in a model m iff P is true with x being
some possible object in the model
Roughly speaking, equivalent to the disjunction of
instantiations of P
At(KingJohn,NUS) Smart(KingJohn)
At(Richard,NUS) Smart(Richard)
At(NUS,NUS) Smart(NUS)
...

Properties of quantifiers
x y is the same as y x
x y is the same as y x
x y is not the same as y x
x y Loves(x,y)

There is a person who loves everyone in the world

y x Loves(x,y)

Everyone in the world is loved by at least one person

Quantifier duality: each can be expressed using the other


x Likes(x,IceCream) x Likes(x,IceCream)
x Likes(x,Broccoli)
x Likes(x,Broccoli)

Equality
term1 = term2 is true under a given
interpretation if and only if term1 and term2
refer to the same object
E.g., definition of Sibling in terms of Parent:
x,y Sibling(x,y) [(x = y) m,f (m = f)
Parent(m,x) Parent(f,x) Parent(m,y)
Parent(f,y)]

Using FOL
The kinship domain:

Brothers are siblings


x,y Brother(x,y) Sibling(x,y)

One's mother is one's female parent


m,c Mother(c) = m (Female(m) Parent(m,c))

Sibling is symmetric
x,y Sibling(x,y) Sibling(y,x)

Using FOL
The set domain:
s Set(s) (s = {} ) (x,s2 Set(s2) s = {x|s2})
x,s {x|s} = {}
x,s x s s = {x|s}
x,s x s [ y,s2} (s = {y|s2} (x = y x s2))]
s1,s2 s1 s2 (x x s1 x s2)
s1,s2 (s1 = s2) (s1 s2 s2 s1)
x,s1,s2 x (s1 s2) (x s1 x s2)
x,s1,s2 x (s1 s2) (x s1 x s2)

Knowledge engineering in FOL


1. Identify the task
2. Assemble the relevant knowledge
3. Decide on a vocabulary of predicates,
functions, and constants
4. Encode general knowledge about the
domain
5. Encode a description of the specific
problem instance
6. Pose queries to the inference procedure
and get answers
7. Debug the knowledge base

The electronic circuits domain


One-bit full adder

The electronic circuits domain


1. Identify the task
Does the circuit actually add properly? (circuit
verification)

2. Assemble the relevant knowledge


Composed of wires and gates; Types of gates
(AND, OR, XOR, NOT)
Irrelevant: size, shape, color, cost of gates

3. Decide on a vocabulary
Alternatives:
Type(X1) = XOR
Type(X1, XOR)
XOR(X1)

The electronic circuits domain


1. Encode general knowledge of the domain

t1,t2 Connected(t1, t2) Signal(t1) = Signal(t2)


t Signal(t) = 1 Signal(t) = 0
10
t1,t2 Connected(t1, t2) Connected(t2, t1)
g Type(g) = OR Signal(Out(1,g)) = 1 n
Signal(In(n,g)) = 1
g Type(g) = AND Signal(Out(1,g)) = 0 n
Signal(In(n,g)) = 0
g Type(g) = XOR Signal(Out(1,g)) = 1
Signal(In(1,g)) Signal(In(2,g))
g Type(g) = NOT Signal(Out(1,g))
Signal(In(1,g))

The electronic circuits domain


1. Encode the specific problem instance
Type(X1) = XOR
Type(A1) = AND
Type(O1) = OR

Type(X2) = XOR
Type(A2) = AND

Connected(Out(1,X1),In(1,X2)) Connected(In(1,C1),In(1,X1))
Connected(Out(1,X1),In(2,A2)) Connected(In(1,C1),In(1,A1))
Connected(Out(1,A2),In(1,O1)) Connected(In(2,C1),In(2,X1))
Connected(Out(1,A1),In(2,O1)) Connected(In(2,C1),In(2,A1))
Connected(Out(1,X2),Out(1,C1))
Connected(In(3,C1),In(2,X2))
Connected(Out(1,O1),Out(2,C1))
Connected(In(3,C1),In(1,A2))

The electronic circuits domain


1.

Pose queries to the inference procedure


What are the possible sets of values of all the terminals for the adder
circuit?

i1,i2,i3,o1,o2 Signal(In(1,C_1)) = i1 Signal(In(2,C1))


= i2 Signal(In(3,C1)) = i3 Signal(Out(1,C1)) = o1
Signal(Out(2,C1)) = o2
1.

Debug the knowledge base


May have omitted assertions like 1 0

Inference in first-order logic

Outline
Reducing first-order inference to propositional inference
Unification
Generalized Modus Ponens
Forward chaining
Backward chaining
Resolution

Universal instantiation (UI)


Every instantiation of a universally quantified sentence is
entailed by it:
v
Subst({v/g}, )

for any variable v and ground term g


E.g., x King(x) Greedy(x) Evil(x) yields:
King(John) Greedy(John) Evil(John)
King(Richard) Greedy(Richard) Evil(Richard)
King(Father(John)) Greedy(Father(John)) Evil(Father(John))

Existential instantiation (EI)


For any sentence , variable v, and constant
symbol k that does not appear elsewhere in
the knowledge base:
v
Subst({v/k}, )

E.g., x Crown(x) OnHead(x,John) yields:


Crown(C1) OnHead(C1,John)
provided C1 is a new constant symbol, called
a Skolem constant

Reduction to propositional inference


Suppose the KB contains just the following:
x King(x) Greedy(x) Evil(x)
King(John)
Greedy(John)
Brother(Richard,John)

Instantiating the universal sentence in all possible ways, we have:


King(John) Greedy(John) Evil(John)
King(Richard) Greedy(Richard) Evil(Richard)
King(John)
Greedy(John)
Brother(Richard,John)

The new KB is propositionalized: proposition symbols are


King(John), Greedy(John), Evil(John), King(Richard), etc.

Reduction contd.
Every FOL KB can be propositionalized so as to
preserve entailment
(A ground sentence is entailed by new KB iff entailed
by original KB)
Idea: propositionalize KB and query, apply
resolution, return result

Reduction contd.
Theorem: Herbrand (1930). If a sentence is entailed by an FOL
KB, it is entailed by a finite subset of the propositionalized KB
Idea: For n = 0 to do
create a propositional KB by instantiating with depth-$n$ terms
see if is entailed by this KB

Problem: works if is entailed, loops if is not entailed


Theorem: Turing (1936), Church (1936) Entailment for FOL is
semidecidable (algorithms exist that say yes to every entailed
sentence, but no algorithm exists that also says no to every
nonentailed sentence.)

Problems with propositionalization


Propositionalization seems to generate lots of irrelevant
sentences.
E.g., from:

x King(x) Greedy(x) Evil(x)


King(John)
y Greedy(y)
Brother(Richard,John)

it seems obvious that Evil(John), but propositionalization


produces lots of facts such as Greedy(Richard) that are
irrelevant
With p k-ary predicates and n constants, there are pnk
instantiations.

Unification
We can get the inference immediately if we can find a
substitution such that King(x) and Greedy(x) match
King(John) and Greedy(y)
= {x/John,y/John} works
Unify(,) = if =
p
q

Knows(John,x) Knows(John,Jane)
Knows(John,x) Knows(y,OJ)
Knows(John,x) Knows(y,Mother(y))
Knows(John,x) Knows(x,OJ)
Standardizing apart eliminates overlap of variables, e.g.,
Knows(z17,OJ)

Unification
We can get the inference immediately if we can find a
substitution such that King(x) and Greedy(x) match
King(John) and Greedy(y)
= {x/John,y/John} works
Unify(,) = if =
p
q
Knows(John,x)
Knows(John,Jane)
Knows(John,x)
Knows(y,OJ)
Knows(John,x)
Knows(y,Mother(y))
Knows(John,x)
Knows(x,OJ)

{x/Jane}}

Standardizing apart eliminates overlap of variables, e.g.,


Knows(z17,OJ)

Unification
We can get the inference immediately if we can find a
substitution such that King(x) and Greedy(x) match
King(John) and Greedy(y)
= {x/John,y/John} works
Unify(,) = if =
p
q
Knows(John,x)
Knows(John,Jane)
Knows(John,x)
Knows(y,OJ)
Knows(John,x) Knows(y,Mother(y))
Knows(John,x) Knows(x,OJ)

{x/Jane}}
{x/OJ,y/John}}

Standardizing apart eliminates overlap of variables, e.g.,


Knows(z17,OJ)

Unification
We can get the inference immediately if we can find a substitution
such that King(x) and Greedy(x) match King(John) and Greedy(y)
= {x/John,y/John} works
Unify(,) = if =
p
q
Knows(John,x)
Knows(John,Jane)
Knows(John,x)
Knows(y,OJ)
Knows(John,x)
Knows(y,Mother(y))
Knows(John,x)
Knows(x,OJ)

{x/Jane}}
{x/OJ,y/John}}
{y/John,x/Mother(John)}}

Standardizing apart eliminates overlap of variables, e.g.,


Knows(z17,OJ)

Unification
We can get the inference immediately if we can find a substitution
such that King(x) and Greedy(x) match King(John) and Greedy(y)
= {x/John,y/John} works
Unify(,) = if =
p
q
Knows(John,x)
Knows(John,Jane)
Knows(John,x)
Knows(y,OJ)
Knows(John,x)
Knows(y,Mother(y))
Knows(John,x)
Knows(x,OJ)

{x/Jane}}
{x/OJ,y/John}}
{y/John,x/Mother(John)}}
{fail}

Standardizing apart eliminates overlap of variables, e.g.,


Knows(z17,OJ)

Unification
To unify Knows(John,x) and Knows(y,z),
= {y/John, x/z } or = {y/John, x/John, z/John}

The first unifier is more general than the


second.
There is a single most general unifier (MGU)
that is unique up to renaming of variables.
MGU = { y/John, x/z }

The unification algorithm

The unification algorithm

Generalized Modus Ponens (GMP)


p1', p2', , pn', ( p1 p2 pn q)
q
p1' is King(John) p1 is King(x)
p2' is Greedy(y)
p2 is Greedy(x)
is {x/John,y/John} q is Evil(x)
q is Evil(John)

where pi' = pi for all i

GMP used with KB of definite clauses (exactly one positive


literal)
All variables assumed universally quantified

Soundness of GMP

Need to show that


p1', , pn', (p1 pn q) q
provided that pi' = pi for all I

Lemma: For any sentence p, we have p p by UI


1. (p1 pn q) (p1 pn q) = (p1 pn q)
2. p1', \; , \;pn' p1' pn' p1' pn'
3. From 1 and 2, q follows by ordinary Modus Ponens

Example knowledge base


The law says that it is a crime for an American to sell
weapons to hostile nations. The country Nono, an
enemy of America, has some missiles, and all of its
missiles were sold to it by Colonel West, who is
American.
Prove that Col. West is a criminal

Example knowledge base contd.


... it is a crime for an American to sell weapons to hostile nations:
American(x) Weapon(y) Sells(x,y,z) Hostile(z) Criminal(x)

Nono has some missiles, i.e., x Owns(Nono,x) Missile(x):


Owns(Nono,M1) and Missile(M1)

all of its missiles were sold to it by Colonel West


Missile(x) Owns(Nono,x) Sells(West,x,Nono)

Missiles are weapons:

Missile(x) Weapon(x)

An enemy of America counts as "hostile:


Enemy(x,America) Hostile(x)

West, who is American


American(West)

The country Nono, an enemy of America


Enemy(Nono,America)

Forward chaining algorithm

Forward chaining proof

Forward chaining proof

Forward chaining proof

Properties of forward chaining


Sound and complete for first-order definite clauses
Datalog = first-order definite clauses + no functions
FC terminates for Datalog in finite number of
iterations
May not terminate in general if is not entailed
This is unavoidable: entailment with definite clauses
is semidecidable

Efficiency of forward chaining


Incremental forward chaining: no need to match a rule
on iteration k if a premise wasn't added on iteration
k-1
match each rule whose premise contains a newly added
positive literal

Matching itself can be expensive:


Database indexing allows O(1) retrieval of known facts
e.g., query Missile(x) retrieves Missile(M1)

Forward chaining is widely used in deductive


databases

Hard matching example


Diff(wa,nt) Diff(wa,sa) Diff(nt,q)
Diff(nt,sa) Diff(q,nsw) Diff(q,sa)
Diff(nsw,v) Diff(nsw,sa) Diff(v,sa)
Colorable()
Diff(Red,Blue)
Diff (Red,Green)
Diff(Green,Red) Diff(Green,Blue)
Diff(Blue,Red)
Diff(Blue,Green)

Colorable() is inferred iff the CSP has a


solution
CSPs include 3SAT as a special case, hence
matching is NP-hard

Backward chaining algorithm

SUBST(COMPOSE(1, 2), p) = SUBST(2,


SUBST(1, p))

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Properties of backward chaining


Depth-first recursive proof search: space is
linear in size of proof
Incomplete due to infinite loops
fix by checking current goal against every goal
on stack

Inefficient due to repeated subgoals (both


success and failure)
fix using caching of previous results (extra
space)

Widely used for logic programming

Logic programming: Prolog

Algorithm = Logic + Control

Basis: backward chaining with Horn clauses + bells & whistles


Widely used in Europe, Japan (basis of 5th Generation project)
Compilation techniques 60 million LIPS

Program = set of clauses = head :- literal1, literaln.


criminal(X) :- american(X), weapon(Y), sells(X,Y,Z), hostile(Z).

Depth-first, left-to-right backward chaining


Built-in predicates for arithmetic etc., e.g., X is Y*Z+3
Built-in predicates that have side effects (e.g., input and output
predicates, assert/retract predicates)
Closed-world assumption ("negation as failure")
e.g., given alive(X) :- not dead(X).
alive(joe) succeeds if dead(joe) fails

Prolog
Appending two lists to produce a third:
append([],Y,Y).
append([X|L],Y,[X|Z]) :- append(L,Y,Z).
query:

append(A,B,[1,2]) ?

answers:

A=[]
B=[1,2]
A=[1]
B=[2]
A=[1,2] B=[]

Resolution: brief summary


Full first-order version:
l1 lk,
m1 mn
(l1 li-1 li+1 lk m1 mj-1 mj+1 mn)
where Unify(li, mj) = .
The two clauses are assumed to be standardized apart so that
they share no variables.
For example,
Rich(x) Unhappy(x)
Rich(Ken)
Unhappy(Ken)
with = {x/Ken}
Apply resolution steps to CNF(KB ); complete for FOL

Conversion to CNF
Everyone who loves all animals is loved by
someone:

x [y Animal(y) Loves(x,y)] [y Loves(y,x)]

1. Eliminate biconditionals and implications

x [y Animal(y) Loves(x,y)] [y Loves(y,x)]

2. Move inwards: x p x p, x p
x p

x [y (Animal(y) Loves(x,y))] [y Loves(y,x)]


x [y Animal(y) Loves(x,y)] [y Loves(y,x)]
x [y Animal(y) Loves(x,y)] [y Loves(y,x)]

Conversion to CNF contd.


1.

Standardize variables: each quantifier should use a different


one

x [y Animal(y) Loves(x,y)] [z Loves(z,x)]

2.

Skolemize: a more general form of existential instantiation.


Each existential variable is replaced by a Skolem function of the
enclosing universally quantified variables:
x [Animal(F(x)) Loves(x,F(x))] Loves(G(x),x)

3.
4.

Drop universal quantifiers:

[Animal(F(x)) Loves(x,F(x))] Loves(G(x),x)

Distribute over :

[Animal(F(x)) Loves(G(x),x)] [Loves(x,F(x)) Loves(G(x),x)]

Resolution proof: definite clauses

Vous aimerez peut-être aussi