Académique Documents
Professionnel Documents
Culture Documents
Artificial Intelligence
Lecture 5
Informed Search
1
Contents
5.1 Introduction to Informed Search Methods
5.2 Best-First Search
5.2.1 Greedy best-first search
5.2.2 A* search
5.2.3 Best-First Search Analysis
5.2.4 Heuristic Functions
Heuristic function:
estimated cost of the cheapest
path from node n to goal node
f(n) = g(n)
(uniform-cost)
f(n) = h(n)
(greedy algorithm)
f(n) = h(n)
Evaluation function
Heuristic function:
estimated cost of the cheapest
path from node n to goal node
This is why the algorithm is called greedy since at each step it tries
to get as close to the goal as it can11
it is not optimal
time and space complexity is O(bm)
12
Goal
Goal
13
14
f(n) = h(n)
estimated cost of the cheapest
path from node n to goal node
f(n) = g(n)
cost to get to node n
15
A* Search
Best-known form of best-first search.
Combines greedy and uniform-cost search to find the (estimated)
cheapest path through the current node
Evaluation function:
f(n) = g(n) + h(n)
f(n) is the estimated cost of cheapest solution that passes through node n
16
A* Search
Its goal is to find a minimum total cost path.
A* expands node with the lowest f(n) value first
This leads to both complete and optimal search algorithm, provided
that h(n) satisfies certain conditions
A heuristic h(n) is admissible if for every node n, h(n) h*(n),
where h*(n) is the true cost to reach the goal state from n.
A* is optimal if h(n) is an admissible heuristic
provided that h(n) never overestimates the cost to reach the
goal
17
A* Search
hSLD
g(n)
18
A* Search
220+193
19
A* Search
from Arad
20
A* Search
Properties of A*
Complete? Yes (when the branching factor is finite and every operator
has a fixed positive cost)
Time? Exponential
Drawback
21
Heuristic Functions
The effect of heuristic accuracy on performance
One way to determine the quality of a heuristic is the
effective branching factor, b*
N + 1 = 1 + b* + (b*)2 + (b*)3 + ... + (b*)d
N: total number of nodes generated by A*
d: solution depth
b*: branching factor that a uniform tree of depth d would have in order to
contain N + 1 nodes
22
Heuristic Functions
Dominance
1
Start state
3
4
h1(s) = 7
h2(s) = 2 + 3+ 3 + 2 + 4 + 2 + 0 + 2 = 18
Goal state
h2 dominates h1
if h2(s) >= h1(s) for all s
If h2 dominates h1
then A* is more efficient with h2 (i.e., it expands fewer states)
23
Heuristic Functions
Given two admissible heuristics h1(n) and h2(n), which is better?
If h2(n) h1(n) for all n, then
h2 is said to dominate h1
h2 is better for search
24
Memory-bounded Search
Limitations of A*
It can use a lot of memory
In principal, O(no. of states)
For really big search spaces, A* will run out of memory
25
Memory-bounded Search
Search algorithms that try to conserve memory
Most are modifications of A*
Iterative-deepening A* (IDA*)
Recursive best-first search (RBFS)
Simplified memory-bounded A* (SMA*)
26
Memory-bounded Search
Iterative-deepening A* (IDA*)
The main difference between IDA* and IDA:
The cutoff used is f-cost (g+h) rather than depth
At each iteration, the cutoff value is the smallest f-cost of
any node that exceeded the cutoff on the previous iteration;
It can avoid the substantial overhead associated with keeping
a sorted queue nodes.
27
Memory-bounded Search
Recursive Best-First Search (RBFS)
Mimic the operation of best-first search, but using only linear
space
It is similar to that of a recursive depth-first search with record
keeping to prevent following the current path indefinitely
It records the f-cost of the best alternative path available from any
ancestor of the current node
if current f-cost is greater, backtrack to the alternative path as
unwind recursion, store best f-cost of any child node
This allows revisiting abandoned subtree if the current f-cost
becomes greater
Issue:
possible repeated re-generation of subtrees
28
Memory-bounded Search
Recursive Best-First Search (RBFS)
f-limit for
recursive calls
Memory-bounded Search
Recursive Best-First Search (RBFS)
Update
Memory-bounded Search
Update
Memory-bounded Search
Simplified Memory-bounded A* (SMA*)
Uses all available memory for the search
drops nodes from the queue when it runs out of space
those with the highest f-value
Basic idea:
Do A* until runs out of memory
Expand the best (lowest f-value) leaf until memory is full
33
Job-shop scheduling
Telecommunications network design
Vehicle routing
Portfolio management
34
elevation
35
36
Hill-Climbing Search
Hill climbing on a surface of states
It is a simply loop that continuously moves in the direction of
increasing value that is, uphill.
37
Hill-Climbing Search
Properties
Terminates when a peak is reached, where no neighbor has a higher
value
It does not maintain a search tree,
current node data structure only record the state and its objective
function value
Does not look ahead of the immediate neighbors of the current state.
Hill-Climbing Search
Algorithm
Note: if a heuristic cost estimate h is used, find the neighbor with the lowest h
39
Hill-Climbing Search
8-queens problem
Hill-Climbing Search
8-queens problem
Successor function:
- move a single queen to another
square in the same column.
- It returns all possible states
generated by moving a single queen
to another square in the same column
- Each state has 8 x 7 = 56 successors
41
Hill-Climbing Search
8-queens problem
Heuristic function, h(n):
- The no. of pairs of queens that are
attacking each other (directly/indirectly)
- Thus, h = 17
42
Hill-Climbing Search
8-queens problem
A local minimum with h = 1 with only five steps and very nearly a solution
43
Hill-Climbing Search
Also called greedy local search
Tries to grab a good neighbor state without thinking ahead where to
go next
Problem: depending on initial state, can get stuck in local maxima
44
Hill-climbing Search
Problems with Hill Climbing
Local maxima (foothills): a local maximum is a peak i.e, higher
than each of its neighboring states, but lower than the global
maximum
Every move of
a single queen
makes the
situation worse
Hill-climbing Search
Problems with Hill Climbing
Plateaus: All neighbors look the same (the space has a broad flat
region that gives the search algorithm no direction)
Ridges: going through a sequence of local maxima
46
Hill-climbing Search
Overcoming Local Optimum and Plateau
Simulated annealing
Genetic algorithms
Etc.
47
Genetic Algorithms
Combine two parent states, rather than modifying a single state to
generate successor states
16257483
52
Genetic Algorithms
53
Genetic Algorithms
Genetic operators: Crossover
54
Genetic Algorithms
Genetic operators: Mutation
55
Genetic Algorithms
function GENETIC_ALGORITHM( population, FITNESS-FN) return an individual
input: population, a set of individuals
FITNESS-FN, a function which determines the quality of the individual
repeat
new_population empty set
loop for i from 1 to SIZE(population) do
x RANDOM_SELECTION(population, FITNESS_FN)
y RANDOM_SELECTION(population, FITNESS_FN)
child REPRODUCE(x,y)
if (small random probability) then child MUTATE(child )
add child to new_population
population new_population
until some individual is fit enough or enough time has elapsed
return the best individual in population, according to FITNESS-FN
56
Genetic Algorithms
function REPRODUCE(x,y) returns an individual
inputs: x, y, parent individual
n LENGTH(x)
c random number from 1 to n
return APPEND(SUBSTRING(x, 1, c), SUBSTRING(y, c+1, n))
57
No. of nonattacking
pairs of queens
58
The End
59