HW2
Link of the textbook 3.5.4:
https://online.vitalsource.com/reader/books/9780134671932/epubcfi/6/64[%3Bvnd.vst.idref%3D
P700101715800000000000000000F32C]!/4/2[P700101715800000000000000000F32C]/22[P70
0101715800000000000000000F3C5]/2/2[P700101715800000000000000000F3C6]/3:28[iss%2
Cibl]
Submit the assignment electronically using canvas. The written part has to be uploaded as a
.pdf file. Please do not submit word files! Please also avoid submitting handwritten answers,
unless your handwriting is very easy to read and the image is clear.
1. The functions g and h each play a different role in A*. What are those roles? What happens
when you emphasize or de-emphasize one of them by using different weights in f(n)? Consider
the case in which f(n) = (1 – w)g(n) + wh(n), with 0 <= w <=1. This weighted A* search is
described briefly in the textbook (4th edition, sect 3.5.4). Be specific and analyze what happens
for different values of w.
2. You want to reduce the memory used by A*. You come up with the following idea: you keep
in the queue only the N best nodes (i.e, the nodes with lower costs), for some positive value of
N. When the queue is full and a new node has to be stored, the worst node is deleted from the
queue and removed from consideration.
If an admissible heuristic is used, will the modified algorithm find the optimal solution? Explain
why (or why not).
Are there any additional constraints?
If a perfect heuristic is used, i.e. for all n h(n)=h*(n), where h*(n) is the cost of the optimal path
from n to goal, will the modified algorithm find the optimal solution? Explain.
3. Answer the following questions on Uniform Cost search briefly but precisely:
Is it possible for Uniform Cost to expand more nodes than Breadth-First search? Feel free to
use an example to support your answer.
Does Uniform Cost search expand more nodes than A*? Why (or why not)?
4. Does the fact that A* is "optimally efficient" means that A* will never expand more nodes than
any other algorithm?
5. Inspired by the iterative deepening algorithm, you decide to design an "iterative broadening
algorithm". The idea is to start with 2 children and do a depth-first search limiting at each node
expansion the number of children to 2. If you fail to find a solution, you restart the search from
the beginning increasing the number of children by 1. Repeat this process until you find a
solution.
https://online.vitalsource.com/reader/books/9780134671932/epubcfi/6/64[%3Bvnd.vst.idref%3DP700101715800000000000000000F32C]!/4/2[P700101715800000000000000000F32C]/22[P700101715800000000000000000F3C5]/2/2[P700101715800000000000000000F3C6]/3:28[iss%2Cibl
https://online.vitalsource.com/reader/books/9780134671932/epubcfi/6/64[%3Bvnd.vst.idref%3DP700101715800000000000000000F32C]!/4/2[P700101715800000000000000000F32C]/22[P700101715800000000000000000F3C5]/2/2[P700101715800000000000000000F3C6]/3:28[iss%2Cibl
https://online.vitalsource.com/reader/books/9780134671932/epubcfi/6/64[%3Bvnd.vst.idref%3DP700101715800000000000000000F32C]!/4/2[P700101715800000000000000000F32C]/22[P700101715800000000000000000F3C5]/2/2[P700101715800000000000000000F3C6]/3:28[iss%2Cibl
https://online.vitalsource.com/reader/books/9780134671932/epubcfi/6/64[%3Bvnd.vst.idref%3DP700101715800000000000000000F32C]!/4/2[P700101715800000000000000000F32C]/22[P700101715800000000000000000F3C5]/2/2[P700101715800000000000000000F3C6]/3:28[iss%2Cibl
What advantages, if any, do you see in this algorithm? What shortcomings?
For what type of search spaces do you think this algorithm will be useful?