代写代考 6. JUNCTION TREE ALGORITHM

6. JUNCTION TREE ALGORITHM
Note that even small graphs can have large joint distribution tables and running variable
elimination for each variable inefficient compared to running the sum-product algorithm.
Hence, we want to extend the sum-product algorithm for loopy markov random fields where

Copyright By PowCoder代写 加微信 powcoder

we must first construct a junction tree in jt_construction.py in the following steps:
a. Build the junction tree graph structure by forming cliques and creating edges
between cliques: _get _jt_clique _and edges() [3 points]
b. Assign factors in the original graph to the cliques: _get_clique_factors() [1 point]
Note that we will also provide evidence variables, and the graph structure must be updated
with the evidence variables.
Once the junction tree has been constructed, in main.py, we will perform:
a. Inference of clique potentials using the sum product algorithm on the constructed
junction tree: _get_clique potentials() [2 points]
b. Compute the node marginal probabilities e.g. p(X2| XE); p(X3| XE) from the clique
potentials using brute-force marginalization: _get_node marginal_probabilities() [2
To retrieve marginal probabilities for all query nodes in the graph. Note that step (b) has 1
point for more efficient solutions. Hint: cliques have varying sizes.

11. PARAMETER LEARNING UNDER COMPLETE OBSERVATIONS
In part 2, we assume that observations are generated using a linear-Gaussian model i.e.
-. Wuckuc t
To derive the MLE estimates of A, we can equivalently find the maximum log-likelihood

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com