Suppose we have a hidden Markov model with a sequence of states Y1,...,Yn and observations X1,...,Xn. An important quantity, for signal processing, and inference is the probability of some state Yi given all the observations X1,...,Xk. For HMM, this quantity can be found by marginalizing out the variables, which in signal processing community is known as the Forward-Backward algorithm. There's a generalization of the Forward-Backward algorithm to compute an analogous quantity for a wider class of probabilistic models, where interactions are described by a general tree graph, known as the Sum-Product algorithm, Message Passing or Belief Propagation. In addition to estimating the probability of a variable conditioned on the evidence, versions of this algorithm are used for decoding error-correcting codes, computing free energy in Ising-like models and providing heuristics for various NP-hard problems such as Min-Cut. In this talk I will describe the algorithm and some applications, some known results, and associated open problems.