Concepts and Algorithms for Computing Maximum Entropy Distributions for Knowledge Bases with Relational Probabilistic Conditionals
Produktform: Buch / Einband - flex.(Paperback)
Many practical problems are concerned with incomplete and uncertain knowledge about domains where relations among different objects play an important role. Relational probabilistic conditionals provide an adequate way to express such uncertain, rule-like knowledge of the form „If A holds, then B holds with probability p“. Recently, the aggregating semantics for such conditionals has been proposed, which, combined with the principle of maximum entropy (ME), allows probabilistic reasoning in a relational domain. However, there exist no specialized algorithms which would allow performing ME reasoning under aggregating semantics in practice.
The main topic of this publication is the development, implementation, evaluation, and improvement of the very first algorithms tailor-made for solving the ME optimization problem under aggregating semantics. We demonstrate how the equivalence of worlds can be exploited to compute the ME distribution more efficiently. We further introduce an algorithm which works on weighted conditional impacts (WCI) instead of worlds and we present a novel algorithm which computes the WCI of a conditional by employing combinatorial means. These algorithms allow us to process some larger examples which could not be computed before at all and can also be beneficial for other relational ME semantics.weiterlesen
Dieser Artikel gehört zu den folgenden Serien
50,00 € inkl. MwSt.
kostenloser Versand
lieferbar - Lieferzeit 10-15 Werktage
zurück