Abstract
Default rules like “If A, then normally B” or probabilistic rules like “If A, then B with probability x” are powerful constructs for knowledge representation. Such rules can be formalized as conditionals, denoted by \((B|A)\) or \((B|A)[x]\), and a conditional knowledge base consists of a set of conditionals. Different semantical models have been proposed for conditional knowledge bases, and the most important reasoning problems are to determine whether a knowledge base is consistent and to determine what a knowledge base entails. We present an overview on systems and implementations our group has been working on for solving reasoning problems in various semantics that have been developed for conditional knowledge bases. These semantics include quantitative, semi-quantitative, and qualitative conditional logics, based on both propositional logic and on first-order logic.
Access provided by Autonomous University of Puebla. Download conference paper PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
1 Introduction
When studying concepts and methods for nonmonotonic reasoning, actually implemented and operational systems realizing the developed approaches can be very helpful. Besides providing a proof-of-concept, such systems may also yield the basis for practical applications. In recent years, our group at the University of Hagen has been involved in the development of several software systems implementing reasoning tasks for conditional logics. The types of conditional logics covered by these systems comprise pure qualitative logics providing default rules like “If A, then normally B” and also quantitative probabilistic logics with rules like “If A, then B with probability x”, based either on an underlying propositional language or on a first-order language. The purpose of this paper is to provide a brief overview of some of these systems and to illustrate the reasoning tasks they address.
In Sect. 2, after sketching syntax and models of several propositional conditional logics, systems dealing with these logics are presented, both for qualitative logics and for probabilistic logics. Along the same dimensions, Sect. 3 deals with first-order conditionals. In Sect. 4, we conclude and point out future work.
2 Propositional Conditional Logics
2.1 Unquantified and Quantified Conditionals
We start with a propositional language \(\mathcal L\), generated by a finite set \(\varSigma \) of atoms \(a,b,c, \ldots \). The formulas of \(\mathcal L\) will be denoted by uppercase Roman letters \(A,B,C, \ldots \). For conciseness of notation, we may omit the logical and-connective, writing AB instead of \(A \wedge B\), and overlining formulas will indicate negation, i.e. \(\overline{A}\) means \(\lnot A\). Let \(\varOmega \) denote the set of possible worlds over \(\mathcal L\); \(\varOmega \) will be taken here simply as the set of all propositional interpretations over \(\mathcal L\) and can be identified with the set of all complete conjunctions over \(\varSigma \). For \(\omega \in \varOmega \), \(\omega \,\models \, A\) means that the propositional formula \(A \in \mathcal L\) holds in the possible world \(\omega \).
By introducing a new binary operator |, we obtain the set
of unquantified conditionals over \(\mathcal L\). A conditional \((B | A)\) formalizes “if A then (normally) B” and establishes a plausible, probable, possible etc. connection between the antecedent A and the consequence B. By attaching a probability value to an unquantified conditional, we obtain the set
of all probabilistic conditionals (or probabilistic rules) over \(\mathcal L\). A knowledge base \(\mathcal R\) is a set of conditionals from \((\mathcal L\mid \mathcal L)\) or from \({(\mathcal L\mid \mathcal L)}^{ prob }\), respectively.
Example 1
(Qualitative conditional knowledge base). Suppose we have the propositional atoms f - flying, b - birds, p - penguins, w - winged animals, k - kiwis. Let the set
consist of the following five conditionals:
Example 2
(Probabilistic conditional knowledge base). We use the well-known Léa Sombé example (see e.g. [47]) and consider the three propositional variables \(s\) - being a student, \(y\) - being young, and \(u\) - being unmarried. Students and unmarried people are mostly young. This commonsense knowledge an agent may have can be expressed by the probabilistic knowledge base
containing the two conditionals:
2.2 Models of Propositional Conditional Knowledge Bases
In order to give appropriate semantics to conditionals, they are usually considered within richer structures such as epistemic states. Besides certain (logical) knowledge, epistemic states also allow the representation of preferences, beliefs, assumptions of an intelligent agent. Basically, an epistemic state allows one to compare formulas or worlds with respect to plausibility, possibility, necessity, probability, etc.
In a quantitative framework with probabilistic conditionals, obvious representations of epistemic states are provided by probability distributions \(P: \varOmega \rightarrow [0,1]\) with \(\sum _{\omega \in \varOmega } P(\omega ) = 1\). The probability of a formula \(A \in \mathcal L\) is given by \(P(A) = \sum _{\omega \models A} P(\omega )\), and the probability of a conditional \((B|A) \in (\mathcal L\mid \mathcal L)\) with \(P(A) > 0\) is defined as \(P(B|A) = \displaystyle \frac{P(AB)}{P(A)}\), the corresponding conditional probability. Thus, the satisfaction relation \( \, {\models }^{{ prob }} \, \) between probability distributions over \(\varOmega \) and conditionals from \({(\mathcal L\mid \mathcal L)}^{ prob }\) is defined by:
As usual, this relation is extended to a set \(\mathcal R\) of conditionals by defining \( P \, {\models }^{{ prob }} \, \mathcal R\) iff \( P \, {\models }^{{ prob }} \, (B|A)[x] \) for all \((B|A)[x] \in \mathcal R\); for all satisfaction relations considered in the rest of this paper, we will tacitly assume the corresponding extension to sets of conditionals.
Example 3
For the propositional language used in Example 2, let \(P^*\) be the probability distribution given by:
It is easy to check that ; for instance since and , we have and thus .
Various types of models have been proposed to interpret qualitative conditionals \((B|A)\) adequately within a logical system (cf. e.g. [39]). One of the most prominent approaches is the system-of-spheres model of Lewis [38] which makes use of a notion of similarity between possible worlds. Other, more fine-grained semantics for conditionals use numbers to compare different degrees of “plausibility” between the verification and the falsification of a conditional. In these qualitative frameworks, a conditional (B|A) is accepted (or verified), if its confirmation, AB, is more plausible, possible etc. than its refutation, \(A\overline{B}\); a suitable degree of acceptance is calculated from the degrees associated with AB and \(A\overline{B}\). Here, two of the most popular approaches to represent epistemic states are ordinal conditional functions, OCFs, (also called ranking functions) [49, 50], and possibility distributions [11, 14], assigning degrees of plausibility, or of possibility, respectively, to formulas and possible worlds.
In the following, we will focus on OCFs [49]. An OCF \(\kappa \) is a function \( \kappa : \varOmega \rightarrow \mathbb {N}\cup \{\infty \} \) with \(\kappa ^{-1}(0) \ne \emptyset \). The smaller \(\kappa (\omega )\), the less suprising or the more plausible the world \(\omega \). For formulas \(A \in \mathcal L\), \(\kappa (A)\) is given by:
The satisfaction relation between OCFs and qualitative conditionals from \({(\mathcal L\mid \mathcal L)}\), denoted by \(\models ^{ ocf }\), is defined by:
Thus, a conditional \((B|A)\) is accepted by the ordinal conditional function \(\kappa \) iff its confirmation AB is less surprising than its refutation \(A\overline{B}\).
Example 4
For the propositional language used in Example 1, let \(\kappa \) be the OCF given in Fig. 1. For the conditional \((\overline{f}|p) \in {\mathcal R_{ bird }}\), we have \(\kappa (p\overline{f}) = 1 < 2 = \kappa (pf)\) and thus \( \kappa \,\models ^{ ocf }(\overline{f}|p) \). Similarly, it is easy to check that \(\kappa \) also accepts the other conditionals in \({\mathcal R_{ bird }}\), implying \(\kappa \, \models ^{ ocf }{\mathcal R_{ bird }}\).
2.3 Systems for Reasoning with Propositional Conditional Knowledge Bases
Reasoning with respect to a conditional knowledge base \(\mathcal R\) means to determine what \(\mathcal R\) entails. While in classical logic, entailment is defined with respect to all models, for probabilistic conditional knowledge bases this approach is very restrictive since it may yield only uninformative answers. Therefore, entailment may be defined with respect to a set of some best or preferred models.
In probabilistic conditional logic, the principle of maximum entropy (ME principle) has been advocated [28, 30, 40, 41]. While in general, each model of a probabilistic conditional knowledge base \(\mathcal {R}\) determines a particular way of extending and completing the probabilistic knowledge expressed in \(\mathcal {R}\) to a full probability distribution, the ME principle selects the distribution that accepts \(\mathcal {R}\) and that is as unbiased as possible. Formally, given a knowledge base \(\mathcal {R} = \{(B_1|A_1)[x_1], \ldots , (B_n|A_n)[x_n]\}\), \( ME (\mathcal {R})\) is the unique probability distribution that satisfies all constraints specified by \(\mathcal {R}\) and has the highest entropy \( \mathcal {H}(P) = - \sum _{\omega \in \varOmega } P(\omega ) \log P(\omega ) \) among all models P of \(\mathcal {R}\):
Reasoning in probabilistic conditional logic by employing the principle of maximum entropy [28, 40] requires solving the numerical optimization problem given in Eq. (2). MEcore [19] is a software system implementing maximum entropy reasoning. While MEcore does not employ a junction-tree modelling as in the expert system shell SPIRIT [48], but a straightforward representation of the complete probability distribution, its focus is on flexibly supporting different basic knowledge and belief management functions like revising or updating probabilistic beliefs, or hypothetical reasoning in what-if mode. In addition, there is a component checking the consistency of a knowledge base \(\mathcal R\), i.e., checking whether the set of models of \(\mathcal R\) is non-empty. A query asking for the probability of \((B|A)\) in the context of \(\mathcal R\) is answered with respect to the uniquely defined maximum entropy model \( ME (\mathcal {R})\), i.e., \((B|A)[x]\) is ME-entailed from \(\mathcal R\) iff \( ME (\mathcal {R})(B|A) = x\). The distribution \(P^*\) given in Example 3 is in fact the ME distribution computed by MEcore for \({\mathcal R_{ syu }}\), i.e., we have \(P^*= ME ({\mathcal R_{ syu }})\). MEcore can be controlled by a text command interface or by script files containing command sequences. It features an expressive command language which allows, e.g., to manipulate knowledge bases, and to automate sequences of updates and revisions. Besides this, a Java software interface allows to integrate MEcore in other programs. In [3, 33], the functionalities of MEcore are illustrated in applications of ME modelling and reasoning in the medical domain.
The methodological theory of conditionals developed by Kern-Isberner [29, 30] allows to describe the aim of knowledge discovery in a very general sense: to reveal structures of knowledge which can be seen as structural relationships being represented by conditionals. In this setting, knowledge discovery is understood as a process which is inverse to inductive knowledge representation. By applying this theory, an algorithm that computes sets of propositional probabilistic conditionals from distributions was developed and implemented in the system CondorCKD [22, 23, 34] using the functional programming language Haskell.
For propositional qualitative conditional logic using OCFs, p-entailment [25] is an inference relation defined with respect to all OCF models of a knowledge base \(\mathcal R\): If A, B are formulas, then A p-entails B in the context of \(\mathcal R\) iff \(\kappa \, \models \, (B|A)\) for all \(\kappa \) such that \(\kappa \,\models \,\mathcal R\). System P [1] provides a kind of gold standard for plausible, nonmonotonic inferences, and in [13] it is shown that, given a knowledge base \(\mathcal R\), system P inference is the same as p-entailment.
There are also inference relations which are defined with respect to specific OCFs obtained inductively from a knowledge base \(\mathcal R\). System Z [42] is based upon the ranking function which is the unique minimal OCF that accepts \(\mathcal R\); this ranking function is obtained from an ordered partition \((\mathcal R_0,...,\mathcal R_m)\) of \(\mathcal R\) defined by the notion of tolerance [42]. Other OCFs accepting \(\mathcal R\) that have favourable inference properties are c-representations [30, 31]. A c-representation of \(\mathcal R\) is a ranking function \(\kappa \) constructed from integer impacts \(\eta _{i}\in \mathbb {N}_0\) assigned to each conditional \((B_i|A_i) \in \mathcal R\) such that \(\kappa \) accepts \(\mathcal R\) and is given by [31]:
Condor@AsmL [6] is a software system that implements automated reasoning with qualitative default rules employing c-representations. Based on a characterization theorem for c-representations and c-revisions and an approach to compute c-representations and c-revisions using the tolerance-induced partition of \(\mathcal R\) [31], inference is done with respect to the OCF thus obtained from \(\mathcal R\). Condor@AsmL provides functionalities for advanced knowledge management tasks like belief revision and update or diagnosis and hypothetical what-if-analysis for qualitative conditionals. Condor@AsmL implements the abstract Condor specification given in [4] and was developed in AsmL [26], allowing for a high-level implementation that minimizes the gap between the mathematical specification of the underlying concepts and the executable code and supports the formal verification of the implemented system [5].
While Condor@AsmL computes a c-representation for any \(\mathcal R\) that is consistent, this c-representation may not be minimal. Unlike in system Z where there is a unique minimal OCF, there may be more than one minimal c-representation. In [7], the set of all c-representations for \(\mathcal R\) is specified as the set of all solutions of a constraint satisfaction problem \( CR (\mathcal R)\), and a high-level declarative approach using constraint logic programming (CLP) techniques for solving the constraint satisfaction problem \( CR (\mathcal R)\) is presented. In particular, the approach developed in [7] supports the generation of all minimal solutions; these minimal solutions are of special interest as they provide a preferred basis for model-based inference from \(\mathcal R\). Moreover, different notions of minimality are investigated and the flexibility of the approach is demonstrated by showing how alternative minimality concepts can be taken into account by slight modifications of the CLP implementation. In [2], a skeptical inference relation taking all c-representations of \(\mathcal R\) into account is introduced, and it is demonstrated that it can be implemented as a constraint satisfaction problem that extends \( CR (\mathcal R)\).
3 First-Order Conditional Logics
As an illustration for first-order probabilistic conditionals, consider the following example, adapted from [12], modelling the relationships among elephants in a zoo and their keepers. Elephants usually like their keepers, except for keeper Fred. But elephant Clyde gets along with everyone, and therefore he also likes Fred. The knowledge base \(\mathcal {R} _{ ek }\) consists of the following conditionals:
Conditional \(\textit{ek}_{1}\) models statistical knowledge about the general relationship between elephants and their keepers (“elephants like their keeper with probability 0.9”), whereas conditional \(\textit{ek}_{2}\) represents knowledge about the exceptional keeper Fred and his relationship to elephants in general (“elephants like keeper Fred only with probability 0.05”). Conditional \(\textit{ek}_{3}\) models subjective belief about the relationship between the elephant Clyde and keeper Fred (“elephant Clyde likes keeper Fred with probability 0.85”). From a common-sense point of view, the knowledge base \(\mathcal {R} _{ ek }\) makes perfect sense: conditional \(\textit{ek}_{2}\) is an exception of \(\textit{ek}_{1}\), and \(\textit{ek}_{3}\) is an exception of \(\textit{ek}_{2}\).
However, assigning a formal semantics to \(\mathcal {R} _{ ek }\) is not straightforward. For instance, for transforming the propositional approach employed in Eq. (1) to the relational case with free variables as in \(\mathcal {R} _{ ek }\), the exact role of the variables has to be specified. While there are various approaches dealing with a combination of probabilities with a first-order language (e.g. [24, 27, 35, 36]) here we focus on two semantics for probabilistic relational conditionals, the aggregating semantics [36] proposed by Kern-Isberner and the grounding semantics employed in the logic FO-PCL [21].
While the two approaches are related in the sense that they refer to a (finite) set of constants when interpreting the variables in the conditionals, there is also a major difference. FO-PCL requires all groundings of a conditional to have the same probability x given in the conditional, and in general, FO-PCL needs to restrict the possible instantiations for the variables occurring in a conditional by providing constraint formulas like \(U \ne V\) or \(U \ne a\) in order to avoid inconsistencies. Thus, while the aggregating semantics uses probabilistic conditionals \((B|A)[x]\) with relational formulas A, B, these conditionals are extended by a constraint formula C to \(\langle (B|A)[x],C\rangle \) in FO-PCL. The models of a knowledge base \(\mathcal R\) consisting of such first-order probabilistic conditionals are again probability distributions over the possible worlds, where a possible world is a subset of the Herbrand base induced by the predicates and constants used for \(\mathcal R\).
The satisfaction relation \(\models _{\otimes }\) for FO-PCL is defined by
where \(\varTheta ^{ adm } (\langle (B | A) [x], C\rangle )\) is the set of all admissible ground substitutions \(\theta \) for the given conditional, i.e. where \(\theta (C)\) evaluates to true. Thus, a probability distribution P \(\otimes \)-satisfies a conditional \(\langle (B | A) [x], C\rangle \) if it satisfies each admissible individual instantiation of it. In contrast, the satisfaction relation \(\models _{\odot }\) for aggregating semantics [36] is less strict with respect to probabilities of ground instances, since it is capable of balancing the probabilities of ground instances in order to ensure the probability x given by a conditional; \(\models _{\odot }\) is defined by
where \(\varTheta ((B | A) [x])\) is the set of all ground substitutions of \((B | A) [x]\).
The principle of maximum entropy used in the propositional setting (Equation (2)) has been extended to first-order knowledge bases for aggregating semantics and for FO-PCL [21, 36] by defining
where \(\bullet \in \{\otimes , \odot \}\). Since for FO-PCL grounding and for aggregating semantics the set of models is convex, the optimization problem in (6) yields a unique solution for every consistent \(\mathcal {R}\). Thus, analogously to the propositional case, reasoning can be done with respect to the maximum entropy model \( ME _{\bullet }(\mathcal {R})\).
Software components for these inference tasks have been implemented in KReator Footnote 1 [20], an integrated development environment for representing, reasoning, and learning with relational probabilistic knowledge. In particular, KReator provides specific plugins for an optimized computation of the ME model under aggregating semantics (cf. [16–18]) that exploits the conditional structure of \(\mathcal R\) and its induced equivalence classes [30, 37]. The KReator plugin for FO-PCL semantics employs a simplification of the ME model computation by transforming \(\mathcal R\) into an equivalent knowledge base \(\mathcal R'\) that is parametrically uniform [8–10, 21]. Furthermore, algorithms for solving various reasoning problems for probabilistic conditional logics that also take inconsistent information into account have been implemented in the Log4KR libraryFootnote 2 [43–46].
In [37], ranking functions for qualitative first-order conditionals are introduced, and in [32], a system Z-like approach for first-order default reasoning is developed. Unlike propositional system Z, the first-order approach of [32] may yield more than one minimal solution; an implementation of the approach in [32] using Log4KR is given in [15].
4 Conclusions and Future Work
Conditionals play a major role in logic-based knowledge representation and reasoning. In this paper, we gave a brief survey on different versions of conditional logics and illustrated corresponding reasoning tasks addressed by software systems that have been implemented within our research projects in recent years. Our current work includes the further exploitation of conditional structures for relational probabilistic inference under maximum entropy, and the investigation of the precise properties of inference with c-representations using OCFs in the propositional case and with the system Z-like approach in the relational case.
Notes
- 1.
KReator can be found at http://kreator-ide.sourceforge.net/.
- 2.
References
Adams, E.W.: The Logic of Conditionals: An Application of Probability to Deductive Logic. Synthese Library. Springer Science+Business Media, Dordrecht (1975)
Beierle, C., Eichhorn, C., Kern-Isberner, G.: Skeptical inference based on c-representations and its characterization as a constraint satisfaction problem. In: Gyssens, M., Simari, G. (eds.) FoIKS 2016. LNCS, vol. 9161, pp. 65–82. Springer, Switzerland (2016)
Beierle, C., Finthammer, M., Potyka, N., Varghese, J., Kern-Isberner, G.: A case study on the application of probabilistic conditional modelling and reasoning to clinical patient data in neurosurgery. In: van der Gaag, L.C. (ed.) ECSQARU 2013. LNCS, vol. 7958, pp. 49–60. Springer, Heidelberg (2013)
Beierle, C., Kern-Isberner, G.: Modelling conditional knowledge discovery and belief revision by abstract state machines. In: Börger, E., Gargantini, A., Riccobene, E. (eds.) ASM 2003. LNCS, vol. 2589, pp. 186–203. Springer, Heidelberg (2003)
Beierle, C., Kern-Isberner, G.: A verified AsmL implementation of belief revision. In: Börger, E., Butler, M., Bowen, J.P., Boca, P. (eds.) ABZ 2008. LNCS, vol. 5238, pp. 98–111. Springer, Heidelberg (2008)
Beierle, C., Kern-Isberner, G., Koch, N.: A high-level implementation of a system for automated reasoning with default rules (system description). In: Armando, A., Baumgartner, P., Dowek, G. (eds.) IJCAR 2008. LNCS (LNAI), vol. 5195, pp. 147–153. Springer, Heidelberg (2008)
Beierle, C., Kern-Isberner, G., Södler, K.: A declarative approach for computing ordinal conditional functions using constraint logic programming. In: Tompits, H., Abreu, S., Oetsch, J., Pührer, J., Seipel, D., Umeda, M., Wolf, A. (eds.) INAP/WLP 2011. LNCS, vol. 7773, pp. 168–185. Springer, Heidelberg (2013)
Beierle, C., Krämer, A.: Achieving parametric uniformity for knowledge bases in a relational probabilistic conditional logic with maximum entropy semantics. Ann. Math. Artif. Intell. 73(1–2), 5–45 (2015)
Beierle, C., Kuche, S., Finthammer, M., Kern-Isberner, G.: A software system for the computation, visualization, and comparison of conditional structures for relational probabilistic knowledge bases. In: Proceeding of the Twenty-Eigth International Florida Artificial Intelligence Research Society Conference (FLAIRS 2015), pp. 558–563. AAAI Press, Menlo Park (2015)
Beierle, C., Finthammer, M., Kern-Isberner, G.: Relational probabilistic conditionals and their instantiations under maximum entropy semantics for first-order knowledge bases. Entropy 17(2), 852–865 (2015)
Benferhat, S., Dubois, D., Prade, H.: Representing default rules in possibilistic logic. In: Proceedings 3th International Conference on Principles of Knowledge Representation and Reasoning KR 1992, pp. 673–684 (1992)
Delgrande, J.: On first-order conditional logics. Artif. Intell. 105, 105–137 (1998)
Dubois, D., Prade, H.: Conditional objects as nonmonotonic consequence relations: main results. In: Principles of Knowledge Representation and Reasoning: Proceedings of the Fourth International Conference (KR 1994), pp. 170–177. Morgan Kaufmann Publishers, San Francisco (1994)
Dubois, D., Prade, H.: Possibility Theory and Its Applications: Where Do We Stand? In: Kacprzyk, J., Pedrycz, W. (eds.) Springer Handbook of Computational Intelligence, pp. 31–60. Springer, Heidelberg (2015)
Falke, T.: Computation of ranking functions for knowledge bases with relational conditionals. M.Sc. Thesis, Dept. of Computer Science, University of Hagen, Germany (2015)
Finthammer, M., Beierle, C.: Using equivalences of worlds for aggregation semantics of relational conditionals. In: Glimm, B., Krüger, A. (eds.) KI 2012. LNCS, vol. 7526, pp. 49–60. Springer, Heidelberg (2012)
Finthammer, M., Beierle, C.: A two-level approach to maximum entropy model computation for relational probabilistic logic based on weighted conditional impacts. In: Straccia, U., Calì, A. (eds.) SUM 2014. LNCS, vol. 8720, pp. 162–175. Springer, Heidelberg (2014)
Finthammer, M., Beierle, C.: Towards a more efficient computation of weighted conditional impacts for relational probabilistic knowledge bases under maximum entropy semantics. In: Hölldobler, S., Krötzsch, M., Peñaloza, R., Rudolph, S. (eds.) KI 2015. LNCS, vol. 9324, pp. 72–86. Springer, Heidelberg (2015). doi:10.1007/978-3-319-24489-1_6
Finthammer, M., Beierle, C., Berger, B., Kern-Isberner, G.: Probabilistic reasoning at optimum entropy with the MEcore system. In: Lane, H.C., Guesgen, H.W. (eds.) Proceedings 22nd International FLAIRS Conference, FLAIRS 2009, pp. 535–540. AAAI Press, Menlo Park (2009)
Finthammer, M., Thimm, M.: An integrated development environment for probabilistic relational reasoning. Logic J. IGPL 20(5), 831–871 (2012)
Fisseler, J.: First-order probabilistic conditional logic and maximum entropy. Logic J. IGPL 20(5), 796–830 (2012)
Fisseler, J., Kern-Isberner, G., Beierle, C.: Learning uncertain rules with CondorCKD. In: Proceedings 20th International FLAIRS Conference, FLAIRS 2007. AAAI Press, Menlo Park (2007)
Fisseler, J., Kern-Isberner, G., Beierle, C., Koch, A., Müller, C.: Algebraic knowledge discovery using haskell. In: Hanus, M. (ed.) PADL 2007. LNCS, vol. 4354, pp. 80–93. Springer, Heidelberg (2007). http://dx.doi.org/10.1007/978-3-540-69611-7_5
Getoor, L., Taskar, B. (eds.): Introduction to Statistical Relational Learning. MIT Press, Cambridge (2007)
Goldszmidt, M., Pearl, J.: Qualitative probabilities for default reasoning, belief revision, and causal modeling. Artif. Intell. 84(1–2), 57–112 (1996)
Gurevich, Y., Rossman, B., Schulte, W.: Semantic essence of AsmL. Theoret. Comput. Sci. 343(3), 370–412 (2005)
Halpern, J.Y.: Reasoning About Uncertainty. MIT Press, Cambridge (2005)
Kern-Isberner, G.: Characterizing the principle of minimum cross-entropy within a conditional-logical framework. Artif. Intell. 98, 169–208 (1998)
Kern-Isberner, G.: Solving the inverse representation problem. In: Proceedings 14th European Conference on Artificial Intelligence. ECAI 2000, pp. 581–585. IOS Press, Berlin (2000)
Kern-Isberner, G.: Conditionals in Nonmonotonic Reasoning and Belief Revision. LNCS (LNAI), vol. 2087. Springer, Heidelberg (2001)
Kern-Isberner, G.: A thorough axiomatization of a principle of conditional preservation in belief revision. Annals Math. Artif. Intell. 40(1–2), 127–164 (2004)
Kern-Isberner, G., Beierle, C.: A system Z-like approach for first-order default reasoning. In: Eiter, T., Strass, H., Truszczyński, M., Woltran, S. (eds.) Advances in Knowledge Representation. LNCS, vol. 9060, pp. 81–95. Springer, Heidelberg (2015)
Kern-Isberner, G., Beierle, C., Finthammer, M., Thimm, M.: Comparing and evaluating approaches to probabilistic reasoning: theory, implementation, and applications. Trans. Large-Scale Data Knowl.-Centered Syst. 6, 31–75 (2012)
Kern-Isberner, G., Fisseler, J.: Knowledge discovery by reversing inductive knowledge representation. In: Proceedings of the Ninth International Conference on the Principles of Knowledge Representation and Reasoning, KR-2004, pp. 34–44. AAAI Press (2004)
Kern-Isberner, G., Lukasiewicz, T.: Combining probabilistic logic programming with the power of maximum entropy. Artif. Intell. 157(1–2), 139–202 (2004). Special Issue on Nonmonotonic Reasoning
Kern-Isberner, G., Thimm, M.: Novel semantical approaches to relational probabilistic conditionals. In: Lin, F., Sattler, U., Truszczynski, M. (eds.) Proceedings Twelfth International Conference on the Principles of Knowledge Representation and Reasoning, KR 2010, pp. 382–391. AAAI Press (2010)
Kern-Isberner, G., Thimm, M.: A ranking semantics for first-order conditionals. In: De Raedt, L., Bessiere, C., Dubois, D., Doherty, P., Frasconi, P., Heintz, F., Lucas, P. (eds.) Proceedings 20th European Conference on Artificial Intelligence, ECAI-2012, pp. 456–461. No. 242 in Frontiers in Artificial Intelligence and Applications. IOS Press (2012)
Lewis, D.: Counterfactuals. Harvard University Press, Cambridge (1973)
Nute, D.: Topics in Conditional Logic. D. Reidel Publishing Company, Dordrecht (1980)
Paris, J.: The Uncertain Reasoner’s Companion - A Mathematical Perspective. Cambridge University Press, Cambridge (1994)
Paris, J., Vencovska, A.: In defence of the maximum entropy inference process. Int. J. Approximate Reasoning 17(1), 77–103 (1997)
Pearl, J.: System Z: A natural ordering of defaults with tractable applications to nonmonotonic reasoning. In: Proceeding of the 3rd Conference on Theoretical Aspects of Reasoning About Knowledge (TARK 1990), pp. 121–135. Morgan Kaufmann Publ. Inc., San Francisco (1990)
Potyka, N.: Linear programs for measuring inconsistency in probabilistic logics. In: Proceedings KR 2014, pp. 568–578. AAAI Press (2014)
Potyka, N.: Solving Reasoning Problems for Probabilistic Conditional Logics with Consistent and Inconsistent Information. Ph.D. thesis, Fernuniversität Hagen, Germany (2015)
Potyka, N., Thimm, M.: Consolidation of probabilistic knowledge bases by inconsistency minimization. In: Proceedings ECAI 2014, pp. 729–734. IOS Press (2014)
Potyka, N., Thimm, M.: Probabilistic reasoning with inconsistent beliefs using inconsistency measures. In: Proceeding of the International Joint Conference on Artificial Intelligence 2015 (IJCAI 2015), pp. 3156–3163 (2015)
Rödder, W., Kern-Isberner, G.: Léa sombé und entropie-optimale informationsverarbeitung mit der expertensystem-shell SPIRIT. OR Spektrum 19(3), 41–46 (1997)
Rödder, W., Reucher, E., Kulmann, F.: Features of the expert-system-shell SPIRIT. Logic J. IGPL 14(3), 483–500 (2006)
Spohn, W.: Ordinal conditional functions: a dynamic theory of epistemic states. In: Harper, W., Skyrms, B. (eds.) Causation in Decision, Belief Change, and Statistics, II, pp. 105–134. Kluwer Academic Publishers (1988)
Spohn, W.: The Laws of Belief: Ranking Theory and Its Philosophical Applications. Oxford University Press, Oxford (2012)
Acknowledgments
A large part of the work reported here was done in cooperation and in joint projects with Gabriele Kern-Isberner and her research group at TU Dortmund University, Germany. I am also very grateful to all members of the project teams involved, in particular to Marc Finthammer, Jens Fisseler, Nico Potyka, Matthias Thimm, and numerous students for their contributions.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Beierle, C. (2016). Systems and Implementations for Solving Reasoning Problems in Conditional Logics. In: Gyssens, M., Simari, G. (eds) Foundations of Information and Knowledge Systems. FoIKS 2016. Lecture Notes in Computer Science(), vol 9616. Springer, Cham. https://doi.org/10.1007/978-3-319-30024-5_5
Download citation
DOI: https://doi.org/10.1007/978-3-319-30024-5_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-30023-8
Online ISBN: 978-3-319-30024-5
eBook Packages: Computer ScienceComputer Science (R0)