Machine Learning Technology Artificial Intelligence Technology Natural Language Processing Technology Semantic Web Technology Ontology Technology Reasoning Technology Knowledge Information Technology Collecting AI Conference Papers Digital Transformation Technology Navigation of this Blog
Rule Based Reasoning , Programming and Applications
Rule Based Reasoning , Programming and Applications
This publication is the peer-reviewed proceedings of the fifth international symposium on rules, RuleML 2011 – Europe, held in Barcelona, Spain, in July 2011. This is the first of two RuleML events to be held in 2011, with the second RuleML symposium – RuleML 2011 – America – taking place in Fort Lauderdale, Florida, USA, in November 2011. The 18 full papers, 8 short papers, 3 invited papers and 2 keynote abstracts presented at the event were carefully selected from 58 submissions. The papers are thematically organised into the following areas: rule-based distributed/multi-agent systems; rules, agents and norms; rule-based event processing and reaction rules; fuzzy rules and uncertainty; rules and the semantic web; rule learning and extraction; rules and reasoning; and rule-based applications. It is organised into.
Contents
Rule-Based Distributed/Multi-Agent Systems:
-
Rule-Based Distributed and Agent Systems
This paper outlines the roles played by rules and rule-based systems in distributed and multi-agent systems. These roles include an overview of traditional and emerging application areas, as well as internal agent architectures and frameworks for implementing these architectures.
-
Extending a Multi-agent Reasoning Interoperability Framework with Services for the Semantic Web Logic and Proof Layers
The ultimate vision of the Semantic Web (SW) is to provide an interoperable and information-rich web environment, whereby users can safely delegate complex actions to intelligent agents. Much research has been done for agent interoperability. Many proposals and standards for ontology-based metadata and rule-based reasoning are already widely used. Nevertheless, the proof layer of SW has so far been neglected, even though it is essential for SW agents and human users to understand how the results were obtained in order to increase the reliability of the exchanged information. This paper focuses on the implementation of third-party SW reasoning and proof services wrapped as agents in a multi-agent framework. This approach allows agents to exchange and justify arguments without the need to conform to a common rule paradigm. Through an external reasoning and proof service, the receiving agent can understand the semantics of the received rule set and check the validity of the reasoning results.
-
Cross-Community Interoperation between the EMERALD and Rule Responder Multi-Agent Systems
The vision of the Semantic Web allows users to delegate complex actions to intelligent agents, which then act on behalf of the user in various real-life applications. In this paper, based on Semantic Web and multi-agent standards such as RDF, OWL, RuleML and FIPA, two Semantic Web-enabled multi-agent systems, EMERALD and Rule Responder, that can be adopted to support a community of users are Focus. We show that these multi-agent systems can be interoperated to automate collaboration between communities using a declarative knowledge-based approach. Furthermore, we present multi-step interaction scenarios between agents to demonstrate the utility of interoperation between the above systems and illustrate a general approach to cross-community collaboration.
Rules, Agents and Norms:
-
Rules, Agents and Norms: Guidelines for Rule-Based Normative Multi-Agent Systems
This survey paper focuses on some requirements for developing normative multi-agent systems (NMAS). In particular, the guidelines proposed by Boella et al. for NMAS will be discussed. Finally, two specific issues regarding the role of norms in rule-based NMASs are addressed: the concepts of compliance and norm change.
-
-
A Dynamic Metalogic Argumentation Framework Implementation
One of the main challenges facing the AI community is to represent something close to human reasoning as a computational formalisation of argumentation. In this paper, we present a complete implementation and accompanying software for defeasible adversarial argumentation. Our work is based on the meta-logical framework of defeatable adversarial argumentation games in [9]. Our software consists of a meta-interpreter, a declarative implementation of the argumentation game model and a graphical interface developed in Java that displays the results of game runs and the construction of argumentation derivation trees.
-
Rule-Based Event Processing and Reaction Rules:
-
On Complex Event Processing for Real-Time Situational Awareness
This paper provides an overview of existing research results and open research questions on the application of complex event processing for real-time situation awareness. The paper considers two different perspectives: better detection of emerging complex situations and prediction of future situations. To illustrate these perspectives, two application areas are considered: activity recognition from video content and social media observation, respectively.
-
-
A Declarative Framework for Matching Iterative and Aggregative Patterns against Event Streams
Pattern matching for complex event processing and streams is becoming important in many areas, such as financial services, mobile devices, sensor-based applications, clickstream analysis and real-time processing in Web 2.0 and 3.0 applications. However, there are a number of issues that need to be considered to enable effective pattern matching in modern applications. The language for describing patterns needs to have well-defined semantics, it needs to be rich enough to represent important classes of complex patterns, such as iterative and aggregate patterns, and the execution model of the language needs to be efficient, as event processing is a real-time process . In this paper, we present an event processing framework that includes an expressive language characterised by precise semantics and a corresponding execution model that is expressive enough to represent iterative and aggregate patterns. As our approach is based on logic, we analyse the deductive features of such an event processing framework. Finally, we provide an open-source implementation and show experimental results of the execution system.
-
Entity-Based State Management for Complex Event Processing Applications
Complex Event Processing (CEP) using Event-Condition-Action (ECA) rules has proven to be particularly suitable for detecting noteworthy business situations of defined length and structure. In contrast, challenges arise when the state of complex and durable entities (e.g. counters, servers, task queues) has to be derived from a low-level continuous update stream. This paper presents a new approach to state management in CEP applications. We propose a business entity provider that encapsulates arbitrary state computation logic and manages state in the form of typed application-wide data structures. Using a plug-in-based component model, the Business Entity Provider can be integrated into the application based on the specific requirements of the business scenario. We present an ECA rule model that enables event pattern detection and access to well integrated business entities, and demonstrate our approach in a real-world scenario in the workload automation domain.
Fuzzy Rules and Uncertainty:
-
Declarative Traces into Fuzzy Computed Answers
Fuzzy logic programming is a growing declarative paradigm that aims to integrate fuzzy logic into logic programming. In this setting, the so-called Multi-Adjoint Logic Programming approach (MALP) is a very flexible fuzzy language and we have developed the FLOPER tool (Fuzzy LOgic Programming Environment for Research) in We are developing the FLOPER tool (Fuzzy LOgic Programming Environment for Research). Currently, this platform is useful for compiling, executing and debugging fuzzy programmes (to standard Prolog code) in a secure way and is ready to be extended in the near future with powerful transformation and optimisation techniques recently designed in our research group. In this paper, we focus on an outstanding property of the system: its ability to easily collect declarative traces at runtime without modifying the underlying procedural principles. The cleverness lies in the use of a lattice modelling the degree of truth (over {true,false}), which allows the fuzzy computed solution to include not only the sequence of programme rules used in arriving at the solution, but also the set of evaluated fuzzy connectives and the primitive (arithmetic) operators they invoke It is to enrich the constructs for direct visualisation of the sequence and to give a detailed description of the computational complexity.
-
A Flexible XPath-Based Query Language Implemented with Fuzzy Logic Programming
In this paper, we present an extension of the XPath query language to handle flexible queries. To provide ranked answers, our approach proposes fuzzy variants of the and, or and avg operators for XPath conditions. Our proposal is implemented in a fuzzy logic language to exploit the clear synergy between the target and source fuzzy languages.
Rules and the Semantic Web:
-
A RIF-Style Semantics for RuleML-Integrated Positional-Slotted, Object-Applicative Rules
In F-logic and RIF, objects (frames) are defined entirely separately from function and predicate applications; in POSL and RuleML, these basic concepts are integrated by allowing applications with optional object identifiers and orthogonal position or slot arguments to They are integrated by permitting applications. As a result, psoa (positional-slotted, object-applicative) terms are newly formalised, reducing the number of RIF terms by generalising positional and slotted terms (named arguments) as well as frame terms and class membership The number of RIF terms can be reduced. Just as multi-slot frames correspond to (web-)distributed slot descriptions of the same object identifier (IRI), multi-tuple psoa terms (e.g. shelf) correspond to positional descriptions. The syntax and semantics of these integrated terms and their corresponding rules are defined as PSOA RuleML in the style of RIF-BLD. This semantics provides a new first-order model-theoretic basis, merging frame partitions such as F-logic and RIF with integrated psoa terms such as POSL and RuleML.
-
COROR: A COmposable Rule-Entailment Owl Reasoner for Resource-Constrained Devices
OWL (Web Ontology Language) reasoning has been extensively researched since it was standardised by the W3C; general research in the OWL reasoning community has focused on faster, larger-scale and more expressive OWL reasoners, but Only a few studies have focused on OWL reasoning for resource-constrained devices. However, the ever-increasing application of Semantic Web technologies in pervasive computing and the desire to push intelligence to the edge of the network emphasises the need for resource-constrained reasoning. This paper presents COROR, a COmposable Rule-entailment Owl Reasoner for resource-constrained devices. What differentiates this work from related work is the use of two novel reasoner composition algorithms that dynamically dimension the rule-based reasoner at runtime according to the characteristics of a particular semantic application. The reasoners are implemented and evaluated on a resource-constrained sensor platform. Experiments show that the synthetic algorithm outperforms the original non-synthetic reasoner while retaining the same reasoning capabilities.
-
Rule-Based Trust Assessment on the Semantic Web
The Semantic Web is a decentralised forum where anyone can publish structured data or extend and re-use existing data. This inherent openness of the Semantic Web raises questions about the trustworthiness of the data. Data is usually considered trustworthy based on several factors, including its source, the user’s prior knowledge, the source’s reputation and the user’s past experience. However, in the Semantic Web, further factors need to be considered for the data to be inferred, as rules are important for checking data integrity, expressing tacit knowledge or defining policies. Given existing trust measures, we identify two trust axes – data and rules – and two trust categories – content-based and metadata-based – to help with trust assignment in relation to Semantic Web data. A metamodelling framework that uses trust ontologies to assign trust values to data, sources and rules on the web, uses provenance ontologies to capture data generation and declarative rules to combine these values to form different trust assessment models is Proposal. These trust evaluation models can be used for trust transfer from known data to unknown data. This paper describes how AIR, a web rule language, can be used to implement our framework and to declaratively describe evaluation models using different types of trust and provenance ontologies.
-
SOWL: A Framework for Handling Spatio-temporal Information in OWL 2.0
We propose SOWL, an ontology for representing and reasoning about spatio-temporal information in OWL. Based on established standards of the Semantic Web (OWL 2.0, SWRL), SOWL enables the representation of static and dynamic information based on a four-dimensional fluid (or N-ary) approach; SOWL integrates both RCC-8 phase relations and conical direction relations. In addition to quantitative information (precisely defined temporal and spatial information), the representation of qualitative temporal and spatial information (information whose temporal and spatial extent is unknown, such as ‘left-of’ spatial relations or ‘before’ temporal relations) is a feature of SOWL The SOWL inference function is based on the supported The SOWL reasoner can infer new relations and check their consistency while maintaining soundness, completeness and tractability for the supported set of relations.
Rule Learning and Extraction:
-
Conditional Learning of Rules and Plans by Knowledge Exchange in Logical Agents
This paper discusses issues related to logical agents, and in particular to learning rule sets from other agents. In principle, this approach extends the cultural transmission of competence that characterises human societies to agent societies. However, new knowledge cannot be blindly accepted and incorporated, but should be evaluated (and thus possibly discarded) according to its usefulness. We propose a methodology and formalisation.
-
A Framework for the Automatic Extraction of Rules from Online Text
The majority of knowledge on the web is encoded in unstructured text and is not linked to formalised knowledge such as ontologies or rules. A potential solution to this problem is to acquire this knowledge through natural language processing or text mining methods. Prior research has focused on automatically extracting RDF- or OWL-based ontologies from text, but the type of knowledge acquired is generally limited to simple terminological hierarchies. In this paper, we present a generic framework for retrieving more complex relations from text and encoding this knowledge as rules. Our approach starts with existing domain knowledge in the form of OWL ontologies and Semantic Web Rule Language (SWRL) rules, and applies natural language processing and text matching techniques to deduce classes and properties. It then incorporates deductive knowledge in the form of new rules. We have evaluated this framework by applying it to a web-based text on car rental requirements. We show that our approach can automatically and accurately generate rules for car rental company requirements that are not in the knowledge base. Thus, our framework rapidly acquires complex knowledge from free text sources. We extend this framework to handle richer domains such as medicine.
-
Classification Rule Mining for a Stream of Perennial Objects
We study classification for slow streams of complex objects such as customers or students. The learning task has to take into account that the labels of objects are affected by the data input from the stream of neighbouring fast transactions, for example, customer purchases or student exams, and furthermore that these labels may change over time. This task involves merging streams and exploiting the association between the target labels and the attribute values in the fast stream. We propose a method for discovering classification rules for the union of such streams and use it to enhance the decision tree classifier. We show that the new approach has competitive predictive power while building much smaller decision trees than the original classifier.
-
A Case for Learning Simpler Rule Sets with Multiobjective Evolutionary Algorithms
Fuzzy rules are easily understood by people because they are described in structured, natural language. In a wide range of decision support applications in business, the interpretability of rule-based systems is a feature and advantage over alternative approaches that may be perceived as ‘black box’. The motivation of this paper is to examine the relationship between rule simplicity (a key element of interpretability) and out-of-sample performance. Prediction has been described as both an art and a science, to emphasise the intuitive and experiential aspects of the process. We computationally explore the widely-valued ‘rule of thumb’ of forecasting, expressed in Occam’s principle that ‘simpler explanations are more likely to be correct’.
Rules and Reasoning:
-
Algorithms for Rule Inference in Modularized Rule Bases
This paper considers an extended knowledge representation of rules. It is called an extended tabular tree (XTT2) and provides a network of decision units that group rules working in the same context. These units are linked to an inference network and a number of inference options are considered. The original contribution of this paper is the proposal and formulation of several different inference algorithms that operate on the same rule base. Such an approach allows for more flexible rule design and deployment, as the same knowledge base can be used in different ways depending on the application.
-
Modularity in the Rule Interchange Format
The adoption of standards by the knowledge representation and logic programming communities is essential for their visibility and influence. The Rule Interchange Format is a fundamental initiative in this direction and should be supported by users, developers and theorists. It is therefore essential for the community to discuss the recommendations published by the W3C RIF Working Group. In particular, this paper presents the semantics of the multi-document Rule Interchange Format (RIF), analyses it and draws out some flaws. A more general approach is proposed as an alternative semantics for multi-document. As an important secondary result, some related problems in the semantics of RIF-FLD are also discussed and possible solutions are proposed.
-
Overview of Knowledge Formalization with XTT2 Rules
This paper describes a new formalised knowledge representation for rule-based systems, called XTT2. This hybrid knowledge representation is a combination of a decision diagram and an extended decision table, where one decision table contains rule sets of similar structure operating within a common context The structure of XTT2 is such that rule sets operating in the same context provide the It constitutes a hierarchical knowledge representation consisting of lower-level knowledge components for which specifications are provided and higher-level knowledge components for which the decision diagram defines the overall structure of the knowledge base. The model has a concise formalisation that opens up possibilities for rigorous design and validation. The focus of this paper is to present the formal aspects of the approach, starting from the initial logical specification.
-
HalVA – Rule Analysis Framework for XTT2 Rules
Quality and reliability issues are important in the development and exploration of rule-based systems. This paper considers a formalised knowledge representation for rules called XTT2. It proposes a custom runtime and verification framework for XTT2 called HalVA, a rule language based on an expressive attribute logic called ALSV(FD). This allows certain formal properties of rules to be verified, such as determinacy, inclusiveness and completeness.
Rule-Based Applications:
-
Rewriting Queries for Web Searches That Use Local Expressions
Users often enter local expressions to limit their web search to a geographical location. However, current search engines’ ability to handle expressions such as “close to” is limited. This paper presents an approach that uses topological background knowledge to rewrite queries containing local expressions into a format suitable for standard search engines. To formalise local representations, the Region Connection Calculus (RCC) is extended by additional relations related to existing relations by composition rules. The approach is applied to a web search for communities ‘close’ to a reference location in a certain region of Switzerland. The results show that query rewriting significantly improves search recall. When dealing with approximately 30,000 role assertions, the time required to rewrite a query is in the range of a few seconds. The ways in which performance degradation can be addressed when operating with a larger knowledge base are discussed.
-
Implementing General Purpose Applications with the Rule-Based Approach
The use of rule-based systems (RBS) for the implementation of generic applications allows the verification of their formal properties; to adapt the RBS approach to generic applications, it is necessary to design an RBS architecture and a certain knowledge representation. This paper proposes an example architecture (Four Layer Architecture – FLA) and knowledge representation (Extended Tabular Trees – XTT2). It also describes a prototype RBS and an example application that acknowledges the approach.
-
-
Rule-Based Complex Event Processing for Food Safety and Public Health
The challenge for public health officials is to detect emerging foodborne outbreaks from a large set of simple, isolated and domain-specific events. These events can be drawn from a number of different information systems, including surveillance and inspection reporting systems from healthcare providers, real-time complaint hotlines from consumers and inspection reporting systems from regulatory bodies. In this paper, foodborne outbreaks are formulated as complex events and an event-driven rule-based engine is applied to the emerging event detection problem. The evidence set is defined as a set of simple events linked symptomatically, spatially and temporally. A weighted metric is used to calculate the strength of the evidence set as a basis for response by public health authorities.
コメント