4 years ago

Event based heterogeneous sensors fusion for public place ... - ISIF

Event based heterogeneous sensors fusion for public place ... - ISIF

1.2 Paper outline The

1.2 Paper outline The next section introduces respectively the CEP principles and the surveillance system architecture based upon these principles. Section 3 talks about the use of description logic to model situations, from the event rules description to the context definition (e.g. the smart sensors involved, operation theatre, system effectors, etc.). We present in section 4 via an example security scenario how CEP works in practice. Then we conclude with a short discussion and present some future work. 2 Event based approach 2.1 Complex Event Processing principles CEP is a reasoning technology based on asynchronous and reactive principles. Reasoning is done using a reactive rule-based inference engine designed to deal with a continuous stream of real-time event objects. The principle consists in a time, content and context based selection of a subset or filtered set of event objects, described by their appearance pattern. According to the result of such assessment, a new (complex) event object or one or more actions can be triggered. As explained in [7], an event object is a record of an activity. It has minimally these three features: 1. a significance, giving its semantic 2. a form, giving information related to the activity which will be processed by a computer (at least a timestamp and a unique identifier), 3. a relation with some other event objects, the set of those it is linked with (often a causal relationship), For example, an activity could be the entrance of an identified person in a restricted area. Then the form of the event object could be composed by the unique ID of the identified person and the semantic description or the geographical coordinates of the area (in which the person was observed by a sensor). The time of the event object is set to be the date-time of the detection of the situation by the smart sensor 2 . The event could also be linked with a previous event corresponding to the identification of the person. Notice that an event object has no duration, even if the activity it records is defined in a time interval. This implies that one needs to be aware of not only the beginning of an activity, but also the end (if one is interested in that point). So, the first step in the modeling task of a situation consists of a decomposition of such situation into its activities (i.e., the events types), including those derived by sensors and those produced by a Complex Event 2 The smart sensor is a combination of the electronic device which captures a real-life situation, the device which encodes data in a digital format and the algorithms that process this digital data. As such, the event object’s time is set by the first device in the sensor processing chain. Processor. In a second step, one uses (logical, temporal and/or relational) connectors applied on the event types to model the initial situation. Via this kind of event type connection we construct (event) patterns. Patterns help to organize and simplify the situation modeling, but they are not enough to describe a to-be-recognized situation to. In addition, the event objects content and/or their context have to be assessed in order to verify if they are equal to the targeted situation model. Let’s illustrate this with our example. The assessment should be to verify that the identified person is allowed to access the restricted area at the given time. This is a context assessment because it changes according to the current time, but it’s also a content assessment because the person’s ID is an attribute of the event object. This type of content and context based assessment we refer to as “CEP constraints”. Finally, after the event pattern is matched and the constraints are satisfied, one or more actions can be triggered, such as creation and sending of a new event, synthesis of the detected situation (or part of a bigger situation) or the creating of a command to control an effector (e.g. siren alarm, a screen or a computation of a scheduling with resource constraints). The new event, generated by a Complex Event Processor is called a complex 3 event. Such complex event can be matched by another event pattern, can satisfy another set of constraints and can contribute to the generation of other complex events. The triggering of an action and the complex event generation can be executed in parallel, but in any case they are the endpoint of the aforementioned process. The association of an event pattern, a constraint and an action we refer to as “reactive rule”. 2.2 Reactive rule declaration and Event Pattern Description Language For the description of a reactive rule, we use the following syntax: ON SUCH-AS DO + The event pattern is described via an “Event Pattern Description Language (EPDL)” we have defined (Figure 1 shows its grammar). FullEventPattern : EventPattern : t | EventPattern #n | EventPattern #n:t| EventPattern. Filter: EventObjectSlot Cmp value | EventObjectSlot Cmp value, Filter | EventObjectSlot ∈ Range | EventObjectSlot ∈ Range, Filter . 3 Complex event is the opposition of a simple or primitive event which qualify for all events generated by sensors, even if they represent a complex situation. The term composite can be used as well instead of complex.

EventPattern : NaryConnector{EventPattern, EventPattern + } | EventPattern BinConnector EventPattern | UnaryConnector EventPattern | EventObject | EventObject[Filter]. Figure 1: The light EPDL grammar BinConnector : >> | →. NaryConnector : ∧ | ∨. UnaryConnectror : ¬ |∀. Cmp : < | ≤ | = | ≥ | > | . Range : [ub, lb] | ]ub, lb] | [ub, lb[ | ]ub, lb[. Our EPDL consists of several types of operators such as logical connectors, temporal connectors and constraints on the appearance of event patterns. The logical connectors are conjunction (∧), disjunction (∨) and absence (¬). They can be applied on/between the event objects of an event pattern. The order of appearance of the event objects in a conjunction or disjunction is not significant. Because an event pattern describes the appearance of an event object, we talk about absence instead of negation of event objects.(an event object is not false or true, just it is or not). The temporal connectors are only precedence (>>) and cause (seen as a special case of precedence) (→). Both can be applied on event patterns. We don’t need here the 13 relations defined in the Allen’s Interval Algebra [8] because of, firstly, the no-duration aspect of an event object, and secondly the fact that the event pattern defines the way of appearance of event objects. Then, the precedence connector adds to the conjunction the order of appearance (or sequence) of the event objects. The cause connector specifies the precedence by adding a dependence between two event objects. An event object A is a cause of another event object B if A had to happen in order for B to happen. The appearance constraints are the “for all” (∀), the number of occurrences of an event pattern (#n) and the time window, relative to a clock, in which the event pattern must be matched (:t). The “for all” constraint describes that the pattern matcher must react for each occurrence of the event pattern, on which the connector is applied. The two other constraints are final appearance constraints, meaning that they are applied on a global event pattern and not on some parts of it. Then, they define a full event pattern. Some constraints on the content of an event object are sufficiently basic to be part of an event pattern description. The function of those constraints is often to filter event objects. It consists in the comparison of attributes of event object with constant values or the membership of an event object attribute in an interval. This simplifies the reactive rule declaration. 2.3 CEP engine architecture The CEP engine architecture, as used in public place surveillance context, consists of the following components (showed in Figure 2): • Listeners: Handling communication protocols (like TCP/IP and SOAP), providing interfaces to external components (like sensors or middleware). • Event Distributor: Used for filtering of sensor and CEP event objects based on CEP patterns in which they're a member of and placing the event objects into so-called "buckets" which process event objects ordered per CEP pattern. • Event Processor: The actual core of the CEP engine that contains pluggable modules for CEP pattern matching, constraint rules, contextual information retrieval, action handling, etc. The Event Processor uses the internal middleware (code-named “Walker”) as a controller or event flow execution handler. Figure 2: CEP engine architecture The flow of event objects and their processing happens as follows: • Simple event objects provided by sensors are fed to the CEP engine via the Listeners using a particular communication protocol. • The Incoming Event Buffer function of the Event Distributor component provides a time ordering mechanism and delivers as a product a timeordered list of event objects. • This list is picked-up by the Event Bucket Loader function of the Event Distributor component which creates an event bucket for each event pattern and puts the events that could match a pattern in related buckets. • The content of the buckets are delivered to the internal middleware (“Walker”) of the CEP event processor, which invokes and controls the CEP processor functions, being the Rule Handler, Action Handler and the Semantic Information Store (code-named “ContextStore”) Handler. • The results of this processing are either Actions or complex event objects. • The complex event objects are sent back to the Listeners for further (higher level) CEP processing. The Actions provide functions (such as mentioned in 2.1) to invoke external system functions.

On Sensor Fusion Between a Pair Of Heterogeneous Robots - ISIF
Sensor/Data Fusion Based on Value of Information - ISIF
Track-to-Track Fusion in a Heterogeneous Sensory Environment - ISIF
Multiple Sensor Image Registration, Image Fusion and ... - ISIF
On Platform-Based Sensor Management - ISIF
A Target Tracking System based on Radar and Image Fusion - ISIF
Fusion-Based Recommender System - ISIF
Sensor Data Fusion Using a Probability Density Grid - ISIF
Low-level fusion: a PDE-based approach - ISIF
High-level Fusion based on Conceptual Graphs - ISIF
A Consensus-based Fusion Algorithm in Shape-based Image ... - ISIF
Face Detection Using Information Fusion - ISIF
Fusion of Heterogeneous Sensors Data - CiteSeerX
Track Association and Track Fusion with Non-Deterministic ... - ISIF
A Prototype System for 3D Color Fusion and Mining of ... - ISIF
Architectures for Distributed Information Fusion To Support ... - ISIF
Information Fusion: a Decision Support Perspective - ISIF
FUSE – Fusion Utility Sequence Estimator - ISIF
Problem-Solving Approach to Data Fusion - ISIF
From Data Fusion to Situation Analysis - ISIF
Plan-Driven Fusion: Shaping the Situation Awareness Process ... - ISIF
Improved Representations of Sensor Exploitation for Automatic ... - ISIF
Target Tracking and Data Fusion using Multiple Adaptive Foveal - ISIF
Information Fusion in Anti Asymmetric Warfare and Low Intensity - ISIF
Multiple Sensor Multiple Object Tracking With GMPHD Filter - ISIF
A Semantics-based Middleware for Utilizing Heterogeneous Sensor ...
A Hidden Markov Model Based Sensor Fusion Approach for ...
Sensor Fusion for Video Surveillance - Information Fusion 2004
Mission Driven Sensor Management - Information Fusion 2004
A Location System based on Sensor Fusion ... - wireless-earth