27.03.2014 Views

SEKE 2012 Proceedings - Knowledge Systems Institute

SEKE 2012 Proceedings - Knowledge Systems Institute

SEKE 2012 Proceedings - Knowledge Systems Institute

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

aggregations is captured by the interm ediate values of the<br />

connections of the OR neuron in the output layer.<br />

We use the OR/AND neuron as the fundamental part of<br />

the fuzzy neural network model. However, we add<br />

connections from OR and AND neur ons from the first layer<br />

of OR/AND neurons to OR neurons of the second layer of<br />

the OR/AND neurons. As the result we combi ne several<br />

OR/AND neurons to form a uni ted model. In this model,<br />

each neuron in the first layer of each OR/ AND neuron is<br />

connected to all OR neurons of the second layer.<br />

A single OR/AND neuron represents one out put of a<br />

modeled system. The final output of the model is<br />

determined via comparing the outputs of all OR/AND<br />

neurons. In the case of classification – this comparison<br />

identifies the “w inning” class. The FNN structure dealing<br />

with 3 outputs – classes – is shown in Figure 2.<br />

There are tw o ways to im plement the class selection<br />

function. One is using Maximum Output, and another uses<br />

Threshold Value. For the Maximum Output we compare the<br />

output values of all OR neurons. The one with the maximum<br />

value is selected, and the output of the model is generated<br />

according to the index of this output. For the Threshold<br />

Value, a threshold value t is defined. The output values of<br />

each OR neuron is compared with t. If there is one and only<br />

one output with its value greater than or equal to t, the<br />

output of the model is the class represented by this OR<br />

neuron. Otherwise, the output of the model is not<br />

determined.<br />

Output for class 1 (O1)<br />

Output for class N (On)<br />

Class<br />

selection<br />

Figure 2. General structure of FNN<br />

Final output: class #<br />

The most widely used algorithm to construct a FNN is<br />

the back propagation [3]. The objective used during that<br />

process is focused on minimization of the sum of squared<br />

errors between predicted and original outputs. The main<br />

advantage of this approach is its simplicity.<br />

An interesting and important approach to construct<br />

FNNs has emerged with the introduction of evolutionary<br />

computing [4, 5]. Different evolutionary algorithms are<br />

applied for optimization-based learning of FNNs [6, 7].<br />

They allow us to focus not only on param etric but also on<br />

structural optimization of the FNNs [8].<br />

III. MULTI-OBJECTIVE OPTIMIZATION<br />

A multi-objective optimization problem is an<br />

optimization problem involving several criteria or model<br />

design objectives. M any real-world problem s involve<br />

simultaneous optimization of several incommensurable and<br />

often competing requirements. While in a single-objective<br />

optimization the optimal solution is usually clearly defined,<br />

this does not hold for multi -objective optimization<br />

problems. If the objectives are opposing, then the problem is<br />

to find the best possible design or model that satisfies<br />

multiple objectives. In this case, instead of a single solution<br />

a set of alternative solutions is obtained.<br />

Given a set of solutions to the problem , their partial<br />

ordering can be determined using the principle of<br />

dominance: a solution is clearly better than (dom inating)<br />

another solution, if it is better or equal in all objectives, but<br />

at least better in one objective. U sing this principle, the set<br />

of best solutions is found by removing all solutions that are<br />

dominated by at least one other solution. This set of<br />

indifferent solutions is referred to as a Pareto set.<br />

Generally speaking, solving m ulti-objective problems is<br />

difficult. In an attem pt to solve these problems in an<br />

acceptable timeframe, specific multi-objective evolutionary<br />

algorithms [9, 10, 11] have been developed.<br />

IV.<br />

FNN CONSTRUCTION<br />

A. Construction of Models with Skewed Data<br />

The evolutionary-based optimization offers a great<br />

flexibility in exploiting different objective functions. T his<br />

allows for building FN Ns that satisfy very com plex<br />

objective functions addressing different features of the<br />

network, as well as dealing w ith imbalanced data sets. T he<br />

traditional techniques for constructing networks lead to the<br />

development of models that ar e “good” for biggest classes,<br />

but “ignore” smallest classes.<br />

B. FNN and MOO: Concept<br />

In order to construct models based on data sets with<br />

uneven distribution of classes, as well as to extract rules<br />

identifying relationships among data attributes representing<br />

these classes, a novel approach for model development is<br />

proposed.<br />

In a nutshell, the m ain idea of the approach is to replace<br />

construction of a single FNN with development of a number<br />

of FNNs where each of them takes care of a single class.<br />

This process leads to finding the best m odel for each class.<br />

These single-class models are used to extract if-then rules<br />

describing the relationships among data attributes for each<br />

class. These rules are then combined and pruned to create a<br />

model that gives good classification ra te and “treats all<br />

classes in an uniform way”.<br />

The development of FNNs for different classes is done<br />

simultaneously using an evolutionary -based technique –<br />

Pareto multi-objective evolutionary strategy. This multiobjective<br />

optimization targets all class at the sam e time.<br />

This leads to developm ent of a num ber of different m odels.<br />

A subset of these m odels constitutes a Pareto surface. Each<br />

model that belongs to this subset is non -dominated – it<br />

means that none of the models from the whole set has better<br />

performance across all classes.<br />

An additional advantage of application of an<br />

evolutionary-based optimization method is flexibility in<br />

selection of objectives that control a construct ion process of<br />

FNNs. In such case, the proposed approach allows for<br />

107

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!