Académique Documents
Professionnel Documents
Culture Documents
BY
SHIKHA SHARMA
Athabasca, Alberta
October, 2010
i
DEDICATION
I would like to dedicate this essay to my parents, and my siblings Rinki and Anshul,
who have always been a source of encouragement and motivation to me. Without
their continued love and support, this would not have been possible.
ii
ABSTRACT
feature of daily life; it is a central topic in many domains such as economics, artificial
deal with uncertainty: Fuzzy logic, Rough set theory, Multi-valued logic, and
Technology applications such as semantic web services and data mining. These
applications are used in day to day lives where modeling and reasoning with
deal with uncertainty. For intelligent system to deal with this uncertainty there has to
this goal. The essence of designing an intelligent system lies in its ability to
the hybrid intelligent systems. The design and architecture play a central role in the
success of intelligent system. At the design level, dealing with uncertainty at object,
environment and goal level help to deal with uncertainty at an architecture level.
Therefore, having a right design and architecture for intelligent system defines the
system based upon hybridization of neural network and fuzzy logic useful in
iii
implemented to handle uncertainty can handle real world situations accurately and
iv
ACKNOWLEDGEMENTS
Very special thanks to my Mom, Dad, Rinki and Anshul for providing me with the
support during this journey. I would also like to thank wonderful friends for their
continued support and encouragement.
v
TABLE OF CONTENTS
Table of Contents
INTRODUCTION ....................................................................................................... 1
1.1 Background................................................................................................... 1
vi
2.3.2 Advantages ............................................................................................. 19
vii
3.2.1 Background ............................................................................................. 44
ARCHITECTURE..................................................................................................... 57
viii
4.3.1 Architecture for Intelligent System ........................................................... 70
REFERENCES ........................................................................................................ 88
ix
LIST OF TABLES
x
LIST OF FIGURES
Figure 9: Relation between soft computing and other fields [73] ............................ 60
Figure 16: Maternal ECG Cancellation in Abdominal Signal using ANFIS [87] ....... 81
xi
CHAPTER 1
INTRODUCTION
1.1 Background
One of the prominent questions in the field of artificial intelligence is how to deal
uncertainty by Doug Hubbard is: the lack of certainty, a state of having limited
When dealing with real-world problems, we can rarely avoid uncertainty. Klir
cognitive level, it emerges from the vagueness and ambiguity inherent in natural
language. At the social level, uncertainty has even strategic uses and it is often
1
created and maintained by people for different purposes (privacy, secrecy, propriety)
alternatives.
which provides the user with a facility of posing and obtaining answers to questions
precise in nature and is not a complete set of accurate facts and rules. Hence,
There has been enormous effort undertaken to deal with uncertainty and lot of
literature has been generated during this time on how to handle uncertainty. The
most popular approach to dealing with uncertainty is the theory of probabilistic logic;
Networks of Plausible Inference [25] provides a framework for this reasoning. Other
approaches were conditional planning, and decision theory. There has been a
uncertainty models have been introduced based upon predicate logic, and
2
probability based method. Some of these models are:
Fuzzy Logic
Multi-valued Logic
Bayesian Networks
Rough Sets
applications such as semantic web services and data mining. These applications
are used in day to day lives, hence making it critical to have excellent measures in
domain and map them to the uncertainty model that best compliments them.
and discuss design and architecture for intelligent system. This will provide users
with much needed tools for developing intelligent systems that can handle
Many theories have been developed to deal with knowledge uncertainty but neither
structured framework has been established nor have standard guidelines been
information systems. There has been extensive research done in the field,
3
identifying issues of uncertainty as well as uncertainty models of information
systems, but only limited interaction exists between these two areas.
are: Fuzzy logic; Multi-valued logic; Rough Set; and Bayesian network;
uncertainty models to each type of application. Semantic web services and Data
mining will be two domains of interest for the purpose of this essay.
design and architecture of the main components to be included in the framework will
be discussed. One application of intelligent system in the real world will be explored.
4
CHAPTER 2
under conditions infused with uncertainty [4]. In an ideal world, agents would know
all the facts about the environment in which they operate. Unfortunately, reality is far
from idealism where agents do not have access to the whole truth, thereby making it
impossible to derive conclusions that are fully accurate. Hence these agents should
There are different methodologies to deal with uncertainty; few of them are
described below:
uncertainty as long as it is a simple case where there are not too many
variables involved, i.e. ability to get its hand on required information and deal
Three main reasons why first order logic fails to deal with uncertainty are [12]:
1. Laziness: it takes a lot of work to compile the complete set of rules for
5
2. Theoretical Ignorance: having incomplete knowledge of the complete
3. Practical Ignorance: each case is unique therefore all the generic rules
a goal and will execute the plan that guarantees result (i.e., a goal is
achieved). This method is based upon degree of belief. In the world full of
belief.
These theories were competent in their own ways to deal with uncertainty; but
as the complexity grew, so did the demand for sophisticated models. These
which are approximate rather than exact, and most of human reasoning fall into this
category [15]. There were many different approaches introduced; we will take a look
at four different models: Fuzzy Logic, Multi-valued Logic, Bayesian Networks, and
Rough Sets.
6
2.2 Fuzzy Logic
One of the main problems in dealing with uncertainty in information system is the
fuzziness associated with the knowledge base of an intelligent system; this lead to
defines Fuzzy Logic as a form of multi-valued logic derived from fuzzy set theory to
deal with reasoning that is approximate rather than precise [6]. Fuzzy logic contrary
to its name is not fuzzy rather precise. Fuzzy logic variables may have truth values
that range between 0 and 1 which corresponds to the degree of truth [6].
theory enjoyed the monopoly; but this traditional approach to dealing with
uncertainty failed to come to terms with the fact that uncertainty is possibilistic in
nature than probabilistic. As Asli and Burhan [5] claimed, in realm of uncertainty and
imprecision, fuzzy logic has much to offer. Fuzzy Logic is based upon both
predicate logic and probability theory providing the answer to the posed question
with an assessment of how reliable the answer is. This assessment of reliability is
also called a certainty factor. Fuzzy Logic has two main components:
7
2. Inferential Component: infer answers to posed questions and provide fuzzy
The main difference between fuzzy logic and traditional approach is that the
objects of interest are allowed to be much more general and much more complex
than the objects in traditional logical systems and probability theory. Fuzzy Logic
further addressed issues that were hard to deal with using conventional techniques.
Here are few important issues [1] that can be handled through fuzzy logic:
a. If A is M then B is N
b. If A is M then B is N with CF =
certainty factor.
2. Partial match between the users supplied fact and the rule in the knowledge
base: this is the case where fact may not be identical to antecedent of a rule
implicit fuzzy quantifiers. For example, consider the following proposition with
8
2.2.1 Characteristics of Fuzzy Logic
The unique property of fuzzy sets is that membership in a fuzzy set are not a
2. Any logic system can be fuzzified: i.e., conversion of any system to a fuzzy
1. Truth values can range over the fuzzy subsets of a finite or infinite truth-
values set, usually assumed in the range of [0, 1]. This can be regarded as
predicates are only crisp for e.g., larger than, fuzzy logic lets predicates to be
3. Allows typical quantifiers (all & some) and fuzzy quantifiers (e.g. most, few):
fuzzy logic allows quantifiers that are used in day to day lives, thereby making
9
4. Ability to represent non fuzzy and fuzzy-predicate modifiers: In contrast to
classical system where negation (not) is the main predicate modifier, fuzzy
X is F,
X is F is ,
If X is F then Y is G,
If X is F then Y is G is ,
probability
10
All facts or propositions in knowledge base are stored in canonical form; this
The key problem in application of fuzzy logic is the construction of the membership
function of a fuzzy set. Three principal approaches are used to address this
concern:
a system.
membership functions.
fuzzy if then rules. These rules are created by extracting knowledge from human
approach, this challenge has lead to building automated algorithms for modeling
systems using fuzzy theories via machine learning and data mining techniques.
11
2.2.5 Advantages
Ability to handle real world situations since it goes beyond the restriction of
yes/no and can handle any situation through truth values ranging from 0 -1.
2.2.6 Disadvantages
2.2.7 Applications
Data Mining
E-services
Quality Support
12
2.2.8 Future Work
- Zdzislaw Pawlak
Rough set theory was introduced by Zdzislaw I. Pawlak in early 1980s; this theory is
based upon formal approximation of a crisp set pair of sets which provide the
lower and the upper approximation of the original set [10]. Traditional use of rough
set was to deal with decision problems, since then it has become an area of
interests among researchers from different disciplines, most of which are related to
Artificial Intelligence. Recently, rough set theory has been extended to deal with
uncertainty and for uncertainty management in relational database [11]. Rough set
13
Rough set theory is based on the fundamental principle of associating some
information with every object in the universe. The underlying principle for this
exists between two objects when all their attribute values are identical with respect
objects).
Usually, we hit a grey zone with boundary line objects which are hard to be
placed in either of these sets. As Pawlak said in [16] knowledge base has a
information about their elements. Rough set theory brings forth the approach of
used to divide universe into equivalence classes. Pair of precise sets consists of
lower approximation and upper approximation of the vague concept. The notion of
Lower Approximation Region results that are certain and surely belong to
Upper Approximation Region results that are likely but still uncertain and
14
Boundary Region difference between the upper approximation and the
1. Indiscernibility Relation
Mike Elementary No
Looking at the above table, we can see that for each Candidate we
A Name
B Education
C Job Prospects
15
Each person can be discerned (distinguished) from each other based
on all three attributes. But if we were to take a look at the attribute Education,
Fa: U Va
defined as:
2. Approximation
be able to identify attributes given the set. Using this process we define lower
From Table 1, we infer that candidate with Job Prospects are {Shelly,
16
Prospects, we can easily deduce that if Candidate has good education, then
Let X = subset of U
B = Subset of attributes A
If BNb(x) = 0, then set (x) is called crisp set where have an exact match; and if
numerically.
3. Rough Membership
17
This can be interpreted as a degree of certainty to which x belongs to X.
Pawlak said that above function confirms that vagueness is related to set,
4. Dependency of Attributes
inferred from another; that is AB, if the value of B can be inferred uniquely
I(A) I(B)
K = |POSA(B)|
|U|
POSA (B) represents a set of all elements of U that can be uniquely identified
18
5. Reduction of Attributes
make any difference to the object in universe. Hence we can reduce the
attributes and be able to get the minimal set of attributes which delivers the
2.3.2 Advantages
2.3.3. Disadvantages
19
2.4 Multi-Valued Logic
Uncertainty means that the atoms may be assigned logical values other than
the conventional ones - true and false, in the semantics of the program. The
use of multi-valued logics to express uncertainty in logic programs may be
suitable.
-Daniel Stamate
Multi-valued logic is a logical calculi in which there are more than two-truth values.
Traditionally, there were only two-possible values for any proposition. An obvious
extension to classical two-valued logic is an n-valued logic for n>2 [10]. This
extension leads to a new set which may be finite or infinite and have same structure
in place. As Dubois and Prade said in [20], multi-valued logic is constructed on truth
function calculi: the degree of truth of a formula can be calculated from the degree
applied in the field of uncertainty, where degree of truth was viewed as certainty
factors.
Multi-valued logic has been used in wide array of logic systems such as
memory, multi level data communication coding and various digital processors [28].
Its roots were originated from Lukasiewicz and Post in the twenties. In this logic,
the calculus and its semantics) and set is considered to be fuzzy if it is the
There are many instances in real world where we get different view from
20
software lifecycle. Different stakeholders are interested in different aspects and
usually results in information which might not be consistent with each others views
and opinions and might be even incomplete in nature. Inconsistent viewpoints might
typical two truth values, rather it can represent different types of contradictions and
different levels of uncertainty. Belnap said in [27] that paraconsistent logic (multi
valued logic) has been driven by the need for automated reasoning systems that are
done given spurious answers if their database becomes inconsistent. The usual
choice of values in multi valued logic depends upon the nature of the problem or
Lattices are used to represent the information (truth values) of the system. In
multi-valued logic, we can calculate the product of lattices as the merging point for
different views for dealing with inconsistent data. Product of two lattices results in a
lattice where a pair of element (a, b) are composed of element a from the first
lattices and element b from the second lattice. These products sustain all the
increases. To deal with this, we can use the technique of abstraction. Abstraction
21
results in discarding some information and only retaining information that is relatively
important. With the multi-valued logic the more values we have, the more detailed
information we hold about the system it represents, and more complex it becomes.
Hence depending upon the problem in hand, we make a tradeoff between complete
or abstract data.
helpful, handsome. Fuzzy logic falls short to represent these descriptions through
the use of fuzzy set, and that is where multi-valued logic is used. As an example
given in [33]:
If X is A then Y is B
If X is A` then Y is B`
logic these predicates are expressed as multi-sets. Multi-valued logic can be viewed
as an extension of fuzzy logic, some of its features and principles can be extended
predicate is; this is associated with each multi-set which usually tells how true the
satisfies the predicate smart with the degree extremely. The main difference
22
between multi-valued logic and fuzzy logic is that in multi-valued logic the
membership degree belongs to set [0,1]. The condition of an ordered list is forced
upon the set of truth degree (symbolic) with M = {0,, i, m-1}2 with the total order
relation
i j such that i j
The truth degrees can be proposed by expert using multi-valued logic as long
TL(, ) = max(0, + M + 1)
SL(, ) = min(M-1, + )
Using General Modus Ponens we can infer that we can have a rule defined
by the same multi-set as the premise, but can modify the truth-degree associated
The above relations are expressed through modifications of the truth degree
23
2.4.1 Approximate Reasoning with Linguistic Modifiers
Modus Ponens with free rules [32]. The primary difference between typical GMP
rules and new rules is that, in GMP observation and the premise correspond to the
same multi-set, where as in new rules they both are represented by different multi-
In multi-sets theory, these modifiers result in the same multi-set, but the truth
degree is modified, whereas in fuzzy logic, these modifiers result in a whole new
fuzzy set which is different from the original set. An example of an approximate
If X is A then Y is B
Y is m(A)________
then Y is m(B)
m = m, thereby giving the ability to infer this conclusion. A general principle is that,
24
a modification applied on the rule premise will be applied to rule conclusion as well.
For e.g.:
If X is vA then Y is vB
X is m(vA)____________
Y is m(vB)
It is recommended to use modifiers which modify the truth-degree and not the actual
multi-set.
Sarif and Barr defines n-variable multi-valued logic in mathematical terms as the
function f(x) with radix (r); f(x): Rn R where R = {0, 1, , r-1} is a set of r logic
Deterministic algorithm
This is based on direct cover approach and requires high computational time
o Choose a minterm
25
o Repeat steps 1-3, until all minterms are explored.
The steps to choose minterm and implicant that covers minterms are critical
required). There are many different implementations in how to choose minterms and
This is based on exploring large solution space and coming to near optimal
solutions. This is based on the concept of chromosome and genes, where solutions
contains several genes. These genes consist of five attributes which represent the
Second and third attribute: windows boundary of product term for the first
variable X1
Fourth and fifth attribute: windows boundary of product term for second
variable X2
for the length to be just right. If it is too short, it will not be able to reach best
solution, and if it is too large, it will take too long of a time. There are two proposed
26
2.4.3 Future Work
rules to more complex strong rules, such as a set with multiple premises.
-Eugene Charniak
directed acyclic graph (DAG) [22]. Bayesian network can be used to represent
calculate the probabilities of different problems that can occur. Bayesian network is
Nodes represent random variable (RV), which can either have discrete values
(such as true/false) or continuous values (such as 1.0, 1.9). Directed Arcs between
27
values of some of the nodes that have been observed. When new information
probabilities due to which they might change. When Bayesian network is referred as
Belief network, belief refers to the conditional probability given the evidence.
the random variable (n) grows in number, it becomes hard to depict all probabilities,
for e.g., if we have n = 5, then it will require 31 joint probabilities, where as if n = 10,
it will require 1023 joint probabilities. Bayesian network overcomes this complexity
28
Figure 1: D-connecting Paths [23]
if every interior node n in the path has the property that either [23]:
To summarize, two nodes are d-connected it there exists a causal path between
them or there exists an evidence that renders the two nodes correlated with each
other.
Another problem that comes with classical probabilistic theory is the problem of
we do not run in to this problem. Bayesian network handles this problem effectively
and every nodes in the network be specified (all possible combinations of its
p(r1,r2,rn) for all values of r1,r2,rn. This provides all the information associated
with the distribution. Also sum of all the joint distributions should equal 1. The joint
29
It is important to understand how to number random variables 1, 2, n. There
are various techniques but for our interest we will look at topological sort where each
Recent work in this field has lead to the invention of many new algorithms
which are both sophisticated and efficient in nature for computing and inferring
between the variable and proceed by local computations which makes the execution
times relatively short. We mentioned earlier that Bayesian networks have the
feature of Independence Assumption; hence new algorithms make full use of this
feature offered by Bayesian Network. Using this, the number specified by the
local level is used for insuring that global distribution is consistent as well.
2.5.3 Constraints
deterministic polynomial time) and usually takes exponential time to get the problem
solved. There are many factors that are taken into consideration during the
evaluation of the network, such as the type of network, the type of algorithm used,
30
Exact Solution
To find an exact solution is usually NP-hard, with the exception of single connected
undirected path between any two nodes as shown in figure 2. These are usually
We will not look at the algorithm in this paper, but it can be found in [25]. The
main difference between single connected networks (figure 2) and the multiple
a change in one node will only have affect on its neighboring nodes, for e.g., in
figure 2, a change in d can not affect any other nodes except for node going through
b itself.
However in multiple connected networks, there can be more than one path
between any two nodes. Hence when a change is introduced, for e.g., in figure 2, if
a change is introduced in d, it will not only affect c, but also affect a through b.
31
Hence a will have double affects (through b and c). This ripple effect is what makes
techniques such as clustering. This conversion works fine when dealing with
networks consisting of fewer nodes, but gets complicated when nodes created
through clustering has large values. Trade off is go from exact solution to
approximate solution.
Approximate Solution
in Bayesian network and each technique fits well or not depending upon the nature
principles:
2.5.5. Applications
Bayesian networks have been applied in different domains. The most frequent
Diagnosis problems
Speech recognition
Data mining
Determination of errors
32
2.5.6 Advantages
approach.
Used for complex simulations since it does not rely on traditional approach of
assumption).
2.5.7 Disadvantages
33
CHAPTER 3
This chapter discusses two of the main applications where modeling and reasoning
with uncertainty is primordial; these applications are: Data Mining and E-services.
comes into play and recommends model to deal with this uncertainty.
The fruits of knowledge growing on the tree of data are not easy to pick.
3.1.1 Background
[47]. A step in the knowledge discovery of databases that consists of applying data
knowledge from large data repositories. Computers have enabled humans to gather
more data than we can digest; it is only natural to turn to computational techniques
34
to help us unearth meaningful patterns and structures from the massive volumes of
that the digital information era made a fact of life for all of us: data overload [42].
1. Data Cleaning
2. Data Integration
3. Data Selection
4. Data Transformation
5. Data Mining
6. Pattern Evaluation
7. Knowledge Presentation
Across a wide variety of fields, data are being collected and accumulated at a
35
marketing, the classical approach to data analysis relied fundamentally on one of
more analysts becoming intimately familiar with the data and serving as an interface
between the data and the users and products [42]. Databases are increasing in size
by growing number of records and increasing files or attributes associated with each
record. To replace this manual and traditional approach which is slow and
expensive, and to deal with huge databases, demand for data mining has grown
proportionally to handle and utilize data efficiently. The unifying goal is extracting
high level knowledge from low-level data in the context of large data sets [42].
future trend and be able to make knowledge driven decisions. Data is stored in data
integration, and on-line analytical processing (OLAP), that is, analysis techniques
the ability to view information from different angles [45]. It collects information about
data warehouse which focuses on selected subjects, and thus its scope is
department wide.
36
3. Automated capability: ability to automatically discover hidden patterns or
4. Embedded learning capability: ability to learn from the past and to apply
Data mining has since evolved into an independent field of research in which
intelligent data analysis methods attempt to unearth the buried treasures from the
mountains of raw data [48]. Data mining component of Knowledge discovery relies
statistics to find patterns. Data mining has functionalities such as outlier analysis,
association analysis, cluster, and evolution analysis. The main tasks involved in
scheme, the classification of database values into the categories defined, and the
37
Figure 4: Data Mining [80]
A cluster consists of group of objects that are more similar to each other than
to other cluster. It is nothing but the subsets of the data set. In fact cluster analysis
has the virtue of strengthening the exposure of patterns and behavior as more and
more data becomes available [50]. Aim of cluster analysis is the classification of
objects according to similarities among them, and organizing objects into groups
[47]. Once the clustering task is executed, the product of categories could either be
fuzzy or crisp (hard) in nature. Hard clustering method is based upon a classical set
theory, where object either belong or does not belong to a cluster [47].
On the other hand, fuzzy clustering method is based upon the concept where
an object can belong to several clusters simultaneously with the degree of belief
associated with each object in the cluster. That is, during the clustering algorithm,
there could be some values that belong to the borderline, thereby not fully classifying
38
into one specific category or might belong to more than one category. In real world,
fuzzy clustering occurs more than hard clustering where objects in borderline are not
forcefully classified into one cluster. This is due to the fact that mostly real world
Another issue that exists in data mining is when data values are given equal
treatment during the classification process which is carried out in a crisp manner.
During this classification, some values belong more to the category as opposed to
company for 15 years, and another employee Y has been working with the same
company for 20 years. Theoretically, during classification they both belong to the
seniority than X, but during the classification process, we lose this important
technique.
posed by the collection of natural data which is often vague and uncertain [51].
39
Traditionally, to deal with uncertainty in Data mining, several approaches have been
proposed, such as fuzzy decision trees, and fuzzy c-means. The underlying
principle with these approaches is to associate degree of belief to each value during
classification process where data value can be classified into more than one
category.
Fuzzy logic is a good model to deal with uncertainty in Data mining. Fuzzy
set theory is based upon membership function; users can use the given data to
Integration of fuzzy logic with data mining techniques has become one of the key
collection of natural data [52]. Rules can be designed to model the to-be-controlled
system given the input and output variables. Here are the basic steps of the
to initial categories for the data set. Many clustering algorithms are available
40
3. Evaluation of the clustering scheme: clustering method is used to find a
the clusters. The membership value is in the range zero to one and indicates
5. Fuzzy classification: data values (Ai) are classified into categories according
identifier. This represents the confidence level with which tk.Ai belongs to the
category li.
classification beliefs and storing them in a cube, where the cell store the
degree of belief for the classification of attribute values. This cube is also
referred to as CVS.
our data set based on which sound decisions can be made. Fuzzy logic
concept is used for quality measurement of our dataset with regards to each
category.
41
9. Prediction and determination of samples to determine which cluster it will
belong to. This is usually done by calculating the average index of each
Repeat
42
For i=1 to n :Update j(xi) applying (3)
quantity included in fuzzy sets. Using these measures, we can check which
set best fits by checking the degree associate with the sets; this allow us to
3.1.5 Applications
There are many applications of applying fuzzy logic to dealing with uncertainty in
43
3.2 Semantic Web Services and Uncertainty
- J. Rowley
3.2.1 Background
these services over the Web [53]. Such services are referred to as E-services which
Web has altered how businesses used to do its operation. The introduction of
communications and provide a fast and efficient method of transacting with one
electronic business with all other companies in the marketplace instead of traditional
Services offers are described in such a way that they allow automated discovery to
take place and offer request matching on functional and non-functional service
capabilities [54].
E-services are available for different purposes, such as, banking, shopping,
health care, learning, and has high potential benefits in the areas of Enterprise
44
Application Integration and Business-to-Business Integration. The concept of e-
services plays a vital role in knowledge management applications through the ability
of exchanging functionality and information over the Internet. Web services provide a
to note that web services operate at a purely syntactic level [65] as shown in Figure
6 [67].
web services. Semantic Web Services are pieces of software advertised with a
formal description of what they do; composing services means to link them together
invocation, and interoperation are the core pillars of the deployment of semantic web
services [64]. SWS is taking web services to next level by adding the dimension of
45
semantically enhanced information processing in conjunction with logical inference
composition and execution of services in the web [65]. As Polleres said in [65],
SWS provides a seamless integration of applications and data on the web. Figure
6 [67] illustrates both web services and semantic web services, and Figure 7 [66]
Different semantic web services framework such as, OWL Service Ontology
(OWL-S), Web Service Modeling Ontology (WSMO) and the Semantic Web Services
46
Framework (SWSF) are used to semantically describe necessary aspects of
of a goal (clients purpose of using web services) to the web services capabilities are
common elements
to each matched web services capabilities. This will tell us which result is closer to
47
semantically and syntactically, it is tough for computers to use this language
These are machine processable that does not allow for uncertainty due to
The real power behind human reasoning however is the ability to do so in the face of
semantics provide the benefit of utilizing a common language which allows for
abduction, induction and deduction. This will provide inference mechanism that is
Uncertainty exists in almost every life situation, and semantic web services
are no different. As authors of [63] said, one important issue with semantic web
services is the fact that they are embedded in background ontologies which
48
constraint the behavior of the involved entities. Semantic web provides a vision
query, the request is not one hundred percent crisp. Semantic description contains
to have the ability to deal with uncertainty. In these cases, we cannot assume exact
matches of inputs provided by the users, as we might not be able to comprehend it.
Since both web content and users query are vague or uncertain in nature, we need
Current semantic web services framework use first order logic and relies on
subsumption checking for matching process between goal and web services
capabilities. Authors of [71] said, Overtime, many people have responded to the
need for increased rigor in knowledge representation by turning to first order logic as
a semantic criterion. This is distressing since it is already clear that first order logic
intelligent agent using knowledge to interact with the world. In the real world,
concepts are not always subsumed by each other, and cannot always be classified
with semantic web ontology which is based on the concept of crisp logic. Semantic
web frameworks such as OWL are not equipped to deal with this uncertainty. They
49
assume that the knowledge base is crisp in nature thereby entirely eliminating the
concept of uncertainty.
For most part, classical theories where used in semantic web services for
knowledge base was assumed to be complete and precise. Hence there was a
need to extend non-classical theories to deal with uncertainty (both qualitative and
quantitative).
In recent years, probabilistic and possibilistic logic has been extended into
semantic web services to deal with uncertainty. The underlying principles behind
these approaches were annotating the ontologies with some kind of uncertainty
information about its axioms and use this information to perform uncertainty
reasoning [68]. The main issue with this approach was that these uncertainties were
asserted by humans who are not good at either predicting or perceiving concepts
The foundational problem with Semantic web ontology is that it is built upon
There are various models recommended to deal with this situation and handle
50
3.2.4 Fuzzy Logic Uncertainty Model
To deal with incomplete knowledge base, the combination of fuzzy logic with
combining fuzzy logic with probabilistic logic to complement each other and provide
best of both worlds. Fuzzy set theory classifies object into fuzzy sets (sets with
fuzzy boundaries) along with the degree of membership associated with each object
in the set. Figure 8 [72] illustrates Web Services Framework using Fuzzy Set Logic.
51
The main steps involved in Semantic web services with integration of fuzzy logic are
[72]:
1. Scope and rules specification: Domain experts specify both the scope and
rules; these rules are matched in the rules matching phase with the web
service description
2. Fuzzy set generation: fuzzy set is then generated based on the scope
the history of how often it is used. This is then stored in a local database and
4. Define fuzzy rules: Two fuzzy sets are defined; one is a fuzzy set of weights,
and the second is a fuzzy set of distance, which will be used in the matching
distance algorithm during the matching process. These fuzzy sets are used
in conjunction.
5. Model for fuzzy matching: all services that have been matched are stored in
database with associated weights, distance and matched values. Results are
sorted in indexed order based upon weights. The fuzzy matching algorithm
Algorithm 1: FuzzyMatching
1. for i 1 to n do
2. initiate new thread
52
3. member S[i]
4. weight W[i]
5. if weight is High then
6. distance Approximate
7. else if weight is Medium then
8. distance Close
9. else if weight is Short then
10. distance Exact
11. end if
12. service Fetch Web service
13. result call ApproximateMatchingAlgorithm(service, member, distance)
14. if result > 0 then
15. Store service in database
16. end if
17. Sort stored services
18. for each stored service
19. initiate new thread
20. O[1..n] service.outputParameters
21. service Fetch Web service
22. I[1..n] service.inputParameters
23. temp false
24. for i 1 to n do
25. if O[i] = I[i] then
26. temp true
27. else
28. temp false
29. break loop
30. end if
31. end for
32. if temp = true then
33. link services and store in database
53
34. end if
35. end for
36. end for
specified by the service provider and rules specified in the first step are
operations.
pool of all web services. Final web service is selected by domain expert
Table 2 and 3 below show how fuzzy logic in integrated with Semantic web to deal
with uncertainty.
54
Possible use of
Formal
Capabilities Implicit Semantics Powerful (soft)
Semantics
Semantics
Using fuzzy or
Building Analyzing word co-
probabilistic
ontologies either occurrence patterns in
clustering to
automatically or text to learn
learn taxonomic
semi- taxonomies/ontologies
structures or
automatically
ontologies
Annotation of Analyzing word Using fuzzy or
unstructured occurrence patterns or probabilistic
content with hyperlink structures to clustering to
respect to these associate concept learn taxonomic
ontologies names from and structures or
(resulting in ontology with both ontologies OR
Bootstrapping semantic resources and links using fuzzy
Phase metadata) between them ontologies
(building Using clustering
Using an
KR mechanisms
phase) techniques or Support to represent
Entity ontology for
Vector Machines (SVM) ontologies that
Disambiguation Entity
for Entity may be used for
Disambiguation
Disambiguation Disambiguation
Semantic
Integration of Analyzing the extension Schema based
different of the ontologies to integration
schemas and integrate them techniques
ontologies
Semantic
Metadata Analyzing annotated This enrichment
Enrichment resources in conjunction could possibly
(further with an ontology to mean annotating
enriching the enhance semantic with fuzzy
existing metadata ontologies.
metadata)
55
Possible use of
Implicit Formal
Capabilities Powerful (soft)
Semantics Semantics
Semantics
Hypothesis
Complex Query
validation
processing
queries
Word frequency Providing
and other CL confidence
Using Formal
Question techniques to levels in answer
ontologies for
Answering analyze both the based on fuzzy
QA
question and concepts or
answer sources probabilistic
Analyzing
Using
occurrence of
hypernymy,
Concept-based words that are
partonomy and
search associated with
hyponymy to
a concept, in
improve search
resources
Utilization Analyzing semi- Using ontologies
Phase Connection and structured data to extract
pattern explorer stores to extract patterns that are
patterns meaningful
Word frequency
and other CL
Using formal Using Fuzzy KR
techniques to
Context-aware ontologies to mechanisms to
analyze
retriever enhance represent
resources that
retrieval context
match the
search phrase
Using ontologies
Dynamic user to dynamically
interfaces reconfigure user
interfaces
Analyzing User profile will
content to have ontology
Interest-based identify concept associated with
content delivery of content so as it which contains
to match with concepts on
interest profiles interest
Navigation
searches will Fuzzy matches
Navigational Discovery style
need to analyze for research
and Research queries
unstructured search results.
content
56
CHAPTER 4
Intelligent systems have to deal with knowledge uncertainty in practically every real
world situation as much of the knowledge base is based on human knowledge which
models such as fuzzy logic, rough set theory and so forth and mapped best fitted
uncertainty model to data mining and semantic web services application. For
intelligent systems to deal with this uncertainty there has to be a proper design and
architecture in place. The focus of this chapter is to discuss design and architecture
the solution of NP-complete problems, for which an exact solution cannot be derived
in polynomial time [74]. Soft computing techniques can work around knowledge
of finding exact solutions cannot be applied anymore in todays world which is highly
57
unpredictable. Hence the need for soft computing came about to deal with
uncertainty.
Guiding principle of soft computing as Zadeh said in [75] is to: exploit the
tractability, robustness and low solution cost. The main constituents of soft
computing are Neural Network (NN), Fuzzy Logic, Evolutionary Algorithm, and
above to create Intelligent Systems which can solve the problem in hand. Zadeh
distinct methodology for addressing problem in its domain; these methodologies are
methodologies provides soft computing with the cutting edge which is missing in
other techniques.
58
natural evolution and Darwins theory of survival of the fittest. Natural
new solutions are created; and suitable solution is selected depending upon
These steps undergo series of iteration before the final solution is chosen.
from other methodologies as it aims for an optimized solution rather than just
a good solution and also makes use of historic data to gain better
of human knowledge.
59
2. Earlier soft computing would aim for good solutions versus optimal solution;
4. Ability to handle real world applications by dealing with uncertainty rather than
ignoring it.
5. Support various applications for which mathematical models are not available
or inflexible.
60
Next sections deal with design and architecture of intelligent systems with
uncertainty.
The essence of designing an intelligent system lies in its ability to effectively control
replica of a human expert making similar decision if they were placed in the situation
in which the intelligent system is operating. In the closed environment, where all
elements are accurately defined, and with minimal scope of change or introduction
of new or unknown elements, intelligent system can very precisely perform an action
for which they are designed. Design of such intelligent systems will be focused on
and rules. The variable which makes it harder to achieve this goal is uncertainty; it
ignoring this variable; it should be well considered during early stages of design
61
4.2.1 Main Aspects of Design
1. Uncertainty in Objects
itself; these sensors help objects maintain their integrity with in parameters
defined during the design stage. If at any time, any measurement goes out of
measurements, then objects would filter the data to ensure it can ignore noise
in the data. In situations, where knowledge base is not fully equipped with
rules and facts to help objects, different soft computing techniques are used
An example would be the use of rough set theory, which can help
identify the uncertain situation. Through the use of approximation and rough
best of the ability; the success of handling uncertainty through rough set
facts in the knowledge base. Change is another factor which has to be dealt
62
with, since environment can change anytime. There could be many other
them as well. An interaction of the main object with other objects in the
system is dependent upon the nature of other objects which could be very
uncertain in nature. Due to these reasons, this level is a little hard to deal
store, the sale of the store can go down steeply; for an intelligent system, it
becomes critical to understand if this was just one time thing due to bad
Because there are many different factors at this level, the best way to
well defined, then multi-valued logic can hold precisely almost all the
systems were less complicated, the scope of the problem would seldom
63
change; expansion of horizon. But recently, as systems have become more
complex and evolved to next level, change is the only thing that is constant.
various variables. At this level, it is critical for the system to act intelligently
of its current situation and predict future evolution when the modifications are
Rough set theory can be used at this level to help with decision making
and self learning. For optimization, we can use various hybrid soft-computing
Traditional design frameworks are pretty effective and efficient in handling many real
world applications; main shortcoming is that they aim for a solution instead of an
optimized solution. Similar to field of agriculture, where hybrid seeds are created
64
been developed for optimization. Fuzzy logic and hybrid frameworks are two
1. Fuzzy Logic
class of objects into smaller granules in such a manner that the objects with in
granules are similar in nature, and objects amongst different granules are
distinct in nature.
which objects belong to which granule. In addition to typical black and white
zone, there is a grey zone, where granules cannot be easily discerned from
each other; the boundary line that divides one granule from another is fuzzy
than numbers which help bridge the gap between machine language and
human knowledge. These words act like labels of fuzzy granules; the ability
uncertainty thereby making systems more robust and flexible in dealing with
65
using words as labels for granules, can be practically used in every field
speed, due to the large search space. Current state of Artificial Neural
about various aspects of the network and the problem domain [73]. With the
networks which offer two main features: evolution and learning. These
based on Darwins theory of survival for the fittest. The selection process is
such that the desirable behaviors and features are passed on to the next
66
1. Evolution introduced at weight training level
global optimal solution rather than local optimum solutions. Here is the
IV. Genetic operators are then applied to each child individual created
in step (III) to further reproduce next generation.
VI. End
67
coding can be used in these cases to improve scalability such as,
this algorithm is similar to the algorithm at the weight level, except this
IV. Genetic operators are then applied to each child individual created
in step (III) to further reproduce next generation.
VI. End
operating. The same learning rules are applied to the entire network and
the architecture is set up in such a manner, that for every learning rule
rate. The algorithm for Evolutionary search for learning rules as in [73]:
this algorithm is very similar to previous two algorithms, with the exception
68
II. Fitness of each network is evaluated depending upon the problem
in hand.
IV. Genetic operators are then applied to each child individual created
in step (III) to further reproduce next generation.
VI. End
architecture at the highest level. Through this, we minimize the search space.
available at our disposal. This can lead to an exhaust list of possible solutions, and
the best one has to be chosen depending upon the nature of the problem in hand.
evaluated depending upon different criterions such as: speed vs. accuracy. Best
solution chosen to the problem could be the one which uses the least amount of
69
computational resources, or the one that provides more accuracy irrespective of the
computational speed.
success, and success is the achievement of behavioral goals [33]. The success of
which can be used to implement intelligent systems that can deal with uncertainty.
These architectures at a higher level identify the main modules that are required
systems that can deal with uncertainty. In this section, few of these architectures
have been explored with the focus on how each of them deals with uncertainty.
Basic Architecture
The basic architecture is a simple architecture which receives an input X; this input
represents the problem in hand. This could be the application being worked upon,
such as, data mining, or e-services. Function 1 can be implemented through any
soft-computing techniques such as fuzzy logic, or rough set theory whichever fits the
best depending upon the nature of the problem. Once it receives the input, it
70
Figure 10: Basic Architecture for Intelligent Systems
calculate degree of belief (grade of membership) of each data value to the clusters.
The output is the data clustered together along with the value of degree of belief.
set of applications. This is usually applied in cases where one soft computing
technique can solve the problem in hand. Intelligent systems based on this type of
architecture can be easily implemented, but may not be very efficient in solving
complex situations.
Hybrid intelligent systems are becoming very popular due to their ability to be
different techniques offers the best of both worlds; they utilize the best of AI
techniques to implement intelligent systems that are more efficient and effective.
There are three general approaches to the architectures of the intelligent system
[79]:
71
1. Sequential Type
exists at the level of user initiating the query. Originally when the user input
processes the data, and creates fuzzy sets. Function 2 can be implemented
72
2. Parallel Type
Figure 12; there could be few variations to this. These two functions
problem, and then Function 3, will choose the better solution and give that as
a final output. If this was the set up of the problem, then uncertainty needs to
be handled at the front level only when input X is received. Function 1 & 2
could be implemented using Fuzzy logic and Rough set theory. Function 3
will choose the better of two solutions which can be performed using neural
networks.
performing different functions (narrow & broad) and then function 3 will
aggregate their inputs to produce the final output. In this case, uncertainty
had to be dealt with at two levels; front level when input X is received and
73
two solutions, if there was still some uncertainty, it could be dealt by Function
3. Feedback Type
This is a type where function 1 performs the main function required, and
function 2 is there to fine tune the parameters of function 1 (figure 13), so that
subsumption.
74
Uncertainty can be dealt with at the level of receiving an input. In this
Evolutionary algorithm.
Different types of architecture for hybrid system, is based upon mix and
match of different soft-computing techniques. This mix and match offer the best of
AI.
This is a specialized type of architecture for hybrid system, where one function is
evolutionary algorithm and other one can be chosen from a pool of soft-computing
techniques available.
intelligence techniques such as fuzzy logic, multi-valued logic. Abraham and Grosan
in [73] have discussed several architectures for evolutionary intelligent systems. For
intelligent paradigm in return can help optimize evolutionary algorithm. Hence both
help each other to obtain the level of optimization. Figure 14 from [73] shows the
75
Figure 14: Evolutionary Intelligent System Architecture [73]
computing techniques, hybrid systems have been developed which are fully capable
of handling real world applications. Next section provides us with an example of real
world application making use of Intelligent System. This will help us understand
intelligent systems. In this section, we will take a look at a real world application of
intelligent system called Adaptive Neuro Fuzzy Inference System (ANFIS) [86]. As
the name suggests, this Intelligent System is based upon combination of neural
networks and fuzzy logic. Neural networks enable recognition of patterns and
becoming adaptive to the changing environment the agent operates in. Similarly,
fuzzy logic provides the capability of inference and decision making through
knowledge base. ANFIS plays a critical role in the field of signal processing; we will
see how this intelligent system can help with noise cancellation in signal processing.
76
The application of noise cancellation can be applied to many real world applications
purposes we will take a look at one specific application in field of medicine where it
desired signal [86]. The ultimate goal is to cancel or reduce the noise from the
signal so it does not distort the signals which can cause misinterpretation. The
by identifying the non-linear model between a measureable noise source and the
of noise in the system and then subtracting this from the signal. The effectiveness of
level. It is a critical step in translating signals properly to what they truly represent;
passage dynamics that transforms noise source into noise estimate in a detected
signal [86]. ANFIS architecture is composed of neural network and fuzzy logic; we
Neural Network
Neural Networks has already been mentioned in Section 4.2.2. ANFIS is based
77
Back Propagation
This learning algorithm is based upon Widrow-Hoff learning rule which is used
to train multi layer feed forwards networks. Training of networks involve usage of
input and their corresponding output vectors until network is trained to approximate a
function, and is able to provide association between input and output vectors as
expected. Through this training, network learns to associate input with output.
Back propagation refers to the manner in which gradient is computed for non-
linear multi layer networks. When a back propagation is properly trained, they are
able to associate, infer and make precise decisions when presented with an
unknown input. Usually through similarity in inputs, it will lean towards the correct
output. This is based on two phases of data flow. First phase is where the input is
propagated from the input layer to the output layer, producing the output. Second
phase is where error signal is being propagated from the output layer to the previous
Fuzzy Logic
Fuzzy logic has already been discussed in greater details in section 2.2. ANFIS is
given input to an output using fuzzy logic [86]. Figure 15 [89] illustrates the
functional block of fuzzy inference system. This system takes an input which is a
crisp set, and returns the crisp output through weighed average.
78
Figure 15: Basic Configuration of a Fuzzy Logic System [89]
problems; once such problem is to suppress maternal ECG from fetal ECG.
by mother for safety of both the mother and the baby. Many health problems of a
new born baby can be reduced by monitoring fetuss heart rate, since heart rate is
an important indicator of health [90]. ECG, which stands for electrocardiogram, can
be recorded and processed to derive this heart rate. Maternal ECG represents
mothers ECG and Fetal ECG represents fetuss ECG. While trying to get
measurements for fetal ECG, there is interference from maternal ECG. Hence it is
crucial to suppress maternal ECG from Fetal ECG while measuring abdominal signal
79
ANFIS comes into play to deal with maternal ECG; we will look at details of
how maternal ECG is handled through ANFIS as discussed in [86]. Fetal ECG x(k)
is recorded through abdominal signal y(k) via a sensor in abdominal region. During
the process of recording y(k), this signal gets mixed (noisy) with mothers heartbeat
n(k) which acts as a noise. n(K) can be easily measured in this case through
thoracic signal obtained via a sensor placed at thoracic region. Noise does not
appear directly in y(k), but only appears in bits and pieces which distorts the signal
Function B represents the path that noise signal n(k) takes; if path was
known, then we would get the original signal through y(k) d(k). Since, it is an
unknown factor and time variant due to changing environment, it is not simple
enough. (k) is distorted noise signal which is an added component on top of y(k).
minimize error. The ANFIS approach to cancel noise cancellation works only when
[86]:
80
2. Zero mean value for x(k)
In our case of suppressing maternal ECG from fetal ECG; information signal
x(k) is of sinusoidal form and noise is a random signal. ANFIS performs calculation
Figure 16: Maternal ECG Cancellation in Abdominal Signal using ANFIS [87]
81
In the real world, issue of accurately predicting Fetal ECG without the
measurement. There will be lot of interference from Maternal ECG which could not
were able to get a good estimation of fetal ECG. Even though this value contains
some error, but in comparison outperforms and gives a better prediction of fetal ECG
Intelligent system that was implemented through neural network only would
be a little complex to train the network, and the measurement of error in the
estimated signal will be higher. On the contrary, if fuzzy logic was used, then it will
be hard to create all if-then rules, since the environment is complex. While ANFIS is
dependent upon both neural network and fuzzy logic, it gets the best of both worlds.
Measurement of error obtained through ANFIS is not zero and does represent high
ANFIS is just one of many examples of intelligent system that are being used
in real world applications to solve complex problems that involve uncertainty. ANFIS
Lot of work has been conducted in this field; R. Swarnalath, and D.V. Prasad
incorporated ANFIS with wavelets for maternal ECG cancellation as in [87]; more
beginning to end; it preprocesses the input, so the input is accepted by the intelligent
82
system, transforms this input through various uncertainty models to effectively
handle uncertainty that exists in data and then finally produces the output. An
intelligent system that is implemented to handle uncertainty can handle real world
ignored.
83
CHAPTER 5
5.1 Conclusion
Artificial intelligence is an ever growing field with a lot of scope for research and
advancement. It has recently gained a lot of popularity through its ability to handle
real world situations; since then, many new theories and methodologies have been
uncertainty is a part of our day to day lives. For the invention of more robust
intelligent system, that can think and act like a human being, we have to apply
from probabilistic and possibilistic theory to combination of these two. This essay
looked at four main uncertainty models: Fuzzy logic, Rough set theory, Multi-valued
logic, and Bayesian network.. These models share one common goal: to handle
approach to handle uncertainty varies amongst these models, and some are more
effective in certain domain depending upon the nature of domain in question. Hence
we cannot claim that one model is better than another because the solution to be
84
Hybridization of soft computing techniques provides a cutting edge to the
hybrid intelligent systems. These systems can handle complex system efficiently
be solved. More and more research is conducted in hybridization and lot of work
has been conducted to start using this in our day to day lives.
Data mining and semantic web services are two different applications with the
need to handle uncertainty that exist at different levels. Different models have been
identified and used for these purpose; this essay recommends fuzzy logic for
handling different uncertainties that exist in these applications. For data mining,
fuzzy logic proves to be a good model to deal with uncertainty in Data mining. This
algorithm based on membership function and degree of belief, and can handle what
data mining needs. Its ability to transform crisp sets into fuzzy sets along with the
value of degree of belief which signals which objects belongs more to the set in
comparison to other objects in the same set; Its ability to search for hidden patterns
through huge amounts of data. These features make fuzzy logic suitable for data
mining.
Similar to Data mining, fuzzy logic is recommended for semantic web service
domain. Uncertainty exists in semantic web service at different levels, from user
initiating query to finding results that match the query. Fuzzy logic generates a fuzzy
set to understand user query and when they system retrieves the results against
users query; it creates two fuzzy sets which contains weights and distance
information to display the results in the order of most relevant to least relevant.
85
Concept of fuzzy sets in fuzzy logic sets itself apart from other soft-computing
techniques.
The design and architecture play a central role in the success of intelligent
system. More and more algorithms have been developed recently to achieve
which is applied in the field AI; this definitely helps to get rid of features that are not
very viable, thereby reducing the search space. At the design level, dealing with
uncertainty at object, environment and goal level help to deal with uncertainty at an
architecture level. Therefore, having a right design and architecture for intelligent
neural network and fuzzy logic useful in suppressing maternal ECG from fetal ECG.
As more and more work is conducted in the field of Artificial Intelligence and
uncertainty, new architectures are being evolved which can handle any complex
problem with efficiency and accuracy. That day is not far away, when Artificial
Knowledge uncertainty in intelligent system has come a long way from the initial
state where intelligent systems were used for basic computation, to todays era,
where intelligent systems have been practically evolved to handle complicated real
life situations. The success of these intelligent systems depends upon their abilities
86
to handle uncertainty. Future research should be conducted to create more hybrid
models which are generated through mix and match of available models. To handle
compacting the environment which is smaller yet a true representation for the world
it represents.
87
REFERENCES
[1] L.A. Zadeh, The Role of Fuzzy Logic in the Management of Uncertainty in
Expert Systems, Fuzzy Sets and Systems, Volume 11, Issues 13, pp. 199
227, 1983.
[5] A. Celikyilmaz and I.B. Turksen, Modeling Uncertainty with Fuzzy Logic, p.
400. Heidelberg, Germany: Springer, 2009.
[7] Y.Y. Yao, A Comparative Study of Fuzzy Sets and Rough Sets, Information
Sciences, Volume 109, Issues 1-4, pp. 227242, 1998.
[9] S. Greco, B. Matarazzo and R. Slowinski, Rough Sets theory for Multicriteria
Decision Analysis, European Journal of Operational Research, Volume 129,
Issue 1, pp. 1-47, 2001.
[11] M.Bit, T. Beaubouef, Rough Set Uncertainty for Robotic Systems, Journal of
Computer Sciences in Colleges, Volume 23, Issue 6, pp. 126-132, 2008.
[12] Stuart Russell, and Peter Norvig, Artificial Intelligence: A Modern Approach,
Second Edition, p. 986. Upper Saddle River, N.J.: Prentice Hall, 2002.
88
[15] L.A. Zadeh, "Knowledge Representation in Fuzzy Logic," 1989 IEEE
Transactions on Knowledge and Data Engineering, Volume 1, Issue 1, pp.
89-100, March 1989.
[17] Pawlak, Z., and Skowron, A., Rough membership functions, Advances in the
Dempster Shafer Theory of Evidence, p. 251-271, New York, NY: John Wiley
and Sons Inc., 1994.
[20] D. Dubois, and H. Prade, Possibility Theory, Probability Theory, and Multiple
Valued Logics, Journal of Mathematics and Artificial Intelligence, Volume 32,
Issues 1-4, pp. 35-66, August 2001.
[21] B.G. Buchanan and R.O. Duda, Principles of Rule-Based Expert Systems,
Advances in Computers, Volume 22, pp. 164-218, 1984.
89
[28] B. Sarif and M. Abd-El-Barr, Synthesis of MVL Functions Part I: The
Genetic Algorithm Approach, Proceedings of the International Conference on
Microelectronics, pp. 154-157, 2006.
[31] Dueck, G. W. and Miller, D. M., "A Direct Cover MVL Minimization: Using the
Truncated Sum," Proceeding of the 17th International Symposium on multi-
valued logic, pp. 221-227, May 1987.
[32] A. Borgi, K. Ghedira, and S.B.H. Kacem, Generalized Modus Ponens Based
on Linguistic Modifiers in a Symbolic Multi-valued Framework, Multi Valued
Logic, 38th International Symposium, pp. 150-155, 2008.
[33] J.S. Albus, Outline for a Theory of Intelligence, Proceedings of the 1991
IEEE International Conference on Systems, Man, and Cybernetics, Volume
21, Issue 3, pp. 473-509, 1991.
[34] C.J. Butz and J.Liu, A Query Processing Algorithm for Hierarchical Markov
Networks, 2003 IEEE/WIC International Conference on Web Intelligence
(WI03), pp. 588-592, 2003.
[36] L.R. Rabiner, B.H. Juang, An Introduction to Hidden Markov Models, IEEE
ASSP Magazine, Volume 3, Issue 1, pp. 4-16, 1986.
[37] Ralph L. Wojtowicz, Non-Classical Markov Logic and Network Analysis, 12th
International Conference on Information Fusion, pp. 938-947, 2009.
90
[40] D. Hand, H. Mannila, and P. Smyth, Principles of Data Mining, p. 546.
Cambridge, England: MIT Press, 2001.
[45] J. Han, and M. Kamber, Data Mining: Concepts and Techniques, Second
Edition, p. 386. San Francisco, CA: Morgan Kaufmann Publishers, 2006.
[52] S. Mitra, S.K. Pal, and P. Mitra, Data Mining in Soft Computing Framework:
A Survey, IEEE Transactions on Neural Networks 13, Volume 1, pp. 314,
2002.
91
[53] D. Berardi, D. Calvanese, G. Giacomo, M. Lenzerini, and M. Mecella, A
Foundational Vision of E-Services, Web Services, E-Business, and the
Semantic Web, Volume 3095, pp. 28-40, 2004.
[58] G. Yee, and L. Korba, Negotiated Security Policies for E-Services and Web
Services, Proceedings of the 2005 IEEE International Conference on Web
Services, pp. 1-8, 2005.
[61] D. Parry, Tell Me the Important Stuff - Fuzzy Ontologies And Personal
Assessments For Inter action With The Semantic Web, Proceedings of the
2008 IEEE World Conference on Computational Intelligence, pp. 1295-1300,
2008.
[62] E. Sirin, and B. Parsia, Planning for Semantic Web Services, International
Workshop Semantic Web Services at ISWC, pp. 1-15, 2004.
92
[64] H. Haas and A. Brown (2004). Web Services Glossary,
http://www.w3.org/TR/wsgloss/, 16 September 2010.
[72] K. Shehzad, and M. Javed, Multithreaded Fuzzy Logic based Web Services
Mining Framework, European Journal of Scientific Research, Volume 41,
Iusse 4, pp.632-644, 2010.
[74] L.A. Zadeh, Soft Computing and Fuzzy Logic, IEEE Transactions on
Computing, Volume 11, Issue 6, pp. 48-56, 1994.
[76] A. Korvin, H. Lin, and P. Simeonov, Knowledge Processing with Interval and
Soft Computing, p. 233. London, England: Springer, 2008.
93
[77] J. Jang, C. Sun, and E. Mizutani, Neuro-Fuzzy and Soft Computing: A
Computation Approach to Learning and Machine Intelligence, p. 614. Upper
Saddle River, N.J.: Prentice Hall, 1996.
[78] L.A. Zadeh, The Roles of Soft Computing and Fuzzy Logic in the
Conception, Design and Deployment of Intelligent System, Proceedings of
IEEE Asia Pacific Conference on Circuits and Systems, pp. 3-4, 1996.
[79] V. Vasilyev, and B. Ilyasov, Design of Intelligent Control Systems with Use of
Soft Computing: Conceptions and Methods, Proceedings of the 15th IEEE
International Symposium on Intelligent Control, pp. 103-108, 2000.
[82] S. Kok, and P. Domingos, Learning Markov Logic Network Structure via
Hypergraph Lifting, ACM Proceedings of the 26th Annual International
Conference on Machine Learning, pp. 505-512, 2009.
[83] J.S. Albus, A Reference Model Architecture for Intelligent System Design,
Proceedings of the 1996 IEEE International Conference on Systems, Volume
1, Issue 1, pp. 15-30, 1996.
94
[88] B.B. Jovanovic, I.S. Reljin, and B.D. Reljin, Modified ANFIS architecture
improving efficiency of ANFIS technique, Neural Network Applications in
Electrical Engineering, pp. 215-220, 2004.
[89] G. Luiz, C.Abreu, and J. Ribeiro, On-line Control of a Flexible Beam Using
Adaptive Fuzzy Controller and Piezoelectric Actuators, SBA Control and
Automation, Volume 15, Issue 14, pp. 377-383, 2003.
[90] G. Clifford, Fetal and Maternal ECG, Biomedical Signal and Image
Processing, Volume 2, Issue 1, pp. 1-10, 2007.
95