1st International Symposium on Advances in Artificial Intelligence and Applications
Neural Network Combining Classifier Based on Dempster-Shafer Theory 186Benmokhtar Rachid, Huet Benoit,pages 3 – 10. Show abstractIn this paper, we propose an improved version of RBF network based on Evidence Theory (NN-ET) using one input layer and two hidden layers and one output layer, to improve classifier combination and recognition reliability in particular for automatic semantic-based video content indexing and retrieval. Many combination schemes have been proposed in the literature according to the type of information provided by each classifier as well as their training and adaptation abilities. Experiments are conducted in the framework of the TrecVid 2005 features extraction task that consists in ordering shots with respect to their relevance to a given class. Finally, we show the efficiency of NN-ET combination method.
V-Detector algorithm with tree-based structures 172Chmielewski Andrzej, Wierzchoń Sławomir T.,pages 11 – 16. Show abstractV-Detector is real-valued negative selection algorithm designed to detecting anomalies in datasets. Many of previous experiments were focused on analysis usability of this algorithm to detection intruders in computer network. Intrusion Detection Systems (IDS) should be efficient and reliable due to large number of network connection and their diversity. Additionally, every connection is described as a record containing tens of numerical and symbolic attributes. Choosing proper structures to store samples of normal connections and detectors can meaningfully speed up the processes of learning and classification of observations; further reduced utilization of processor can be gained. In this paper we examine usefulness of some tree-based structures to represent data in an IDS.
Local search and nature based metaheuristics: a case of flowshop scheduling problem 155Duda Jerzy,pages 17 – 24. Show abstractIn recent years some new nature based metaheuristics for solving a flowshop scheduling problem have appeared in literature. Most of them use some local search procedures in order to provide good approximation of the optimal solutions. Their authors prove that metaheuristics hybridised with a local search perform better than without the local search improvement. In this paper the author tries to find an answer to even more important question: do such hybrid metaheuristics perform better than their local search algorithms run independently?
BRILI, an English-Spanish Cross-Lingual Question Answering System 69Ferrández Sergio, Ferránndez Antonio, Roger Sandra, López-Moreno Pilar and Peral Jesús,pages 25 – 31. Show abstractThis paper describes the BRILI Cross-lingual Question Answering (CL-QA) system. The BRILI system is capable to answer English questions from Spanish documents. It has participated in the 2006 edition of the Cross-Language Evaluation Forum (CLEF). Some new characteristics in BRILI are described, especially the strategy used for the question processing module in which the Inter Lingual Index (ILI) Module of EuroWordNet (EWN) is used with the aim of reducing the negative effect of question translation on the overall accuracy. The experimental evaluation on the CLEF 2004 set of questions reveals a precision of 34%, which implies an improvement of around 20% in precision in comparison with the monolingual QA system using machine translation. These experiments prove that our system obtains better results than using machine translation and other current bilingual QA systems.
Creating and Application of Maps of Concepts for DL Ontologies 115Goczyła Krzysztof, Waloszek Wojciech, Zawadzka Teresa, Zawadzki Michał,pages 33 – 39. Show abstractIn previous work a novel knowledge representation, called Knowledge Cartography, was introduced. The method allows for description, in the form of a map of concepts, of interrelationships among concepts distinguished in a terminology and for gradual (with growth of our knowledge) assignment of individual objects to those concepts. Effectiveness of the process of building map of concepts is a key factor influencing usability of the method. This paper presents a new map-creating algorithm TreeFusion which exploits binary decision diagrams originally developed for supporting VLSI design. The paper presents also some current applications of Knowledge Cartography.
Implementation and Tests of Clustering over Vertically Distributed Data without Sharing their Value 128Gorawski Marcin, Słabiński Łukasz,pages 41 – 50. Show abstractThis paper describes an experimental implementation of a secure distributed k-Clustering algorithm which applies the homomorphic encryption scheme for preserving privacy of data. We show in details technical aspects of the homomorphic encryption implementation and using it in a distributed application. The implementation enables us to evaluate experimentally the efficiency of the clustering algorithm, indicate its limitations and potential bottlenecks. Finally, we provide some conclusions and propose further work in this area.
Text Normalization as a Special Case of Machine Translation 202Graliński Filip, Jassem Krzysztof, Wagner Agnieszka, Wypych Mikołaj,pages 51 – 56. Show abstractThe paper presents some problems of text normalization for an inflected language like Polish. It is shown that text normalization, being usually one of the first steps of text-to-speech synthesis, is a complex process, referring to several levels of analysis: morphological, syntactical and semantic. It is claimed here that text normalization may be treated in a similar way to Machine Translation: The tools and algorithms developed for Machine Translation may be used for text normalization, with the ``spoken language'' being treated as the target language.
A Data Clustering Algorithm Based On Single Hidden Markov Model 101Hassan Md. Rafiul, Nath Baikunth, Kirley Michael,pages 57 – 66. Show abstractThe ability to cluster data into different groups based on a particular similarity measure has a wide appeal in many domains, including: data mining, image classification, speech recognition, fraud detection and in network traffic anomaly detection. Typically, the clustering algorithm partitions a dataset into a fixed number of clusters supplied by the user. In this paper, we propose a single Hidden Markov Model (HMM) based clustering method, which identifies a suitable number of clusters in a given dataset without using prior knowledge about the number of clusters. Initially, the dataset is partitioned into windows of fixed size based on the HMM log likelihood values. This provides a framework for identifying the most appropriate number of clusters (windows of varying sizes). After determining the number of clusters, the data values are then labeled and allocated to clusters. The algorithm is tested using a number of benchmark datasets. The proposed algorithm for both small and large datasets (KDD 1999 Intrusion Detection dataset) performed significantly better compared to other commonly used clustering algorithms.
Fault Modeling for Nonlinear Systems Using ANFIS 141Khosravi Abbas, Lu Jie,pages 75 – 83. Show abstractIn this paper we develop a new method to model occurred faults in different parts of nonlinear systems. Using an Adaptive Neuro-Fuzzy Inference System (ANFIS) we build a model for faultless plant which is used in the procedure of fault modeling. The considered model for fault is again an ANFIS system and its parameters are adjusted in an indirect way using difference between actual output and output of plant model. Simulation results on a nonlinear system are shown in this paper and they clearly demonstrate the capability of the proposed method for fault modeling.
Image Classification with Customized Associative Classifiers 124Kobyliński Łukasz, Walczak Krzysztof,pages 85 – 91. Show abstractIn recent years the concept of utilizing association rules for classification emerged. This approach proved to often be more efficient and accurate than traditional techniques. In this paper we extend existing associative classifier building algorithms and apply them to the problem of image classification. We describe a set of photographs with features calculated on the basis of their color and texture characteristics and experiment with different types of rules, which make use of the information about the existence of a particular feature on an image, its occurrence count and spatial proximity to accurately classify the images. We suggest using association rules more closely tied to the nature of image data and compare the results of classification with simple rules, taking into consideration only the existence of a particular feature on an image.
On evolutionary approach for determining defuzzyfication operator 162Kosiński Witold, Markowska-Kaczmar Urszula,pages 93 – 101. Show abstract Ordered fuzzy numbers that make possible to deal with fuzzy inputs quantitatively, exactly in the same way as with real numbers, are defined together with four algebraic operations. For defuzzyfication operators that play the main role when dealing with fuzzy controllers and fuzzy inference systems an approximation formula is given. In order to determine it when a training set is given a dedicated evolutionary algorithm is presented. The form of the genotype composed of three types of chromosomes is given together with the fitness function. Genetic operators are also proposed.
Linear Ordering of Observation Space for Pattern Recognition 153Kulikowski Julisz Lech,pages 103 – 109. Show abstractThe problem of linear ordering of observation space as a novel approach to pattern recognition based on non-parametric, combinatorial statistical tests is presented. The problem consists in ordering the elements of a discrete multi-dimensional observation space along a curve such that the elements belonging to different similarity classes are close each to each other as much as possible, the similarity classes are mutually separated and the length of the curve is minimized. Such a problem is, in general, NP-difficult and it is shown how its approximate solution can be reached. Its effective solution leads to a construction of pattern recognition algorithm based on combinatorial (serial) statistical tests.
EMOT—Evolutionary Approach to 3D Computer Animation 185Kwasnicka Halina, Wozniak Piotr,pages 111 – 121. Show abstract Key-framing and Inverse Kinematics are popular animation methods, but new approaches are still developed. We propose a new evolutionary method of animation creation -- system EMOT (Evolutionary MOTion). It allows for automation of motion of animated characters. It uses new evolutionary approach -- Gene Expression Programming (GEP). Characters are controlled by computer programs, an animator has to provide the way of motion evaluation. GEP works with a randomly selected initial population, uses directed but random selection. Experiments show that the proposed method can develop robust controllers.
Documents clustering method based on Ants Algorithms 114Machnik Łukasz,pages 123 – 130. Show abstractAnts Algorithms, particularly Ant Colony Optimization meta-heuristic, are universal, flexible and give possibility to scale because they are based on multi agent cooperation. The increase of demand for effective methods of big document collections management is sufficient stimulus to place the research on the new application of ant based systems in the area of text document processing. In this publication author presents the implementation of that technique in the documents clustering area – new documents clustering method. The aim of this document is to present the details of the ACO documents clustering method and detail results of experiments.
A Data Quality Assessment Algorithm with Applications in Predictive Toxicology 109Malazizi Ladan, Neagu Daniel, Chaudhry Qasim,pages 131 – 140. Show abstractLack of the quality of the information that is integrated from heterogeneous sources is an important issue in many scientific domains. In toxicology the importance is even greater since the data is used for Quantitative Structure Activity Relationship (QSAR) modeling for prediction of chemical toxicity of new compounds. Much work has been done on QSARs but little attention has been paid to the quality of the data used. The underlying concept points to the absence of the quality criteria framework in this domain. This paper presents a review on some of the existing data quality assessment methods in various domains and their relevance and possible application to predictive toxicology, highlights number of data quality deficiencies from experimental work on internal data and also proposes some quality metrics concluded from the results.
GA-Based Rule Extraction from Neural Networks for Approximation 161Mularczyk Krystyna, Markowska-Kaczmar Urszula,pages 141 – 148. Show abstractneural networks solving approximation problem. It is based on two hierarchical evolutionary algorithms with multiobjective Pareto optimisation. The lower level algorithm searches for rules that are optimised by the upper level algorithm. The conclusion of the rule takes the form of a tree whose inner nodes contain functions and operators, and leaves—identifiers of attributes and numeric constants. The details referring to the rules encoding, genetic operators and fitness function are described. Some preliminary results of experimental studies are presented, as well.
Universal Method for Solving Timetabling Problems Based on Evolutionary Approach 93Norberciak Maciej,pages 149 – 157. Show abstractTimetabling problems are often hard and time-consuming to solve. Most of the methods of solving them concern only one problem instance or class. This paper describes a universal method for solving large, highly constrained timetabling problems from different domains. The solution is based on evolutionary algorithm’s framework and employs tabu search to speed up the solution finding process. Hyper-heuristics is used to establish operating parameters of the algorithm. The method has been used to solve three different timetabling problems with promising results of some preliminary experiments.
General Structure of T-Lymphocyte Applied to Immune-Based Event Detection in Financial Time Series 196Pelech-Pilichowski Tomasz, Duda Jan T.,pages 159 – 167. Show abstractThe paper is focused on the T-lymphocyte construction applied to immune-inspired event detection in financial time series. The goal is to recognize symptoms of abrupt changes of long-time mean value of many processed series. The task of the T-lymphocyte is to distinguish between `healthy' and `illness' states through examining individual series, with algorithms based on weak and rigorous statistical tests (detailed operation of detection is showed). General structure of the T-lymphocyte algorithm is illustrated. Comparison of the number of detected symptoms is presented.
Multiclassifier Approach to Tagging of Polish 148Piasecki Maciej, Wardyński Adam,pages 169 – 178. Show abstractThe large tagset, the limited size of corpora and the free word order are the main causes for achieving low accuracy of tagging Polish by applying the commonly used techniques based on stochastic modelling. The proposed architecture of the Polish tagger called TaKIPI created the possibility for using different types of classifiers in tagging, but only C4.5 Decision Trees were applied initially. The goal of this paper is to explore the range of promising rule based classifiers and to investigate their impact on the accuracy of tagging. Moreover, some simple techniques of combing the classifiers are also tested. The performed experiments showed that even a simple combination of different classifiers can increase the accuracy of the tagger by almost one percent.
Credibility Coefficients Based on Decision Rules 123Podraza Roman, Walkiewicz Mariusz, Dominik Andrzej,pages 179 – 187. Show abstractCredibility coefficients are heuristic measures evaluating similarity of objects in respect to other data in information systems (or decision tables). By applying knowledge discovery methods it is possible to reveal rules from the information system. However the knowledge obtained from the data can be not precise due to improper data. It is assumed that majority of data is correct and only a minor part may be improper. Credibility coefficients of objects are aimed to indicate to which group a particular object probably belongs. A main focus of the paper is set on an algorithm of calculating credibility coefficients. This algorithm is based on decision rules, which are generated using the rough set theory. Implementation and applications of credibility coefficients are presented in the paper. Discussion of some practical results of identifying improper data by credibility coefficients is inserted as well.
Applications of Graph Neural Networks to Large-Scale Recommender Systems, Some Results 57Pucci Augusto, Gori Marco, Scarselli Franco, Hagenbuchner Markus, Tsoi Ah Chung,pages 189 – 195. Show abstractRecent machine learning algorithms exploit relational information within a graph based data model. This paper considers one of the most promising supervised machine learning methods, the graph neural networks (GNN), for this class of problems. An application of the GNN approach to a recommender system problem revealed certain limitations of the GNN. A recommender system makes personalized product suggestions by extracting knowledge from previous user interactions. This is a specially interesting scenario because of the scale-free nature of graphs modelling user preference dependencies. We focused on the MovieLens dataset, which contains data collected from a popular recommender system on movies, and has been widely used as a benchmark problem for evaluating recently proposed approaches. This paper performs a deep analysis on the dataset to help discovery of some intriguing properties, and discusses problems and limitations encountered by GNN while facing this particular practical problem.
Neural modeling of steam turbines 173Sobańska Paulina, Szczepaniak Piotr,pages 197 – 205. Show abstractThis study shows that neural networks are useful for creation of precise models of industrial objects such as turbosets. The complex dependencies of two cooperative turbines were successfully grasped by the neural network what is enormously beneficial with the lack of precise mathematical model. The estimated error of the neural model varies within the range of a few percents, which is a satisfactory result.
Differential Evolution: Competitive Setting of Control Parameters 219Tvrdík Josef,pages 207 – 213. Show abstractThis paper is focused on the adaptation of control parameters in differential evolution. The competition of different control parameter settings was proposed in order to ensure the self-adaptation of parameters within the search process. Several variants of such algorithm were tested on six functions at four levels of search-space dimension. This competitive differential evolution proved to be more reliable and less time-consuming than the standard differential evolution. The competitive differential evolution also outperformed other algorithms that were tested.
Generating New Styles of Chinese Stroke Based on Statistic Model 88Xu Miao, Dong Jun,pages 215 – 222. Show abstractChinese calligraphy is one of the most important arts in Chinese history. It is entertainment form as well as embodiment of imagery thinking. In this paper, a statistical model based approach for generating new styles of Chinese character stroke is proposed. First of all, original calligraphy samples are aligned into the common co-ordinate frame, and a training set consisting of landmarks is generated semi-automatically. After that, several most significant features of the training set are extracted, and a statistical model is built in order to generate strokes with new styles. Then, Bezier curve is used to fit the discrete contour data. Some results are demonstrated finally.
An Approximation Branch-and-Bound Algorithm for Fuzzy Bilevel Decision Making Problems 91Zhang Guangquan, Lu Jie, Dillon Tharam,pages 223 – 231. Show abstractOrganizational decision making often involves two decision levels. When the leader at the upper level attempts to optimize his/her objective, the follower at the lower level tries to find an optimized strategy according to each of possible decisions made by the leader. Furthermore, such bilevel decision making may involve uncertain parameters which appear either in the objective functions or constraints of the leader or the follower. Following our previous work on fuzzy bilevel decision making, this study proposes a solution concept and related theorems for general- fuzzy-number based fuzzy parameter bilevel programming problems. It then develops an approximation Branch-and-bound algorithm to solve the proposed problem.
Workshop on Agent Based Computing III
JABAT—an implementation of the A-Team Concept 78Barbucha Dariusz, Czarnowski Ireneusz, Jędrzejowicz Piotr, Ratajczak-Ropel Ewa, Wierzbowska Izabela,pages 235 – 241. Show abstractThe paper proposes a JADE-based A-Team environment (in short: JABAT) as a middleware supporting the construction of dedicated A-Team architectures that can be used for solving a variety of computationally hard optimisation problems. The paper includes a general overview of the functionality and structure of the proposed environment and a description of how this functionality can be extended to solving new problems.
Spatial Telemetric Data Warehouse and Software Agents as Environment to Distributed Execute SQL Queries 122Gorawski Marcin, Płuciennik Ewa,pages 243 – 252. Show abstract. The article presents environment for distributed SQL query execution in a telemetric data warehouse. A parallel spatial data warehouse stores information remotely read from meters grouped in nodes according to their geographical positions. This data warehouse constitutes distributed data structure. Software agents environment enables querying this structure as local one so distribution is transparent to the user. Query evaluation consists of analysis, decomposition, local execution and results merging. There is no need to transfer data between nodes. Authors present operators for executing query modification and sub-results merging and also test outcomes of SQL query realization in different agent environment configurations.
Simple Reputation Metrics for Mobile Agent in Open Environment 68Klopotek Mieczyslaw A., Wolski Michal,pages 253 – 261. Show abstractTrust metrics are useful as a mechanism to build reputation of virtual communities. In this paper we describe and compare four reputation metrics used for mobile agents. Comparison is based on simulation of open environment, characterized by unknown network topology. In our experiments when agents begin to inspect network they don't have any information about nodes. We present our simulation environment (MARS) and “path mechanism” that is very helpful to recognize evil nodes. In last part of this paper we want to check which of well known algorithms can better recognize reputation of node.
An Agent-based Approach to Group Decision Simulation using Argumentation 72Marreiros Goreti, Novais Paulo, Machado José, Ramos Carlos, Neves José,pages 263 – 270. Show abstractGroup decision making simulation allows for the creation of virtual group decision scenarios. The use of a group decision simulator enhances user competences in this area, to test different argumentation strategies and to validate “what if” useful real world scenarios. In this paper, it is proposed a multi-agent model to simulate group decision making tasks. Agents are designed with emotional properties, reason with incomplete information and use persuasive argumentation to convince the other group elements about the best alternative choice.
Introducing fault tolerance and responsiveness in web applications using SREFTIA 118Niazi Muaz, Ahmed H. Farooq, Ali Arshadpages 271 – 278. Show abstractWe present a framework for adding fault tolerance and responsiveness to an existing web based application using transportable information agents. This framework is presented in a step by step manner and uses several agents to introduce fault-tolerance in a system. We also show simulated results for an actual application where we introduce fault tolerance and responsiveness and demonstrate a reduction in down time as well as an improvement in the responsiveness.
6th International Interdisciplinary Conference on Electronic Commerce
Towards an Open eBusiness Collaboration Community 213Campelo F. Eulálio G., Stucky Wolffried,pages 281 – 288. Show abstractInternet access is no longer an issue for European organizations. However, the integration of these organizations in regional business networks still is a challenge for most enterprises, especially in the SME sector. The current eMarketplace model, which was one of the faces of the eBusiness revolution at the beginning of the XXI century, did not fulfill the expectations to include organizations of all sizes and sectors in an unified electronic market. This paper deals with this issue and suggests a new eBusiness model to foster a broader open participation of companies in B2B world, allowing them to overcome some of the current obstacles e.g. high investment costs, partner collaboration mechanisms, standards, interoperability, to take part in a virtual collaborative network.
An uncertain problem of caching dynamic content for a web e-commerce 111Gil Waldemar, Orski Donat,pages 289 – 296. Show abstractImproving the performance of the Web-based system is a crucial requirement, especially in e-commerce area. Many sites incorporate dynamic web pages to deliver customized content to their users. This is one of the most important motivations for improving the performance of the web-based dynamic content system. In many papers mining web logs are used for developing web caching and prefetching strategies. In this paper, we present benefit of this technique for boosting the performance of serving dynamic content. Generated documents are accommodated in a dedicated part of the cache, to avoid the drawback of incorrect replacement of requested documents. We exploit a method for variable-size content objects using an n-gram based prediction on future web request to improve the basic algorithm of buffering. The adaptation is reasonable where, because of an uncertainty, we could not perform the parameters of algorithm correction in an optimal way. Experimental results indicate improvement of web-access performance of dynamic content for this method.
Relationship between IT Security and users’ needs 166Kapczyński Adrian,pages 297 – 302. Show abstractThis paper deals with mapping of IT security into users’ needs. The scope of IT security domain was narrowed to authentication methods based on physiological or behavioral characteristics of human beings. Preferences of chosen group of people were diagnosed and based on that AI enhanced tool was built in order to help selection of biometric methods based on specified criteria.
No Trust, No Transaction–The Implications For The Internet Suppliers 237 Kossecki Paweł, Świerczyńska-Kaczor Urszula,pages 303 – 309. Show abstractBuilding customers’ trust is essential for Internet supplier in acquiring consumers’ loyalty, increasing their satisfaction, encouraging customers to move their spending from traditional to e-market. The authors identify the key factors which have an effect on building customers’ trust. They analyse two groups of factors: strictly connected with the process of making transactions (e.g. communication before purchase, the security of payment, the delivery costs, the means of dealing with claims and returns) and not related directly (e.g. law regulations, protection of consumer rights and privacy, technical infrastructure, transfer of external trust).
Relationship Between IT Technology and the User Needs 56Michalski Andrzej M.,pages 311 – 316. Show abstractNearly all solutions in the current Information Technology (IT) are technology-dependent. And suitable technology—as a rule—is developed to fulfils the specific needs of the user. The paper describes sequence of repeated cycles we can see in the IT development process. A short case study based on SBC (Single Board Computers) technology is included.
Long-term Preservation of Digital Information 149Styblińska Maria,pages 317 – 324. Show abstractThese digital materials represent our heritage and are its future intellectual capital. The fast pace of change in the technological landscape makes ensuring long-term access to this material a challenge. The paper mentions about some specific aspects of preservation of especially assets in digital ages.
2nd Workshop on Large Scale Computing on Grids
Efficient Matchmaking in an Agent-based Grid Resource Brokering System 1000Dominiak Mateusz, Kuranowski Wojciech, Gawinecki Maciej, Ganzha Maria, Paprzycki Marcin,pages 327 – 335. Show abstractCentralized yellow pages -based approach is one of well-known methods of facilitating resource discovery in distributed systems. Due to its conceptual simplicity, it became our choice in two recent research projects: agent-based model e-commerce system and agent system for resource brokering in a grid. It is the latter project that is used as a background to study efficient ways of implementing agent-based yellow page service. Results of actual experiments provide us with guidelines how to maximize its throughput.
The Software Agents in a Database Grid 125Gorawski Marcin, Bańkowski Sławomir, Gorawski Michał,pages 337 – 344. Show abstractSoftware Agents methodology presented in the following paper is based on a grid structure. A single grid node can be adequately configured to access a remote folder or to remotely access other Agents. The tasks are distributed and executed parallel on many nodes. This causes a significant increase in the system performance. The build-in functions measureing a local node performance allows precise whole system efficiency monitoring. The system is adapted to a multi-user mode, and a good balancing mechanism provides equal use of all system nodes. It is essential for a full usage of a system capability in an unknown environment. The balancing objective is to omit slower nodes and to send more tasks to faster nodes
Selection of Indexing Structures in Grid Data Warehouses 71Gorawski Marcin, Gorawski Michał, Bańkowski Sławomir,pages 345 – 354. Show abstractData warehouse systems service larger and larger sets of data. Effective data indexing is not sufficient, because one system node is unable to store this many quickly flowing data. More and more common solution is distribution of an indexing structure among several system nodes. Indexing structures presented in the following article are based on a grid technology. They assure good performance on every node, and their diversity allows adjusting to any data type. Presented system offers effective data distribution and index management on many system nodes connected via network.
Parallel Program Execution Anomalies 178Kwiatkowski Jan, Pawlik Marcin, Konieczny Dariusz,pages 355 – 362. Show abstractThe parallel runtime depends not only on the utilized algorithm but also on the hardware and software environment properties. Execution time anomalies resulting from unexpected behavior of the hardware and software environment or inexact performance model are well known to every scientist analyzing the performance of the parallel program. An abnormal behavior can usually be observed only for specific input data, execution parameters or hardware configurations. In the paper different sources of anomalies are discussed and the way how the performance analysis can be carried out in their presence is presented.
Performance Prediction Methods 179Szymanska-Kwiecień Agnieszka, Kwiatkowski Jan, Pawlik Marcin, Konieczny Dariusz,pages 363 – 370. Show abstractThe increase of computer networks speed paired with the ubiquity of inexpensive, yet fast and generously equipped hardware offers many organizations an affordable way to increase the available processing power. Clusters, hyperclusters and even Grids, not so long ago seen only in huge datacenters, can now be found helping many small organizations in solving their computational needs. Finding an effective application performance prediction method constitutes one of particularly important but still open research problems. The paper presents the short overview of techniques utilized to predict application execution time. The whole spectrum of prediction methods is analyzed and selected tools dedicated for grid environments are presented.
Workshop on Real-Time Safety-Critical Software
RT-UML in modeling of multimedia applications 187Cichoń Paweł,pages 373 – 382. Show abstractRT-UML enables a structural and behavioral description of multimedia applications including time characteristic, however it does not offer mechanisms of a multimedia objects presentation (arrangement) expression or the time constrains expression between events. This fact illustrates RT-UML lack of expressive capabilities, especially in the context of multimedia applications modeling. The presented in the paper extension of RT-UML illustrates how multimedia objects arrangement and events synchronizations can be presented, which means new type of diagrams addition to UML (presentation diagrams) and a synchronized events set class placing into the time model of RT-UML. Moreover extensions provide an application a graphical presentation of synchronized events to sequence and activity diagrams of UML, which enables time constrains of multimedia objects and their activities expression (synchronization of time events, which occur during presentation of multimedia objects). The originality of this approach relies on the extension of RT-UML syntax and on the presentation of an original method of multimedia applications modeling, which can make the production process more formalized and thus more precise.
Automated Code Generation for Safety-Related Applications: A Case Study 170Gluch David, Kornecki Andrew J.,pages 383 – 391. Show abstractThis paper addresses issues relating to the suitability of using automated code generation (ACG) technologies for the development of real-time, safety-critical systems. This research explored the characteristics of model-based software development methodologies and the automated code generation tools that support them. Specifically, data related to the engineering challenges, skills, and effort associated with ACG practices and technologies were collected as part of a case study. Characteristics such as the generated code’s organization, size, readability, traceability to model, real-time constructs, and exception handling were identified. In addition, the case study involved software engineering practices that incorporate integrated analysis and design iterations throughout a model-based development process. The research investigated both the static and dynamic characteristics of the selected techniques and tools, identified characteristics of ACG tools with potential impact on safety, and considered the semantic consistency between representations.
Code Generation for CSM/ECSM Models in COSMA Environment 182Grabski Waldemar, Nowacki Michał,pages 393 – 400. Show abstractThe COSMA software environment, developed in the Institute of Computer Science, WUT, was designed primarily for model checking of reactive systems specified in terms of Concurrent State Machines (CSM). However, COSMA supports also Extended CSM (ECSM). The extensions allow for using complex data types and pieces of C/C++ code, attributed to CSM states and/or transitions. Because of these extensions, ECSM models cannot be verified by model checking, but they can be used as an intermediate step in code generation. The underlying CSM represent then the flow of control within cooperating components and the communication among them while the extensions specify the data structures and the details of their processing. The paper discusses the code generation from ECSM diagrams. The approach is illustrated with an example.
FPGA as a part of MS Windows control environment 177Kołek Krzysztof, Turnau Andrzej,pages 401 – 406. Show abstract The attention is focused on the Windows operating system (OS) used as a control and measurement environment. Windows OS due to their extensions may be used as a real-time OS (RTOS). Benefits and drawbacks of typical software extensions are compared. As far as hardware solutions are concerned the field programmable gate arrays FPGA technology is proposed to ensure fast time-critical operations. FPGA-based parallel execution and hardware implementation of the data processing algorithms significantly outperform the classical microprocessor operating modes. Suitability of the RTOS for a particular application and FPGA hardware maintenance is studied.
EMLAN: a framework for model checking of reactive systems software 180Krystosik Artur,pages 407 – 413. Show abstract Embedded System Modelling Language (EMLAN) is high-level, C-like language for modelling and model checking of embedded, reactive systems. The language addresses a number of topics such as: partitioning of the system, concurrency, interrupts, synchronization mechanisms, time, data transformations, hardware interactions. Model checking of the EMLAN specification is based on translations into DT-CSM (Discrete Time Concurrent State Machines), generation of a reachability graph and checking requirements expressed as CTL temporal formulas. The paper presents the general concepts of verification of reactive systems using EMLAN.
Hardware and software architectures for reconfigurable time-critical control tasks 195Piłat Adam, Grega Wojciech,pages 415 – 424. Show abstractThe most popular configuration of the controlled laboratory test-rigs is the personal computer (PC) equipped with the I/O board. The dedicated software components allows to conduct a wide range of user-defined tasks. The typical configuration functionality can be customized by PC hardware components and their programmable reconfiguration. The next step in the automatic control system design is the embedded solution. Usually, the design process of the embedded control system is supported by the high-level software. The dedicated programming tools support multitasking property of the microcontroller by selection of different sampling frequencies of algorithm blocks. In this case the multi-layer and multitasking control strategy can be realized on the chip. Programming the serial communication and user defined applications the process data can be uploaded to the host system. The proposed solutions implement rapid prototyping approach. The available toolkits and device drivers integrate system-level design environment and the real-time application software, transferring the functionality of MATLAB/Simulink programs to PCs’ or microcontrolers application environment.
Task-Oriented Real-Time Execution without Asynchronous Interrupts combined with Runtime State Restoration 63Skambraks Martin, Halang Wolfgang,pages 425 – 431. Show abstractThe architectural concept of a programmable electronic system is presented, which is particularly suited for highly safety-critical applications. Its most essential characteristics are task-oriented real-time execution without the need for asynchronous interrupts and the ability for state restoration at runtime. The concept of task execution without the use of asynchronous interrupts combines the advantages of the classic synchronous and asynchronous approaches to real-time programming. It lowers the complexity of both hardware architecture and conforms particularly well with the safety standard IEC 61508. As a special form of forward recovery prevents redundancy attrition due to transient faults.
Numerical Assessment of Software Development Tools in Real-Time Safety-Critical Systems using Bayesian Belief Networks 194Zalewski Janusz, Kornecki Andrew J., Pfister Henry L.,pages 433 – 442. Show abstractThis paper explores the applicability of Bayesian belief networks (BBN’s) to assess software development tools for real-time safety-critical applications. Bayesian inference rules are applied to experimental results from a previous study, including such properties of safety-critical software as efficiency, usability, functionality and traceability, and a numerical assessment of a tool is derived to assist in the tool qualification process.
1st International Workshop on Secure Information Systems
Public Key Management Scheme with Certificate Management Node for Wireless Ad Hoc Networks 61Funabiki Shunsuke, Isohara Takamasa, Takemori Keisuke, Sasase Iwao,pages 445 – 451. Show abstractAn authentication technique is an important issue for a wireless ad hoc network that is composed with unknown nodes. The self-organized public key management scheme that each node creates and manages certificates by itself has been proposed for the network that does not connect to the Internet with a certificate authority. However, it needs large time and loads heavy traffic to collect all certificates in the network. In this paper, we propose public key management scheme with repository management node that collects certificates of each node in its power range. This proposed scheme consists of selecting certificate management node (CMN) and clustering techniques. It can reduce the time to collect certificates from the CMN when normal nodes access to the CMN. Furthermore, our proposed scheme can reduce memory and traffic loads because normal nodes neither have all certificates nor exchange repositories each other. By a computer simulation, we evaluate average traffic and an amount of memory consumption and show that the proposed scheme can reduce the both of them than conventional schemes.
Parity-based Detection of Multiple Errors in S-boxes 97Idzikowska Ewa, Bucholc Krzysztof,pages 453 – 457. Show abstractIn this paper we present low-cost, concurrent checking methods for multiple error detection in S-boxes of symmetric block ciphers. These are redundancy-based fault detection schemes. We describe some studies of parity based concurrent error detection in S-boxes. Probability of multiple error detection is analyzed for random data. In this work 48-input, 32-output substitution blocks are taken into consideration.
Personal Identification Techniques Based on Operational Habit of Cellular Phone 67Isohara Takamasa, Takemori Keisuke, Sasase Iwao,pages 459 – 465. Show abstractBiometrics authentication which identifies a legitimate user of a cellular phone has been adopted. However, it is not used so actively, because control procedure is complicated, and a user has a resistance to provide own biological information to the communication terminal. In this paper, we propose personal identification techniques based on operational habit of cellular phone, which authenticate the legitimate user by collecting typing histories of cellular phone in a background process. The typing histories that are recorded in the ring buffer are divided into two profiles. One is ``short term profile'' which is histories from latest typing, and the other is ``long term profile'' which are older histories than the short term records. Also, we implement two authentication algorithms; first algorithm retrieves a remarkable key of the short term profile from the long term profile. Second algorithm computes weight values of keys of the short term profile by using long term profile. We evaluate the FAR (False Acceptance Rate) and FRR (False Reject Rate), and we show that our method can apply to personal identification on the cellular phone.
TLS handshake method based on SIP 89Kaji Tadashi, Hoshino Kazuyoshi, Fujishiro Takahiro, Takata Osamu, Yato Akifumi, Takeuchi Keisuke, Tezuka Satoru,pages 467 – 475. Show abstractAs the result that there are many security problems around the Internet, various security measures are required to protect communications over the Internet. TLS is widely used to protect application layer protocol. However, TLS requires a handshake process to establish a TLS session. And the handshake process costs because of PKI based authentication and key calculation for each TLS session. This paper proposes the effective TLS handshake method based on SIP and TLS session resume. In this method, SIP server performs the PKI based authentication and key calculation for all TLS sessions on behalf of TLS server and TLS client.
Modern access control based on eye movement analysis and keystroke dynamics 126Kapczyński Adrian, Kasprowski Paweł, Kuźniacki Piotr,pages 477 – 483. Show abstractThe paper presents key aspects connected with modern access control based on behavioral patterns of human being. Two biometric techniques based on eye movement and keystroke dynamics were presented. Quantitative results obtained during examination of systems based on mentioned behavioral methods are convincing to undertake further research work.
The OCTAVE methodology as a risk analysis tool for business resources 160Pyka Marek, Januszkiewicz Paulina,pages 485 – 497. Show abstractIn this article the authors conduct a discussion concerning methodology that improves information management and protection decision making process. The authors describe OCTAVE (The Operationally Critical Threat, Asset, and Vulnerability Evaluation) using real-life examples and reference to the Polish legal regulations. The purpose of the article is to present a methodology, which is successfully being employed in Western-Europe countries and to show that it can also be efficiently conducted in Poland, fitting well into the policies of many companies.
A Workflow Instance-based Model-checking Approach to Analysing Organisational Controls in a Loan Origination Process 165Schaad Andreas, Sohr Karsten,pages 499 – 517. Show abstractDemonstrating the safety of a system (i.e. avoiding the undesired propagation of access rights or indirect access through some other granted resource) is one of the goals of access control research, e.g. However, the flexibility required from enterprise resource management (ERP) systems may require the implementation of seemingly contradictory requirements (e.g. tight access control but at the same time support for discretionary delegation of workflow tasks and rights). To aid in the analysis of safety problems in workflow-based ERP systems, this paper presents a model-checking based approach for automated analysis of delegation and revocation functionalities. This is done in the context of a real-world banking workflow. We expand on some of our earlier work reported in , which was restricted to single workflow models and not arbitrary workflow instances of one or several models. This initial restriction to model-checking at the workflow model level meant that we were not able to analyse delegation and revocation of tasks and access rights in the detail as required in some of our earlier conceptual models [11, 12]. We derived information about the workflow from BPEL specifications and ERP business object repositories. This was captured in a SMV specification together with a definition of possible delegation and revocation scenarios. Possibly required separation of duty properties were translated into a set of LTL-based constraints.
Secure Distributed Processing of Context-Aware Authorization 145Shim Young-Chul,pages 519 – 526. Show abstractIn this paper we present a framework for context-aware authorization in ubiquitous computing environments. We present an architecture consisting of authorization infrastructure and context infrastructure. The context infrastructure provides context information and the authorization infrastructure makes decisions to grant access rights based on context-aware authorization policies and context information. This paper also describes how multiple nodes in distributed environments cooperate to perform evaluation and detection of context constraints and events included in authorization policies while the integrity and privacy of context information are guaranteed.
Implementation of Behaviour-Based Temporally Aware Behaviour-Based Security in Smart Cards 54William G. Sirett, Markantonakis Konstantinos and Mayes Keith,pages 527 – 533. Show abstractBehaviour-based security is a group of techniques used to monitor the activity of a system to identify abnormal behaviour and possibly an attack in progress. Smart cards present a constrained environment for behaviour-based security as there is no on-card source of time. A smart card applicable timestamping scheme, to provide secure time, is identified and used in a Java Card behaviour-based temporally aware security countermeasure; the functionality, implementation and operation of which is fully detailed in this work.
First International Multiconference on Computer Science and Information Technology
Timed Concurrent State Machines 171Daszczuk Wiktor B.,pages 537 – 547. Show abstractTimed Concurrent State Machines are an application of Alur’s Timed Automata concept to coincidence-based (rather than interleaving) CSM modeling technique. TCSM support the idea of testing automata, allowing to specify time properties easier than temporal formulas. Also, calculation of a global state space in real-time domain is defined for TCSM, allowing to store a system in ready-to-verification form, and to multiply it by various testing automata.
Unified Automatic Testing of a GUI Applications' Family on an Example of RSS Aggregators 70Derezińska Anna, Małek Tomaszpages 549 – 559. Show abstract This paper addresses a unification problem of automatic test scripts of GUI applications. We particularly focus on testing a family of applications related by a common functionality. The goal was to make a testing process to the most extend flexible and user independent. The capabilities and limitations of the solution are discussed. The strategy was showed on an example of an applications' family—RSS aggregators. The experiments of functional, performance and regression testing of RSS aggregators are presented.
From Simulation to Execution: on a Certain Programming Paradigm for Reactive Systems 94Dobosiewicz Wlodek, Gburzynski Pawel,pages 561 – 568. Show abstractWe present a certain programming environment, which (over the years of its evolution) has allowed us to unify simulation, specification and execution for a class of highly practical systems, including (low-level) communication protocols and embedded software. The programming paradigm inherent in our scheme is a combination of FSM (finite state machines) and coroutines with multiple entry points. It brings about a unique flavor of multitasking suitable and appropriate for expressing both event driven simulation models as well as executable, multithreaded, embedded systems. Its characteristic feature in the latter domain is the extremely small footprint, which makes it possible to program tiny devices in a highly structural, reusable, self-documenting and reliable manner.
Reducing the Overhead Cost in Fixed & Low Mobility AODV Based MANETs 110Kawish Babar, Aslam Baber, Khan Shoaib,pages 569 – 578. Show abstractThe primary goal of an ad hoc network routing protocol is to establish correct and efficient route between two nodes so that message may be delivered in time. It is observed that in high mobility network models AODV performs superior than other proactive and reactive protocols (DSDV and DSR). However for low mobility or static network models the routing efficiency of AODV is marginally better than others. The major reason for this decline in the performance is due to the shorter route cache in AODV protocol. Frequently discarding the valid routes and rediscovering them causes unnecessary delay in routing and increase control overheads. This paper analyzes the bandwidth utilization and routing delay once MANETs are subjected to prolong static environments and suggests a provision of dynamic route cache in AODV routing protocol. Keywords: MANETs, AODV, ART, TTL, Route Cache
Parallel Performance of a 3D Elliptic Solver 216Lirkov Ivan, Vutov Yavor,pages 579 – 590. Show abstractIt was shown that block-circulant preconditioners applied to a conjugate gradient method used to solve structured sparse linear systems arising from 2D or 3D elliptic problems have good numerical properties and a potential for high parallel efficiency. In this paper the convergence rate and the parallel performance of a circulant block-factorization based preconditioner applied to a 3D problem are analyzed. A portable parallel code is developed based on Message Passing Interface (MPI) standards. The performed numerical tests on parallel computer systems clearly demonstrate the high level of efficiency of the developed algorithm.
Comparative analysis of PCG solvers for voxel FEM systems 217Margenov Svetozar, Vutov Yavor,pages 591 – 598. Show abstractThe presented comparative analysis concerns two iterative solvers for large-scale linear systems related to µFEM simulation of human bones. The benchmark problems represent the strongly heterogeneous structure of real bone specimens. The voxel data are obtained by a high resolution computer tomography. Non-conforming Rannacher-Turek finite elements are used for discretization of the considered elliptic problem. It is well known that the preconditioned conjugate gradient method is the best tool for efficient solution of large-scale symmetric systems with sparse positive definite matrices. Here, the performance of two preconditioners is studied, namely the modified incomplete Cholesky factorization MIC(0) and the algebraic multigrid. The comparative analysis is mostly based on the computing times to run the sequential codes. The number of iterations for both preconditioners are also discussed. Numerical tests of a novel parallel MIC(0) code are given at the end. The obtained parallel speed-ups and efficiencies well illustrate the scope of efficient applications for real-life large-scale problems.
Experimental dependability evaluation of memory manager in the real-time operating system 193Pisarczyk Pawel,pages 599 – 605. Show abstractThe paper presents results of experimental dependability evaluation of the Phoenix-RTOS operating system. Experiments are conducted using a self-developed testing environment and a kernel fault injector. Dependability evaluation is the last stage of a system development process. Results will be used in the future research to propose the dependable memory manager.