Proceedings of the International Multiconference on Computer Science and Information Technology

Volume 2

October 15–17, 2007. Wisła, Poland

ISSN 1896-7094

 Multiconference Proceedings (PDF, 33.238 M)

Workshop on Agent Based Computing IV

  • A Practical Approach for Researching Trading Agents' Behavior in a CDA Environment
    140 Software Simulation, Agent-Based Computational Economics, Estimation of Bidding Strategy Galina Ilieva, pages 3 – 12. Show abstract During the last decade, there has been a huge increase in the volume of Internet business transactions which in turn led to the emergence of a new field of scientific research - agent-based computational economics. In order to be able to conduct successful transactions, agents must have appropriate strategies to participate in trading nego­tiations. This paper is aimed at investigating the process of multi-agent negotiations in continuous double auction (CDA) and describes one possible approach for solving the problem of resource allocation in a distributed com­putational environment. Our goal is to present a new software tool for e-auctions simulation and to investigate bidding agents’ strategies. Via a series of experiments we prove that this new software application sufficiently mo­dels the process of making transactions in CDA. We propose a new method for quick and reliable strategy com­pa­ri­sons. Using this novel method, we can choose a bidding strategy that ensures the most effective resource allocation.
  • A Common Base for Building Secure Mobile Agent Middleware Systems
    158 Mobile Agents, Security, Mobile Agent Middleware Guido van 't Noordende, Benno Overeinder, Reinier Timmer, Frances Brazier, Andrew Tanenbaum, pages 13 – 25. Show abstract The Agent Operating System (AOS) provides the basic functionality needed for secure and reliable mobile agent platforms: support for secure communication, secure agent storage and migration, and minimal primitives for agent life-cycle management. Designed as a layer between local operating systems and higher level agent platform middleware, it supports interoperability between agent platforms and between different implementations of AOS itself. AOS has been tested on interoperability, both with regard to different higher-layer middleware platforms and interoperability between two implementations of AOS in C++ and Java.
  • Abstract software migration architecture towards agent middleware interoperability
    155 Mobile Agents, Interoperability, IEEE-FIPA, migration, abstract implementation, JADE, AgentScape Jordi Cucurull, Benno J. Overeinder, Michel A. Oey, Joan Borrell, Frances M.T. Brazier, pages 27 – 37. Show abstract Agent mobility is the ability of an agent to migrate from one location to another. So far, there are several difficulties with agents' migration due to the lack of interoperability among agent middleware that is distributed over the net. In this paper, an abstract software migration architecture is presented, which is the first step towards full agent middleware interoperability. With this architecture, the process of migrating an agent is uniformly defined for multiple middleware, leaving the agent execution environment' standards as a future research. To validate the suggested abstract migration, the architecture has been successfully implemented over two different agent middleware: JADE and AgentScape.
  • Overview of an approach for a Web based Mobile Agent architecture
    38 mobile agents, web, infrastructure Marius Feldmann, pages 39 – 45. Show abstract Abstract. The paper describes an approach for establishing a mobile agent architecture in the Web. This architecture makes use of currently much discussed interaction principles and the document centric constitution of the Web. It builds out an abstraction layer useful for mobile agent execution and allows realizing various feasible Web applications following the mobile agent principle.
  • Utilization of Software Agents and Web Services as Transducers for Legacy Software; Case Study Based on an SMTP Server
    175 Web Services, Agent-based Computing, Legacy Software, Simple Mail Transfer Protocol, Efficiency Michal Oglodek, Maciej Gawinecki, Marcin Paprzycki, pages 47 – 57. Show abstract There exists a number of ways in which legacy software can be “wrapped” to become interoperable. Two of currently more popular of them are utilization of Web Services and software agents. The aim of this paper is to experimentally compare efficiency of JADE implemented agents, with Web Services, when used as transducers for an SMTP server.
  • An Architecture for Resource Bounded Agents
    157 intelligent agent architecture, active logic, inductive logic programming, learning Slawomir Nowaczyk, Jacek Malec, pages 59 – 69. Show abstract We study agents situated in partially observable environments, who do not have sufficient resources to create conformant (complete) plans. Instead, they create plans which are conditional and partial, execute or simulate them, and learn from experience to evaluate their quality. Our agents employ an incomplete symbolic deduction system based on Active Logic and Situation Calculus for reasoning about actions and their consequences. An Inductive Logic Programming algorithm generalises observations and deduced knowledge so that the agents can choose the best plan for execution. We describe an architecture which allows ideas and solutions from several subfields of Artificial Intelligence to be joined together in a controlled and manageable way. In our opinion, no situated agent can achieve true rationality without using at least logical reasoning and learning. In practice, it is clear that pure logic is not able to cope with all the requirements put on reasoning, thus more domain-specific solutions, like planners, are also necessary. Finally, any realistic agent needs a reactive module to meet demands of dynamic environments. Our architecture is designed in such a way that those three elements interact in order to complement each other's weaknesses and reinforce each other's strengths.
  • Employment of mobile multiagent systems for attendance of transaction of distributed database
    168 Zofia Kruczkiewicz, pages 71 – 81. Show abstract It is aimed at the limitation of the network traffic which is generated during the realization of two-phase-commit protocol of distributed transactions in distributed database systems. This paper presents the way of the decrease of the number and the way of the diminution of the size of messages sent during dis­tributed transactions as the manner of the limitation of the network traffic. Mul­ti­agent system is employed for attendance of confirmation of two-phase dis­t­ri­buted transactions. The paper presents the manner of the limitation of the net­work traffic in multiagent systems as the use of mobile agents as resources ma­na­gers of databases, static agents as coordinators of transactions and static agents as managers of transactions. These agents communicate via two-phase-com­mit protocols. The paper presents the performance experiments enabling the comparison of two-phase-commit protocols, the protocol elaborated in the paper and the protocol described in the literature.

2nd International Symposium Advances in Artificial Intelligence and Applications

  • Using Regular Expressions in Translation Memories
    109 translation memory,, computer aided translation (CAT), machine translation (MT) Jacek Gintrowicz, Krzysztof Jassem, pages 87 – 92. Show abstract The paper presents the idea of using regular expressions in translation systems based on translation memories. Regular expressions are used in: search for a match in the instance sentence, search for an appropriate example in the translation memory and in the transfer of the instance sentence into its equivalent. The application of transfer rules to translation memories support the thesis, put forward in the paper, that Machine Translation and Computer-Aided Translation converge into the same direction.
  • A Rule--Based Characterization of Clusters of Genes
    61 Decision rules, attributes, Gene Ontology, DNA microarrays Aleksandra Gruca, pages 93 – 102. Show abstract This paper presents a method of using the attribute analysis and the rough sets theory for describing and analyzing the Gene Ontology (GO) composition of clusters of genes obtained in DNA microarray experiments. GO terms are understood as attributes of genes and gene clusters are characterized by decision rules related to attributes. A modification of the known algorithm for computing the decision rules for the information system is proposed, which makes it suitable for large-size problems encountered in the analysis of DNA microarray data. Additionally methods are developed for the assessment of the statistical significance of the computed rules. Presented approach is used for DNA microarray data obtained in experiments of measuring transcriptome response of cells to ionizing radiation. The results of computations are compared with those obtained with the use of GO browsers.
  • Open-ended Evolution in Flocking Behaviour Simulation
    172 flocking behaviour evolutionary algorithm Urszula Markowska-Kaczmar, Halina Kwasnicka, Marcin Mikosik, pages 103 – 120. Show abstract In this research we tried to apply open-ended evolution to breed controllers for artificial organism which would be able to manifest flocking behaviours. In comparison to the previous work in this area the artificial world created in our system is characterized by significant diversity of organisms. Animals were equipped with double and dynamic sight, which were suggested by other authors in their works. In experiments, many different behaviours were observed, which were very similar to those in nature, for instance: the escape of herbivore from predators, making herbivores route towards plants or a pursuit of predators after herbivores. Another interesting behaviour was grouping of predators around plants, where the probability of meeting herbivore is greater than in other places. The most advanced behaviour was creation of flocks, which was the goal of experiments. The observed motion of animals looked natural. However, to the full success it was necessary to apply steered evolution.
  • A Hybrid Clustering Method for General Purpose and Pattern Recognition
    169 clustering, k-means, single link, hybrid algorithm, shape identification Jan W. Owsinski, Mariusz T. Mejza, pages 121 – 126. Show abstract The present short paper outlines a simple hybrid technique of clustering, based on the joint use of k-means and nearest neighbour algorithms. The technique starts with the k-means algorithm, performed as the first stage for an adequately high number of centroids and continues with the nearest neighbour algorithm, executed for the clusters obtained in the first stage, as the set of initial objects to be merged. It is shown on examples how the thus defined technique performs in terms of identification of relatively complex shapes.
  • Application of Genetic Algorithm in Median Filtering
    37 genetic algorithm, median filtering, optimization Sandra Sovilj-Nikic, Ivan Sovilj-Nikic, pages 127 – 139. Show abstract Images are often corrupted by impulse noise due to errors generate d in noisy sensors or com­mu­ni­ca­tion channels. Two types of impulse noise can be defined: 1) fixed-valued and 2) random-valued. In many appli­cations it is very important to remove noise in the images before some subsequent processing such as edge detection, object recognition and image segmentation. In this paper an adaptive filtering using genetic algo­rithm is proposed. In the simulations over various images, the proposed partition based median (PBM) fil­ter using genetic algorithm in training have demonstrated better results in noise suppressing then com­pe­ti­tive filters based on median filtering in terms of SNR(dB) as well as the perceived image quality.
  • Linguistic Knowledge Representation for Stochastic Systems
    146 Fuzzy modelling, Linguistic variable, Stochastic systems Anna Walaszek-Babiszewska, pages 141 – 150. Show abstract The paper deals with an idea of a linguistic knowledge representation and a linguistic inference. Relational linguistic fuzzy model with weights of rules is utilising. The probability of linguistic values of antecedent and consequent variables, calculated according to Zadeh’s definition, is proposed to formulate a linguistic fuzzy model of a stochastic system. The linguistic inference procedures and the exemplary MISO model are presented.
  • Implementation and evaluation of the fuzzy system for the assessment of cadastre operators' work
    156 fuzzy system, operators' work assessment, self-organizing map Bogdan Trawiński, Tomasz Karczyński, pages 151 – 160. Show abstract The Mamdani fuzzy system for the multi-criteria assessment of cadastral system operators' work is presented in the paper. The system comprises five input criteria: productivity, complexity, quality, availability and report production and was implemented using JAVA. Moreover two alternative methods of assessment were incorporated into the system: first based on self-organizing maps and second employing linear function of input criteria. All three methods of operator assessment were evaluated using actual data taken from one cadastre information centre for the period of 70 weeks.
  • Fuzzy contrast enhancement for images in the compressed domain
    144 image enhancement, compressed domain, fuzzy algorithm, image processing Camelia Popa, Aurel Vlaicu, Mihaela Gordan, Bogdan Orza, pages 161 – 170. Show abstract Our objective is to investigate image processing directly in the compressed domain, without full decompression. Compressed domain image processing algorithms provide a powerful computational alternative to classical (pixel level) based implementations. The field is just emerging and the algorithms reported in the literature are mostly based on linear arithmetic operations between pixels. In this paper, the problem of implementing a non-linear operator, using compressed domain processing is addressed. A new algorithm for digital image enhancement, using fuzzy theory, adapted to the frequency content of each coefficient block in the DCT (Discrete Cosine Transform) encoded JPEG image is developed and proposed.
  • A Comparative Analysis of Crossover Variants in Differential Evolution
    115 differential evolution, binomial crossover, exponential crossover, control parameters Daniela Zaharie, pages 171 – 181. Show abstract This paper presents a comparative analysis of binomial and exponential crossover in differential evolution. Some theoretical results concerning the probabilities of mutating an arbitrary component and that of mutating a given number of components are obtained for both crossover variants. The differences between binomial and exponential crossover are identified and the impact of these results on the choice of control parameters and on the adaptive variants is analyzed.
  • Information Extraction Systems and Nominal Anaphora Analysis Needs
    102 information extraction, nominal anaphora, corpus, decision tree Ireneusz Matysiak, pages 183 – 192. Show abstract This paper presents the needs for anaphora analysis in Information Extraction from practical point of view. It depicts the problem of anaphora phenomenon and its role in texts together with corpus examples. It encloses a short survey of the computational treatment of nominal anaphora. In the end, a uniform approach for nominal anaphora analysis is proposed. Moreover, the paper describes requirements for corpus preparation and framework implementation.
  • Adaptive Differential Evolution: Application to Nonlinear Regression
    65 Global optimization, differential evolution, self-adaptation of control parameters, nonlinear regression. Josef Tvrdik, pages 193 – 202. Show abstract Adaptation of control parameters in differential evolution is considered. Five adaptive variants of differential evolution are applied to the estimation of parameters in nonlinear regression models and their performance is compared experimentally with the adaptive controlled random search algorithm tailored especially for these problems. The NIST nonlinear regression datasets are used as a benchmark. Two of five tested variants of adaptive differential evolution perform almost as reliable as the adaptive controlled random search algorithm and one of these two variants converges only slightly slower and its time requirements are almost comparable with the adaptive controlled random search.
  • Color Mining of Images Based on Clustering
    101 image, mining, color, clustering Krzysztof Walczak, Lukasz Kobylinski, pages 203 – 212. Show abstract The increasing size of multimedia databases and the ease of accessing them by a large number of users through the Internet carries a problem of efficient and semantically adequate querying of such content. A metadatabase may be used to shorten query resolution time by trying to limit the number of images being thoroughly analyzed to a smaller subset, having a high probability of finding the query image. In the article we propose a simple but fast and effective method of indexing such image metadatabases. The index is created by describing the images according to their color characteristics, with compact feature vectors, that represent typical color distributions. We present experiment results of typical search schemes by querying the metadatabase index created using a few different approaches.
  • Using of a Graph of Action Dependencies for Plans Optimization
    100 planning problems, plans optimization, heuristics Lukáš Chrpa, pages 213 – 223. Show abstract There are a lot of planning techniques which prefer faster computation of plans instead of quality of these plans. However, in many cases we do not only need to have a plan which is computed quickly, but we need to have the plan of good quality as well. This paper presents a Graph of Action Dependencies as a tool which can be useful for plans optimization. This approach is very good for detecting pairs of inverse actions in plans or actions not needed to acquire a goal etc. Combination of this approach with existing planners should bring an improvement in acquiring more quality plans in a short time.
  • Manifestation of selective attention in Sigma-if neural network
    153 neural networks, classification, selective attention Maciej Huk, pages 225 – 236. Show abstract Artificial neural networks are a very well known biologically-inspired machine learning technique. This technique has been widely applied in many domains, such as real-time signal filtering, modelling and synthesis, process control and classification (e.g. of images, diseases or star spectra). Other examples of use of neural networks include generation of rules for expert systems and knowledge discovery. However, effective analysis of multidimensional data sets still causes particular difficulties. It has been shown that processing too many data features is costly and has adverse effects on the classification properties of resulting models. Thus, while development of techniques for selecting features from very large data sets is not trivial, it is nonetheless very important. It can be observed that analogous problems are effectively solved by human low-level distributed selective attention mechanisms. For this reason, building a general (e.g. neural) model of such functionality would provide great benefits for the machine learning domain. This work addresses selective attention functionality found in the recently developed Sigma-if neural network. Experiments show how this selective attention model can reduce data acquisition and processing costs as well as the probability of classification errors.
  • Recognition of Structured Collocations in A Inflective Language
    151 corpus linguistics, collocation extraction, Polish, lexical-syntactic regularities, pattern recognition Maciej Piasecki, Bartosz Broda, Magdalena Derwojedowa, pages 237 – 246. Show abstract We present a method of the structural collocations extraction for an inflective language (Polish) based on the process divided into two phases: extraction and filtering of the pairs of wordforms reduced to baseforms and structural annotation of the extracted collocations with lexico-syntactic patterns. The parameters of the patterns are specified manually but their instances are generated and tested on the corpus automatically. The extracted collocations were evaluated by applying them as rules in morpho-syntactic disambiguation of Polish and by comparing them with a lists of two-word expressions extracted from two Polish dictionaries.
  • Polish Morphological Guesser Based on a Statistical A Tergo Index
    150 morphological guesser, Polish, automatic extraction, corpus linguistics, statistical a tergo index Maciej Piasecki, Adam Radziszewski, pages 247 – 256. Show abstract We present a direct method of construction of a morpho-syntactic guesser for Polish, which is a program producing morpho-syntactic descriptions for word forms unknown to the morphological analyser. The core of the method is the construction of a statistical a tergo index, in which pseudo-suffixes (endings) extracted by a statistical tree define morpho-syntactic properties of corresponding word forms. The secondary aim was to investigate to what extent it is possible to develop the morphological analyses exclusively on the basis of endings. Experiments in the extraction of a guesser for a domain of texts are also presented. The method can be applied to any other inflectional language with only minor technical changes.
  • Computational Efficiency of Suboptimal Nonlinear Predictive Control with Neural Models
    82 Model Predictive Control algorithms, neural networks, optimisation, quadratic programming Maciej Ławryńczuk, pages 257 – 266. Show abstract This paper studies computational efficiency of suboptimal Model Predictive Control (MPC) with neural models. The algorithm requires solving on-line only a quadratic optimisation problem. Considering a nonlinear polymerisation process, for which the linear MPC algorithm is inadequate, it is shown that the suboptimal algorithm results in closed-loop control performance similar to that obtained in the fully-fledged nonlinear MPC technique, which hinges on non-convex optimisation.
  • Improved TBL algorithm for learning context-free grammar
    130 grammatical inference, context-free grammar, genetic algorithm Marcin Jaworski, Olgierd Unold, pages 267 – 274. Show abstract In this paper we introduce some improvements to the tabular representation algorithm (TBL) dedicated to inference of grammars in Chomsky normal form. TBL algorithm uses a Genetic Algorithm (GA) to solve partitioning problem, and improvements described here focus on this element of TBL. Improvements involve: i nitial population block size manipulation, block delete specialized operator and modified fitness function. The improved TBL algorithm was experimentally proved to be not so much vulnerable to block size and population size, and is able to find the solutions faster.
  • Pattern Extraction for Event Recognition in the Reports of Polish Stockholders
    152 Event Recognition, rule extraction, Information Extraction, Memory Based Learning Michał Marcińczuk, Maciej Piasecki, pages 275 – 284. Show abstract In the paper the application of the general Memory Base Learning to Event Recognition in the domain of reports of stock issuers is investigated. A multi-classifier scheme is applied in which the boundaries of annotations are identified first and then a heuristic algorithm of merging into pair is applied. A modified method based only on positive examples is proposed. Several types of simple features requesting only simple processing of text are tested. The proposed method can be trained on a small annotated corpus.
  • Collecting Polish-German Parallel Corpora in the Internet
    58 parallel corpora Monika Rosińska, pages 285 – 292. Show abstract Parallel corpora have recently become indispensable resources in multilingual natural language processing. Manual preparation of a bilingual corpus is a laborious task. Therefore methods for the automated creation of parallel corpora are currently a topic of concern for many researches. A number of sophisticated and effective algorithms for collecting parallel texts from the Internet have already been created. The aim of the research has been to verify the efficiency of existing algorithms for the collection of Polish-German parallel corpora, intended as a reference source for a Machine Translation system, and possibly, to propose a new algorithm – best suitable for the task.
  • A Rule Based Approach to Temporal Expression Tagging
    104 temporal expression recognition and normalisation, tern, local semantics, tagging Pawel Mazur, Robert Dale, pages 293 – 303. Show abstract In this paper we present the DANTE system, a tagger for temporal expressions in English documents. DANTE performs both recognition and normalization of the expressions in accordance with the TIMEX2 annotation standard. The system is built on modular principles, with a clear separation between the recognition and normalisation components. The interface between these components is based on our novel approach to representing the local semantics of temporal expressions. DANTE has been developed in two phases: first on the basis of the TIMEX2 guidelines alone, and then on the ACE 2005 development data. The system has been evaluated on the ACE 2005 and ACE 2007 data. Although this is still work in progress, we already achieve highly satisfactory results, both for the recognition of temporal expressions and their interpretation (normalisation).
  • Truncated Importance Sampling for Reinforcement Learning with Experience Replay
    71 Reinforcement Learning, Direct Adaptive Control, Importance Sampling Paweł Wawrzyński, Andrzej Pacut, pages 305 – 315. Show abstract Reinforcement Learning (RL) is considered here as an adaptation technique of neural controllers of machines. The goal is to make Actor-Critic algorithms require less agent-environment interaction to obtain policies of the same quality, at the cost of additional background computations. We propose to achieve this goal in the spirit of {\it experience replay}. An estimation method of improvement direction of a changing policy, based on preceding experience, is essential here. We propose one that uses truncated importance sampling. We derive bounds of bias of that type of estimators and prove that this bias asymptotically vanishes. In the experimental study we apply our approach to the classic Actor-Critic and obtain $20$-fold increase in speed of learning.
  • Adaptive Temporal Planning at Airports
    129 temporal planning, plan repair, Simple Temporal Planning, multi-agent planning Pim van Leeuwen, Lian Ien Oei, Pieter Buzing, Cees Witteveen, pages 317 – 326. Show abstract Airports are getting more and more congested with ground handling activities during turnaround as one of the most constraining factors. To alleviate this bottleneck, robustness of the planning of these activities is of paramount importance. In this paper, we present a new idea to solve a strategic planning problem in a way that allows unforeseen, real-time disruptions to be handled in a straightforward and elegant manner. To that end we apply Hunsberger's decoupling algorithm to a Simple Temporal Network representation of the ground handling domain.
  • Advantages of an Easy to Design Fuzzy Predictive Algorithm: Application to a Nonlinear Chemical Reactor
    143 fuzzy control, model predictive control, fuzzy modeling Piotr Marusak, pages 327 – 336. Show abstract Advantages of a fuzzy predictive control algorithm are discussed in the paper. The discussed fuzzy predictive algorithm is a combination of a DMC (Dynamic Matrix Control) algorithm and Takagi–Sugeno fuzzy modeling inheriting advantages of both techniques. The algorithm is numerically effective. Moreover, information about measured disturbance can be included in it in an easy way. A simple and easy to apply method of fuzzy predictive control algorithms synthesis is presented. The advantages of the fuzzy predictive control algorithm are demonstrated in the example control system of a control plant with difficult dynamics – a nonlinear chemical reactor with inverse response.
  • Preferences in Evolutionary Multi-Objective Optimisation with Noisy Fitness Functions: Hardware in the Loop Study
    147 Evolutionary/Genetic algorithms; Decision making ; Hardware in the loop; Control system design; Electric drive control Piotr Woźniak, pages 337 – 346. Show abstract Multi-objective optimisation (MOO) is an important class of problem in engineering. The conflict of objectives in MOO places the issue of compromise in a central position. Since no single solution optimises all objectives, decision-making based on human preference is a part in solving MOO problems. In this paper application of the evolutionary MOO to the dynamic system controller design by use of the hardware in the loop is presented. Thanks to this approach problems of un-modelled plant dynamics and uncertainty of parameters are alleviated because no mathematical model is needed. The a-priori search of one solution does not require knowledge of a whole Pareto front.
  • On Dimensionality of Latent Semantic Indexing for Text Segmentation
    55 Text Segmentation, Latent Semantic Analysis, LSI, Information Retrieval Radim Rehurek, pages 347 – 356. Show abstract In this paper we propose features desirable of linear text segmentation algorithms for the Information Retrieval domain, with emphasis on improving high similarity search of heterogeneous texts. We proceed to describe a robust purely statistical method, based on context overlap exploitation, that exhibits these desired features. Ways to automatically determine its internal parameter of latent space dimensionality are discussed and evaluated on a data set.
  • Information System Based on Natural Language Generation from Ontology
    62 ontology semantics dialog Wojciech Górka, Łukasz Bownik, Adam Piasecki, pages 357 – 364. Show abstract Knowledge, facts and dependencies are usually recorded as texts. This method of gathering knowledge causes a situation when some dependencies between certain concepts are not visible. The ontology-based recording of knowledge enables to easily integrate data from many sources and to determine dependencies between them. In the course of the specific targeted project a computer system was developed. The system (which operates on polish language) gives the user access to information from a wide range of topics. The adopted method of communication is based on a natural language and makes use of information assets accumulated in ontologies.
  • Intelligent system for docking ligands to protein active sites
    111 molecular docking, computational package Zbigniew Starosolski, Andrzej Polański, pages 365 – 374. Show abstract We presented computational packed for analysis of interaction of molecular ligand-protein complexes. Packed was invented as an open project allows modyfication in simply manner, composed as set of functional blocks of standardised i/o data. The final information from package can be chosen by user interactive from all signals between blocks. Is based on Matlab high-level language environment that enables to perform other various computational analysis based on Matlab toolboxes. The block diagram of package and an example of its use is presented.
  • Integrating Fuzzy Logic and Chaos
    83 fuzzy, chaos, integration, chaotification, modeling, fuzzy control Zhong Li, Wolfgang Halang, pages 375 – 388. Show abstract Motivated by the current studies on the interactions between fuzzy logic and chaos theory, for instance, fuzzy modeling of chaotic systems using Takagi-Sugeno (TS) models, linguistic descriptions of chaotic systems, fuzzy control of chaos, complex fuzzy systems, and a combination of fuzzy control technology and chaos theory for an engineering practice, this survey paper aims to provide some heuristic research achievements and insightful ideas to attract more attention on the topic, interactions or relationship between fuzzy logic and chaos theory, which are related at least within the context of human reasoning and information processing.
  • Automatic Form Filling Based on Ontology-Controlled Dialogue With the User
    51 ontology semantics dialog Łukasz Bownik, Wojciech Górka, Adam Piasecki, pages 389 – 398. Show abstract Semantic Web is now one of the most common directions in the IT-oriented research. Here, the focus is put on issues related to the logical layer of semantic applications, i.e. inference methods within constantly broadening ranges of logic as well as standardization of successive languages with increasing logical expressiveness. The article features the solution to the issue of automatic form filling with the use of ontology-controlled dialogue with the user. This solution resulted in the implementation of a universal programming module. Practical application of semantic technologies based on previous achievements in the field, combining the possibilities of inference machines with the flexibility of the RDF language, allows to achieve certain benefits for the users of an IT system. The Authors assume that the reader possesses some basic knowledge in the range of ontology structure as well as the RDF and OWL languages.
  • Genetic Techniques in Modeling Visual Aspects of Non-Natural 3D Objects
    40 Genetic Algorithm, 3D modeling, Car Elzbieta Hudyma, Marcin Koszow, pages 399 – 407. Show abstract This paper introduces the idea of using nature-based genetic algorithms to evolve visual aspects of a class of non-natural 3D objects. This pioneer designing approach is presented on example of car models. The problem definition is introduced in genetic algorithms terms and the results of a prototype system are presented. Genetic operators allows users to interactively cross the car models and mutate them. The implemented genetic algorithm is able to generate the models that fits to the provided profile. Alternatively it can be used as a random generator – for creating inspirational car models. More experienced users can directly manipulate the object’s genes. The system renders the effects in a real time and allows to watch the objects in a virtual 3D space. Presented approach can be used as well for other class of customizable objects, for example: clothes, furniture, pieces of architecture, etc.
  • An investigation of the mutation operator using different representations in Grammatical Evolution
    45 Grammatical Evolution, representation, mutation, locality, binary representation, grey code representation Jonatan Hugosson, Erik Hemberg, Anthony Brabazon, Michael O'Neill, pages 409 – 419. Show abstract Grammatical evolution (GE) is a form of grammar-based genetic programming. A particular feature of GE is that it adopts a distinction between the genotype and phenotype similar to that which exists in nature by using a grammar to map between the genotype and phenotype. This study seeks to extend our understanding of GE by examining the impact of different genotypic representations in order to determine whether certain representations, and associated diversity-generation operators, improve GE's efficiency and effectiveness. Four mutation operators using two different representations, binary and gray code representation respectively, are investigated. The differing combinations of representation and mutation operator are tested on three benchmark problems. The results provides support for the continued use of the standard genotypic integer representation as the alternative representations do not exhibit higher locality nor better GE performance. The results raise the question as to whether higher locality in GE actually improves GE performance.
  • Knowledge based segmentation of fundus eye images
    42 colour image segmentation, knowledge representation, expert systems, rule induction systems, fundus eye images Leslaw Milosz Pawlaczyk, pages 421 – 435. Show abstract {\it In this article we describe a new method for segmentation of two anatomical structures visible in fundus eye images: vessels and eye cup. The method is called Universal Segmentation Scheme (USS) and is based on the knowledge stored in rules of an expert system (ES) and PNC2 rule induction system. The USS is a general method which is later adapted for each of the anatomical structures specifically. The results of segmentation are compared to the ground truth images which show the relative segmentation error.}
  • Genetic Programming for Dynamic Environments
    18 Adaptive, Dynamic, Genetic Programming, Option Pricing Application Zheng Yin, Anthony Brabazon, Conall O'Sullivan, Michael O'Neill, pages 437 – 446. Show abstract Genetic Programming (GP) is an automated computational programming methodology which is inspired by the workings of natural evolution techniques. It has been applied to solve complex problems in multiple application domains. This paper investigates the application of a dynamic form of GP in which the probability of crossover and mutation adapts during the GP run. This allows GP to adapt its diversity-generating process during a run in response to feedback from the fitness function. A proof of concept study is then undertaken on the important real-world problem of options pricing. The results indicate that the dynamic form of GP yields better results than are obtained from canonical GP with fixed crossover and mutation rates. The developed method has potential for implementation across a range of dynamic problem environments.

Computer Aspects of Numerical Algorithms

  • Applications of Finite Element Methods in synovial joint numerical calculations
    119 Finite Element Methodsm mesh, synovial joint Anna Kucaba-Piętal, Jarosław Sęp, pages 449 – 455. Show abstract The problem of meshing for numerical calculations of human rapid movement influence on the tribological features of a synovial joint was discussed. The calculations performed pointed out great effectiveness of the ADINA-F solver.
  • Molecular Dynamic computer simulations of nanoflows
    117 Molecular Dynamic Method, nanoflows, Ewald Sum, Anna Kucaba-Piętal, Janusz Bytnar, Zbigniew Walenta, pages 457 – 465. Show abstract In this paper we present the results of utilizing scientific computing methodologies to address an engineering problem from nano technologies. In nano and micro-scale, the calculation could only be done with some particle based representation method. One of them is Molecular Dynamics (MD) method. In the paper we describe the construction of the Molecular Dynamics Method and we present some results of the MD simulation of the water nanoflows [13, 14].
  • Linking of direct and iterative methods in Markovian models solving
    133 Gauss-Seidel method, block Gauss-Seidel method, preconditioning, Markov chains Beata Bylina, Jaroslaw Bylina, pages 467 – 477. Show abstract An article identifies and assesses an effectiveness of two different methods applied to solve linear equations systems which result while modeling of computer networks and systems with Markov chains. The paper considers both the hybrid of direct methods as well as classic one of iterative methods. Two varieties of Gauss elimination will be considered as an example of direct methods: the LU factorization method and the WZ factorization method. Gauss-Seidel iterative method will be discussed. That issue points in preconditioning and matrix division into blocks where blocks will be solved applying direct methods. The paper presents an impact of liked methods on both time and accuracy of vector probability determining regarding particular networks and computer systems occurring.
  • Airfoil shape optimization by coupling computational fluid dynamics with evolutionary multiobjective optimization
    177 computatonal fluid dynamic, multiobjective optimization, computational Grid Daniela Zaharie, Silviu Panica, Marius Stoia-Djeska, Mircea Dragan, Dana Petcu, pages 479 – 481. Show abstract The problem of coupling in an efficient way computational fluid dynamics and evolutionary multiobjective optimization codes in order to solve problems of optimal design is discussed. Both the problem of providing an easy to use framework and that of the computational cost are addressed. Moreover, a user interface was designed to allow the execution of different instances, with respect to the parameters of the evolutionary algorithm, of the combined code on several machines from a Grid infrastructure.
  • Comparative Analysis of High Performance Solvers for 3D Elliptic Problems
    166 parallel algorithms, PCG method, preconditioner, MIC(0) factorization, circulant, performance Ivan Lirkov, Yavor Vutov, pages 483 – 492. Show abstract The presented comparative analysis concerns two iterative solvers for 3D linear boundary value problems of elliptic type. After applying the Finite Difference Method (FDM) or the Finite Element Method (FEM) discretization a system of linear algebraic equations has to be solved, where the stiffness matrix is large, sparse and symmetric positive definite. It is well known that the preconditioned conjugate gradient method is the best tool for efficient solution of large-scale symmetric systems with sparse positive definite matrices. Here, the performance of two preconditioners is studied, namely the Modified Incomplete Cholesky factorization MIC(0) and the Circulant Block-Factorization (CBF) preconditioning. Portable parallel codes are developed based on Message Passing Interface (MPI) standards. The comparative analysis is mostly based on the execution times to run the parallel codes. The number of iterations for both preconditioners are also discussed. The performed numerical tests on parallel computer systems demonstrate the level of efficiency of the developed algorithms. The obtained parallel speed-up and efficiency well illustrate the scope of efficient applications.
  • On the computer simulation of heat and mass transfer in vacuum freeze-drying
    68 freeze--drying, heat end mass transfer, ordinary and partial differential equations, Runge--Kutta methods, heat conduction equation, finite element and finite difference methods Krassimir Georgiev, Ivan Lirkov, Nikola Kosturski, Svetozar Margenov, pages 493 – 502. Show abstract The paper is devoted to studying the problem of freeze--drying which is a process of the dehydrating frozen materials by sublimation under high vacuum. The mathematical and the computer models describing this process are presented. The discretizations used and the numerical treatment of the corresponding ordinary and partial differential equations is discussed. The results of some test experiments and the corresponding analysis can be found.
  • Image reconstruction from incomplete data projections by means of iterative algebraic algorithms
    56 image reconstruction, iterative algebraic algorithms, computer tomography Nadiya Gubareni, Mariusz Pleszczynski, pages 503 – 515. Show abstract In this paper we consider the problem of image reconstruction from incomplete projection data for some particular schemes of reconstruction. We present the numerical reconstruction algorithms for image reconstruction of high contrast objects from incomplete data. Numerical simulation results for a number of modeling objects with hight contrast are presented and discussed.
  • Parallel PCG algorithms for voxel FEM elasticity systems
    63 FEM, PCG, MIC(0), AMG, parallel algorithms Svetozar Margenov, Yavor Vutov, pages 517 – 526. Show abstract The presented comparative analysis concerns two parallel iterative solvers for large-scale linear systems related to µFEM simulation of human bones. The benchmark problems represent the strongly heterogeneous structure of real bone specimens. The voxel data are obtained by a high resolution computer tomography. % Non-conforming Rannacher-Turek finite elements are used for discretization of the considered problem of linear elasticity. It is well known that the preconditioned conjugate gradient method is the best tool for efficient solution of large-scale symmetric systems with sparse, positive definite matrices. Here, the performance of two parallel preconditioners is studied. Both are based on displacement decomposition. The first one uses modified incomplete Cholesky factorization MIC(0) and the other—algebraic multigrid. The comparative analysis is mostly based on the computing times to run the codes. The number of iterations for both preconditioners are also discussed. \\ \\ \medskip {\bf Keywords:} FEM, PCG, DD, MIC(0), AMG, parallel algorithms.
  • Implementing the Conjugate Gradient Method on a grid computer
    74 grid computing, large sparse linear systems, GridSolve, Conjugate Gradient Method, resource-aware partitioning Tijmen Collignon, Martin van Gijzen, pages 527 – 540. Show abstract We study two implementations of the Conjugate Gradient method for solving large sparse linear systems of equations on a heterogeneous computing grid, using GridSolve as grid middleware. We consider the standard CG algorithm of Hestenes and Stiefel, and as an alternative the Chronopoulos/Gear variant, a formulation that is potentially better suited for grid computing since it requires only one synchronisation point per iteration, instead of two for standard CG. The computational work is divided into tasks which are dynamically distributed over the available resources using a resource--aware data partitioning strategy. We present numerical experiments that show lower computing times and better speed--up for the Chronopoulos/Gear variant. We also identify bottlenecks and suggest improvements to GridSolve.
  • Automatic First- and Second-Order Adjoints for Truncated Newton
    64 Truncated Newton, First- and Second Order Adjoints Uwe Naumann, Michael Maier, Jan Riehme, Bruce Christianson, pages 541 – 555. Show abstract The analysis and modification of numerical programs in the context of generating and optimizing adjoint code automatically probably ranges among the technically and theoretically most challenging source transformation problems known today. A complete compiler for the target language (Fortran in our case) is needed to cover the technical side. This amounts to a mathematically motivated semantic transformation of the source code that involves the reversal of the flow of data through the program. Both the arithmetic complexity and the memory requirement can be substantial for large-scale numerical simulations. Finding the optimal data-flow reversal schedule turns out to be an NP-complete problem. The same complexity result applies to other domain-specific peephole optimizations. In this paper we present a first research prototype of the NAGWare Fortran compiler with the ability to generate adjoint code automatically. Moreover, we discuss an approach to generating second-order adjoint code for use in Newton-type algorithms for unconstrained nonlinear optimization. While the focus of this paper is mostly on the compiler issues some information on the mathematical background will be found helpful for motivational purposes.

7th International Multidiscipinary Conference on Electronic Commerce

  • Legal Aspects of Deep Links on the Internet
    39 deep linking, search engine, news bulletins Artur Strzelecki, pages 559 – 563. Show abstract Links are found everywhere and they are these what creates the World Wide Web. Possibility to use the whole network is the essence of polycentric decentralized structure of the Internet. The user may quickly and easily surf web sites. The search engines reach editorial texts and photos presented in the news through deep links. Skipping the home page of a given web site normally means noting lower amount of visits. The amount of visits has an impact on the income from adverts. Problem of an effective defense against unwanted deep links occurs especially in the media where generally contents are protected by copyright. Many e-commerce enterprises want to be on the highest position in the results of searching. Search engines copy up-to-date press news or directly refers to them. There arises dispute over ownership and the aim of using copied information. The article discusses the significance and effects of using deep links in up-to-date press contents coming from the search engines. There are presented practical examples of disputes occurring on the basis of deep links. The author proposes the method of using deep links as a selling propulsion in the news services.
  • Application of stochastic processes in Internet survey
    127 Internet survey, uncontrolled sample, population, coherent system, estimation, reliability function Elżbieta Getka-Wilczyńska, pages 565 – 578. Show abstract In this paper Poisson processes and basic methods of the reliability theory are proposed to interpretation, definition and analysis some stochastic properties of process of Internet data collection. At first, the notion of uncontrolled sample is introduced and random size of it is defined as a counting process. At the second, the process of Internet data collection is considered as a life test of the population surveyed. The events which appear in Internet survey are interpreted as a lifetime, arrival, death of the element of the population and the basic characteristics of reliability of the length of the population lifetime are described, calculated and estimated by using the notions and methods of the reliability theory. Keywords: Internet survey, uncontrolled sample, population, coherent system, estimation, reliability function
  • Data model standardization for real-time e-commerce
    142 real-time markets, systems integration, market modeling Przemysław Kacprzak, Mariusz Kaleta, Piotr Pałka, Kamil Smolira, Eugeniusz Toczyłowski, pages 579 – 588. Show abstract Real-time activity is important trend rising in e-commerce. Such kind of activity requires specialized, fully standardized and integrated market systems. In this paper we present and discuss basic requirements for real-time e-commerce systems. We also present M³—Open Multi-commodity Market Data Model, which may provide flexible and universal market data and communication models for wide range of markets, and thereby facilitate systems integration for real-time e-commerce purposes.
  • A purchasing power indicators as a tool for predicting market attractiveness for SMEs in a changing currency environment
    180 currency fluctuations, SMEs, e-commerce Jacek Wachowicz, pages 589 – 596. Show abstract E-commerce gave SMEs possibility to act over the international markets. However, this introduced some new threats to business operations. Apart of technological, knowledge, workforce, increased competition they need to face changes of exchange rates and conjuncture fluctuations. This paper presents a concept of simple indicators, which may very quick select markets of a great potential due to financial market changes.
  • Even Swaps Method for Developing Assessment Capabilities of E-Negotiation System
    35 multiple criteria decision making, even swaps, negotiation support, ENSs Tomasz Wachowicz, pages 597 – 606. Show abstract One of the features of the e-negotiation systems is the capability to support negotiators in evaluating and comparing the offers. It is usually conducted by means of an additive scoring system, which results in the abstract scores assigned to all the offers. However the process of assigning the scores to the issues and options, required by an additive scoring system, may be perceived by some decision makers as an affected and vague. In the paper we consider thus an alternative approach that basis on the even swaps method. It is a part of multiple attribute decision making methodology called PrOACT, proposed by Hammond, Keeney and Raiffa, and focuses on finding the equivalent amounts as the balances between the unit of one issue with respect to the units of the others. The method is adopted to the negotiation actuality and programmed in a spreadsheet as a prototype software.
  • The Brand Equity - Marketing and Financial Approach
    53 brand equity, virtual communities, valuing customers on Internet market Urszula Świerczyńska-Kaczor, Paweł Kossecki, pages 607 – 613. Show abstract The authors emphasize two important issues of the management and measurement of brand equity on the Internet market: creating relationships between company and virtual communities and the process of valuing customers. Strengthening cooperation with virtual societies leads to enhancing brand equity in the areas of brand perception, customer's loyalty and company's reputation. In this way the potential of the company's brand and its portfolio are reinforced. Moreover, the brand equity depends on the value of the customers' base. We propose the method of valuing customers by referring to the level of customers' loyalty and their Customer Lifetime Value.

2nd International Workshop on Secure Information Systems

  • Integrated, Business-Oriented, Two-Stage Risk Analysis
    60 Information security management, ISMS, Risk analysis Andrzej Białas, Krzysztof Lisek, pages 617 – 628. Show abstract This paper presents an integrated, business-oriented, two-stage risk analysis method related to the Information Security Management Systems (ISMS) concept. The current state of the work is presented, including risk analysis methods and their implementation. The concept assumes the integration of preliminary overviews as well as high- and low-level risk analyses. High-level risk analysis works with the needs of business processes and presents criticality of these processes. Low-level risk analysis works with assets and selects safeguards in a cost-effective manner. It is assumed that the presented risk analysis concept can be used in other management systems: business continuity and IT services management. The paper concludes the current state of the work and defines its further directions.
  • Securing Voice over Internet Protocol
    16 VoIP, Security, IPSec, Security Levels, and Media Gateway Control Protocol. Ahmad Ghafarian, Randolph Draughorne, Steven Grainger, Shelley Hargraves, Stacy High, Crystal Jackson, pages 629 – 639. Show abstract In recent years, there has been significant increase in VoIP and internet telephony usage. The users, whether corporate or individuals are subject to the same security risks that have affected data networks for many years. This is mainly because voice networks are IP-based and all IP protocols for sending voice traffic contain flaws. In this paper, we study the security risks associated with the VoIP including vulnerabilities, man-in-the-middle attack, and denial-of-service. We will also review the protection measure that can be taken to make VoIP more secure, such as authorization, authentication, transport layer security, and media encryption.
  • Anomaly Based Intrusion Detection Based on the Junction Tree Algorithm
    89 intrusion detection, anomaly-based intrusion detection systems, junction tree algorithm Evgeniya Nikolova, Veselina Jecheva, pages 641 – 649. Show abstract The aim of this paper is to present a methodology for the attacks recognition during the normal activities in the system. Since the proposed approach uses the graphical representation method, we apply the junction tree algorithm (JTA). Some results from the accomplished simulation experiments are submitted as well.
  • Model based code generation for fast-deployment security applications
    173 integrated security systems, code generation, fast deployment Gyula Simon, László Szabados, András Tóth, pages 651 – 660. Show abstract A diverse set of sensors and actuators are key components of integrated security systems, which provide protection against various types of attacks and threats. Based on the type of the protected objects and environment the sensor/actuator components can be completely different, and the control logic, which makes decision based on the sensor readings, must be configured to the actual scenario. Short-lifetime security systems require fast and cost effective deployment but the safety requirements are still high. In this paper an architecture and a corresponding model-based code generation scheme is proposed, which provides easy and fast deployment for security applications with various sensory needs.
  • Dealing With Network Security in Academic Institutions - a Case Study
    134 network security, firewall, IDS, IPS, FortiGate Ivan Dolezal, Jiri Grygarek, Ondrej Jakl, Karel Krecmer, pages 661 – 670. Show abstract The paper deals with a real-life experience of the authors with their efforts at a radical security improvement of the academic computer networks that they administer at a large university and a medium-sized research institute. The solution, which started in 2004 and is still going on, has been based on hardware multi-threat security appliances with high throughput. The requirements on them included a combination of general purpose Intrusion Prevention System, HTTP/FTP antivirus capabilities and firewall functions. In particular, we describe our experience with the appliances of the FortiGate series, which have been deployed as the best solution available.
  • A Joint Meta-Linguistic Taxonomy of Intrusion Detection and Testing / Verification
    120 intrusion detection, verification, formal methods, languages, taxonomy Krzysztof Brzeziński, pages 671 – 680. Show abstract The current research into intrusion detection makes only the marginal use of results obtained by the community concerned with formal verification and testing. To harmonize the ideas and methods used by these two separate disciplines, we develop a discourse space (a taxonomy), in which the linguistic problems common to testing (in particular passive testing) and intrusion detection are captured. It is shown that the currently accepted main intrusion detection paradigms can be described and explained by this taxonomy.
  • Picture Passwords Superiority and Picture Passwords Dictionary Attacks
    139 graphical authentication, picture passwords superiority, cryptanalysis, dictionary attacks, strong passwords, human factor Krzysztof Golofit, pages 681 – 690. Show abstract This paper explores authentication techniques based on pictures as a possible solution to the most important problems concerning traditional passwords. The aim of this work is to bring together the technical (cryptological) and non-technical (psychological) awareness into the research on passwords. Security issues of any authentication mechanism (relying on knowledge) should not be considered without analysis of the human factor − since the users’ human nature was identified as a source of major weaknesses of conventional authentication. Several issues of security and possibility of practical application were discussed in the paper. In the first place the statistically significant superiority of picture passwords over alphanumerical ones was presented. Then, it was shown that the crucial weaknesses of the picture passwords follow less from pictures locations and more from their meaning and ability to be recognized. Moreover, the techniques leading to resistance to ‘key logging’ and ‘mouse tracking’ were discussed. Finally, methods guaranteeing that users choose dissimilar, personalized and cryptogra­phically strong graphical passwords were proposed.
  • Design and Implementation of a Portable ID Management Framework for a Secure Virtual Machine Monitor
    47 ID Management, Authentication, Virtual Machine Monitor, VMM, Hypervisor, Smart card, ID card Manabu Hirano, Takeshi Okuda, Eiji Kawai, Suguru Yamaguchi, pages 691 – 700. Show abstract A commonly used virtual machine monitor (VMM) allows multiple operating systems to share physical hardware resources as virtual resources in a safe manner. It provides a strong isolation mechanism between virtual machines (VMs). In this paper, we state the importance of ID management for a security-purpose VMM system to enforce security policy on an end-user environment. We present a design of a portable ID management framework for a security-purpose VMM. Our proposal employs a smart card (ID card) for user authentication. The proposed ID management framework can provide generic programming interfaces to existing VMM software. Our ID management framework realizes an authentication between a VMM and its users, an authorization for a VM boot operation, storage of cryptographic keys for VMM-layer’s disk encryption/decryption, and access control for virtual/physical resources based on a user identity. In this paper, we show the prototype implementation of our ID management framework and its integration into the proven VMM software, QEMU.
  • Access Control for Cross-Organisational Web Service Composition
    70 Cross-Organisational Serivces, Access Control, SOA Security, Trust Michael Menzel, Christian Wolter, Christoph Meinel, pages 701 – 711. Show abstract Service Oriented Architectures (SOA) promise a flexible approach to utilize distributed capabilities that may be located in independent trust domains. These capabilities can be exposed using Web Service technologies, which provide functionality to describe, discover, and invoke exposed services across organisational boundaries. A broad range of SOA-platforms and toolkits are available focusing on Web Service enabling and orchestration within an organisation. This paper addresses an evaluation and classification of different SOA-platforms and security frameworks regarding secure cross-organisational service invocation. To overcome the revealed limitations of existing frameworks, a two layered security architecture is introduced that satisfies the identified security requirements and abstracts from local access control models to enable secure federated cross-organisational services compositions.

International Conference on Principles of Information Technology and Applications

  • Application of Model Transformation in the Generic Framework for Traceability
    79 model transformation, metamodel, MDA, traceability, dependency area, UML Anna Derezinska, Jacek Zawlocki, pages 715 – 724. Show abstract Model transformation approach allows us to develop automatic and flexible solutions for the software evolution. Application of model transformation concepts was shown for the generic framework for traceability in object-oriented designs. Three transformations within the framework were considered: the input transformation of any model to the internal format, the traceability analysis generating a dependency area for a given model, and the output transformation of the resulting dependency area. They can be realized as model-to-model transformations with respect to their metamodels and in accordance to the independently specified transformation rules. In the input and output transformations the language and tools of the QVT standard proposed by the OMG were applied. In the remaining transformation, traceability rules were defined as automata with transitions labeled with conditions and actions.
  • Performance of Index trees on Flash Memory
    121 Database, Flash memory, index trees, DB indexing performance Hyun Seung Lee, Ha Yoon Song, Kyung-Chang Kim, pages 725 – 734. Show abstract Flash memory can be a viable solution for the future embedded systems. Embedded systems usually carry database as a part of embedded software. It is well known that flash memory is far faster than usual hard disk storages especially for read however it takes much time for flash memory to delete than to read or to write. For a database system, index tree on flash memory has been widely studied for its better performance, but there have been no standard measure for flash memory database performance. In order to find the storage access performance of index trees, we measured index tree access time. We choose B-tree, R-Tree and MR-tree for prominent index trees. We also choose hard disk and flash memory for storage devices. With several indexing profiles having different mixture of insert, search (read) and delete of indices, we measured access time performance with different combination of index tree structure and storage device. We hope our result will work as a basis for flash memory database system design and performance trimming.
  • Monitoring services in Service Oriented Architecture
    159 SOA, monitor, SLA, quality of service Ilona Bluemke, Marcin Warda, pages 735 – 744. Show abstract At the Institute of Computer Science Warsaw University of Technology a tool for monitoring services in Service Oriented Architectures (SOA) was designed and implemented. This tool was used in some experiments in real enterprise integrated architecture eg.: the effectiveness and the usage of resources in some flows were measured. These experiments are described and some conclusions are given. The monitoring module may be very useful in the maintenance of complex SOA systems.
  • Compositional Abstractions for Interacting Processes
    24 Behaviour abstraction, communicating sequential processes, compositionality, algebra of abstractions Maciej Koutny, Giuseppe Pappalardo, Marta Pietkiewicz-Koutny, pages 745 – 754. Show abstract A promising way of dealing with complex behaviours of networks of communicating processes is to use abstractions. In our previous work, interface abstraction, modelled through a suitable relation, allowed us to `interpret' the behaviour of an implementation process as that of a specification process, even in the event that their interfaces differ. The proposed relation is compositional, in the sense that a composition of communicating sub-systems may be implemented by connecting their respective implementations. But so far abstraction has been shown to distribute only over network composition which restricts its usefulness for compositional correctness analysis. In this paper we extend the treatment to other process constructs which proved to be useful in the development of complex distributed applications.
  • The Concept of Quasi-objects in a Temporal Intelligent System
    15 quasi-object, temporal intelligent system, heterogeneous domains Maria Antonina Mach, pages 755 – 764. Show abstract The paper presents the notion and concept of quasi-objects that form a representation layer of a temporal intelligent system. The tasks of such a system are briefly discussed to justify the use of quasi-objects, the origin of the concept is presented, and the main features of quasi-objects are outlined. Two examples of quasi objects are presented and discussed. Finally, advantages of this concept are briefly shown.
  • Federated Method Invocation with Exertions
    96 distributed systems, service oriented programming, metacomputing Michael Sobolewski, Michael Sobolewski, pages 765 – 778. Show abstract Six generations of RPC systems can be distinguished including Federated Method Invocation (FMI) presented in this paper. Some of them—CORBA, Java RMI, and Web/OGSA services—support distributed objects. However, creating object wrappers implementing remote interfaces doesn’t have a great deal to do with object-oriented distributed programming. Distributed objects developed that way are usually ill-structured with missing core object-oriented traits: encapsulation, instantiation, inheritance, and network-centric messaging by ignoring the real nature of networking. A distributed system is not just a collection of distributed objects—it’s the network of dynamic objects. In particular, the object wrapping approach does not help to cope with network-centric messaging, invocation latency, object discovery, dynamic object federations, fault detection, recovery, partial failure, etc. The Jini™ architecture does not hide the network; it allows the programmer to deal with the network reality: leases for network resources, distributed events, transactions, and discovery/join protocols to form service federations. An exertion-based architecture presented in this paper implements FMI to support service-oriented metaprogramming. The new triple Command pattern architecture presented in this paper uses Jini service management for managing the network of FMI objects.
  • Dependability of the Explicit DMC Algorithm for a Rectification Process
    78 dependability evaluation, fault injection, DMC algorithm, rectification process Piotr Gawkowski, Maciej Ławryńczuk, Piotr Marusak, Janusz Sosnowski, Piotr Tatjewski, pages 779 – 788. Show abstract . The paper studies dependability of software implementation of the explicit DMC (Dynamic Matrix Control) Model Predictive Control (MPC) algorithm applied for a rectification column. The process with two inputs and two outputs with strong cross-couplings and significant time delays is studied. The algorithm’s control law is calculated off-line. Dependability is evaluated experimentally using software implemented fault injection approach. The injected faults influence the quality of rectification process.
  • On communication management as a key element of successful IT program
    85 Rafal W. Cegielski, Jaroslaw Chudziak, Joanna Meyer, pages 789 – 798. Show abstract . In this paper we discuss the role of communication management in each phase of the large implementation of IT solutions. It is well known that the majority of project failures in the recent years are not due to lack of technical competency of the supplying team, but rather because of not well enough prepared and executed communication between various participants of the implementation process. In the paper we present various dimensions of communication process, its role and objectives, as well as, main participants and several modern tools and channels, which can be used in the process.
  • Shared Ontologies to Increase Systems Interoperatibiliy in University Institutions
    126 Interoperability, Ontologies Integration, Shared Ontology, Prot�g�-PROMPT, University Institutions, Methodology. Richard Gil, Ana María Borges, Leonardo Contreras, pages 799 – 808. Show abstract The increasing demand of quality information has forced that the methodological resources of system engineering come perfecting gradually. The practical materialization of diverse concepts and relations, are requiring that the semantic technology and ontological engineering obtain all their advance and potential. The University Institutions must be at the top of these findings. It is the purpose of this paper to show a first version of ontological integration of some subsystem models that interact inwards and outside of this type of institutions. These subsystems were selected intentionally of previous studies. The Prot\'eg\'e-PROMPT tool will be used as resource for the integration of some experimental ontology of such subsystems, using as methodological reference a framework that estimates an approach of shared terminology to be able to integrate in a compatible ontology. As result a method which allows obtaining a shared ontology is proposed.

International Workshop on Real Time Software

  • Interactive real-time control labs with truetime and easy java simulations
    86 education, real-time, control, virtual lab, simulation,interactive, truetime, ejs, matlab,simulink Gonzalo Farias, Karl-Erik Årzén, Anton Cervin, pages 811 – 820. Show abstract This paper presents the development of interactive real-time control labs using TrueTime and Easy Java Simulations. TrueTime is a freeware Matlab/Simulink based tool to simulate real-time control systems, and Easy Java Simulations allows rapid creation of interactive simulations in Java. Authors can use TrueTime to develop the simulation of a real-time control system, and then move to Easy Java Simulations to link the system and create the graphical user interface which provides the visualization and user interaction. The combination of these tools brings together the best of them.
  • Modelling and schedulability analysis of real-time sequence patterns using Time Petri nets and Uppaal
    32 real-time systems, time Petri nets, timed automata, UPPAAL, schedulability, firing sequence patterns Angelo Furfaro, Libero Nigro, pages 821 – 835. Show abstract This paper proposes an original approach to the schedulability analysis of real-time systems specified by Time Petri Nets (TPNs). The focus is on sequence patterns of transition firings (execution tasks). A TPN model is first translated in the Timed Automata terms of the popular Uppaal tool. Then schedulability properties of tasks are verified through reachability analysis. The approach is efficient and scalable. The paper demonstrates the concrete application of the approach through examples. Finally, conclusions are drawn together with an indication of on-going and future work.
  • Improving Dependability of Automation for Free Electron Laser FLASH
    105 Free-electron laser FLASH, automation, model checking, formal verification Boguslaw Koseda, Tomasz Szmuc, Wojciech Cichalewski, pages 837 – 847. Show abstract Free-electron laser FLASH (260-meter-long machine) [ISBN 3-935702-17-5] is a pilot facility for the forthcoming XFEL [ISBN 3-935702-17-5] (3 km) and ILC [ISBN 0-309-66039-4]($\simeq$35 km) projects. Along with growth of the experiment, service and maintenance are becoming so complex that certain degree of automation seems to be inevitable. The main purpose of the automation software is to facilitate operators with computer-aided supervision of several hardware/software subsystems. The efforts presented in this contribution concern elaboration of general framework for designing and development of automation software for the FLASH. The toolkit facilitates specification, implementation, testing and formal verification. The ultimate goal of the framework is to systematize the way of automation software development and to improve its dependability. At present usefulness of the tools is being evaluated by testing the automation software for single RF-power station of the FLASH.
  • Task jitter measurement under RTLinux operating system
    48 RTLinux, Jitter, Measurement Pavel Moryc, Jindrich Cernohorsky, pages 849 – 858. Show abstract This paper deals with real-time task jitter measurement under RTLinux operating system. In the first part, it describes methods and tools developed to measure jitter in the RTLinux environment. In the second part, it is focused on discussion of results, obtained on PC hardware, and their interpretation.
  • Robust Real-Time Communication for Large Scale Distributed Control Systems
    87 distributed control system, real-time, redundancy, communication network, supervisory control, scheduling, OPC Mariusz Postol, pages 859 – 869. Show abstract Because of their scale, complexity and requirement of expandability, Large Scale Distributed Control Systems (LSDCS) are usually created in a multistep integration process. To succeed, it has to be governed by well-defined information architecture, appropriate communication infrastructure and the supervisory role of the time notion taken into consideration from the very beginning of the design stage. Mutual influence of the architecture and underlying communication is discussed in the paper and a novel systematic design methodology is proposed to greatly reduce the complexity. A dedicated communication component is proposed in this approach. Functionality and scheduling algorithms offered by these components enable to satisfy all the defined prerequisites of the real-time distributed control and design the robust system in a systematic and uniform way. The presented case study proves that the solution not only allows the real-time process requirements to be met, but also is a platform for multi-enterprise collaboration.
  • Veriest: Reusing Verilog Designs in Esterel
    46 synthesizable verilog, esterel, synchronous languages, Menachem Leuchter, Shmuel Tyszberowicz, Yishai Feldman, pages 871 – 881. Show abstract Veriest is an automatic translator that converts synthesizable Verilog designs into the synchronous language Esterel. The translation into a synchronous language can expose hidden flaws in the original design, including subtle race conditions. In addition, the extensive libraries of verified Verilog designs can now be reused in synchronous designs. Verilog and Esterel have different models and features, complicating the translation. For example, Verilog has flexible data types and operators for dealing with data buses of varying widths; it also supports three-state logic, which has no equivalent in langauges not meant to describe hardware. Veriest creates functions in the hosting language (usually C) to represent concisely such features of Verilog that are not native to Esterel.
  • Towards the Safety Verification of Real-Time Systems with the Coq Proof Assistant
    97 hybrid systems, safety verification, discrete abstraction, proof assistants Olga Tveretina, pages 883 – 892. Show abstract Hybrid systems are systems involving the interaction of discrete and continuous dynamics. Hybrid systems have been used as a mathematical model for many safety critical applications. One of the most important analysis problems of hybrid systems is the reachability problem. Approaches based on predicate abstraction are widely used for the reachability analysis. They are not efficient enough because of introducing additional transitivity along the series of abstract states. In this paper we give an approach to solve this problem for some classes of hybrid systems. A verification example formalized within the $\coq$ proof assistant is provided.
  • A speed classification method for real-time controlled dynamic systems
    145 real-time control, digital control, scheduling, time-critical systems, FPGA, magnetic levitation Paweł Piątek, Wojciech Grega, pages 893 – 902. Show abstract Traditionally, control algorithms are designed without a consideration of their real-time implementation details. The performance of a digital control system besides the sampling period depends on many variables, such as the control loop execution time, jitter, the complexity of the control algorithm etc. In this paper attention is focused on the interaction of the parameters of the scheduled tasks and on the performance of control loops closed with digital controller. A new design approach that is based on the relative speed classification of the control system have been proposed. The approach is illustrated by analysis of control systems developed for laboratory magnetic levitation process.
  • Dependability of Explicit DMC and GPC Algorithms
    81 dependability evaluation, safety, fault injection, DMC algorithm, GPC algorithm, robot Piotr Gawkowski, Maciej Ławryńczuk, Piotr Marusak, Janusz Sosnowski, Piotr Tatjewski, pages 903 – 912. Show abstract This paper studies dependability of software implementation of DMC (Dynamic Matrix Control) and GPC (Generalised Predictive Control) Model Predictive Control (MPC) algorithms. Explicit formulation of algorithms is considered in which the control laws are calculated off-line. Dependability is evaluated usig software implemented fault injection approach. Tests are performed in the control system of a remotely controlled robot vehicle used in nuclear plants.
  • Incorporating Fault Tolerance into Component-based Architectures for Embedded Systems
    36 Fault tolerance, component-based development, software architecture, embedded systems, UML Shourong Lu, Wolfgang Halang, pages 913 – 922. Show abstract A component-based software architecture is presented to support the process of designing and developing fault-tolerant computerised control systems. To this end, we combine an idealised fault-tolerant component, the C2 architecture style and protective wrappers, and embed fault tolerance techniques into component definitions. The resulting architecture is described by normal- and abnormal-activity components aiming to support a wide range of fault tolerance features. Use of this architecture enables to reason about system dependability already from the earliest development stages on, and to customise fault tolerance strategies according to application characteristics.
  • Using Preemption For Dependable Urban Vehicle Traffic
    110 real-time scheduling, real-time control, urban vehicle traffic, dependable systems Tiberiu LETIA, Sergiu BARBU, Florin Dinga, pages 923 – 932. Show abstract A new approach for design and implementation of urban vehicle control system is proposed. The vehicle streams on lanes are considered similar with the streams of instructions in multitask programs. Real-time scheduling algorithms are used to allocate the green lights to phases. An adaptive component is used to calculate new vehicle flow parameters when a failure appears as a consequence of an accident. The real-time schedulers use the parameters to obtain new feasible resource allocations.

Invited Papers for the Round Table Discussion on Real Time Software Engineering Education

  • The modular approach of a real-time course
    174 real time systems education Shmuel Tyszberowicz, pages 933 – 937. Show abstract This article describes some thoughts concerning the needed syllabus for a "Real-Time Systems" course for computer science students. The ideas described are based on several years of experience in teaching such a course both for graduate and advanced undergraduate (in the last year of studies) students at several academic institutes: Tel-Aviv University, the Open University of Israel, and the Academic College of Tel-Aviv Yaffo.
  • A Model for Educating Real-Time Software Engineers On-Demand
    178 Real-Time Education, Educational Model, Student projects Janusz Zalewski, pages 939 – 942. Show abstract This paper discusses educating software engineers at the undergraduate level to prepare them for taking immediately available jobs in industry, in response to the industry demand in the area of real-time systems. It is focused on the innovativeness of organizational aspects of the educational process, involving its stakeholders, rather than on the curricular aspects, traditionally considered first.
  • ILERT - International Learning Environment for Real-Time Software-Intensive Control Systems
    171 engineering education, globalization, real-time software, real-time control, safety Andrew J. Kornecki, Thomas B. Hilburn, Wojciech Grega, Jean-Marc Thiriet, Miroslav Sveda, pages 943 – 948. Show abstract . Due to heavily software-centric nature of modern reactive and time-critical systems, there is an increasing demand for efficient development of high quality Real-Time Software-Intensive Control systems (RSIC). The study discussed in this paper is focused on the creation of international curriculum framework centered on RSIC – this important aspect of the computer-system-control-software engineering education. The study explores the mechanism for involving students from multilingual geographically separated institutions in a coordinated educational experience. It exposes them to the problems, methods, solution techniques, infrastructure, technologies, regulatory issues, and tools in the domain of dependable real-time safety-critical software-intensive control systems. The ultimate objective is the creation of a model RSIC curriculum, which can be used by engineering schools both in the USA and the EU.
  • Educational Objectives for Embedded Systems
    176 Education, real-time systems, embedded systems, concepts, professionalism, ethics Wolfgang Halang, pages 949 – 952. Show abstract Adequate concepts, professionalism and ethics are identified as the goals in properly educating systems engineers for professional life. Criteria and concepts appropriate to design feasible solutions must be based on fully understanding the peculiarities of the environments where embedded systems are employed. Bad practices of the computing profession need to be overcome to foster professionalism, and the consequences of the shift from hardware to software must be realised. This gives rise to some rules of ethics and proper conduct.

Workshop on Ad-Hoc Wireless Networks: Urban Legends and Reality

  • Acoustic Target Classification In Wireless Audio--Sensor Networks
    167 target classification and tracking, sensor network applications, audio applications Baljeet Malhotra, Ioanis Nikolaidis, Janelle Harms, pages 955 – 964. Show abstract Target tracking is one of the important problems in wireless sensor networks. We consider an aspect of tracking: the classification of targets based on the acoustic signals produced by vehicles. We present a na\"ive classifier and simple distributed schemes based on features extracted from acoustic signals. We demonstrate a novel way of using Aura matrices to create a new feature derived from the Power Spectral Density (PSD) of a signal, which performs at par with other existing features. An experimental study has been conducted using real acoustic signals of different vehicles in an urban setting. Our proposed schemes using a naïve classifier achieved highly accurate results in classifying different vehicles into two classes. Communication overheads were also computed to capture the tradeoff of energy cost vs. classification quality.
  • Analysis of IEEE 802.11b/g Card Behavior in Multirate Ad-hoc Networks
    106 ad-hoc, IEEE 802.11b/g cards, measurements, multirate Katarzyna Kosek, Szymon Szott, Marek Natkaniec, Andrzej R. Pach, pages 965 – 974. Show abstract In multirate ad-hoc networks, mobile stations usually adapt their transmission rates to the channel conditions. This paper investigates the behavior of IEEE 802.11b/g cards in a multirate ad-hoc environment. WLAN network cards from different vendors are analyzed. The theoretical upper bound estimation of the throughput in multirate ad-hoc networks is derived. The measurement scenarios and obtained results are presented. For result validation the theoretical and experimental values are compared.
  • Self-organizing Ad-hoc Architecture as a Solution for Coverage Issues in WiMAX Metropolitan Area Networks
    170 WiMAX, testbed, coverage, mesh Krzysztof Gierlowski, Józef Woźniak, Krzysztof Nowicki, pages 975 – 984. Show abstract New WiMAX technology offers several advantages over currently available (GSM or UMTS-based) solutions. It is a cost effective, evolving, and robust technology providing quality of service guarantees, high reliability, wide coverage and non-line-of-sight transmission capabilities. All these features make it especially suitable for densely populated - urban environments. In the paper we discuss design and implementation difficulties concerning network coverage, discovered in the test-bed implementation during measurements and tests. We point out unexpected “coverage white spots” - not characteristic to WiMAX technology. As one of possible solutions of this significant drawback of the very promising technology we consider reconfigurable mesh organization of WiMAX base stations. We also suggest directions for further development of this kind of network operation, partly based on our practical experience.
  • Ad-hoc networking with low-cost devices: how to do it right
    162 ad-hoc networks, routing, small footprint Pawel Gburzynski, Wlodek Olesinski, pages 985 – 994. Show abstract Although simple wireless communication involving nodes built of microcontrollers and radio devices from the low end of the price spectrum is quite popular these days, one seldom hears about serious wireless networks built from such devices. Commercially available nodes for ad-hoc networking (somewhat inappropriately called "motes")See for example http://www.xbow.com/. are in fact quite serious computers with tens of megabytes of RAM and rather extravagant resource demands. We show how one can build practical ad-hoc networks using the smallest and cheapest devices available today. In our networks, such devices are capable of sustaining swarm-intelligent sophisticated routing while offering enough processing power to cater to complex applications involving distributed sensing and monitoring.
  • Problems When Realizing Ad Hoc Networks: How A Hierarchical Architecture Can Help
    19 implementation, experiments, wrongful assumptions, hierarchical mesh Stefan Bouckaert, Dries Naudts, Ingrid Moerman, Piet Demeester, pages 995 – 1004. Show abstract Despite the fact that many electronic devices are equipped with wireless interfaces and a lot of publications on wireless ad hoc and mesh networking exist, these networks are seldom used in our everyday life. A possible explanation is the fact that only few of the numerous theoretically promising proposals lead to practical solutions on real systems. Currently, wireless network design is mostly approached from a purely theoretical angle. In this paper, common theoretical assumptions are challenged and disproved, and key problems that are faced when putting theory to practice are determined by experiment. We show how these problems can be mitigated, and motivate why a heterogeneous hierarchical wireless mesh architecture can help in making wireless ad hoc networking a reality.

1st Workshop on Advances in Programming Languages

  • Post object-oriented paradigms in software development: A comparative analysis
    67 separation of concerns, crosscutting concerns, aspect-oriented programming, composition filters Adam Przybylek, pages 1009 – 1020. Show abstract The object-oriented (OO) paradigm has been popularised as a solution to the problems encountered with the structured paradigm. It facilitates the understandability, extensibility, reusability and maintainability of systems. However, years of experience and analytical studies have shown that this is only partially true, and that there are still issues which have never been successfully resolved in the OO paradigm. These issues arise whenever programmers need to deal with peripheral requirements which spread over a system. A simple modification of these requirements typically involves intensive changes in code. Over the last decade interesting and worthwhile work has been done on the subject of implementing peripheral requirements. Perhaps the most successful outcomes have been obtained by aspect-oriented programming and composition filters. The main goals of the paper are: (1) to present open problems for the OO paradigm; (2) to analyse two post-OO paradigms involved in confronting these problems; (3) to indicate a possible application of these paradigms.
  • Slicing wxHaskell modules to derive the User Interface Abstract Model
    123 GUI (graphical user interfaces), slicing, dependency graphs Daniela da Cruz, Pedro Rangel Henriques, pages 1021 – 1024. Show abstract In this paper, we discuss an experience slicing an Haskell program to separate its GUI developed with the graphic toolkit wxHaskell. The slicing operation was implemented in Strafunski, an Haskell-centered software bundle for generic programming and language processing. To analyze the user interface and extract the abstract model, we first parse the source Haskell program, and build its Abstract Syntax Tree (AST). Then a strategic traversal of the AST is used to slice the GUI wxHaskell code, and build the graph of dependencies between widgets. In this way it becomes possible to understand the execution flow of the program.
  • Pattern-based Program Visualization
    122 program comprehension, program visualization, program animation, software maintenance Daniela da Cruz, Pedro Rangel Henriques, Maria João Varanda Pereira, pages 1025 – 1036. Show abstract The aim of this paper is to discuss how our pattern-based strategy for the visualization of data and control flow can effectively be used to animate the program and exhibit its behavior. That result allows us to propose its use for Program Comprehension. The animator uses well known compiler techniques to inspect the source code in order to extract the necessary information to visualize it and understand program execution. We convert the source program into an internal decorated (or attributed) abstract syntax tree and then we visualize the structure by traversing it, and applying visualization rules at each node according to a pre-defined rule-base. No changes are made in the source code, and the execution is simulated. Several examples of visualization are shown to illustrate the approach and support our idea of applying it in the context of a Program Comprehension environment.
  • Machine Code Can Be Representation of Source Code With Optimization
    131 compilation, decompilation, machine code patching Samir Ribić, Adnan Salihbegović, pages 1037 – 1039. Show abstract In the recent times the authors of this paper have been doing re­search on possibility of deve­lo­ping programming language, which would be nei­ther compiler, nor interpreter. The concept is based on hol­ding complete pro­gram in native machine code, while the specialized editor can decompile ma­chine code and display it in high level language. The displayed code can be ree­dited and saved again as pure machine code. This paper investigates the pos­si­bility of optimizing generated code, while still retaining decompilability features.
  • Executable form of the IEC 61131-3 ST language programs in the CPDev environment
    132 IEC 61131-3, ST language, engineering environment, controller programming Dariusz Rzońca, Jan Sadolewski, Bartosz Trybus, pages 1041 – 1053. Show abstract A prototype compiler of the ST language (Structured Text), its operation and internal structure is presented. The compiler is principal part of CPDev engineering environment for programming industrial controllers according to IEC 61131-3 standard. The CPDev is under development at Rzesz\ów University of Technology. The compiler generates an universal executable code as final result. The code can be interpreted on different platforms by target-specific virtual machines. Sample platforms include AVR, ARM, MCS-51, PC.
  • Adapting Service-finding for Component Flexibility in Object-Oriented Languages
    30 Modular Design Dominik Dahlem, William Harrison, pages 1055 – 1056. Show abstract Developers assembling components into larger bodies of software generally make component selection choices early, and use a great deal of latent information in making those choices. In contrast, service-oriented software uses dynamic service-finding facilities, often called "service brokers". The involvement of brokers to provide a service implies that no assumptions can be made about latent information. The broker may supply any service having the characteristics being sought. Using late-bound selection with more information explicitly manifest is vital to coarse-grained business-to-business service interaction. But it would also be beneficial to allow for more flexible and dynamic use of components for in-house software, especially when improvement-over-time and contextual adaptation are important. This work explores applying the ``new" operator to either interfaces or classes, to obtain the capabilities of a service broker. This allows conventional object-oriented software to derive the advantages that accrue to more flexible component selection being adopted for service-oriented software.
  • OORS: An Object-Oriented Rewrite System with Applications in Retargetable Code Generation and Optimization
    66 code generator generator, pattern matching Gernot Gebhard, Philipp Lucas, pages 1057 – 1069. Show abstract Retargeting a compiler's back end to a new architecture is a time-consuming process. This becomes an evident problem in the area of programmable graphics hardware (graphics processing units, GPUs ) or embedded processors, where architectural changes are faster than elsewhere. We propose the object-oriented rewrite system OORS to overcome this problem. Using the OORS language, a compiler developer can express the code generation and optimization phase in terms of cost-annotated rewrite rules supporting complex non-linear matching and replacing patterns. Retargetability is achieved by organizing rules into profiles, one for each supported target architecture. Featuring a rule and profile inheritance mechanism, OORS makes the reuse of existing specification possible. This is an improvement regarding traditional approaches. Altogether OORS increases the maintainability of the compiler's back end and thus both decreases the complexity and reduces the effort of the retargeting process. To show the potential of this approach, we have implemented a code generation and a code optimization pattern matcher supporting different target architectures using the OORS language and introduced them in a GPU compiler.
  • Exstensions Of Deductive Concept in Logic Programing and Some Applications
    33 automated, theorem, proving, ordered, linear, resolution, prolog, descriptive, LOGPRO, intelligent, tutor, system Ivana Berkovic, Biljana Radulovic, Petar Hotomski, pages 1071 – 1080. Show abstract Abstract. The paper presents ATP – a system for automated theorem proving based on ordered linear resolution with marked literals, its putting into the base of prolog-like language and some applications. This resolution system is especially put into the base of prolog-like language, as the surrogate for the concept of negation as definite failure. This logical complete deductive base is used for building a descriptive logical programming language LOGPRO, which enables eliminating the defects of PROLOG-system (the expansion concerning Horn clauses, escaping negation treatment as definite failure), but keeping the main properties of PROLOG-language and possibilities of its expansions. Some features of the system when it is used as the base for time-table and scheduling, a technique for the implicational problem resolving for generalized data dependencies and intelligent tutoring system are described.
  • Adaptive Language Approach to Software Systems Evolution
    103 Adaptive systems evolution, programming languages, program transformation, aspect oriented languages, domain specific languages, reflection Jan Kollar, Jaroslav Poruban, Peter Vaclavik, Jana Bandakova, Michal Forgac, pages 1081 – 1091. Show abstract From the viewpoint of adaptability, we classify software systems as being nonreflexive, introspective and adaptive. Introducing a simple example of LL(1) languages for expressions, we present its nonreflexive and adaptive implementation using Haskell functional language. Multiple metalevel concept is an essential demand for a systematic language approach, to build up adaptable software systems dynamically, i.e. to evolve them. A feedback reflection loop from data to code through metalevel data is the basic implementation requirement and the proposition for semi-automatic evolution of software systems. In this sense, practical experiment introduced in this paper is related to the base level of language, but it illustrates the ability for extensions primarily in horizontal but also in vertical direction of an adaptive system.
  • Separate compilation of grammars with Tatoo
    118 parser generator, separate compilation, mudular grammar Julien Cervelle, Rémi Forax, Gilles Roussel, pages 1093 – 1101. Show abstract This paper presents an extension of the Tatoo compiler compiler that supports separate compilation of formal grammars. It allows the developer to define reusable libraries of grammars such as those of arithmetic expressions or of classical control operators. The aim of this feature is to simplify the development of domain specific languages especially for non specialists in grammar writing.
  • Flexible Object Representation using Properties
    69 field, property, representation, object-oriented, memory, join, inheritance, reuse, boilerplate Koen Vanderkimpen, Marko van Dooren, Eric Steegmans, pages 1103 – 1112. Show abstract Many object-oriented programming languages use fields to represent object state. But subclasses cannot alter the representation of an object sufficiently. Because of that, a number of subclassing problems are difficult or impossible to deal with if the superclasses were not designed to anticipate them, especially under multiple inheritance. Moreover, in languages such as Java and C#, encapsulation necessitates a lot of boilerplate code. In this paper, we improve the C# concept of properties, making them more flexible for use with inheritance and reducing their boilerplate code. Using our properties makes it easier for programmers to model programs in a more consistent manner. Furthermore, our Properties allow redefining an object's attributes in ways that equal the possibilities for redefinition of virtual methods in many programming languages, which makes them better suited to deal with unanticipated reuse. Specifically, using our construct, it becomes possible to join several superclass attributes into only one at the subclass level, conjointly decreasing memory consumption.
  • A Generator of SQL Schema Specifications
    77 CASE Tools; SQL Generator; Database Schema Design and Implementation; Constraint Specification; IIS*Case Slavica Aleksić, Ivan Luković, Miro Govedarica, Pavle Mogin, pages 1113 – 1122. Show abstract IIS*Case is an integrated CASE tool that supports the automation and intelligent support of complex and highly formalized design and programming tasks in the development of an information system. IIS*Case generates relational database schemas in 3 rd normal form with all relevant data constraints. SQL Generator is an IIS*Case tool that generates the implementation specification of a database schema ac­cording to ANSI SQL:2003 standard. The generator may also produce a database schema specification for Microsoft SQL Server or Oracle DBMSs. The paper describes SQL Generator's traits, considers aspects of its application, and shows its use in the implementation of a complex da­tabase constraint using procedural mechanisms of a particular relational DBMS. SQL Generator is implemented in Java and Oracle JDeveloper envi­ronment.
  • A QTI Metamodel
    95 Assessment, Metamodeling, Model Driven Architecture Sonja Radenkovic, Nenad Krdzavac, Vladan Devedzic, pages 1123 – 1132. Show abstract Question & Test Interoperability specification (QTI) provides a good starting point for modeling and designing assessment systems. This paper presents a metamodel for developing a QTI-based assessment system using the Model Driven Architecture (MDA) standard. The main reason for applying the MDA standard in development of assessment systems is to make a clear difference between conceptual and concrete modeling in order to automate transfer and sharing of information and knowledge. The first prerequisite for this is a QTI metamodel.
  • An L-Attributed Grammar for Adjoint Code
    57 adjoint code, l-attributed grammar Uwe Naumann, Jan Riehme, pages 1133 – 1146. Show abstract Gradients of high-dimensional functions can be computed efficiently and with machine accuracy by so-called adjoint codes. We present an L-attributed grammar for the single-pass generation of intraprocedural adjoint code for a simple imperative language (a subset of C). Our ideas can easily be applied to any programming language that is suitable for syntax-directed translation. Moreover the conceptual insights are useful in the context of multi-pass generation of adjoint code. Our focus is on correctness. The necessary domain-specific code optimizations are beyond the scope of this paper. We give references to corresponding work in this area.
  • On applying stochastic programming in mathematical theory of programming
    135 problem solving, type theory, category theory, stochastic programming, probabilistic programming Viliam Slodicak, Valerie Novitzka, Anita Verbova, pages 1147 – 1150. Show abstract Stochastic programs are mathematical programs where some of the data incorporated into the objective or constraints are uncertain. Probabilistic programming is one of methods in stochastic programming. It is a programming in which the probabilities of the values of the variables are of interest. Our idea is that solving of a problem we can construct in logical reasoning over some mathematical theories. In this approach we use category theory for construction of logical system. We discuss here application of one method of stochastic programming in our approach.