ICSOFT-EA 2016 Abstracts


Area 1 - Enterprise Software Technologies

Full Papers
Paper Nr: 15
Title:

Architecture Viewpoint for Modeling Business Collaboration Concerns using Workflow Patterns

Authors:

Ayalew Kassahun and Bedir Tekinerdogan

Abstract: Businesses today rarely operate in isolation but must collaborate with others in a coordinated fashion. To address collaboration concerns, business analysts need to design business processes. Business process designs have a direct impact on the required software systems and the corresponding architectural design. Conversely, the architectural design imposes constraints on the business process designs. Unfortunately, business processes and software architectures are often designed separately leading to a misalignment between the two. To bridge this gap we propose the architecture collaboration viewpoint to be used by teams of business analysts and software architects when addressing business collaboration concerns. The collaboration viewpoint uses elements from business process and architecture viewpoints to provide new modeling artifacts for alignment. The design artefacts are mapping tables and workflow pattern diagrams that are used to identify misalignments and redesign the business processes. The viewpoint facilitates the communication between business analysts and architects. We illustrate the collaboration viewpoint for a food supply chain transparency system from a real industrial case study.

Paper Nr: 29
Title:

Business Process Aware Identification of Reusable Software Components

Authors:

Lerina Aversano, Marco Di Brino and Maria Tortorella

Abstract: Enterprises need to follow the rapid evolution of their business processes and promptly adapt the existing software systems. A preliminary requirement is that the software components are available, working and interoperable. A widely diffused solution is moving the adopted software solution toward an evolving architecture, such as the services-based one. The objective of this paper is to propose an approach for supporting the identification of reusable components in software systems by analyzing the business process using them. The proposed solution is based on the idea that a Service Oriented Architecture can be obtained by using a wide range of existing pieces of code. Such code components can be extracted from the existing software systems by identifying those ones supporting the business activities. Then, the paper proposes an approach for identifying the software components supporting a business process activity and candidate them for implementing a service. With this purpose, the recovery of the links existing between the business process model and the supporting software systems is exploited. An impact analysis activity is also performed starting from the initial traced components.

Short Papers
Paper Nr: 58
Title:

Practical Multi-pattern Matching Approach for Fast and Scalable Log Abstraction

Authors:

Daniel Tovarňák

Abstract: Log abstraction, i.e. the separation of static and dynamic part of log message, is becoming an indispensable task when processing logs generated by large enterprise systems and networks. In practice, the log message types are described via regex matching patterns that are in turn used to actually facilitate the abstraction process. Although the area of multi-regex matching is well studied, there is a lack of suitable practical implementations available for common programming languages. In this paper we present an alternative approach to multi-pattern matching for the purposes of log abstraction that is based on a trie-like data structure we refer to as regex trie. REtrie is easy to implement and the real world experiments show its scalability and good performance even for thousands of matching patterns.

Paper Nr: 59
Title:

MILP-based Approach for Optimal Implementation of Reconfigurable Real-time Systems

Authors:

Wafa Lakhdhar, Rania Mzid, Mohamed Khalgui and Nicolas Treves

Abstract: This paper deals with the design and implementation of reconfigurable uniprocessor real-time embedded systems. A reconfiguration is a run-time operation allowing the addition-removal of real-time tasks or the update of their parameters. The system is implemented then by different sets of tasks such that only one is executed at a particular time after a corresponding reconfiguration scenario according to user requirements. The problem is to optimize the system code while meeting all related real-time constraints and avoiding any redundancy between the implementation sets. Based on the Linear Programming (MILP), we propose a multi-objective optimization technique allowing the minimization of the number of tasks and their response times. An optimal reconfigurable POSIX-based code of the system is manually generated as an output of this technique. We apply the paper’s contribution to the study of the performance evaluation.

Posters
Paper Nr: 8
Title:

Consent Management Architecture for Secure Data Transactions

Authors:

Jarkko Hyysalo, Harri Hirvonsalo, Jaakko Sauvola and Samuli Tuoriniemi

Abstract: Digitalization of data intensive services presents several challenges, such as how to safely manage and use the multitude of personal data across various public, private and commercial service providers. Guaranteed privacy is especially critical in sensitive cases like health data management and processing. A key challenge and enabler for efficient data utilization is the need for an adequate consent management framework that meets the General Data Protection Regulation (GDPR). To facilitate sensitive secure data transactions where end-control always resides with the individual, a consent management architecture (CMA) is defined, utilizing the new MyData approach. The proposed CMA enables context-driven authorization of multi-sourced data for safe access by various health services. CMA proof-of-concept and experiences are described and discussed to concretize and evaluate the suggested architecture. Consent management and authorization topics are discussed as a service function of the MyData Operator. The technical APIs required for registering and authorizing data sources and data services via the Operator are demonstrated and analyzed to expedite development of this important area within the research and industrial communities.

Paper Nr: 11
Title:

Business Entity Warehouse: A New Design Method for Decision Support Systems from Business Entities

Authors:

Mounira Ben Abdallah, Imen Jellali, Nahla Haddar and Hanêne Ben-Abdallah

Abstract: Current business intelligence applications and most researches on enterprise performance analysis focus on one part of the business in isolation, the data produced from either the information system or the business process. One the one hand, such single perspective of the correlated data may produce incomplete or biased results. On the other hand, the integration of both data categories faces several challenges inherent to the differences in their semantics, structures and separate storage. To overcome these challenges, we herein propose the concept of business entity warehouse which builds a decision support system based on business entities. The business entity concept was introduced in the information system domain to bring together business operations and business data in a natural way. The business entity warehouse we introduce offers an integrated view of the four business perspectives of the enterprise (functional, behavioural, informational and organizational), and it provides for the analysis of the influence of the business process on the transactional data and vice versa. This paper presents a method to construct business entity warehouses from business entities extracted from IS and business process models.

Paper Nr: 18
Title:

A Knowledge Base Guided Approach for Process Modeling in Complex Business Domain

Authors:

Roberto Paiano and Adriana Caione

Abstract: The business process analysis requires an in-depth knowledge of factors such as the activities carried out; the actors involved; the domain or business context in which the activities are performed; the internal company structure; the current regulatory framework. This involves the employment and the collaboration of different professionals, such as business experts, domain experts and legal experts, along with a considerable effort in terms of time and resources. For the purpose of an efficient and effective management of business processes, it is also important to ensure the compliance with the company context and the flexibility with regard to changes that may occur within the company or at the legislative level. This paper shows a methodological and architectural approach guided by a knowledge base that describes the application domain. It is populated iteratively with the information extracted from the analysis of documents, regulations and requirements. The knowledge base is then used by the process designer as a guide for business process modelling and management.

Paper Nr: 22
Title:

Exploiting Web Technologies to Connect Business Process Management and Engineering

Authors:

Dario Campagna, Stefano Costanzo, Carlos Kavka and Alessandro Turco

Abstract: The Business Process Model and Notation (BPMN) standard can be used for representing low-level simulation and automation workflows for scientific, engineering and manufacturing processes. This paper focuses on removing the main obstacles that limit a more widespread adoption of the standard and the related technology: collaboration and data management. Web technologies can provide the necessary complementary features to the BPMN editing and execution activities: real-time collaboration, accessibility, information and expertise sharing. The proposed prototype mimics a SaaS (Software-as-a-Service) platform offering public community support and a private working area which can be shared in real-time with other users. The prototype includes an execution engine the implementation of which has been tailored to support the data structures required by scientific and engineering applications. The ideas presented in this paper are supported by three use cases: a Multi Disciplinary Optimization case (which is a typical engineering-domain problem involving the design of complex items), a collaborative decision-making scenario (the negotiation process for generating a lecture timetable at a university) and Lego-like decomposition of an optimization algorithm (its constituent elements can be easily re-assembled and shared with our platform).

Paper Nr: 34
Title:

A Novel Approach for Real-time Extracting Data From NoSQL Embedded Data Bases

Authors:

Afef Gueidi, Hamza Gharsellaoui and Samir Ben Ahmed

Abstract: In nowadays industry, embedded databases which display a mixture of multimedia signals and big data with modal interfaces need to be highly reconfigurable to meet real-time constraints, data stores optimization problems and to solve requirements problem in order to achieve high scalability and availability. This paper deals with a multi-objective extracting, managing and interrogating problem for a reconfigurable real-time embedded databases. In this case, new methods are required and NoSQL database created for solving the mentioned sub-problems and to be able to store big data effectively, demand for high performance when extracting, managing and interrogating these embedded databases. We also discuss the advantages and disadvantages of a NoSQL approach.

Paper Nr: 47
Title:

Cloud Data Warehousing for SMEs

Authors:

Sérgio Fernandes and Jorge Bernardino

Abstract: The emergence of cloud computing caused a revolution in the universe of Information Technology. With cloud computing solutions it is possible to access powerful features, hardware and software in less time and with considerably lower costs by using the model "pay-as-you-go". At the same time, this turnover increased information, and data warehouses must respond to this new reality. Small and medium enterprises (SMEs) were deprived of owning a traditional data warehouse due to the costs involved, but the cloud has made it possible to overcome this barrier. This paper provides an overview of Data Warehouse (DW) in the cloud and presents the main characteristics of the following solutions: Amazon Redshift, IBM dashDB, Snowflake, Teradata Active Data Warehouse Private Cloud, Treasure Data, and Microsoft Azure.

Paper Nr: 62
Title:

Rule-based System for Quality of Life Evaluation in Socio-Cultural Field

Authors:

Martin Šanda and Jiří Křupka

Abstract: This article deals with the quality of life in European Union. The objective of this article is to analyse the possibilities of quality of life evaluation on the European level based on selected indicators. For evaluation of quality of life is used expert system and fuzzy sets. User gives the value of a total of thirteen indicators of socio-cultural field, which are divided into three areas. The indicators are selected from several methodologies for evaluating the quality of life and are divided into areas with similar principles and characteristics. Selected methodologies are Active Aging Index, Eurofound, the Economist Intelligence Unit and the Better Life Index.The expert system determines rating for each area and for total rating of quality of life for the selected country. In the conclusions of this paper are other options for adjustments and expansion.

Paper Nr: 65
Title:

Distributed Intelligent Systems for Network Security Control

Authors:

Mohamed Shili, Hamza Gharsellaoui and Dalel Kanzari

Abstract: The great number of heterogeneous interconnected operating systems gives greater access to intruders and makes it easier for malicious users to break systems security policy. Also, a single security control agent is insufficient to monitor multiple interconnected hosts and to protect distributed operating systems from hostile uses. This paper shows the ability of distributed security controller’s agents to correlate data stream from heterogeneous hosts and to trace abnormal behavior in order to protect network security. An experimental study is done to improve our proposed approach.

Area 2 - Software Project Management

Short Papers
Paper Nr: 37
Title:

A Preliminary Mapping Study of Software Metrics Thresholds

Authors:

Elisabetta Ronchieri and Marco Canaparo

Abstract: Many papers deal with the topic of thresholds in software metrics to determine the quality level of a software project. This paper aims to identify the status of influential software metrics thresholds papers. We use search facilities in the SCOPUS Web tool to establish the cited papers published from 1970 to 2015. We classified the selected papers according to different factors, such as the main topic and the general type. The cited papers were more frequently on journals than conference proceedings. We observed three main problems: an unclear explanation of the method for selecting the technique that calculates thresholds; a direct application of the metric threshold values to different code context; a lack of objective analysis for the calculated thresholds. To our knowledge, this paper is the only one that performs this kind of study. It can provide baselines to assist new research and development efforts. Due to the page limit, this paper contains a summary of the results.

Posters
Paper Nr: 4
Title:

Software Project Management Fallacies

Authors:

Ana M. Moreno and Lawrence Peters

Abstract: Software project management plays a critical role in software projects. Therefore, software project management actions have an important impact on software projects and organizations. However, software engineers often become software project managers with little or no training in project management. As a result, sometimes they have to rely on hearsay or their own assumptions to formulate strategies and a plan of action for managing software projects. This has led to several software project management misconceptions or fallacies that can have important negative effects on software projects. This paper examines some relevant fallacies based, on the authors’ experience and discusses published material which refutes them. This work contributes to the practice of Software Project Management by identifying and correcting practices which can reduce the success rate of software projects.

Paper Nr: 51
Title:

Researching Human and Organizational Factors Impact for Decisions on Software Quality

Authors:

Luis Fernández Sanz, Josefa Gómez Pérez, Teresa I. Díez-Folledo and Sanjay Misra

Abstract: Quality is an essential factor for European competitiveness as low price strategies based on low labour costs can be difficult to implement. Although software quality assurance has a long tradition, there is a lack of research on some practical aspects. In particular, the extended study of the influence of human and organizational factors (HOF) on the quality of software development, maintenance and management has been neglected. However, different studies have identified these as key factors in software projects with impact in terms of cost, quality and results measuring quantitative and qualitatively their impact. As part of the Iceberg project, funded under the Marie Curie IAPP EU-funded program, some relevant evidences of the influence of HOF on software quality has been reviewed and analysed to discuss the challenges in this area confirming the need of promoting deeper and wider research efforts.

Area 3 - Distributed and Mobile Software Systems

Short Papers
Paper Nr: 5
Title:

A Lightweight and High Performance Remote Procedure Call Framework for Cross Platform Communication

Authors:

Hakan Bagci and Ahmet Kara

Abstract: Remote procedure calls (RPC) are widely used for building distributed systems for about 40 years. There are several RPC implementations addressing different purposes. Some RPC mechanisms are general purpose systems and provide a number of calling patterns for functionality, hence they do not emphasize performance. On the other hand, a small number of RPC mechanisms are implemented with performance as the main concern. Such RPC implementations concentrate on reducing the size of the transferred RPC messages. In this paper, we propose a novel lightweight and high performance RPC mechanism (HPRPC) that uses our own high performance data serializer. We compare our RPC system’s performance with a well-known RPC implementation, gRPC, that addresses both providing various calling patterns and reducing the size of the RPC messages. The experiment results clearly indicate that HPRPC performs better than gRPC in terms of communication overhead. Therefore, we propose our RPC mechanism as a suitable candidate for high performance and real time systems.

Paper Nr: 41
Title:

New Methodology for Feasible Reconfigurable Real-Time Network-on-Chip NoC

Authors:

Imen Khemaissia, Olfa Mosbahi, Mohamed Khalgui and Zhiwu Li

Abstract: The current research paper is interested in flexible reconfigurable real-time Network-on-Chip (NoC) in Multiprocessors System-on-Chip MPSoC architectures. A NoC is composed of several nodes where each one consists of a processor and a router. The reconfiguration of a processor is any operation that permits the addition-removal-update of periodic dependent OS (Operating System) tasks that are sharing resources. For two added dependent tasks assigned to different processors, a message is added automatically on the NoC. After any reconfiguration scenario, several real-time constraints cannot be satisfied since a task can miss its deadline and a message can take a long time to arrive to its destination. In order to re-obtain the system feasibility, we propose a new approach that is called CRM (abrev. Cynapsys Reconfigurable MPSoC). A multi-agent architecture based on a master/slave model is defined where a slave agent is assigned to each node to control its local feasibility after any reconfiguration scenario, and a master is proposed for the whole architecture if any perturbation occurs at run-time by proposing software or hardware solutions. A developed tool at LISI laboratory and Cynapsys is implemented for a real case study in order to evaluate the paper’s contribution.

Paper Nr: 61
Title:

Architectting the Recommendation Layer of a Platform-as-a-Service e-Marketplace

Authors:

Fatemeh AhmadiZeleti, Islam Hassan, Sonya Abbas, Adegboyega Ojo and Lukasz Porwol

Abstract: This paper addresses the problem of how to architect aspects of an Electronic Marketplace (e-Marketplace) to enable Software SMEs (Small and Medium Scale Enterprises) Engineers to easily discover the most appropriate Platform-as-a-Service (PaaS) offerings available in a marketplace. While there are existing architectural models for e-Marketplaces, these models largely ignore the semantic aspects of the descriptions of offerings in the marketplace. In addition, they provide little support for recommendations and decision making for consumers in the marketplace. These shortcomings make the reuse of existing e-Marketplace architectures inadequate for some categories of services such as PaaS services which are characterised by relatively complex technical specifications. We address this problem by integrating a Semantic Recommendation Layer into a PaaS e-Marketplace architecture. Requirements for this layer were obtained from a series of interviews with Software SME engineers and PaaS providers within the context of a Three-year EU Project. We describe the major components of the Layer and the underpinning recommendation and decision model. Results from this work should contribute to domain-specific architecture for e-Marketplaces.

Posters
Paper Nr: 53
Title:

Implementation of a Low Cost IaaS using Openstack

Authors:

Tiago Rosado and Jorge Bernardino

Abstract: Cloud computing has emerged as an important paradigm enabling software, infrastructure, and information to be used as services over the network in an on-demand manner. Cloud computing infrastructures can provide adaptive resource provisioning with very little initial investment while scaling to a large number of commodity computing nodes. In this paper, we present Openstack, a modular and highly adaptive open source architecture for both public and private cloud solutions. It is shown an experimental implementation of an IaaS, built on entry-level hardware, demonstrating the hardware, and most important, the basic software components needed to set the foundation for a solid entry-point to setup a low-cost Openstack cloud infrastructure.

Area 4 - Software Engineering Methods and Techniques

Full Papers
Paper Nr: 12
Title:

On Mutating UPPAAL Timed Automata to Assess Robustness of Web Services

Authors:

Faezeh Siavashi, Dragos Truscan and Jüri Vain

Abstract: We present a model-based mutation technique for testing the robustness of Web service compositions. Specifi- cations of a Web service composition is modeled by UPPAAL Timed Automata and the conformance between the model and the implementation is validated by online model-based testing with the UPPAAL TRON tool. By applying a set of well-defined mutation operators, we generated model mutations. We validate all generate mutants and exclude the invalid ones. The remaining mutants are used for online robustness testing providing invalid test inputs and revealing vulnerabilities of the implementation under test. We experimented our method on a Booking System web service composition. The results show that from a total of 1346 generated mutants, 393 are found suitable for online model-based testing. After running the tests, 40 of the mutants revealed 3 new errors in the implementation. The experiment shows that our approach of mutating specifications is effective in detecting errors that were not revealing in the conventional conformance testing methods.

Paper Nr: 17
Title:

Secure Data Storage Architecture on Cloud Environments

Authors:

Tran Thi Xuan Trang and Katsuhisa Maruyama

Abstract: Securing sensitive customer data outsourced to external servers in cloud computing environments is challenging. To maintain data confidentiality on untrusted servers, conventional data security techniques usually employ cryptographic approaches. However, most enterprises are unwilling to employ these approaches if they require high-performance client devices to cipher the entire data. In this situation, separating out the confidential data is beneficial since only the confidential data are encrypted or stored in trusted servers. Although this idea has already been proposed, its support is still insufficient. This paper proposes a secure data storage model in cloud computing environments that is based on the concept of data slicing and presents its prototype tool that supports the low-cost migration of existing applications. Our tool provides a structured query language (SQL) translation mechanism that provides transparent access to partitioned data without changing the original SQL queries. A simple case study shows how the proposed architecture implements secure data storage in cloud computing environments.

Paper Nr: 25
Title:

A New Approach to Feature-based Test Suite Reduction in Software Product Line Testing

Authors:

Arnaud Gotlieb, Mats Carlsson, Dusica Marijan and Alexandre Pétillon

Abstract: In many cases, Software Product Line Testing (SPLT) targets only the selection of test cases which cover product features or feature interactions. However, higher testing efficiency can be achieved through the selection of test cases with improved fault-revealing capabilities. By associating each test case a priority-value representing (or aggregating) distinct criteria, such as importance (in terms of fault discovered in previous test campaigns), duration or cost, it becomes possible to select a feature-covering test suite with improved capabilities. A crucial objective in SPLT then becomes to identify a test suite that optimizes reaching a specific goal (lower test duration or cost), while preserving full feature coverage. In this paper, we revisit this problem with a new approach based on constraint optimization with a special constraint called GLOBAL CARDINALITY and a sophisticated search heuristic. This constraint enforces the coverage of all features through the computation of max flows in a network flow representing the coverage relation. The computed max flows represent possible solutions which are further processed in order to determine the solution that optimizes the given objective function, e.g., the lowest test execution costs. Our approach was implemented in a tool called Flower/C and experimentally evaluated on both randomly generated instances and industrial case instances. Comparing Flower/C with MINTS (Minimizer for Test Suites), the State-Of-the-Art tool based on an integer linear formulation for performing similar test suite optimization, we show that our approach either outperforms MINTS or has comparable performance on random instances. On industrial instances, we compared three distinct models of Flower/C (using distinct global constraints) and the one mixing distinct constraints showed excellent performances with high reduction rates. These results opens door to an industrial adoption of the proposed technology.

Paper Nr: 40
Title:

Testing Distributed and Heterogeneous Systems: State of the Practice

Authors:

Bruno Lima and João Pascoal Faria

Abstract: In a growing number of domains, such as health-care and transportation, several independent systems, forming a heterogeneous and distributed system of systems, are involved in the provisioning of end-to-end services to users. Testing such systems, running over interconnected mobile and cloud-based platforms, is particularly important and challenging, with little support being provided by current tools. In order to assess the current state of the practice regarding the testing of distributed and heterogeneous systems (DHS) and identify opportunities and priorities for research and innovation initiatives, we conducted an exploratory survey that was responded by 147 software testing professionals that attended industry-oriented software testing conferences, and present the main results in this paper. The survey allowed us to assess the relevance of DHS in software testing practice, the most important features to be tested in DHS, the current status of test automation and tool sourcing for testing DHS, and the most desired features in test automation solutions for DHS. We expect that the results presented in the paper are of interest to researchers, tool vendors and service providers in this field.

Paper Nr: 57
Title:

Model-based Recovery Connectors for Self-adaptation and Self-healing

Authors:

Emad Albassam, Hassan Gomaa and Daniel Menascé

Abstract: Self-healing and self-configuration are highly desirable properties in software systems so that components can dynamically adapt to changing environments and recover from failure with minimal human intervention. This paper discusses a model-based approach for self-healing and self-configuration using recovery connectors. A recovery connector extends connectors in component-based software architectures and service-oriented architectures with self-healing and self-configuration capabilities so that a component or service can be dynamically adapted and recovered from failures. The design of the recovery connector is based on the MAPE-K loop model and can handle both recovery and adaptation.

Paper Nr: 64
Title:

User Control of Force-directed Layouts

Authors:

Wendy Lucas and Taylor Gordon

Abstract: Force-directed layouts are typically used for minimizing overlaps in node-link graphs. This can make it easier to interpret and derive meaning from the resulting visualization. Once such a layout is put in motion, however, the person interacting with it has little control over the “final” layout. This paper describes an approach that puts even inexperienced users in charge of force-directed layouts that are not limited to network diagrams. The visual interface to a powerful but relatively easy to use visualization grammar has been augmented with sliders for controlling the strength of constraints applied to visual objects. Users can change the balance of power between constraints while the visualization is running, specify different constraints for groupings of visual objects, turn off all or some of the constraints affecting the layout, or return a layout to its pre-constraint-solving specification. This approach is a step towards addressing the need for tools with which all users can control and interact with force-directed layouts.

Short Papers
Paper Nr: 9
Title:

Enterprise Experience into the Integration of Human-Centered Design and Kanban

Authors:

Eva-Maria Schön, Dominique Winter, Jan Uhlenbrok, Maria J. Escalona and Jörg Thomaschewski

Abstract: The integration of Human-Centered Design (HCD) and Agile Software Development (ASD) promises the development of competitive products comprising a good User Experience (UX). This study has investigated the integration of HCD and Kanban with the aim to gain industrial experiences in a real world context. A case study showed that requirements flow into the development process in a structured manner by adding a design board. To this end, the transparency concerning recurring requirements increased. We contribute to the body of knowledge of software development by providing practical insights into Human-Centered Agile Development (HCAD). On one hand, it is shown that the integration of HCD and Kanban leads to a product with a good UX and makes the development process more human-centered. On the other hand, we conclude that a cross-functional collaboration speeds up product development.

Paper Nr: 14
Title:

A Verification Method of Time-response Requirements

Authors:

Yuuma Matsumoto and Atsushi Ohnishi

Abstract: In order to verify the correctness of functional requirements, we have been developing a verification method of the correctness of functional requirements specification using the Requirements Frame model. In this paper, we propose a verification method of non-functional requirements specification, especially time-response requirements written with a natural language. We establish a verification method by extending the Requirements Frame model. We have also developed a prototype system based on the method using Java. The extended Requirements Frame model and the verification method will be illustrated with examples.

Paper Nr: 16
Title:

Object-relational Mapping Revised - A Guideline Review and Consolidation

Authors:

Martin Lorenz, Günter Hesse and Jan-Peer Rudolph

Abstract: Object-relational mapping (ORM) is a mechanism to link classes of an object-oriented (OO) programming language to tables of a relational database management system (RDBMS). When designing a mapping for an application’s domain model, different strategies exist to map associations and inheritance relationships to database tables. Each strategy has a different impact on the application’s quality characteristics. Developers need to understand the impact of a mapping strategy to make informed decisions. In the absence of cost models to quantify the impact, guidelines and best practices have been developed to allow differentiated considerations of strategies. However, looking closer at these guidelines, two major flaws become apparent - incompleteness and inconsistency. In this paper, a comprehensive literature study is presented, which includes an analysis of guidelines and best practices from industry and academia. We propose a consolidation approach, which identifies relevant aspects of mapping strategies that impact a system’s quality characteristics. The approach derives a multi-level organization, which describes the relation between mapping strategy aspects and quality characteristics of a system. The identified mapping aspects and the organization can serve as a framework to improve existing guidelines and to resolve inconsistencies.

Paper Nr: 19
Title:

C-TRAIL: A Program Comprehension Approach for Leveraging Learning Models in Automated Code Trail Generation

Authors:

Roy Oberhauser

Abstract: With society's increasing utilization of (embedded) software, the amount of program source code is proliferating while the skilled human resources to maintain and evolve this code remain limited. Therefore, software tools are needed that can support and enhance program code comprehension. This paper focuses on program concept location and cognitive learning models, and contributes an automatic code trail generator approach called a Code Trail Recommender Agent Incorporating Learning models (C-TRAIL). Initial empirical results applying the prototype on obfuscated code show promise for improve program comprehension efficiency and effectiveness.

Paper Nr: 23
Title:

Bug Report Quality Evaluation Considering the Effect of Submitter Reputation

Authors:

Lerina Aversano and Ermanno Tedeschi

Abstract: The quality of a bug report is a very crucial aspect that influences the entire software life cycle. Generally, in many software projects relevant lack of information can be observed when submitting a bug report. Consequently, the time resolution of a software problem is strongly influenced by the quality of the reporting. In this paper, we investigate the quality of bug reports from the perspective of developers. We examined several metrics impacting the quality of bug reports, such as the length of descriptions, presence of stack traces, presence of attachments, completeness, and readability. In addition different definition of submitter reputation are compared and used. Then, a quality model is built for the evaluation of the quality of the bug reports, and a software tool has been implemented for supporting the application of the proposed model. The validation has been conducted on real cases of bug reports from open source software.

Paper Nr: 24
Title:

From UML/MARTE Models of Multiprocessor Real-time Embedded Systems to Early Schedulability Analysis based on SimSo Tool

Authors:

Amina Magdich, Yessine Hadj Kacem, Adel Mahfoudhi and Mohamed Abid

Abstract: The increasing complexity of Real-Time Embedded Systems (RTES) should be met with equally sophisticated design methods. The recent Unified Modeling Language (UML) profile for Modeling and Analysis of Real- Time Embedded systems (MARTE) is well adapted for systems modeling. However along with the variety of schedulability analysis tools, bridging the gap between design models and meta-models of the documented scheduling analysis tools becomes an important issue. In this paper, we discuss a Model To Text (M2T) transformation for enabling the derivation of schedulability analysis models from UML/MARTE models. The generated model for schedulability analysis represents an input for an analysis tool. As a proof of concepts, we present the implemented code and experimental results.

Paper Nr: 30
Title:

Java Swing Modernization Approach - Complete Abstract Representation based on Static and Dynamic Analysis

Authors:

Zineb Gotti and Samir Mbarki

Abstract: GUIs are essential components for today software. However, legacy applications do not benefit from the advantages of user interfaces new technologies that enhance the interaction and the quality of the system. Building a new system from another existing one is more requested and a very complex process. So, we opted for an ADM approach based on the development of separate models capturing various aspects such as tasks, presentation and structures of system dialogue and behavior. For this purpose, the software artifacts should be analyzed and corresponding behavioral and structural models must be created. Two forms of this analysis were developed: a static analysis that provides the ability to retrieve information from the application using the source code and a dynamic analysis for extracting information about application behavior in run mode. This paper presents the automation of the extraction process, which permits understanding and analyzing the behavior of the legacy system, and compares the models generated to deduce the best solution for an abstract representation of existing GUI’s models.

Paper Nr: 39
Title:

Practical Improvements to the Minimizing Delta Debugging Algorithm

Authors:

Renáta Hodován and Ákos Kiss

Abstract: The first step in dealing with an input that triggers a fault in a software is simplifying it as much and as automatically as possible, since a minimal test case can ensure the efficient use of human developer resources. The well-known minimizing Delta Debugging algorithm is widely used for automated test case simplification but even its last published revision is more than a decade old. Thus, in this paper, we investigate how it performs nowadays, especially (but not exclusively) focusing on its parallelization potential. We present new improvement ideas, give algorithm variants formally and in pseudo-code, and evaluate them in a large-scale experiment of more than 1000 test minimization sessions featuring real test cases. Our results show that with the help of the proposed improvements, Delta Debugging needs only one-fourth to one-fifth of the original execution time to reach 1-minimal results.

Paper Nr: 45
Title:

An Empirical Evaluation of AXIOM as an Approach to Cross-platform Mobile Application Development

Authors:

Christopher Jones and Xiaoping Jia

Abstract: AXIOM is a domain-specific modeling language for cross-platform mobile applications. AXIOM is based on more general techniques such as Model-Driven Architecture and generates native code for the iOS and Android platforms. Previous small-scale quantitative experiments suggested that AXIOM had the potential to provide significant productivity benefits. We have since conducted a limited set of more complex, mid-scale experiments and analyzed AXIOM’s capabilities using both quantitative and qualitative metrics to further define AXIOM’s ability to improve developer productivity when building cross-platform mobile applications. In this paper we describe the methodology of our mid-scale experiments and present the findings from source code and SonarQube analyses. We evaluate these findings and discuss what they mean to AXIOM in general. Finally, we look at possible changes to AXIOM’s syntax and capabilities.

Paper Nr: 52
Title:

A Multi-platform End User Software Product Line Meta-model for Smart Environments

Authors:

Vasilios Tzeremes and Hassan Gomaa

Abstract: End User (EU) architectures for smart environments aim to enable end users to create and deploy software applications for their smart spaces. EU Software Product Lines (SPL) extend EU architectures for smart environments with product line support to promote reuse and software application portability. This paper describes a meta-modeling approach for developing EU SPLs for smart environments. We present a meta-model as the basis for developing a framework for creating EU SPLs and deriving EU applications. The meta-model is composed of platform independent and platform specific meta-models. This paper describes in detail both parts of the meta-model and discusses the relationships and mappings between them. This paper also presents the XANA EU SPL framework that was developed using the proposed platform specific meta-model and discusses XANA’s product line creation and application derivation process.

Paper Nr: 56
Title:

Refinement of UML2.0 Sequence Diagrams for Distributed Systems

Authors:

Fatma Dhaou, Ines Mouakher, Christian Attiogbé and Khaled Bsaies

Abstract: Refinement process applied to UML2.0 Sequence Diagrams (SD) is adopted to deal with the complexity of modeling distributed systems. The various steps leading to the checking of the refinement of SDs theoretically as well as practically are explained. A refinement relation possessing the necessary properties, is formalized; its implementation in the Event-B method is proposed in order to check the correctness of the refinement of SDs, and to verify some safety, liveness properties and the termination of the new introduced events.

Paper Nr: 66
Title:

New Co-design Methodology for Real-time Embedded Systems

Authors:

Ines Ghribi, Riadh Ben Abdallah, Mohamed Khalgui and Marco Platzner

Abstract: Over the years we are witnessing and ever increasing demand for functionality enhancements in the embedded real-time systems. Along with functionalities, the design itself grows more complex. Posed constraints as time, space bounds and energy consumption also require proper handling. In order to enhance the behaviour of such systems, we have developed the I-codesign, a methodology for modelling, partitioning and simulating embedded real-time systems. The tasks in this methodology are described with a probabilistic manner and characterized with real-time parameters. A new partitioning technique aims at each of its three phases to respect firstly the inclusion/exclusion parameters, secondly energy and memory constraints and finally verifies real-time constraints. The output of I-codesign is an embedded controller that supervises the behaviour of the executing system and schedule the implementation /configurations of the software.

Posters
Paper Nr: 2
Title:

Modeling and Executing Component-based Applications in C-Forge

Authors:

Francisca Rosique, Diego Alonso, Juan Pastor and Francisco Ortiz

Abstract: This paper describes a model-driven toolchain for developing component-based applications that enables users to use the same models that define their application to execute them. In this vein, models always remain true to the final application, unlike other approaches where a model tranformation generates a skeleton of the final application after the first steps of the development process. These kind of approaches normally end up with models that represent a different application than the one present in the code.

Paper Nr: 31
Title:

Toward IFVM Virtual Machine: A Model Driven IFML Interpretation

Authors:

Sara Gotti and Samir Mbarki

Abstract: UML is the first international modeling language standardized since 1997. It aims at providing a standard way to visualize the design of a system, but it can't model the complex design of user interfaces and interactions. However, according to MDA approach, it is necessary to apply the concept of abstract models to user interfaces too. IFML is the OMG adopted (in March 2013) standard Interaction Flow Modeling Language designed for abstractly expressing the content, user interaction and control behaviour of the software applications front-end. IFML is a platform independent language, it has been designed with an executable semantic and it can be mapped easily into executable applications for various platforms and devices. In this article we present an approach to execute the IFML. We introduce a IFVM virtual machine which translate the IFML models into bytecode that will be interpreted by the java virtual machine.

Paper Nr: 43
Title:

Conformance Checking using Formal Methods

Authors:

Antonella Santone and Gigliola Vaglini

Abstract: Conformance checking is an important process mining task; it aims to detect inconsistencies between the model of a process and its corresponding execution log. This paper proposes an approach in which it is given a declarative description, represented by a set of temporal logic properties, for the process model; the process discovered from the log is described by means of a process algebra, and conformance checking is performed through the model checking of the discovered process against the properties. To discover the process we consider additional information contained in the log and associated with the single events. Moreover, since discovered processes tend, in general, to be very large and complex, we look for a reduced process containing only the parts relevant for the properties satisfaction. In this way we reduce both the space needed for the discovered process and the time complexity for the properties verification.

Paper Nr: 46
Title:

Accessibility Not on Demand - An Impaired Situation

Authors:

João de Sousa e Silva, Ramiro Gonçalves and António Pereira

Abstract: Digital accessibility is recognized as a fundamental tool for an egalitarian society. Nevertheless, software accessibility is an under addressed topic in the discipline of software engineering and the academy in general. As a result, its development and implementation is compromised. This problem is depicted here with the help of some experiments that shows the poor attention which is dedicated to this topic. Some hypotheses that try to explain this problem are formulated, and some possible solutions are debated. As a conclusion, some insights are given and a new possible researched avenue is presented.

Paper Nr: 55
Title:

Evaluating Data Integrity in the Cloud using the UPPAAL

Authors:

Sachi Nishida and Yoshiyuki Shinkawa

Abstract: There are several considerations when implementing a transaction processing system in cloud environments like Google App Engine (GAE). One of the most critical ones is the data integrity, since the cloud provides us with limited capability for it. Therefore we need to evaluate the applications and the cloud platform carefully from the data integrity viewpoint. This paper presents a model based data integrity evaluation method using the UPPAAL model checker. In order to make the model reusable, we built it as a set of application independent functional modules. On the other hand, the application unique functionalities are to be included in the model as UPPAAL functions written by the C-like UPPAAL language. The data integrity evaluation is performed in two different ways. One is a simulation based method in which the model is executed by the UPPAAL simulator to obtain the resultant variable values. The other is a verification based method in which the given integrity constraints are examined by the UPPAAL verifier using full state space search of the model.