ICSOFT 2020 Abstracts


Area 1 - Foundational and Trigger Technologies

Full Papers
Paper Nr: 33
Title:

Local Rotation Pattern: A Local Descriptor of Color Textures

Authors:

Hela Jebali, Noël Richard and Mohamed Naouai

Abstract: Describing color textures is an extremely challenging problem in pattern recognition and computer vision. In this paper, a new texture feature is proposed and investigated for color texture image classification. Based on quaternion representation of color images and quaternion rotation, a Local Rotation Pattern descriptor (LRP) is proposed. Using quaternion to represent color images is done by encoding the three RGB channels into the three imaginary parts of a quaternion. The distance between two color can be expressed as the angle of rotation between two unit quaternions using the geodesic distance to obtain finally our LRP histograms. Performance in texture classification is assessed for three challenging datasets: Vistex, Outex-TC13 and USPtex databases facing to the recent results from the state-of-the-art. Results show the high efficiency of the proposed approach.

Paper Nr: 49
Title:

An Efficient GPGPU based Implementation of Face Detection Algorithm using Skin Color Pre-treatment

Authors:

Imene Guerfi, Lobna Kriaa and Leila A. Saidane

Abstract: Modern and future security and daily life applications incorporate several face detection systems. Those systems have an exigent time constraint that requires high-performance computing. This can be achieved using General-Purpose Computing on Graphics Processing Units (GPGPU), however, some existing solutions to satisfy the time requirements may degrade the quality of detection. In this paper, we aimed to reduce the detection time and increase the detection rate using the GPGPU. We developed a robust, optimized algorithm based on an efficient parallelization of the face detection algorithm, combined with the reduction of the research area using a mixture of two color spaces. for skin detection. Central Processing Unit (CPU) serial and parallel versions of the algorithm are developed for comparison’s sake. A database is made using a classification method to evaluate our approach in order to discuss all scenarios. The experimental results show that our proposed method achieved 27,1x acceleration compared to the CPU implementation with a detection rate of 97,05%.

Short Papers
Paper Nr: 16
Title:

IT-Application Behaviour Analysis: Predicting Critical System States on OpenStack using Monitoring Performance Data and Log Files

Authors:

Patrick Kubiak, Stefan Rass and Martin Pinzger

Abstract: Recent studies have proposed several ways to optimize the stability of IT-services with an extensive portfolio of processual, reactive or proactive approaches. The goal of this paper is to combine monitored performance data, such as CPU utilization, with discrete data from log files in a joint model to predict critical system states. We propose a systematic method to derive mathematical prediction models, which we experimentally test using a downsized clone of a real life contract management system as a testbed. First, this testbed is used for data acquisition under variable and fully controllable system loads. Next, based on the monitored performance metrics and log file data, we train models (logistic regression and decision trees) that unify both, numeric and textual, data types in a single incident forecasting model. We focus on 1) investigating different cases to identify an appropriate prediction time window, allowing to prepare countermeasures by considering prediction accuracy and 2) identifying variables that appear more likely than others in the predictive models.

Paper Nr: 19
Title:

A Parallel Many-core CUDA-based Graph Labeling Computation

Authors:

Stefano Quer

Abstract: When working on graphs, reachability is among the most common problems to address, since it is the base for many other algorithms. As with the advent of distributed systems, which process large amounts of data, many applications must quickly explore graphs with millions of vertices, scalable solutions have become of paramount importance. Modern GPUs provide highly parallel systems based on many-core architectures and have gained popularity in parallelizing algorithms that run on large data sets. In this paper, we extend a very efficient state-of-the-art graph-labeling method, namely the GRAIL algorithm, to architectures which exhibit a great amount of data parallelism, i.e., many-core CUDA-based GPUs. GRAIL creates a scalable index for answering reachability queries, and it heavily relies on depth-first searches. As depth-first visits are intrinsically recursive and they cannot be efficiently implemented on parallel systems, we devise an alternative approach based on a sequence of breadth-first visits. The paper explores our efforts in this direction, and it analyzes the difficulties encountered and the solutions chosen to overcome them. It also presents a comparison (in terms of times to create the index and to use it for reachability queries) between the CPU and the GPU-based versions.

Paper Nr: 27
Title:

A Novel Approach for Android Malware Detection and Classification using Convolutional Neural Networks

Authors:

Ahmed Lekssays, Bouchaib Falah and Sameer Abufardeh

Abstract: Malicious software or malware has been growing exponentially in the last decades according to antiviruses vendors. The growth of malware is due to advanced techniques that malware authors are using to evade detection. Hence, the traditional methods that antiviruses vendors deploy are insufficient in protecting people’s digital lives. In this work, an attempt is made to address the problem of mobile malware detection and classification based on a new approach to android mobile applications that uses Convolutional Neural Networks (CNN). The paper suggests a static analysis method that helps in malware detection using malware visualization. In our approach, first, we convert android applications in APK format into gray-scale images. Since malware from the same family has shared patterns, we then designed a machine learning model to classify Android applications as malware or benign based on pattern recognition. The dataset used in this research is a combination of self-made datasets that used public APIs to scan the APK files downloaded from open sources on the internet, and a research dataset provided by the University of New Brunswick, Canada. Using our proposed solution, we achieved an 84.9% accuracy in detecting mobile malware.

Paper Nr: 31
Title:

Bi-Objective CSO for Big Data Scientific Workflows Scheduling in the Cloud: Case of LIGO Workflow

Authors:

K. Bousselmi, S. Ben Hamida and M. Rukoz

Abstract: Scientific workflows are used to model scalable, portable, and reproducible big data analyses and scientific experiments with low development costs. To optimize their performances and ensure data resources efficiency, scientific workflows handling big volumes of data need to be executed on scalable distributed environments like the Cloud infrastructure services. The problem of scheduling such workflows is known as an NP-complete problem. It aims to find optimal mapping task-to-resource and data-to-storage resources in order to meet end user’s quality of service objectives, especially minimizing the overall makespan or the financial cost of the workflow. In this paper, we formulate the problem of scheduling big data scientific workflows as bi-objective optimization problem that aims to minimize both the makespan and the cost of the workflow. The formulated problem is then resolved using our proposed Bi-Objective Cat Swarm Optimization algorithm (BiO-CSO) which is an extension of the bio-inspired algorithm CSO. The extension consists of adapting the algorithm to solve multi-objective discrete optimization problems. Our application case is the LIGO Inspiral workflow which is a CPU and Data intensive workflow used to generate and analyze gravitational waveforms from data collected during the coalescing of compact binary systems. The performance of the proposed method is then compared to that of the multi-objective Particle Swarm Optimization (PSO) proven to be effective for scientific workflows scheduling. The experimental results show that our algorithm BiO-CSO performs better than the multi-objective PSO since it provides more and better final scheduling solutions.

Paper Nr: 99
Title:

An Optimization Method for Entity Resolution in Databases: With a Case Study on the Cleaning of Scientific References in Patent Databases

Authors:

Emiel Caron

Abstract: Many databases contain ambiguous and unstructured data which makes the information it contains difficult to use for further analysis. In order for these databases to be a reliable point of reference, the data needs to be cleaned. Entity resolution focuses on disambiguating records that refer to the same entity. In this paper we propose a generic optimization method for disambiguating large databases. This method is used on a table with scientific references from the Patstat database. The table holds ambiguous information on citations to scientific references. The research method described is used to create clusters of records that refer to the same bibliographic entity. The method starts by pre-cleaning the records and extracting bibliographic labels. Next, we construct rules based on these labels and make use of the tf-idf algorithm to compute string similarities. We create clusters by means of a rule-based scoring system. Finally, we perform precision-recall analysis using a golden set of clusters and optimize our parameters with simulated annealing. Here we show that it is possible to optimize the performance of a disambiguation method using a global optimization algorithm.

Area 2 - Software Engineering and Systems Development

Full Papers
Paper Nr: 6
Title:

Consistency Analysis of AUTOSAR Timing Requirements

Authors:

Steffen Beringer and Heike Wehrheim

Abstract: Applying formal methods in the automotive industries can significantly increase the correctness and reliability of the developed system architectures. This in particular demands a formal specification and analysis of requirements on systems. Automotive software architectures are, however, often described using the (semi-formal) AUTOSAR standard which is based on various meta models as exchange formats. This complicates a formal analysis. In this paper, we provide a formalization of timing requirements within the AUTOSAR standard. Timing requirements specify constraints on events of the underlying software architecture. We provide a translation of timing requirements into logical constraints which enable the usage of SMT solvers to analyse requirements. Specifically, we employ this translation to check consistency of the requirements and use maximum satisfiability solving for localization of erroneous requirements.

Paper Nr: 8
Title:

Software Test Automation Maturity: A Survey of the State of the Practice

Authors:

Yuqing Wang, Mika V. Mäntylä, Serge Demeyer, Kristian Wiklund, Sigrid Eldh and Tatu Kairi

Abstract: The software industry has seen an increasing interest in test automation. In this paper, we present a test automation maturity survey serving as a self-assessment for practitioners. Based on responses of 151 practitioners coming from above 101 organizations in 25 countries, we make observations regarding the state of the practice of test automation maturity: a) The level of test automation maturity in different organizations is differentiated by the practices they adopt; b) Practitioner reported the quite diverse situation with respect to different practices, e.g., 85% practitioners agreed that their test teams have enough test automation expertise and skills, while 47% of practitioners admitted that there is lack of guidelines on designing and executing automated tests; c) Some practices are strongly correlated and/or closely clustered; d) The percentage of automated test cases and the use of Agile and/or DevOps development models are good indicators for a higher test automation maturity level; (e) The roles of practitioners may affect response variation, e.g., QA engineers give the most optimistic answers, consultants give the most pessimistic answers. Our results give an insight into present test automation processes and practices and indicate chances for further improvement in the present industry.

Paper Nr: 50
Title:

Accelerating Interval Iteration for Expected Rewards in Markov Decision Processes

Authors:

Mohammadsadegh Mohagheghi and Khayyam Salehi

Abstract: Reachability probabilities and expected rewards are two important classes of properties that are computed in probabilistic model checking. Iterative numerical methods are used to compute these properties. Interval iteration and sound value iteration are proposed in recent years to guarantee the precision of computed values. These methods consider upper and lower bounds of values and update each bound in every iteration until satisfying the convergence criterion. In this paper, we focus on the computation of the expected rewards of models and propose two heuristics to improve the performance of the interval iteration method. The first heuristic updates the upper and lower bounds separately to avoid redundant updates. The second heuristic uses the computed values of the lower bound to approximate a starting point for the upper bound. We also propose a criterion for the correctness of the approximated upper bound. The experiments show that in most cases, interval iteration with our approaches outperforms the standard interval iteration and sound value iteration methods.

Paper Nr: 76
Title:

Codeless Web Testing using Selenium and Machine Learning

Authors:

Duyen P. Nguyen and Stephane Maag

Abstract: Complexity of web systems lead to development processes always more tough to test. Testing phases are crucial in software and system engineering and are known to be very costly. While automated testing methods appear to take over the role of the human testers, the issues of reliability and the capability of the testing method still need to be solved. In our paper, we focus on the automation of functional tests of websites. A single web page may contain a huge set of important functionalities leading to the execution of critical web service operations. Besides, testing all of these functionalities implemented in a web page service is highly complex. Two current popular research areas for automation web-based testing are Codeless Functional Test Automation and Machine Learning/Artificial Intelligence (ML/AI) in test automation. We therefore define and implement a framework to figure out how to automate the web service product under test, the machine can detect or predict the change and adapt those changes to suitable generic test cases. In our work, we examine on Selenium and the benefits of using machine learning in automated web application testing.

Paper Nr: 88
Title:

A Data-driven Methodology towards Interpreting Readability against Software Properties

Authors:

Thomas Karanikiotis, Michail D. Papamichail, Ioannis Gonidelis, Dimitra Karatza and Andreas L. Symeonidis

Abstract: In the context of collaborative, agile software development, where effective and efficient software maintenance is of utmost importance, the need to produce readable source code is evident. Towards this direction, several approaches aspire to assess the extent to which a software component is readable. Most of them rely on experts who are responsible for determining the ground truth and/or set custom evaluation criteria, leading to results that are context-dependent and subjective. In this work, we employ a large set of static analysis metrics along with various coding violations towards interpreting readability as perceived by developers. In an effort to provide a fully automated and extendible methodology, we refrain from using experts; rather we harness data residing in online code hosting facilities towards constructing a dataset that includes more than one million methods that cover diverse development scenarios. After performing clustering based on source code size, we employ Support Vector Regression in order to interpret the extent to which a software component is readable on three axes: complexity, coupling, and documentation. Preliminary evaluation on several axes indicates that our approach effectively interprets readability as perceived by developers against the aforementioned three primary source code properties.

Short Papers
Paper Nr: 1
Title:

MVCLang: A Software Modeling Language for the Model-View-Controller Design Pattern

Authors:

Mert Ozkaya and Irem Fidandan

Abstract: The Model-View-Controller (MVC) software design pattern promotes separating software systems into the model, view, and controller elements. The views represent the user-interfaces, the models represent the system data, and the controllers handle the requests sent by the views and coordinate the interactions between views and models. While many software frameworks are available for the MVC-based software developments, no any attempt have been made on increasing the level of abstraction for the MVC developments and provide a model-based approach. Indeed, none of the high-level software modeling languages support the MVC design pattern. So, we propose in this paper a visual, MVC-based modeling language called MVCLang, which enables to model MVC-based software architectures that can be easily analysed and implemented. MVCLang is supported with an Eclipse-based prototype toolset for specifying the visual MVC architectures and analysing them for a number of wellformedness rules. MVCLang’s toolset can further produce ASP.NET MVC code that reflects the architectural design decisions. We evaluated MVCLang on a software company that offers e-commerce solutions. Therein, 5 developers used MVCLang for their e-commerce project developments and provided feedback for a set of pre-determined questions.

Paper Nr: 2
Title:

A Case Study on Performance Optimization Techniques in Java Programming

Authors:

Ciprian Khlud and Cristian Frăsinaru

Abstract: Choosing the right programming platform for processor or memory intensive applications is a subject that is debated in all types of contexts. In this paper we investigate how a state-of-the art implementation, part of a multi-threaded framework for sequence analysis (elPrep) could benefit from various optimization techniques dedicated to improving the runtime performance of Java applications. We show that, without changing the semantics of the algorithm, by using appropriate programming techniques we are able to significantly improve the behavior of the Java implementation to a point that may even alter the conclusions of the original study. We also show that, by changing the manner in which data is represented, to better fit the particulars of the Java memory management, we are able to improve the original scoring (based on computing time and memory consumption) to around one order of magnitude better on the most expensive component (read/write).

Paper Nr: 5
Title:

Analysing Large-Scale Scrum Practices with Respect to Quality Requirements Challenges

Authors:

Wasim Alsaqaf, Maya Daneva and Roel Wieringa

Abstract: Published empirical research using agile practitioners’ perceptions indicated several important Quality Requirements (QRs) challenges experienced in agile large-scale distributed projects. It also indicated that a popular solution approach to those challenges is to inject some heavyweight practices into agile, for example adding documentation or roles of authorities for QRs. At the same time, agile methodologists proposed several scaled agile frameworks to specifically serve agile organizations working on large and distributed projects. How do these frameworks address QRs? Do they put forward any heavyweight practices as a solution to QRs challenges, or do they invent new agile practices fully aligned with the values of the Agile Manifesto? Currently, very little is known about the extent to which the QRs issues are accounted for in the design of these frameworks as proposed by agile methodologists. This paper attempts to narrow this gap of knowledge. It analyses Large-Scale Scrum (LeSS), a prominent scaled framework, from the perspective of QRs engineering and the Agile Manifesto’s values. To this end, we first applied the 4-Dimensional Analytical Tool to evaluate the degree of agility of the practices of the LeSS framework. We then analysed these practices and evaluated their applicability to mitigate the QRs challenges reported in previous work.

Paper Nr: 10
Title:

Transformation- and Pattern-based State Machine Mining from Embedded C Code

Authors:

Andreas Grosche, Burkhard Igel and Olaf Spinczyk

Abstract: Automated extraction of state machine models from source code can improve comprehension of software system behavior required for many maintenance tasks and reuse in general. Furthermore, it can be used for subsequent automated processing such as refactoring and model-based verification. This paper presents an approach based on normalizing transformations of an input program and a pattern to find state machine implementations in the program and to extract relevant information. The results are used to create state machine models containing states, transitions, events, guards and actions. Fine-grained traceability between the model and the source code enables navigation and refactoring of model elements. We evaluate the approach by applying a prototypical implementation to industrial automotive embedded code and show that 74 % of the expected state machine implementations can be completely identified and 8 % partially.

Paper Nr: 20
Title:

Hybrid Context-awareness Modelling and Reasoning Approach for Microgrid’s Intelligent Control

Authors:

Soumoud Fkaier, Mohamed Khalgui and Georg Frey

Abstract: Modern microgrids are promoting the integration of the information and communication technologies (ICT) in order to enhance the emerging advanced power management functionalities such as the integration of renewable energy sources, distributed storage optimization, demand-response strategies, electric vehicles charging, power generation rate forecasting and scheduling, etc. For this, sophisticated sensing and smart metering infrastructures are incorporated in the used equipment as well as in the involved subsystems. Hence many contextual data are become more and more available and its taking into consideration in the control tasks are likely to provide promising results. However, making the microgrid control system understand the data and take the proper decisions based on the identified context is not an easy task to perform. In fact, recognizing the relations and meanings of the sensed data is difficult and complex due to the heterogeneity and intricacy of the involved parts. Hence, providing context-aware modelling and reasoning mechanisms for microgrids becomes necessary. In this context, this paper contributes with two main solutions. First, a microgrid’s formalized design providing an easy and understandable view of the system is provided. This definition respects the separation of concerns principle in order to tame the complexity of the complicated system. Second, an ontology-based context modelling and a rule-based context reasoning in the framework of microgrids are provided. To show the suitability of the proposed processes, a formal case study is carried out. The proposed processes are proved to be less resources consuming compared to some of the existing works.

Paper Nr: 25
Title:

A Genetic Algorithm for Automated Test Generation for Satellite On-board Image Processing Applications

Authors:

Ulrike Witteck, Denis Grießbach and Paula Herber

Abstract: Satellite on-board image processing technologies are subject to extremely strict requirements with respect to reliability and accuracy in hard real-time. In this paper, we address the problem of automatically selecting test cases that are specifically tailored to provoke mission-critical behavior of satellite on-board image processing applications. Because such applications possess large input domains, it is infeasible to exhaustively execute all possible test cases. In particular, because of their complex computations, it is difficult to find specific test cases that provoke mission-critical behavior. To overcome this problem, we define a test approach that is based on a genetic algorithm. The goal is to automatically generate test cases that provoke worst case execution times and inaccurate results of the satellite on-board image processing application. For this purpose, we define a two-criteria fitness function that is novel in the satellite domain. We show the efficiency of our test approach on experimental results from the Fine Guidance System of the ESA medium-class mission PLATO.

Paper Nr: 39
Title:

Threat Modeling for Cyber-Physical Systems: A Two-dimensional Taxonomy Approach for Structuring Attack Actions

Authors:

Monika Maidl, Gerhard Münz, Stefan Seltzsam, Marvin Wagner, Roman Wirtz and Maritta Heisel

Abstract: Cyber-physical systems (CPSs) include devices that interaction with the physical world. Hence, attacks against CPSs can lead to substantial damage and endanger life and limb. It is important to consider possible attacks already in the early stages of system development, i.e. during the design phase, by performing threat modeling. Threat modeling aims at identifying, analyzing and documenting potential attacks and threats against a given CPS in a structured way. However, the systematic identification of all relevant threats is not trivial. One challenge is that knowledge about threats or potential attack actions is not documented in a way that makes it easily accessible. To address this challenge, we propose a taxonomy approach for structuring attack actions. The distinguishing feature of the taxonomy approach is the use of two dimensions: attack action types and the attack surface. The attack surface consists of those points of a system at which interaction is possible. Attackers can perform attack actions instead of the intended interaction at these points. As a CPS consists of a range of heterogeneous, connected components that can be accessed in various ways, the attack surface of a CPS is typically large. The attack surface of a specific CPS is defined by its system architecture model. We developed the taxonomy approach to support threat modeling for CPSs. Starting from existing approaches in the context of threat modeling, we extended and modified those in several iterations to meet the challenges of threat modeling for CPSs in industrial projects. While the focus in this paper is on CPSs, the two-dimensional taxonomy approach can be easily applied to other domains.

Paper Nr: 47
Title:

Toward a Correct Implementation of LwM2M Client with Event-B

Authors:

Ines Mouakher, Fatma Dhaou and J. C. Attiogbé

Abstract: Within the Internet of Things (IoT), billions of connected devices can collaborate anytime, anywhere and in any form in various domains of applications. These devices with minimal storage and computational power are based on standards and lightweight protocols. Due to the critical nature of application domains of the IoT systems, the verification of various properties is crucial. To this end, the benefits of using formal methods are widely recognized. In this paper, we present an approach that integrates modelling and verification techniques, required for the specification of IoT systems, by exploiting the OMA Lightweight M2M (LwM2M) enabler. We propose a formal model of the LwM2M client, which is located in an LwM2M device, by building several mathematical models of discrete transition systems using Event-B. Indeed, we opt for a systematic and refinement-based approach that helps us to model and to verify gradually the specification. The Rodin tool is used to specify and verify the Event-B models. The generated Event-B models allow us to analyze and verify the behavior of LwM2M client that supports the latest LwM2M 1.1 version. Furthermore, it is a first step towards providing formally proven LwM2M client implementations.

Paper Nr: 57
Title:

Investigating the Gap between Scrum Theory and Practice in Pakistan

Authors:

Muneeba Jilani and Naveed Ikram

Abstract: Theory and practice gap highlights deviations thus projecting any changes that hindered theory less beneficial. Among the companies that practice agile, the most common methodology is Scrum. Scrum advocates strict adherence to the original theory until it's successful prevalence in the organization. Industrial investigation is conducted in this research to reveal practice theory gap in case of Scrum thus providing detailed insight into the matter. It concludes with a set of guidelines for optimal Scrum implementation by addressing the gap in depth.

Paper Nr: 75
Title:

Detecting Model View Controller Architectural Layers using Clustering in Mobile Codebases

Authors:

Dragoş Dobrean and Laura Dioşan

Abstract: Mobile applications are one of the most common software projects written nowadays. The software architectures used for building those type of products heavily impacts their lifecycle as the architectural issues affect the internal quality of a software system hindering its maintainability and extensibility. We are presenting a novel approach, Clustering ARchitecture Layers (CARL), for detecting architectural layers using an automatic method that could represent the first step in the identification and elimination of various architectural smells. Unlike supervised Machine Learning approaches, the involved clustering method does not require any initial training data or modelling phase to set up the detecting system. As a further key of novelty, the method works by considering as codebase’s hybrid features the information inferred from both module dependency graph and the mobile SDKs. Our approach considers and fuses various types of structural as well as lexical dependencies extracted from the codebase, it analyses the types of the components, their methods signatures as well as their properties. Our method is a generic one and can be applied to any presentational applications that use SDKs for building their user interfaces. We assess the effectiveness of our proposed layer detection approach over three public and private codebases of various dimensions and complexities. External and internal clustering metrics were used to evaluate the detection quality, obtaining an Average Accuracy of 77,95%. Moreover, the Precision measure was computed for each layer of the investigated codebase architectures and the average of this metric (over all layers and codebases) is 79,32% while the average Recall on all layers obtained is 75,93%.

Paper Nr: 77
Title:

Visualization Method of Important Regions by Combination of Webpage Structures and Saliency Maps

Authors:

Yuya Inagaki, Hajime Iwata, Junko Shirogane and Yoshiaki Fukazawa

Abstract: In this research, we propose a new visualization method for important areas of a webpage by calculating the saliency in element units using a combination of the structure of the webpage and the saliency map at the development stage. By arranging important information in areas where attention is likely to be focused, users can easily find such information, leading to efficient user acquisition. In addition, a summary map that summarizes particularly important areas into one image should help grasp the page contents. Compared to a traditional saliency map, the visibility of important areas is easier to see, allowing designers to accurately determine which elements are likely to be noticed when a user views a webpage during the development phase.

Paper Nr: 80
Title:

Investigating on the Relationships between Design Smells Removals and Refactorings

Authors:

Lerina Aversano, Mario L. Bernardi, Marta Cimitile, Martina Iammarino and Kateryna Romanyuk

Abstract: Software systems continually evolve and this conducts to its architectural degradation due to the existence of numerous design problems. The presence of Design Smells is the main indicator of such problems, it points out the use of constructs that generally hurt system evolution. In this work, an investigation on Design Smells removals has been performed, focusing specifically on the co-occurrence of refactoring and related changes performed on a software system. An empirical study has been conducted considering the evolution history of 5 software systems. The detection of instances of multiple Design Smell types has been performed, along with all the history of the systems, along with, the detection of refactoring activities. The empirical study shows that Design Smells removals are not correlated to the presence of refactoring. The analysis provides useful indications about the percentage of activities conducted on smelly classes, including refactoring (even if these activities in few cases lead to effective smell removals).

Paper Nr: 81
Title:

Failsafe Mechanism to Hazard Analysis and Risk Mitigation in Unmanned Aerial Vehicle based on NCES

Authors:

Mohamed Naija, Rihab Khemiri and Ernesto Exposito

Abstract: In the last few years, Unmanned Aerial Vehicles (UAVs) are receiving more focus in order to execute a wide variety of applications such as the military, agriculture and medical fields. It is known the high vulnerability of the UAV not only to unexpected faults of their software but also to the environment. For this reason, safety should be considered as the main requirement at design time, since any unexpected behavior of the vehicle or any hazard would lead to potential risks. To maintain their safe operation during their missions, a failsafe mechanism based on Net Condition Event System (NCES) is proposed. The failsafe mechanism is a control logic that guides risk reduction actions to be performed when hazards occur. To generate such a controller using formal models, the proposed process is decomposed into three phases: (1) the first phase consists on hazard identification and analysis according to reactive methods of literature, (2) the second phase allows risk estimation using the standard ISO 13849, and (3) the third phase consists of performing reconfiguration scenario in order to risk mitigation while analyzing safety requirements. The motivation behind the use of formal methods is that they have proven to be useful for making the development process reliable at early design stages. We demonstrate the applicability and feasibility of our proposal on an illustrative medical drone as a case study.

Paper Nr: 89
Title:

Data-centric Refinement of Database-Database Dependency Analysis of Database Program

Authors:

Angshuman Jana

Abstract: Since the pioneer work by Ottenstein and Ottenstein, the notion of Program Dependency Graph (PDG) has attracted a wide variety of compelling applications in software engineering, e.g. program slicing, information flow security analysis, debugging, code-optimization, code-reuse, code-understanding, and many more. In order to exploit the power of dependency graph in solving problems related to relational database applications, Willmor et al. first proposed Database Oriented Program Dependency Graph (DOPDG), an extension of PDG by taking database statements and their dependencies further into consideration. However, the dependency information generated by the DOPDG construction algorithm is prone to imprecision due to its syntax-based computation, and therefore the approach may increase the susceptibility of false alarms in the above-mentioned application scenarios. Addressing this challenge, in this paper, the following two main research objectives are highlighted: (1) How to obtain more precise dependency information (hence more precise DOPDG)? and (2) How to compute them efficiently? To this aim, a data-centric based approach is proposed to compute precise dependency information by removing false alarms. To refine the database-database dependency, the syntax-based DOPDG construction is augmented by adding three extra nodes and edges (as per the condition-action execution sequence) with each node that represents the database statement.

Paper Nr: 93
Title:

Formalization and Verification of Reconfigurable Discrete-event System using Model Driven Engineering and Isabelle/HOL

Authors:

Sohaib Soualah, Yousra Hafidi, Mohamed Khalgui, Allaoua Chaoui and Laid Kahloul

Abstract: This paper deals with the modelling and verification of reconfigurable discrete event systems using model driven engineering (MDE) and Isabelle/HOL. MDE is a software development methodology followed by engineers. Isabelle/HOL is an interactive/automated theorem prover that combines the functional programming paradigm with high order logic (HOL), which makes it efficient for developing solid formalizations. We are interested in combining these two complementary technologies by mapping elements of MDE into Isabelle/HOL. In this paper, we present a transformation process from Ecore models, to functional data structures, used in proof assistants. This transformation method is based on Model-driven engineering and defined by a set of transformation rules that are described using formal presentations. Furthermore, in order to avoid redundant computations in RDESs, we propose a new algorithm for improved verification. We implement the contributions of this paper using Eclipse environment and Isabelle tool. Finally, we illustrate the proposed approach through FESTO MPS case study.

Paper Nr: 94
Title:

Improving UI Test Automation using Robotic Process Automation

Authors:

Marina Cernat, Adelina-Nicoleta Staicu and Alin Stefanescu

Abstract: Robotic Process Automation (RPA) is now one of the fastest growing segments in enterprise software. This technology uses so called “software robots” that can mimic humans interacting with various applications at the UI level. Thus, RPA achieves automation of various UI scenarios, without writing dedicated software to implement them. In this position paper, we open a discussion on the opportunities and challenges of using RPA to improve the test automation.

Paper Nr: 36
Title:

Decifarm: A Fuzzy Decision-support Environment for Smart Farming

Authors:

Jérôme Dantan, Hajer B. Zghal and Yann Pollet

Abstract: Farmers need smart tools to optimize their crops and production. They need other agricultural experts such as advisors, accounting companies and also systems and software tools for decision support. The proposed solution is a fuzzy decision support Environment for smart farming (Decifarm) intended to ensure better data structuration extracted from farms, automated calculations, reducing the risk of missing operations, while ensuring data security. We designed a modular architecture to carry out these problems: we first provide crops phenological stages from both historical and forecast weather open data as well as historical data from sensors previously implemented, located at the parcels, with large amounts of data stored into a No-SQL document database; second, we provide control of an automatic water system based on fuzzy logic; finally, a prototype of hardware and software environments was designed from open hardware components, open source languages and open data, promoting both interoperability and extensibility.

Paper Nr: 37
Title:

Towards Ubiquitous Learning Situations for Disabled Learners

Authors:

Nesrine Ben Salah, Ines B. Saadi and Henda Ben Ghezala

Abstract: Adaptation in ubiquitous learning environment is a major concern in research work, especially for disabled learners. A number of studies have examined context-aware learning systems, but the learning situation has rarely been taken into account, and mainly for learners with disabilities. There is a lack of studies on the description and identification of learning situations for disabled learners. Therefore, these situations need to be well defined, to ensure that all features, including the guidance of the learning process, are properly adapted to this particular type of learners. The identified situation is linked to a situation model called a typical situation. In this paper, we propose an ontology for ubiquitous typical learning situations relating to disabled learners and more specifically, to those with sensory disabilities. We will then be able to refer to these situations in order to identify the observed learning situations during the execution of the learning process. This identification will then be used to recommend and guide the learning process of learners with disabilities.

Paper Nr: 38
Title:

A Systematic Literature Mapping of Artificial Intelligence Planning in Software Testing

Authors:

Luis F. de Lima, Leticia M. Peres, André A. Grégio and Fabiano Silva

Abstract: Software testing is one of the most expensive software development processes. So, techniques to automate this process are fundamental to reduce software cost and development time. Artificial intelligence (AI) planning technique has been applied to automate part of the software testing process. We present in this paper a systematic literature mapping (SLM), using Petersen et al. (2015) approach of methods, techniques and tools regarding AI planning in software testing. Using the mapping, we identify 16 papers containing methods, techniques, frameworks and tools proposals, besides a survey. We identify testing techniques, testing phases, artifacts, AI planning techniques, AI planning tools, support tools, and generated plans in these selected papers. By mapping data analyses we identify a deficiency in the use of white-box and error-based testing techniques, besides the recent use of AI planning in security testing.

Paper Nr: 71
Title:

An Approach That Stimulates Architectural Thinking during Requirements Elicitation: An Empirical Evaluation

Authors:

Preethu R. Anish, Maya Daneva, Smita Ghaisas and Roel J. Wieringa

Abstract: In many global outsourcing projects, the software requirement specifications (SRS) are often orchestrated by requirements analysts who have sufficient business knowledge but are not equipped to ask the kind of questions that are needed to unearth architecturally relevant information from the customer. Often, the resultant SRS therefore lacks some critical details needed by software architects to make informed architectural decisions. To remedy this, the software architects either make assumptions or conduct additional stakeholder interviews resulting in expensive refactoring efforts and project delays. Using an empirical approach, we have designed an approach of using architectural knowledge that can serve as a communication medium between requirements analyst and software architects. In this paper, we present a detailed empirical evaluation of our proposed approach, with practitioners from real-world organizations. Using two studies, we found that in the experience of the participating practitioners, the approach is relevant, easy to use and effective.

Paper Nr: 84
Title:

Metrics-driven DevSecOps

Authors:

Wissam Mallouli, Ana R. Cavalli, Alessandra Bagnato and Edgardo Montes de Oca

Abstract: Due to the modern iterative development practices and new automated software engineering tools and methods brought by the DevOps agile method, the traditional metrics and evaluation methods are not enough to ensure software security. Besides, the recent years have seen probably the most continuous and extreme software security attacks ever recorded against organizations in an assortment of enterprises. Security is presently a vast range, critical for business achievement. The existing metrics must be redefined, and new security metrics should be determined based on multiple measures to increase the reliability of the values. Due to the short cycles of iterative processes in DevOps method, the feedback must come quickly, so the measurement should be automated and continuous. Due to the massive amount of information, the results must be visualized at a suitable level of abstraction, which may be different for the various stakeholders. In this paper, we propose a unique Metric-driven approach to help improve the software engineering processes by increasing the quality, adaptability and security of software and decreasing costs and time-to-market.

Paper Nr: 92
Title:

Emerging Design Patterns for Blockchain Applications

Authors:

Vijay Rajasekar, Shiv Sondhi, Sherif Saad and Shady Mohammed

Abstract: Blockchain or Distributed Ledger Technology (DLT) introduces a new computing paradigm that is viewed by experts as a disruptive and revolutionary technology. While bitcoin is the most well-known successful application of blockchain technology, many other applications and sectors could successfully utilize the power of blockchain. The potential applications of blockchain beyond finance and banking encouraged many organizations to integrate and adopt blockchain into existing or new software systems. Integrating and using any new computing paradigm is expected to affect the best practice and design principles of building software systems. This paper summarizes our ongoing research on collecting, categorizing and understanding, existing software design patterns when building blockchain-based software systems. It collects and categorizes the existing software (design and architectural) patterns that are commonly linked to blockchain and distributed ledger technology. We provide an informal analysis of the identified patterns to highlight their maturity. Finally, we discuss the current research gap in software engineering for blockchain-based applications and propose potential research directions.

Area 3 - Software Systems and Applications

Full Papers
Paper Nr: 9
Title:

A Function Dependency based Approach for Fault Localization with D*

Authors:

Arpita Dutta and Rajib Mall

Abstract: We present a scheme for hierarchically localizing software faults. First the functions are prioritized based on their suspiciousness of containing a fault. Further, the bug is localized within the suspected functions at the specific statement level. In our approach, a new function dependency graph is proposed, and based on that function prioritization is performed. In order to differentiate between the functions with equal suspiciousness value, function complexity metrics are considered. We proposed two different dependency edge weighting techniques, viz., Distribution Specified Normalization (DSN) method, and Highest Weight Normalization (HWN) method. These techniques help to measure the relevance of an edge in propagating a fault. We use spectrum-based fault localization (SBFL) technique DStar(D∗) to localize the bugs at the statement level. We also extended our approach to localize multiple fault programs. Based on our experimental results, it is observed that using DSN and HWN scoring schemes, there is a reduction of 43.65% and 38.88% of statements examined compared to the well-accepted SBFL technique DStar(D∗) respectively.

Paper Nr: 17
Title:

Fuzzy Multi-objective Optimization for Ride-sharing Autonomous Mobility-on-Demand Systems

Authors:

Rihab Khemiri and Ernesto Exposito

Abstract: In this paper, we propose a novel three-phase fuzzy approach to optimize dispatching and rebalancing for Ride-sharing Autonomous Mobility-on-Demand (RAMoD) systems, consisting of self-driving vehicles, which provide on-demand transportation service, and allowing several customers to share the same vehicle at the same time. We first introduce a new multi-objective possibilistic linear programming (MOPLP) model for the problem of dispatching and rebalancing in RAMoD systems considering the imprecise nature of the customer requests as well as two conflicting objectives simultaneously, namely, improving customer satisfaction and minimizing transportation costs. Then, after transforming this possibilistic programming model into an equivalent crisp multi-objective linear programming (MOLP) model, the Goal Programming (GP) approach is used to provide an efficient compromise solution. Finally, computational results show the practicality and tractability of the proposed model as well as the solution methodology.

Paper Nr: 18
Title:

Generic GA-PPI-Net: Generic Evolutionary Algorithm to Detect Semantic and Topological Biological Communities

Authors:

Marwa Ben M’Barek, Amel Borgi, Sana Ben Hmida and Marta Rukoz

Abstract: Community detection aims to identify topological structures and discover patterns in complex networks. It presents an important problem of great significance in many fields. In this paper, we are interested in the detection of communities in biological networks. These networks represent protein-protein or gene-gene interactions which corresponds to a set of proteins or genes that collaborate at the same cellular function. The goal is to identify such semantic and/or topological communities from gene annotation sources such as Gene Ontology. We propose a Genetic Algorithm (GA) based technique as a clustering approach to detect communities from biological networks. For this purpose, we introduce four specific components to the GA: a fitness function based on a similarity measure and the interaction value between proteins or genes, a solution for representing a community with dynamic size, an heuristic crossover to strengthen links in the communities and a specific mutation operator. Experimental results show the ability of our Genetic Algorithm to detect communities of genes that are semantically similar or/and interacting.

Paper Nr: 43
Title:

Bee-route: A Bee Algorithm for the Multi-objective Vehicle Routing Problem

Authors:

Jamila Sassi, Ines Alaya and Moncef Tagina

Abstract: The vehicle routing problem has attracted a lot of interest during many decades because of its wide range of applications in real life problems. This paper aims to test the efficiency and capability of bee colony optimization for this kind of problem. We present a Bee-route algorithm: a multi-objective artificial Bee Colony algorithm for the Vehicle Routing Problem with Time Windows. We have performed our experiments on well known benchmarks in the literature to compare our proposed algorithm results with other state-of-the-art algorithms.

Paper Nr: 45
Title:

A Bee Colony Optimization Algorithm for the Long-Term Car Pooling Problem

Authors:

Mouna Bouzid, Ines Alaya and Moncef Tagina

Abstract: Recently, the big number of vehicles on roadways and the increase in the rising use of private cars have made serious and significant traffic congestion problems in large cities around the world. Severe traffic congestion can have many detrimental effects, such as time loss, air pollution, increased fuel consumption and energy waste. Public transportation systems have the capacity to decrease traffic congestion and be an answer to this increasing transport demand. However, it cannot be the only solution. Another recommended solution for reducing the harmful factors leading to such problems is car pooling. It is a collective transportation system based on the idea that a person shares his private vehicle with one or more people that have the same travel destination. In this paper, a Bee Colony Optimization (BCO) metaheuristic is used to solve the Car Pooling Problem. The BCO model is based on the collective intelligence shown in bee foraging behavior. The proposed algorithm is experimentally tested on benchmark instances of different sizes. Computational results show the effectiveness of our proposed algorithm when compared to several state of the art algorithms.

Paper Nr: 53
Title:

Superlinear and Bandwidth Friendly Geo-replication for Store-and-forward Systems

Authors:

Daniel Brahneborg, Wasif Afzal, Adnan Čaušević and Mats Björkman

Abstract: To keep internet based services available despite inevitable local internet and power outages, their data must be replicated to one or more other sites. For most systems using the store-and-forward architecture, data loss can also be prevented by using end-to-end acknowledgements. So far we have not found any sufficiently good solutions for replication of data in store-and-forward systems without acknowledgements and with geographically separated system nodes. We therefore designed a new replication protocol, which could take advantage of the lack of a global order between the messages and the acceptance of a slightly higher risk for duplicated deliveries than existing protocols. We tested a proof-of-concept implementation of the protocol for throughput and latency in a controlled experiment using 7 nodes in 4 geographically separated areas, and observed the throughput increasing superlinearly with the number of nodes up to almost 3500 messages per second. It is also, to the best of our knowledge, the first replication protocol with a bandwidth usage that scales according to the number of nodes allowed to fail and not the total number of nodes in the system.

Paper Nr: 54
Title:

On the Improvement of R-TNCESs Verification using Distributed Cloud-based Architecture

Authors:

Choucha C. Eddine, Mohamed O. Ben Salem, Mohamed Khalgui, Laid Kahloul and Naima S. Ougouti

Abstract: Reconfigurable discrete event control systems (RDECSs) are complex and critical systems, motivating the use of formal verification. This verification consists of two major steps: state space generation and state space analysis. The application of the mentioned steps is usually expensive in terms of computation time and memory. This paper deals with state space generation (accessibility graph generation) during verification of RDECSs modeled with specified reconfigurable timed net condition/event systems (R-TNCESs). We aim to improve model checking used for formal verification of RDECSs by proposing a new aproach of state space generation that considers similarities. In this approach, we introduce the modularity concept for verifying systems by constructing incrementally their accessibility graphs. Furthermore, we set up an ontology-based history to deal with similarities between two or several systems by reusing state spaces of similar components that are computed during previous verification. A distributed cloud-based architecture is proposed to perform the parallel computation for control verification time and memory occupation. The paper’s contribution is applied to a benchmark production system. The evaluation of the proposed approach is performed by measuring the temporal complexity of several large scale system verification. The results show the relevance of this approach.

Paper Nr: 68
Title:

Verifying the Application of Security Measures in IoT Software Systems with Model Learning

Authors:

Sébastien Salva and Elliot Blot

Abstract: Most of today’s software systems log events to record the events that have occurred in the past. Such logs are particularly useful for auditing security over time. But, the growing sizes and lack of abstraction of the logs make them difficult to interpret manually. This paper proposes an approach combining model learning and model checking to help audit the security of IoT software systems. This approach takes as inputs an event log and generic security measures described with LTL formulas. It generates one formal model for every component of an IoT system and helps auditors make the security measures concrete in order to check if the models satisfy them. The LTL formula instantiation is semi-automatically performed by means of an expert system and inference rules that encode some expert knowledge, which can be applied again to the same kind of systems with less efforts. We evaluate our approach on 3 IoT systems against 11 security measures provided by the European ENISA institute.

Paper Nr: 70
Title:

An in-Depth Requirements Change Evaluation Process using Functional and Structural Size Measures in the Context of Agile Software Development

Authors:

Hela Hakim, Asma Sellami and Hanêne Ben Abdallah

Abstract: The Agile methodology known as Scrum is increasingly used in software development as a response to the challenges of managing frequent requirements changes. However, a number of agile-based projects yield unsatisfactory results mainly because of a lack of a well-defined change evaluation process. In fact, such a process should be set-up early in the Software Life-Cycle (SLC). This paper proposes an in-depth evaluation process for requirements changes affecting either an ongoing sprint or an implemented sprint. This evaluation process involves two levels of details: a functional change level and a structural change level based, respectively, on the COSMIC functional size measurement method –ISO 19761 and the Structural Size Measurement Method. We investigate the use of both COSMIC FSM and SSM methods for rapid and detailed evaluation-based measures of a requirement change request.

Paper Nr: 74
Title:

A Real-time Integration of Semantics into Heterogeneous Sensor Stream Data with Context in the Internet of Things

Authors:

Besmir Sejdiu, Florije Ismaili and Lule Ahmedi

Abstract: Recently, billions Internet of Things (IoT) devices, including sensors are producing sensed data continuously in the stream data, and transmit these data to a centralized server. Due to the dramatically increase of streaming data, their management and exploitation has become increasingly important and difficult to process and integrate the semantic to sensor stream data in real-time. This research focuses on real-time integration of semantics into heterogeneous sensor stream data with context in the IoT. In this context, an IoT real-time air quality monitoring system and different semantic annotations are developed for sensor stream data in the format of Sensor Observation Service (SOS).

Paper Nr: 87
Title:

Temporal Convolutional Networks for Just-in-Time Software Defect Prediction

Authors:

Pasquale Ardimento, Lerina Aversano, Mario L. Bernardi and Marta Cimitile

Abstract: Defect prediction and estimation techniques play a significant role in software maintenance and evolution. Recently, several research studies proposed just-in-time techniques to predict defective changes. Such prediction models make the developers check and fix the defects just at the time they are introduced (commit level). Nevertheless, early prediction of defects is still a challenging task that needs to be addressed and can be improved by getting higher performances. To address this issue this paper proposes an approach exploiting a large set of features corresponding to source code metrics detected from commits history of software projects. In particular, the approach uses deep temporal convolutional networks to make the fault prediction. The evaluation is performed on a large data-set, concerning four well-known open-source projects and shows that, under certain considerations, the proposed approach has effective defect proneness prediction ability.

Paper Nr: 97
Title:

3D Mobility, Resizing and Mobile Sink Nodes in Reconfigurable Wireless Sensor Networks based on Multi-agent Architecture under Energy Harvesting Constraints

Authors:

Hanene Rouainia, Hanen Grichi, Laid Kahloul and Mohamed Khalgui

Abstract: This paper deals with reconfigurable wireless sensor networks (RWSNs) to be composed of a set of sensor nodes, which monitor the physical and chemical conditions of the environment. RWSNs adapt dynamically their behaviors to their environment. The main challenge in RWSN is to keep the network alive as long as possible. We apply a set of solutions for energy problems by using 3D mobility, resizing and mobile sink nodes. These solutions are based on a multi-agent architecture employing a wireless communication protocol. Moreover, we develop an application named RWSNSim that allows us to simulate an RWSN and apply the proposed solutions. The performance of the proposed approach is demonstrated through a case study. The case study consists of surveying of fire in a forest which is simulated with RWSNSim application.

Short Papers
Paper Nr: 11
Title:

Tool Support for Green Android Development: A Systematic Mapping Study

Authors:

Iffat Fatima, Hina Anwar, Dietmar Pfahl and Usman Qamar

Abstract: In order to make mobile apps energy efficient, we must find ways to support energy efficient app development. While there is a lack of support tools that aid practitioners in moving towards green Android development. Our goal is to establish the state of the art with respect to support tools that aid green Android development and to identify opportunities for further research. To achieve this goal, we conduct a systematic mapping study. After applying inclusion, exclusion and quality criteria we selected 21 studies for further analysis. Current support tools to aid green Android development were classified into three categories: Profiler, Detector and Optimizer. Most Profiler tools provide a graphical representation of energy consumed over time at various levels. Most Detector tools provide a list of energy bugs/code smells to be manually corrected by a developer for the improvement of energy. Most Optimizer tools automatically generate refactored version(s) of APK/SC. The most typical technique used by Detector and Optimizer tools is static source code analysis using a predefined set of rules. Profiler tools use a wide range of techniques to measure energy consumption. However, these tools have limitations in terms of code smell/energy bug coverage, accuracy, and usability.

Paper Nr: 35
Title:

Developer Driven Framework for Security and Privacy in the IoMT

Authors:

Ceara Treacy, John Loane and Fergal McCaffery

Abstract: The Internet of Medical Things (IoMT), is a fast growing domain as healthcare moves out of structured health services into care in the community. As a result, the sensitive personal and health data associated with the IoMT can potentially flow through a diversity of apps, systems, devices and technologies, public and open networks. This exposes data in the IoMT to additional attack surfaces, which requires the hardening of the security and privacy of the data. Accordingly, the data is bound by regulatory safety, security and privacy requirements. Applying the regulatory compliant requirements is a struggle for developers in small to medium enterprises due to lack of knowledge, experience and understanding. This paper proposes a framework to assist in meeting regulatory compliance for security and privacy of data in flow in the IoMT, directed at developers in small to medium enterprises. The framework considers both security and privacy properties for data in flow protection in the IoMT. This framework expands on the established threat modeling steps to consider both security and privacy. To mitigate the identified security and privacy threats, the framework includes a set of categorised technical security and privacy controls developed through medical device security standards. The originality of this framework is the inclusion of security and privacy requirements in the extension of the traditional threat modeling process, as well as the security and privacy controls embedded in the medical security standards.

Paper Nr: 42
Title:

New Approach for Deadline Calculation of Periodic, Sporadic and Aperiodic Real-time Software Tasks

Authors:

Aicha Goubaa, Mohamed Khalgui, Georg Frey and Zhiwu Li

Abstract: A real-time system must react to events from the controlled environment while executing specific tasks that can be periodic, aperiodic or sporadic. These tasks can be subjected to a variety of temporal constraints, the most important one is the deadline. Thus, a reaction occurring too late may be useless or even dangerous. In this context, the main problem of this study is how to configure feasible real-time system having both periodic, aperiodic and sporadic tasks. In this paper, we propose a new off-line approach that configures feasible scheduling of a combination of software real-time tasks while serving aperiodic tasks without jeopardizing schedulability of periodic and sporadic ones.

Paper Nr: 48
Title:

Towards a Better Understanding of Genetic Operators for Ordering Optimization: Application to the Capacitated Vehicle Routing Problem

Authors:

S. Ben Hamida, R. Gorsane and K. G. Mestiri

Abstract: Genetic Algorithms (GA) have long been used for ordering optimization problems with some considerable efforts to improve their exploration and exploitation abilities. A great number of GA implementations have been proposed varying from GAs applying simple or advanced variation operators to hybrid GAs combined with different heuristics. In this work, we propose a short review of genetic operators for ordering optimization with a classification according to the information used in the reproduction step. Crossover operators could be position (”blind”) operators or heuristic operators. Mutation operators could be applied randomly or using local optimization. After studying the contribution of each class on solving two benchmark instances of the Capacitated Vehicle Routing Problem (CVRP), we explain how to combine the variation operators to allow simultaneously a better exploration of the search space with higher exploitation. We then propose the random and the balanced hybridization of the operators’ classes. The hybridization strategies are applied to solve 24 CVRP benchmark instances. Results are analyzed and compared to demonstrate the role of each class of operators in the evolution process.

Paper Nr: 58
Title:

Exploiting Exclusive Higher Resolution to Enhance Response Time of Embedded Flash Storage

Authors:

Jung-Hoon Kim and Young-Sik Lee

Abstract: NAND flash-based embedded storage may spend a long time on responding to a host storage system. Most of the flash translation layers (FTLs) of the embedded flash storage utilize a granular page-mapping level. However, they did not pay heed to page mapping management that causes the internal overhead of the page-level FTL. This overhead might damage the response time, especially after the random writes to the embedded flash storage. In this paper, we propose a novel method to reduce the internal overhead related to the page mapping write. This method exploits a virtually-shrunk segment exclusively to the page mapping table, which is implemented by our mapping-segmented flash translation layer (MSFTL). One mapping segment is intrinsically composed of consecutive page mappings smaller in size than a logical page of the host system. As a result, MSFTL drastically reduces the amount of page mapping data written and therefore improves both the average and worst response time compared with the fine-granularity page-level FTLs.

Paper Nr: 64
Title:

Blockchain Project Initiation and Execution: Decision-making Practices and Perceptions

Authors:

Bolatzhan Kumalakov and Yassynzhan Shakan

Abstract: Blockchain promises to revolutionise the way data management is perceived by business entities. Nonetheless, we know little of how to decide which data to protect, such that added value exceeds technology introduction and ownership costs. Paper presents our attempt to approach the issue via conducting an international online survey in Kazakhstan, Kyrgyzstan and Russia in late 2018 and early 2019. Paper contributes to the body of knowledge by establishing that up-to-date blockchain introduction is - de facto - an unguided process. Despite multiple efforts to come-up with a decision framework, real-world projects are initiated with little - if any - guidance on potential costs and benefits.

Paper Nr: 67
Title:

Fault Detection and Co-design Recovery for Complex Task within IoT Systems

Authors:

Radia Bendimerad, Kamel Smiri and Abderrazek Jemai

Abstract: Internet of things (IoT) allows the implementation of embedded devices that guarantee multiple application processing. A device contains a processor core (CPU) but also hardware accelerator (FPGA) to permit a hardware-software (HW-SW) partitioning depending on its complexity. However, these devices have limited capacities, so they are exposed to faults. Thus, the system needs to be adapted in such a way to detect device faults, continues to function normally and perform tasks to produce correct output. In this paper, a holistic approach dealing with fault detection and recovery for complex tasks within the IoT was outlined. It offers a technique to diagnose the state of processing elements. Then it combines task scheduling in HW and SW parts (Co-design) in such a manner to ease the process of recovery when a system fault is detected. The proposed work ensures a better performance for the entire system. An experimental study validates the effectiveness of the present strategy without impacting system performances thanks to the contributions defined in this paper.

Paper Nr: 69
Title:

Towards a Data Science Framework Integrating Process and Data Mining for Organizational Improvement

Authors:

Andrea Delgado, Adriana Marotta, Laura González, Libertad Tansini and Daniel Calegari

Abstract: Organizations face many challenges in obtaining information and value from data for the improvement of their operations. For example, business processes are rarely modeled explicitly, and their data is coupled with business data and implicitly managed by the information systems, hindering a process perspective. This paper presents a proposal of a framework that integrates process and data mining techniques and algorithms, process compliance, data quality, and adequate tools to support evidence-based process improvement in organizations. It aims to help reduce the effort of identification and application of techniques, methodologies, and tools in isolation for each case, providing an integrated approach to guide each operative phase, which will expand the capabilities of analysis, evaluation, and improvement of business processes and organizational data.

Paper Nr: 85
Title:

Using Blockchain to Implement Traceability on Fishery Value Chain

Authors:

Estrela F. Cruz and António M. Rosado da Cruz

Abstract: Nowadays, consumers increasingly want to be informed about the products they are buying or consuming, especially when it comes to food, such as fish. Besides nutritional information, consumers want to know about the fish origin, whether it has been properly stored and transported, etc. At the same time, for public health reasons, authorities may need to know the current location of certain fish lots (which have been caught or produced in a specific location, have been stored in a certain place, have been transported by a certain truck, etc.). In other words, consumers and society in general demand transparency throughout all the value chain of fish products. In this paper, we are proposing a blockchain-based platform to allow to trace fish lots, back and forth, throughout the entire fisheries value chain. To implement the traceability platform, we define a smart contract to be used on the Ethereum blockchain.

Paper Nr: 98
Title:

Towards Quantitative Trade-off Analysis in Goal Models with Multiple Obstacles using Constraint Programming

Authors:

Christophe Ponsard and Robert Darimont

Abstract: Goal Models capture system goals and their decomposition into operational requirements assigned to human, hardware or software agents. This refinement process supports alternatives both when refining goals processes but also when reasoning and refining obstacles to goals. This leads to large design space to explore in order to select a specific solution fulfilling a set of set of non-functional requirements (e.g. reliability, security, performance) or business goals (e.g. costs, satisfaction). This paper investigates how optimisation techniques can be used to efficiently explore the design space where multiple objectives have to be met simultaneously. This works extends previous work by allowing one not only to select a single alternative but also to combine different alternatives together to produce a more robust design. In order to explore the potentially very large design space, we show how to translate a model with many goals and obstacle alternatives, expressed in the KAOS notation, into a constraint programming (CP) problem. The OscaR.CP engine is then used to compute a set of Pareto-optimal solutions regarding the targeted evaluation objectives. Our method is implemented as a tool plugin of a requirements engineering platform and is benchmarked on a security case study close to attack trees.

Paper Nr: 100
Title:

On Decomposing Formal Verification of CTL-based Properties on IaaS Cloud Environment

Authors:

Chams E. Choucha, Mohamed Ramdani, Mohamed Khalgui and Laid Kahloul

Abstract: This paper deals with reconfigurable discrete event/control systems (RDECSs) that dynamically change their structures due to external changes in environment or user requirements. RDECSs are complex and critical. The verification of these systems continues to challenge experts in both academia and industry since the generated state spaces are much bigger and the properties to be verified are more complex. Reconfigurable Timed Net Condition/Event Systems (R-TNCESs) are proposed as an extension of the Petri nets formalism for the optimal functional and temporal specification of RDECSs. Real systems model can encompass millions of transitions which, implies huge state spaces and complex properties to be verified. To control the complexity and to reduce the verification time, a new method of CTL properties verification in a cloud-based architecture is proposed. The novelty consists in a new method for state space generation and the decomposition of the complex properties for running an efficient verification. An algorithm is proposed for the incremental state space generation. A case study is exploited to illustrate the impact of using this technique. The current results show the benefits of the paper’s contribution.

Paper Nr: 23
Title:

Towards Secure Data Sharing Processes in Online Social Networks: Trusty

Authors:

Gulsum Akkuzu, Benjamin Aziz and Mo Adda

Abstract: The development of Web 2.0 has remarkably increased in today’s world. These development has also been a reason for the increment of online social networks (OSNs). Web 2.0 is the roof of the online social networks since online social networks are built on Web 2.0. Users are given an environment in which they can communicate with others without considering other users locations. The way of communication in OSNs is done via sharing various contents of data, such as photos, texts, and videos. Sharing data sometimes cause privacy problems in OSNs, especially in the case that the content involves different users information on itself. Users are notified after the content is shared and they are allowed to remove tags. The content is still available in OSNs platforms, users, therefore, find a way to punish other users with being unfriend, or they quit from OSNs. However, both cases are contradictory with the main of OSNs. By considering the above issues, we develop a framework in which users’ opinions are taken on data sharing process and based on the final decision, which is taken by the user who posts the content, punishing or rewarding technique is used. We also evaluate the proposed work with users interactions.

Paper Nr: 24
Title:

Big Data Streaming Platforms to Support Real-time Analytics

Authors:

Eliana Fernandes, Ana C. Salgado and Jorge Bernardino

Abstract: In recent years data has grown exponentially due to the evolution of technology. The data flow circulates in a very fast and continuous way, so it must be processed in real time. Therefore, several big data streaming platforms have emerged for processing large amounts of data. Nowadays, companies have difficulties in choosing the platform that best suits their needs. In addition, the information about the platforms is scattered and sometimes omitted, making it difficult for the company to choose the right platform. This work focuses on helping companies or organizations to choose a big data streaming platform to analyze and process their data flow. We provide a description of the most popular platforms, such as: Apache Flink, Apache Kafka, Apache Samza, Apache Spark and Apache Storm. To strengthen the knowledge about these platforms, we also approached their architectures, advantages and limitations. Finally, a comparison among big data streaming platforms will be provided, using as attributes the characteristics that companies usually most need.

Paper Nr: 29
Title:

A Retrospective Study of Taxonomy based Testing using Empirical Data from a Medical Device Software Company

Authors:

Hamsini K. Rajaram, John Loane, Silvana T. MacMahon and Fergal Mc Caffery

Abstract: Software defects in medical devices have caused serious injuries and deaths to patients. Medical devices are facing an increasing number of the U.S Food and Drug Administration (FDA) recalls due to poor quality software. Research studies suggest that defect taxonomies are powerful tools to prevent and control defects. Defect taxonomies have been used to improve software quality in the safety critical, business and telecommunications domains. Defect taxonomies can be used in testing and are more efficient at finding defects at the earliest possible stage of software development. This paper discusses taxonomy based testing in medical device software (MDS) development. SW91 is a new defect taxonomy for health software developed by the Association for the Advancement of Medical Instrumentation. This paper details a retrospective study conducted to investigate taxonomy based testing by mapping empirical data from a MDS company in Ireland to SW91 defects. It explains the process and shows the benefits of taxonomy based testing, which include defect minimisation and root cause analysis. It provides recommendations which can be followed when using taxonomy based testing. It also details interviews conducted with the CEO, developers and the quality assurance engineer from Company A. Finally, it briefly details how taxonomy based testing will be implemented at a MDS company by applying a framework which was developed from this research.

Paper Nr: 90
Title:

An Approach to Train and Evaluate the Cybersecurity Skills of Participants in Cyber Ranges based on Cyber-Risk Models

Authors:

Gencer Erdogan, Åsmund Hugo, Antonio Á. Romero, Dario Varano, Niccolò Zazzeri and Anže Žitnik

Abstract: There is an urgent need for highly skilled cybersecurity professionals, and at the same time there is an awareness gap and lack of integrated training modules on cybersecurity related aspects on all school levels. In order to address this need and bridge the awareness gap, we propose a method to train and evaluate the cybersecurity skills of participants in cyber ranges based on cyber-risk models. Our method consists of five steps: create cyber-risk model, identify risk treatments, setup training scenario, run training scenario, and evaluate the performance of participants. The target users of our method are the White Team and Green Team who typically design and execute training scenarios in cyber ranges. The output of our method, however, is an evaluation report for the Blue Team and Red Team participants being trained in the cyber range. We have applied our method in three large scale pilots from academia, transport, and energy. Our initial results indicate that the method is easy to use and comprehensible for training scenario developers (White/Green Team), develops cyber-risk models that facilitate real-time evaluation of participants in training scenarios, and produces useful feedback to the participants (Blue/Red Team) in terms of strengths and weaknesses regarding cybersecurity skills.

Paper Nr: 91
Title:

Application of Computer Vision Technologies for Automated Utility Meters Reading

Authors:

Maria Spichkova and Johan Van Zyl

Abstract: This paper presents a study on automated reading of utility meters using two computer vision techniques: an open-source solution Tensorflow Object Detection (Tensorflow) and a commercial solution Anyline. We aimed to identify the limitations and benefits of each solution applied to utility meters reading, especially focusing on aspects such as accuracy and inference time. Our goal was to determine the solution that is the most suitable for this particular application area, where there are several specific challenges.

Paper Nr: 95
Title:

RiverConc: An Open-source Concolic Execution Engine for x86 Binaries

Authors:

Ciprian Paduraru, Bogdan Ghimis and Alin Stefanescu

Abstract: This paper presents a new open-source testing tool capable of performing concolic execution on x86 binaries. Using this tool, one can find out ahead of time of potential bugs that can enable threats such as process hijacking and stack buffer overflow attacks. Although a similar tool, SAGE, already exists in literature, it is closed-sourced and we think that using its description to implement an open-sourced version of its main novel algorithm, Generational Search, is beneficial to both industry and research communities. This paper describes, in more detail than previous work, how the components at the core of a concolic execution tool, such as tracers, dynamic tainting mechanisms and SMT solvers, collaborate together to ensure code coverage. Also, it briefly describes how reinforcement learning can be used to speed up the state of the art heuristics for prioritization of inputs. Research opportunities and the technical difficulties that the authors observed during the current development of the project are presented as well.

Paper Nr: 101
Title:

Machine Learning Models for Automatic Labeling: A Systematic Literature Review

Authors:

Teodor Fredriksson, Jan Bosch and Helena H. Olsson

Abstract: Automatic labeling is a type of classification problem. Classification has been studied with the help of statistical methods for a long time. With the explosion of new better computer processing units (CPUs) and graphical processing units (GPUs) the interest in machine learning has grown exponentially and we can use both statistical learning algorithms as well as deep neural networks (DNNs) to solve the classification tasks. Classification is a supervised machine learning problem and there exists a large amount of methodology for performing such task. However, it is very rare in industrial applications that data is fully labeled which is why we need good methodology to obtain error-free labels. The purpose of this paper is to examine the current literature on how to perform labeling using ML, we will compare these models in terms of popularity and on what datatypes they are used on. We performed a systematic literature review of empirical studies for machine learning for labeling. We identified 43 primary studies relevant to our search. From this we were able to determine the most common machine learning models for labeling. Lack of unlabeled instances is a major problem for industry as supervised learning is the most widely used. Obtaining labels is costly in terms of labor and financial costs. Based on our findings in this review we present alternate ways for labeling data for use in supervised learning tasks.