DISIM New Faculty Seminars is a welcoming initiative for new members who have recently enrolled at DISIM, offering an opportunity to introduce themselves and summarize some salient aspects of their scientific activity. Details on the next approaching seminars are given below.
Roberto CIVINO, Algebraic tools for breaking ciphersAbstract. The security of a private-key cryptosystem is sometimes hard to prove formally, and many standard constructions we rely on every day are only secure "until proven otherwise". The intriguing art of proving otherwise goes by the name of cryptanalysis. In this talk I will present some of the algebraic techniques that the cryptanalyst can use to detect undesired behaviors in symmetric ciphers and I will show some positive and negative results.Donato PERA, Numerical simulations of 1461 and 1762 San Pio delle Camere (L'Aquila) earthquakes using 3D physic-based modelAbstract. Abruzzo (central Italy) is a region of high seismic risk, having been hit by numerous earthquakes in the past, the most recent in 2009. For many of them, however, which occurred in the pre-instrumental era, available information are often lacking and/or inaccurate. Therefore physical based (PB) numerical simulations represent a valuable tool to increase the understanding of the area, both from a geological and historical point of view. In our work  we propose a physics-based simulation of the two earthquakes that hit the surrounding area of the city of L'Aquila in 1461 and 1762, with magnitudes 6.4 Mw and 6.0 Mw, respectively. Both events are placed, by the available literature, on the fault structure named San Pio delle Camere (Aq) . The physical parameters characterizing the earthquake such as fault plane, epicenter, and magnitude are considered to be fixed. Starting from them three stochastic rupture scenarios are generated from each earthquake using three different slip distributions. The scenarios were evaluated in relation to the possibility to reproduce the macroseismic intensity field available from the historical catalogs. The simulated values of peak velocity are used to derive the value of the macrosiesmic intensity obtained by a suitable empirical relationship specifically derived for Italy. For the numerical simulations we used a three-dimensional soil model used and validated in a previous study related to the 2009 L'Aquila earthquake. The considered slip distributions are able to reproduce quite well the macroseismic effect of the 1461 earthquake. While none of the three scenarios developed satisfactorily reproduce the 1762 earthquake. References:  Numerical simulations of 1461 and 1762 San Pio delle Camere (L’Aquila) earthquakes using 3D physic-based model. D. Pera , F. Di Michele , E. Stagnini , B. Rubino , R. Aloisio , P. Marcati. Accepted for publication at International Conference on Computational Science and its Applications 2023.  Defining a model of 3d seismogenic sources for seismic hazard assessment applications: The case of central apennines (italy). Boncio, P., Lavecchia, G., Pace, B. Journal of Seismology 8, 407–425 (2004).Emanuela RADICI, Approximation Methods for Existence and Regularity in Nonlinear Elasticity and Conservation LawsAbstract. We present different approximating methods to study existence and regularity of solutions of two problems of different nature. A variational problem describing the optimal elastic deformation of a media with prescribed boundary conditions, and a time-dependent one in the form of a conservation law which model the evolution of a macroscopic density of agents avoiding congestion. Different techniques are borrowed from the theories of Calculus of Variation, Optimal Transport and Gradient-Flow.Juri DI ROCCO, Empowering Software Development with Robust Recommender Systems: Leveraging Open-Source Resources and Mitigating Data BiasesAbstract. The field of software development has become increasingly complex, driven by the diverse nature of its components, data sources, and tasks. Developers heavily rely on available resources, particularly open-source software (OSS) repositories, to carry out their daily activities. These repositories provide a wealth of data sources, such as code snippets, documentation, and user discussions, which prove valuable in supporting development tasks. In our research, we have focused on building Recommender Systems (RS) designed to offer developers various suggestions. These recommendations encompass areas such as third-party libraries, documentation on API utilization, and relevant API function calls, all of which aim to assist developers in their work. Furthermore, we have implemented multiple RSs within the context of Model-Driven Engineering (MDE). Our investigation revolves around the idea of leveraging insights from previous modeling experiences and providing modeling assistants that offer guidance and support during modeling tasks. Nowadays, many recommender systems rely on data-driven algorithms, triggering the need to build robust and resilient systems that can withstand malicious data injections and mitigate data biases, such as popularity biases. Through empirical observations, we have identified the significant impact of adversarial attacks and biased datasets on various state-of-the-art recommender systems in domains like API calls, code snippets, and third-party libraries recommendation. These findings underscore the need for RSSE to employ robust defenses against adversarial attacks and to address data biases effectively. By doing so, we can ensure the reliability and accuracy of the recommendations provided by these systems.Igor MELATTI, A Two-Layer Near-Optimal Strategy for Substation Constraint Management via Home BatteriesAbstract. Within electrical distribution networks, substation constraints management requires that aggregated power demand from residential users is kept within suitable bounds. Efficiency of substation constraints management can be measured as the reduction of constraints violations w.r.t. unmanaged demand. Home batteries hold the promise of enabling efficient and user-oblivious substation constraints management. Centralized control of home batteries would achieve optimal efficiency. However, it is hardly acceptable by users, since service providers (e.g., utilities or aggregators) would directly control batteries at user premises. Unfortunately, devising efficient hierarchical control strategies, thus overcoming the above problem, is far from easy. In this presentation, I will present a recent work proposing a two-layer control strategy for home batteries counteracting the above discussed issues. Namely, our control strategy avoids direct control of home devices by the service provider and at the same time yields near-optimal substation constraints management efficiency. The effectiveness of our approach w.r.t. a theoretical optimal centralized strategy has been evaluated by simulation on field data from 62 households in Denmark.Phuong Thanh Nguyen, Mining time-series data in open source software repositories with deep learningAbstract. The proliferation of disruptive Machine Learning (ML) and especially Deep Learning (DL) algorithms has enabled a plethora of applications across several domains. Such techniques work on the basis of complex artificial neural networks, which are capable of effectively learning from data by means of a large number of parameters distributed in different network layers. We have studied and deployed various Machine Learning techniques in Software Engineering and other domains. In open source software repositories, there exist time-series artifacts that result from the interaction between developers and hosting platforms, e.g., the evolution of a software project in GitHub over time. In our recent work, we succeeded in conceptualizing recommender systems to provide upgrading plans for third-party libraries, and API function calls. The migration history of projects is mined to build matrices and train deep neural networks, whose weights and biases are used to forecast the subsequent versions of the related libraries, as well as relevant API calls. This showcases the potential of ML algorithms in mining time series data in OSS ecosystems including GitHub.Guido DI PATRIZIO STANCHIERI, Integrated optical communication system for biotelemetry applicationsAbstract. The research activity of Guido Di Patrizio Stanchieri, will mainly concern the study, analysis, planning, prototyping and characterization of electronic microsystems and photonics integrated in CMOS technology, such as ASICs (Application-Specific Integrated Circuit), SoC (System-onChip), PIC (Photonic Integrated Circuit) and SiP (Silicon photonic). The areas of expertise are in microelectronics integrated analog/digital, optoelectronics and FPGA implementations. In particular, the main activity will concern the development of new ones integrated optoelectronic systems for the processing of data/signals and their optical communication (wireless and fiber) in biomedical applications, such as optical biotelemetry.Luigi POMANTE, HEPSYCODE: a system-level methodology for hw/sw CO-DEsign of HEterogeneous Parallel dedicated SYstemsAbstract. Heterogeneous parallel architectures have been recently exploited for a wide range of application domains, for both general-purpose and dedicated electronic products. Such architectures include several processors (e.g., GPP, DSP, GPU, HW accelerators), memories, and a set of interconnections among them. A dedicated system is a digital electronic system with an HW/SW custom architecture designed in order to satisfy specific a priori known application requirements. A dedicated system could be embedded in a more complex system and/or subjected to real-time constraints. When dedicated systems are based on heterogeneous parallel architectures, they are so complex that the adopted design methodology plays a major role in determining the success of a products. Consequently, SW tools able to support the designers to manage the complexity of systems development are even more of fundamental importance. Unfortunately, there are no general methodologies and toolchains defined for this purpose. In fact, very often, the best option is to refer to experienced designer indications with respect to empirical criteria and qualitative assessments. In such a context, this talk presents a methodology (with a related toolchain) for the system-level HW/SW co-design of heterogeneous parallel dedicated systems, called HEPSYCODE, able to support the development of such systems in different application domains. At the end, it is able to suggest an HW/SW partitioning of the application specification and a mapping of the partitioned entities onto an automatically defined heterogeneous parallel architecture. In particular, the talk illustrates the reference HW/SW co-design flow by describing its different steps and then focusing on the system-level design space exploration approach. Finally, the talk shows some reference examples and case studies.Walter TIBERTI, Cybersecurity in the ICT world: hot topics, challenges and current research activitiesAbstract. Despite being as old as the computer science itself, Cybersecurity is often seen as a "new" and "modern" research area. In fact, only recently (~15 years) the term "cybersecurity" acquired a meaning not strictly associated with illegal activities. However, a dichotomy still exists: while the research communities, conferences and peer-reviewed journals focus mostly on the ethical- or defence-related topics of cybersecurity, impactful cybersecurity advancements are brought by what could be called the "underground" scene of cybersecurity, where the distinction between what is legal and what is not become blurred. Nevertheless, this scenario makes Cybersecurity a fast evolving area, in which a single research topic/challenge can spread easily over the multiple and different ICT domains. In this seminar, we will discuss the multi-disciplinary aspects of Cybersecurity, giving an overview on the security-related hot topics in the ICT areas, the research challenges and some of the on-going research activities in our department.