Plenary Speakers
Short bio
Dr. Isabela Drămnesc is Associate Professor at the Department of Computational Sciences and Artificial Intelligence of Faculty of Computer Science from West University of Timișoara. She teaches various subjects such as: Introduction to Robotics, Logic and Functional Programming, Graph Theory and Combinatorics, Automated Theorem Proving, Algorithm Synthesis, and Mathematical Theory Exploration. Her research interests include artificial intelligence and robotics, logic and automated reasoning, mathematical theory exploration, and proof-based algorithm synthesis, verification, and transformation.
She received her PhD in Computer Science in 2013 from West University of Timișoara, under the supervision of Tudor Jebelean from Johannes Kepler University Linz, Austria. After her PhD, she was a postdoctoral researcher at West University of Timișoara and at INRIA Sophia Antipolis, France.
She has experience in various European projects. She served as project manager of the Erasmus+ higher-education partnership ARC: Automated Reasoning in the Class
and is currently the project manager of the ongoing Erasmus+ project AiRobo: Artificial Intelligence based Robotics
.
She has contributed to several European projects, including serving as project manager for Erasmus+ strategic and cooperation partnerships in higher education, such as ARC: Automated Reasoning in the Class
and AiRobo: Artificial Intelligence based Robotics
.
In 2020, she received the Mihail Ghermănescu
Award for Mathematics and Computer Science from West University of Timișoara at the sixth edition of Gala Premiilor UVT.
Abstract
This talk focuses on the automated certification of various sorting algorithms in Theorema. The certification of QuickSort, Patience Sort, MinSort, MaxSort, and Min–Max Sort is carried out by developing the required theory, defining the algorithms in functional style, and generating the correctness proofs. Theorema supports highly automated proofs presented in a natural style, close to mathematical textbook reasoning. Specifications use multisets to express element preservation, and the correctness arguments rely on a generalized induction scheme based on a well-founded ordering on lists induced by strict multiset inclusion. A brief comparison with Coq — where certification typically proceeds via interactive scripts and additional supporting lemmas — highlights differences in automation, proof engineering effort, and proof readability.
Short bio
Dr. Zsolt Adam Balogh is an associate professor in the Department of Mathematical Sciences, United Arab Emirates University.
His research interests encompass abstract algebra, medical image processing and deep learning with more than ten years of industrial development experience in the field of computed tomography.
Abstract
A CT scanner is a non-invasive diagnostic device that acquires cross-sectional X-ray images, referred to as projections. These projections encode the photon attenuation properties of the scanned object, from which the photon attenuation coefficient at each point of the original object can be computed. However, X-ray radiation emitted by CT scanners poses health risks, although the exact magnitude of these risks is difficult to quantify precisely. For this reason, medical imaging follows a general principle known as ALARA, an acronym for as low as reasonably achievable.
This principle serves as guidance for the development of imaging systems and associated reconstruction algorithms, with the goal of achieving the lowest possible radiation dose without compromising diagnostic accuracy.
This presentation focuses on software-based approaches that enable the development of low-dose CT systems, commonly referred to in the literature as Low-Dose CT (LDCT). The topic encompasses modern model-based iterative reconstruction technologies, sparse-view reconstruction problems, reconstruction challenges associated with reduced-size detector configurations, as well as both traditional and deep-learning–based denoising algorithms.
Short bio
Dr. Máté Tejfel received his Ph.D. degree in 2009 from the Eötvös Loránd University, ELTE, Hungary, where he is currently an associate professor at the Programming Languages and Compilers Department of the Informatics Faculty. His research interests cover development, analysis and refactoring of domain specific languages, formal semantics of languages and verification of programs. He participated in the design and implementation of several analysis and refactoring tools such us RefactorErl and P4Query.
Abstract
P4 is a domain specific language which first version was introduced at 2014. It has created mainly for describing packet processing algorithms executable in different kind of network elements (switches, routers). In the framework of a software technology lab we have created an analyzer/refactoring tool for P4 in the Programming Languages and Compilers Department. In this presentation, I would like to examine what can or cannot make a domain specific language successful and share some of the key lessons we learned during the development of the tool.
Short bio
Dr. Zoltán Gál is a professor at the Faculty of Informatics, University of Debrecen, where he serves as head of the Department of Information Systems and Networks. An IEEE member, he has extensive experience in academic leadership and university-level IT and infrastructure development. His research and professional interests include high-speed communication networks, satellite–terrestrial integration, Quality of Service, Internet of Things systems, and distributed and parallel computing. He actively contributes to higher education, research coordination, and large-scale national and international ICT projects.
Abstract
Low Earth Orbit (LEO) satellite mega-constellations are becoming a cornerstone of global connectivity, enabling broadband services with latency and throughput characteristics comparable to terrestrial networks. Starlink, as the most extensively deployed LEO system, provides a unique real-world reference for evaluating the technological, architectural, and performance aspects of satellite-based communication services. This plenary presentation offers an integrated overview of Starlink LEO satellite services, positioned within the broader context of modern satellite systems and their convergence with terrestrial mobile networks. The presentation highlights the fundamental architectural principles of LEO constellations, including orbital design, multi-beam coverage, and inter-satellite networking, and examines their role in extending connectivity beyond the limits of ground-based infrastructure. Particular attention is given to the integration of satellite systems into 5G Non-Terrestrial Network frameworks, following recent 3GPP standardisation efforts. Key challenges related to spectrum usage, mobility, interference, and end-to-end Quality of Service are discussed in the context of hybrid terrestrial–satellite operation. The overview is complemented by research results derived from large-scale measurement campaigns and analytical studies of Starlink services, revealing spatial and temporal performance variations in latency and reliability. These findings provide insight into critical operating regimes of LEO systems and contribute to understanding their role in future integrated B5G and 6G network architectures.
Short bio
Dr. Sándor Király received his PhD from the University of Debrecen, Hungary. He is currently the head of the Department of Information Technology. His main research interests include digital image processing and the methodology of teaching programming and computer science subjects.
As a secondary school teacher, he has prepared students who won the Nemes Tihamér Contest on eleven occasions and the OKTV (National Secondary School Academic Competition) ten times. Four of his students have represented Hungary at the International Olympiad in Informatics (IOI) on six occasions, earning silver and bronze medals. He contributed to the development of the two-level national baccalaureate examination system in informatics introduced in 2005. Currently, he works as an expert in one of the pillars of the Ferenc Krausz Talent Development Programme, focusing on programming education and talent support in public education.
Abstract
This keynote examines the evolution of programming education in Hungary from the 1980s to the present, interpreting technological changes through a methodological and pedagogical lens rather than a purely historical one. Four major eras are distinguished: the pre-internet period, the spread of personal computers without internet access, the era shaped by widespread internet availability and standardized national examinations, and the current phase marked by the emergence of large language models (LLMs). The presentation analyzes how curricular content (algorithmic thinking versus application-oriented programming), instructional practices, assessment methods, and the role of competitions (such as the Nemes Tihamér Programming Contest and the National Secondary School Competition) evolved across these periods. Particular attention is given to the implicit and explicit learning goals of programming education, and to how certain competencies gained or lost prominence as technological conditions changed. In its concluding section, the keynote offers a normative reflection on programming education in the age of AI, arguing which learning processes and skills cannot be meaningfully delegated to automated systems. It outlines pedagogical responses needed to ensure that programming education continues to foster deep understanding, problem-solving skills, and talent development in an AI-supported learning environment.