At this year's Linux Plumbers Conference, our PhD Student Benno Bielmeier gave a talk on his work on probabilistic real-time analyses and presented it to a broader audience. With his techniques, Benno tries to probabilistically predict real-time behaviour of complex software systems. His talk met high interest of Linux Kernel developers and ended in a fruitful discussion with core kernel developers. Congrats, Benno!
Abstract Ensuring temporal correctness of real-time systems is challenging. The level of difficulty is determined by the complexity of hardware, software, and their interaction. Real-time analysis on modern complex hardware platforms with modern complex software ecosystems, such as the Linux kernel with its userland, is hard or almost impossible with traditional methods like formal verification or real-time calculus. We need new techniques and methodologies to analyse real-time behaviour and validate real-time requirements.
In this talk, we present a toolkit designed to evaluate the probabilistic Worst-Case Execution Time (pWCET) of real-time Linux systems. It utilises a hybrid combination of traditional measurement-based and model-based techniques to derive execution time distributions considering variability and uncertainty in real-time tasks. This approach provides assessment of execution time bounds and supports engineers to achieve fast and robust temporal prediction of their real-time environments.
Our framework models runtime behaviour and predicts WCET in a streamlined four-phase process: (1) model relevant aspects of the system as finite automaton, (2) instrument the system and measure latencies within the model, (3) generate a stochastic model based on semi-Markov chains, and (4) calculate pWCET via extreme value statistics. This method is applicable across system context boundaries without being tied to specific platforms, infrastructure or tracing tools.
The framework requires injecting tracepoints to generate a lightweight sequence of timestamped events. This can be done by existing Linux tracing mechanisms, for instance, BPF or ftrace. Benefits include significantly reduced WCET measurement duration from days to minutes, dramatically accelerating development cycles for open-source systems with frequent code updates like Linux. This efficiency doesn't compromise accuracy; our hybrid approach ensures robust temporal predictions, enabling developers to quickly assess real-time implications of changes and maintain system performance.
In our talk, we outline the steps taken towards this new evaluation method and discuss the limitations and potential impacts on the development process. We invite interaction from the community to discuss the benefits and limitations of this approach. Our goal is to refine this toolkit to enhance its utility for Linux kernel developers and maintainers, ultimately contributing to a more efficient and effective development process for real-time systems.
For one final time, the QLindA consortium has met at OTH Regensburg, to review results and milestones achieved during the project's over three-year lifespan.
These milestones encompass a broad selection of publications on quantum machine learning for industrial applications, jointly organised workshops at the IEEE Conference on Quantum Computing and Engineering, and an extensive software library, among others.
Fitting to the occasion, a beer barrel was opened later in the evening, to properly celebrate the project's conclusion. Prost!
We are happy that all our full paper submissions to IEEE QCE, one of the leading quantum computing conferences, have been accepted:
Multiple contributions to IEEE QCE, IEEE QSW and ACM SIGMOD conferences
We are happy that a whole lot of submissions have been met with favourable response by reviewers recently:
Thanks to all the involved group members and collaborators for the hard and exciting work!
Joint contribution to the 27th International Conference on Computing in High Energy & Nuclear Physics (CHEP) by Maja Franz, Manuel Schönberger and Wolfgang Mauerer, together with international partners:
Our primary focus is on two key areas: Firstly, we estimate runtimes and scalability for common NHEP problems addressed via QUBO formulations by identifying minimum energy solutions of intermediate Hamiltonian operators encountered during the annealing process. Secondly, we investigate how the classical parameter space in the QAOA, together with approximation techniques such as a Fourier-analysis based heuristic, proposed by Zhou et al. (2018), can help to achieve (future) quantum advantage, considering a trade-off between computational complexity and solution quality. Our computational analysis of seminal optimisation problems suggests that only lower frequency components in the parameter space are of significance for deriving reasonable annealing schedules, indicating that heuristics can offer improvements in resource requirements, while still yielding near-optimal results.
Wolfgang Mauerer, Ralf Ramsauer, Petra Eichenseher and Simon Thelen, together with Prof. Dr. Jürgen Mottok (LaS³, Faculty of Electrical and Information Technologies), paid a visit to MuQuaNet at its site at the University of the Bundeswehr Munich. We are proud to take part in Bavarian research regarding quantum cryptography with our own brand-new quantum key distribution system and are looking forward to opportunities to deepen our relationship with UniBW in the future.
In a prestigous event hosted by LfD, Bavarian State Minister for Science and Arts, Markus Blume, inaugurated the OTH's new quantum key distribution system provided by "Quantum Optics Jena" and funded by "Hightech Agenda Bayern". The system will play an important role in future cross-faculty efforts to integrate quantum cryptography into education, science and industry.
The event featured many key players from the Bavarian quantum science and industry commmunity, including Prof. Dr. Rudolf Gross from Munich Quantum Valley, Dr. Nils gentschen Felde from MuQuaNet, Prof. Dr. Christoph Marquardt from FAU, Laura Schulz from LRZ, Dr. Bettina Heim from OHB, Dr. Sebastian Luber from Infineon, Dr. Christoph Niedermeier from Siemens, Prof. Dr. Helena Liebelt from THD, Dr. Peter Eder from IQM, Dr. Andreas Böhm from Bayern Innovativ, Theresa Schreyer and Imran Khan from Keequant.
The LfD hosted an international, interdisciplinary exchange on quantum computing for nuclear and high-energy physics to discuss challenges and applications.
Guests from the USA, UK, Spain and Germany participated enthusiastically in the workshop and exchanged ideas on the latest advances in quantum computing and quantum machine learning.
We are thrilled to actively contribute to these pioneering efforts!
Markus Schottenhammer und Andreas Fellner verteigigen erfolgreich ihre Bachelorarbeiten: "Automatisierte Übungserfassung und Wiederholungserkennung mithilfe einer Smartwatch" und "Vergleich von Variationalen Quantenschaltkreis-Strukturen für Quanten-Reinforcement Learning". Glückwunsch an Beide!
Vincent Eichenseher successfully defends his Bachelor Thesis "Comparative Analysis of Parameter Selection Heuristics for the Quantum Approximate Optimisation Algorithm". Congrats, Vincent!
Lukas Schmidbauer has joined the team as doctoral student in the field of quantum computing, contributing to the TAQO-PAM project. Welcome, Lukas!
Simon Thelen has joined the team as doctoral student in the field of quantum computing, contributing to the TAQO-PAM project. Welcome, Simon!
Unter Beteiligung des bayerischen Wirtschaftsministers Hubert Aiwanger wurde der Öffentlichkeit an der TechBase in Regensburg ein neues Kompetenzzentrum zum Thema Digitalisierung präsentiert. Bei dem Projekt Digital Innovation Ostbayern (DInO) handelt es sich dabei um eines von drei European Digital Innovation Hubs (EDIH) in Bayern. Die Projektpartner der TH Deggendorf, OTH Regensburg, R-Tech GmbH und die Bayerische KI-Agentur „baiosphere“ unterstützen und beraten im Rahmen dieses Projekts kleinere und mittlere Unternehmen sowie öffentliche Einrichtungen bei ihren digitalen Herausforderungen.
Das Labor für Digitales (LfD) ist dabei aufgrund seiner langjährigen und domänenübergreifenden Expertise verantwortlich für die Schwerpunktthemen Künstliche Intelligenz, maschinelles Lernen und Datenanalyse innerhalb des Projekts. Benno Bielmeier und Wolfgang Mauerer unterstützen anwendungsorientiert bei der praktischen Erprobung von Risiken, der Sicherheit und der Qualitätssicherung von nachhaltigen KI-Konzepten und -Lösungen. Dabei werden Aspekte der Auswirkung auf Wirtschaft, Politik und Gesellschaft holistisch beachtet und eine Brücke zwischen aktuellem Forschungsstand und praktischen Anforderungen geschlagen, um transformative Innovation und Digitalisierung voranzutreiben.
New contribution to the CORE A* VLDB conference
Our leading research on quantum data management, driven by Manuel Schönberger, Immanuel Trummer, head of the Cornell Database Group, and Wolfgang Mauerer, uncovers the potential of quantum computing for databases. In our latest paper, accepted for publication in PVLDB and to be presented at VLDB'24, we derive a novel, tailored encoding method to enable the use of highly optimised, quantum-inspired Fujitsu Digital Annealer hardware to solve large instances of the long-standing join ordering problem.
Abstract
Finding the optimal join order (JO) is one of the most important problems in query optimisation, and has been extensively considered in research and practise. As it involves huge search spaces, approximation approaches and heuristics are commonly used, which explore a reduced solution space at the cost of solution quality. To explore even large JO search spaces, we may consider special-purpose software, such as mixed-integer linear programming (MILP) solvers, which have successfully solved JO problems. However, even mature solvers cannot overcome the limitations of conventional hardware prompted by the end of Moore’s law.
We consider quantum-inspired digital annealing hardware, which takes inspiration from quantum processing units (QPUs). Unlike QPUs, which likely remain limited in size and reliability in the near and mid-term future, the digital annealer (DA) can solve large instances of mathematically encoded optimisation problems today. We derive a novel, native encoding for the JO problem tailored to this class of machines that substantially improves over known MILP and quantum-based encodings, and reduces encoding size over the state-of-the-art. By augmenting the computation with a novel readout method, we derive valid join orders for each solution obtained by the (probabilistically operating) DA. Most importantly and despite an extremely large solution space, our approach scales to practically relevant dimensions of around 50 relations and improves result quality over conventionally employed approaches, adding a novel alternative to solving the long-standing JO problem.
The new supplementary study programme "Quantum Technologies" will start for the first time in winter semester 2023/24!
Quantum technologies hold immense potential for solving the most complex problems that have so far pushed classical computers to their limits. The supplementary course offers students a gentle introduction to this complex topic -- from the basics of fascinating phenomena of quantum mechanics, their exploitation in cryptography to the simulation of quantum circuits and problem formulations for quantum computers. The study programme prepares students in the best possible way for the technology of the future! Registration for the first round is now possible via the website of the Regensburg School for Digital Sciences (RSDS).
The additional study programme is an interdisciplinary initiative of Prof. Dr. Ioana Serban (Faculty of Applied Natural and Cultural Sciences), Prof. Dr. Jürgen Mottok (LaS³, Faculty of Electrical and Information Technologies) and Prof. Dr. Wolfgang Mauerer (LfD, Faculty of Computer Science and Mathematics).
Das neue Zusatzstudium "Quantentechnologien" startet zum ersten Mal im Wintersemester 2023/24!
Quantentechnologien bergen ein imenses Potenzial für die Lösung komplexester Probleme, die klassische Computer bislang an ihre Grenzen bringen. Das Zusatzstudium bietet Studierenden eine sanfte Hinführung zu der komplex erscheinenden Thematik -- angefangen von den Grundlagen faszinierender Phänomene der Quantenmechanik über deren Ausnutzung in der Kryptographie bis hin zur Simulation von Quantenschaltkreisen und Problemformulierungen für Quantencomputer. Damit bereitet das Zusatzstudium Studierende bestmöglich auf die Zukunftstechnologie vor! Die Anmeldung für den ersten Durchlauf ist ab sofort über die Website der Regensburg School for Digital Sciences (RSDS) möglich!
Das Zusatzstudium ist eine interdisziplinäre Initiative von Prof. Dr. Ioana Serban (Fakultät Angewandte Natur- und Kulturwissenschaften), Prof. Dr. Jürgen Mottok (LaS³, Fakultät Elektro- und Informationstechnologien) und Prof. Dr. Wolfgang Mauerer (LfD, Fakultät Informatik und Mathematik).
Wolfgang Mauerer and colleagues from the QLindA consortium hosted a workshop on Quantum Machine Learning (QML) as part of the IEEE Conference on Quantum Computing and Engineering to to discuss challenges and applications of QML in Bellevue, Washington, USA.
International researchers from all over the world participated enthusiastically in the workshop and exchanged ideas on the latest advances in Quantum Machine Learning.
We are thrilled to actively contribute to these pioneering efforts!
The LfD Quantum booth at the World of Quantum Fair from 27th to 30th of June in Munich attracted many curious visitors and ensured a wide range of conversations on the topic.
The corresponding After-Show-Party was well received, sparked engaging discussions about the future of quantum computing and helped initiating new interdisciplinary collaborations.
Advancing Quantum Software Engineering at IEEE QSW'23
Two papers by Felix Greiwe, Tom Krüger and Hila Safi, the latter in joint work with Karen Wintersperger, have been accepted at the IEEE Quantum Software Week. They deal with the role of noise and imperfections in quantum software engineering, and uncover generic patterns in the performance of systems optimised by HW/SW co-design approaches. Congrats, Felix, Tom and Hila!
The papers arose of the BMBF sponsored project TAQO-PAM. Of course, both are accompanied by extensive reproduction packages that allow independent researchers to confirm our results.
Our work in developing quantum applications for industry is presented by the Bavarian Ministry of Science as research highlight in Bavaria.
Verband der Elektro- und Digitalindustrie, Arbeitskreis Funktionale Sicherheit ISO 26262 und Untergruppe Software
Abstract
Consolidation of multiple systems of different criticality to one platform of mixed-criticality is an ongoing trend in various embedded industries due to the availability of powerful multicore processors. The isolation of different computing domains is the most crucial factor to guarantee freedom from interference. In this talk, Ralf Ramsauer presents the current state of Static Hardware Partitioning, a technique that leverages virtualisation extensions of modern CPUs to strongly isolate different computing domains on SMP platforms. He shows that it is possible to virtualise embedded real-time systems with (almost) zero runtime overhead and software interaction.
International Workshop on Quantum Data Science and Management organised by Wolfgang Mauerer jointly with Sven Groppe (University of Lübeck), Jiaheng Lu (University of Helsinki), and Le Gruenwald (University of Oklahoma) at the 49th International Conference on Very Large Data Bases.
Goals of the Workshop
For most database researchers, quantum computing and quantum machine lerning are still new research fields. The goal of this workshop is to bring together academic researchers and industry practitioners from multiple disciplines (e.g., database, AI, software, physics, etc.) to discuss the challenges, solutions, and applications of quantum computing and quantum machine learning that have the potential to advance the state of the art of data science and data management technologies. Our purpose is to foster the interaction between database researchers and more traditional quantum disciplines, as well as industrial users. The workshop serves as a forum for the growing quantum computing community to connect with database researchers to discuss the wider questions and applications of how quantum resources can benefit data science and data management tasks, and how quantum software can support this endeavor.
Contribution at the ACM SIGMOD/PODS International Conference on Management of Data by Tobias Winker, Sven Groppe (University of Lübeck), Valter Uotila, Zhengtong Yan, Jiaheng Lu (University of Helsinki), Maja Franz and Wolfgang Mauerer.
Abstract
In the last few years, the field of quantum computing has experienced remarkable progress. The prototypes of quantum computers already exist and have been made available to users through cloud services (e.g., IBM Q experience, Google quantum AI, or Xanadu quantum cloud). While fault-tolerant and large-scale quantum computers are not available yet (and may not be for a long time, if ever), the potential of this new technology is undeniable. Quantum algorithms have the proven ability to either outperform classical approaches for several tasks, or are impossible to be efficiently simulated by classical means under reasonable complexity-theoretic assumptions. Even imperfect current-day technology is speculated to exhibit computational advantages over classical systems. Recent research is using quantum computers to solve machine learning tasks. Meanwhile, the database community already successfully applied various machine learning algorithms for data management tasks, so combining the fields seems to be a promising endeavour. However, quantum machine learning is a new research field for most database researchers. In this tutorial, we provide a fundamental introduction to quantum computing and quantum machine learning and show the potential benefits and applications for database research. In addition, we demonstrate how to apply quantum machine learning to the optimization of the join order problem for databases.
Contribution to the 26th International Conference on Computing in High Energy & Nuclear Physics (CHEP) by Maja Franz, Pia Zurita (University of Regensburg), Markus Diefenthaler (Jefferson Lab) and Wolfgang Mauerer.
Abstract
Quantum Computing (QC) is a promising early-stage technology that offers novel approaches to simulation and analysis in nuclear and high energy physics (NHEP). By basing computations directly on quantum mechanical phenomena, speedups and other advantages for many computationally hard tasks are potentially achievable, albeit both, the theoretical underpinning and the practical realization, are still subject to considerable scientific debate, which raises the question of applicability in NHEP.
In this contribution, we describe the current state of affairs in QC: Currently available noisy, intermediate-scale quantum (NISQ) computers suffer from a very limited number of quantum bits, and are subject to considerable imperfections, which narrows their practical computational capabilities. Our recent work on optimization problems suggests that the Co-Design of quantum hardware and algorithms is one route towards practical utility. This approach offers near-term advantages throughout a variety of domains, but requires interdisciplinary exchange between communities.
To this end, we identify possible classes of applications in NHEP, ranging from quantum process simulation over event classification directly at the quantum level to optimal real-time control of experiments. These types of algorithms are particularly suited for quantum algorithms that involve Variational Quantum Circuits, but might also benefit from more unusual special-purpose techniques like (Gaussian) Boson Sampling. We outline challenges and opportunities in the cross-domain cooperation between QC and NHEP, and show routes towards co-designed systems and algorithms. In particular, we aim at furthering the interdisciplinary exchange of ideas by establishing a joint understanding of requirements, limitations and possibilities.
Planung und Steuerung industrieller Fertigung: Quantum Learning Machine Atos QLM38 kommt im BMBF-Forschungsprojekt TAQO-PAM zum Einsatz.
Vorgezogenes Weihnachtsgeschenk für das Labor für Digitalisierung an der OTH Regensburg: Dort wurde jetzt eine Quantensimulationsanlage im Wert von einer Million Euro angeliefert und installiert. „Solche Hightech-Anlagen stehen üblicherweise in bedeutenden Instituten wie dem Forschungszentrum Jülich, dem Leibnitz Rechenzentrum München, bei der europäischen Organisation für Kernforschung (CERN) und im Munich Quantum Valley“, macht Präsident Prof. Dr. Ralph Schneider die besondere Dimension der Anschaffung deutlich.
Neue Professur im Rahmen der Hightech Agenda Bayern
Das Weihnachtsgeschenk kommt zwar optisch recht unscheinbar daher. Dennoch reiht sich die OTH Regensburg mit der Quantum Learning Machine „Atos QLM38“ ein in die Riege hochkarätiger Forschungsinstitute. Das kommt nicht von Ungefähr. An der Fakultät Informatik und Mathematik sind über Jahre hinweg Kompetenzen im Bereich Quantencomputing aufgebaut worden. Zuletzt hatte der Freistaat Bayern mitgeteilt, dass im Programm zur Stärkung von Quantenprofessuren im Rahmen der Hightech Agenda eine neue Professur für Algorithmik und Quantencomputing-Anwendungen an die OTH Regensburg geht.
Prof. Dr. Wolfgang Mauerer leitet das Labor für Digitalisierung und ist Vorsitzender Direktor des Regensburg Center for Artificial Intelligence (RCAI). Er beschäftigt sich seit mehr als 15 Jahren mit konkreten Anwendungsfällen der Quanteninformatik und gilt hierfür als ausgewiesener Experte. Ihm geht es nicht um den bloßen akademischen Austausch, sondern vor allem darum, die Lücke zwischen Grundlagenforschung und industrieller Anwendung zu schließen.
TAQO-PAM: Starke Partner aus Forschung und Industrie
Diesem Ziel widmet sich auch das von Mauerer ins Leben gerufene Konsortialprojekt TAQO-PAM, das über das Bundesministerium für Bildung und Forschung mit insgesamt 8,2 Millionen Euro gefördert wird. Dabei entfallen alleine 2,6 Millionen Euro auf die OTH Regensburg. Partner sind BMW München, Siemens München und Karlsruhe, die Regensburger Optware GmbH, die Friedrich-Alexander-Universität Erlangen-Nürnberg und Atos Scientific Computing (Tübingen).
Die zunehmende Massenproduktion individualisierter Güter und die dafür notwendige komplexe Logistik innerhalb moderner Fabriken erfordern die Lösung umfangreicher Optimierungsprobleme in Echtzeit. „Klassische Computer können solche Probleme nicht ausreichend gut und schnell verarbeiten; auch mit Quantencomputern ist die Machbarkeit nicht selbstverständlich“, bemerkt Mauerer. Im Projekt sollen unter seiner Führung daher hybride, quanten-klassische Spezialalgorithmen entworfen werden. Diese befähigen die demnächst verfügbaren Quantencomputer mit einigen 10 Qubits zur Lösung dieser Probleme beizutragen. Dies erfolgt durch die Integration von angepassten Quantenprozessoren (QPUs) in bestehende Szenarien und durch Erweiterung bestehender Methoden der Fabrikautomation und Produktionsplanung.
Stärkung des Hightech-Standorts Regensburg
"Das ist ein Paradebeispiel für die starke anwendungsorientierte und zukunftsgerichtete Forschung an unserer Hochschule", sind sich Präsident Schneider und Prof. Dr. Frank Herrmann, Dekan der Fakultät Informatik und Mathematik, einig. Sinnbildlich dafür steht das millionenschwere Weihnachtspaket mit dem Hochleistungsrechner. Ralph Schneider sieht darin auch eine Stärkung des Hightech-Standorts Regensburg. Längst nicht jede Hochschule und jede industrielle Forschungseinrichtung könne das nötige Fachwissen und die Ressourcen für die Nutzung einer Quantum Learning Machine bieten. In der Regel seien nicht nur Neueinsteiger in den Bereich des Quantencomputings auf einen teuren Zukauf von Rechenzeiten in großen Rechenzentren angewiesen.
Link zur Pressemitteilung der OTH Regensburg
©Fotos: OTH Regensburg/Michael Hitzek
At the Critical System Summit in Yokohama, Benno Bielmeier and Wolfgang Mauerer presented in one of six sessions a semiformal approach to deriving statements about the runtime behaviour of complex, mixed-criticality systems.
The presentation was recorded and can be found on YouTube.
As part of the Open Source Summit Japan, the event was hosted by the Linux Foundation and its corporate members, among them AT&T, Cisco, Fujitsu, Google, Hitachi, Huawei, IBM, Intel, Meta, Microsoft, NEC and many others, with more than 600 participants.
The approach links theoretical formalisms with empirically collected data from real-world applications and aims to remain interpretable and tangible. Its idea is to augment a simplified, formal model based on a priori knowledge about the system's intrinsics with empirical information from measurements on real-world scenarios, which then allows us to infer properties of interest for the certification of safety-critical systems.
Tom Krüger has joined the team as doctoral student in the field of quantum computing, contributing to the TAQO-PAM project. Welcome, Tom!
Wolfgang Mauerer, Ralf Ramsauer and Andrej Utz present results of the iDev 4.0 project at the SEMICON Europa 2022 in Munich.
Abstract: The advent of multi-core CPUs in nearly all embedded markets has prompted an architectural trend towards combining safety critical and uncritical software on single hardware units. We present an architecture for mixed-criticality systems based on Linux that allows for the consolidation of critical and uncritical components onto a single hardware unit.
In the context of the iDev 4.0 project, the use-case of this technological building block is to reduce the overall amount of distributed computational hardware components accross semiconductor assembly lines in fabs.
CPU virtualisation extensions enable strict and static partitioning of hardware by direct assignment of resources, which allows us to boot additional operating systems or bare metal applications running aside Linux. The hypervisor Jailhouse is at the core of the architecture and ensures that the resulting domains may serve workloads of different criticality and can not interfere in an unintended way. This retains Linux’s feature-richness in uncritical parts, while frugal safety and real-time critical applications execute in isolated domains. Architectural simplicity is a central aspect of our approach and a precondition for reliable implementability and successful certification.
In this work, we present our envisioned base system architecture, and elaborate implications on the transition from existing legacy systems to a consolidated environment.
Contribution to CORE A* conference ACM SIGMOD driven by Manuel Schönberger and Wolfgang Mauerer breaks new ground for quantum computing in the database community (PDF).
Abstract: The prospect of achieving computational speedups by exploiting quantum phenomena makes the use of quantum processing units (QPUs) attractive for many algorithmic database problems. Query optimisation, which concerns problems that typically need to explore large search spaces, seems like an ideal match for the known quantum algorithms. We present the first quantum implementation of join ordering, which is one of the most investigated and fundamental query optimisation problems, based on a reformulation to quadratic binary unconstrained optimisation problems. We empirically characterise our method on two state-of-the-art approaches (gate-based quantum computing and quantum annealing), and identify speed-ups compared to the best know classical join ordering approaches for input sizes that can be processed with current quantum annealers. However, we also confirm that limits of early-stage technology are quickly reached.
Current QPUs are classified as noisy, intermediate scale quantum computers (NISQ), and are restricted by a variety of limitations that reduce their capabilities as compared to ideal future quantum computers, which prevents us from scaling up problem dimensions and reaching practical utility. To overcome these challenges, our formulation accounts for specific QPU properties and limitations, and allows us to trade between achievable solution quality and possible problem size.
In contrast to all prior work on quantum computing for query optimisation and database-related challenges, we go beyond currently available QPUs, and explicitly target the scalability limitations: Using insights gained from numerical simulations and our experimental analysis, we identify key criteria for co-designing QPUs to improve their usefulness for join ordering, and show how even relatively minor physical architectural improvements can result in substantial enhancements. Finally, we outline a path towards practical utility of custom-designed QPUs.
Felix Greiwe has joined the group as doctoral student in the field of quantum computing, contributing to the TAQO-PAM project. Welcome, Felix!
Contribution to the highly competitive Open Source Summit (with acceptance rates below 20%) in Yokohama, Japan by Benno Bielmeier and Wolfgang Mauerer.
Abstract: Software for safety-critical systems must meet strict functional and temporal requirements. Since it is impossible to exhaustively test the required qualities, formal verification techniques have been devised. However, these approaches are usually only applicable to small systems, and require software architecture and development to consider verification goals from the ground up. Safety-critical systems face an increasing demand for functionality, and need to handle the associated complexity. While the desired functionalities can be satisfied by embedded Linux, established verification techniques fail for code of such magnitude. We show a semi-formal, model-based approach to derive reliable statements about the run-time characteristic of embedded Linux. Using a-priori expert knowledge, we generate a finite automaton-based effective description of safety-relevant aspects. The real-world, system-dependent behaviour of the resulting automata, in particular timing statistics for state transitions, is empirically obtained via system instrumentation. We then show how to turn this into (statistical) guarantees on their behaviour. We show how this allows to draw conclusions that can be used in certifying systems in terms of reliability, latencies, and other real-time properties.
QPU-System Co-Design for Quantum HPC Accelerators (with contributions by Hila Safi and Wolfgang Mauerer) was accepted by the 35th GI/ITG International Conference on the Architecture of Computing Systems (PDF).
Abstract: The use of quantum processing units (QPUs) promises speed-ups for solving computational problems, but the quantum devices currently available possess only a very limited number of qubits and suffer from considerable imperfections. One possibility to progress towards practical utility is to use a co-design approach: Problem formulation and algorithm, but also the physical QPU properties are tailored to the specific application. Since QPUs will likely be used as accelerators for classical computers, details of systemic integration into existing architectures are another lever to influence and improve the practical utility of QPUs.
In this work, we investigate the influence of different parameters on the runtime of quantum programs on tailored hybrid CPU-QPU-systems. We study the influence of communication times between CPU and QPU, how adapting QPU designs influences quantum and overall execution performance, and how these factors interact. Using a simple model that allows for estimating which design choices should be subjected to optimisation for a given task, we provide an intuition to the HPC community on potentials and limitations of co-design approaches. We also discuss physical limitations for implementing the proposed changes on real quantum hardware devices.
The paper is joint work with with Siemens Technology, and was performed within the BMBF sponsored project TAQO-PAM. A reproduction package allows independent researchers to confirm our results.
Uncovering Instabilities in Variational-Quantum Deep Q-Networks (with contributions by Maja Franz, Lucas Wolf and Wolfgang Mauerer) was accepted by the Journal of the Franklin Institute (Impact Factor: 4.25)
Abstract: Deep Reinforcement Learning (RL) has considerably advanced over the past decade. At the same time, state-of-the-art RL algorithms require a large computational budget in terms of training time to converge. Recent work has started to approach this problem through the lens of quantum computing, which promises theoretical speed-ups for several traditionally hard tasks. In this work, we examine a class of hybrid quantum- classical RL algorithms that we collectively refer to as variational quantum deep Q-networks (VQ-DQN). We show that VQ-DQN approaches are subject to instabilities that cause the learned policy to diverge, study the extent to which this afflicts reproduciblity of established results based on classical simulation, and perform systematic experiments to identify potential explanations for the observed instabilities. Additionally, and in contrast to most existing work on quantum reinforcement learning, we execute RL algorithms on an actual quantum processing unit (an IBM Quantum Device) and investigate differences in behaviour between simulated and physical quantum systems that suffer from implementation deficiencies. Our experiments show that, contrary to opposite claims in the literature, it cannot be conclusively decided if known quantum approaches, even if simulated without physical imperfections, can provide an advantage as compared to classical approaches. Finally, we provide a robust, universal and well-tested implementation of VQ-DQN as a reproducible testbed for future experiments.
The paper ist joint work with with Fraunhofer IIS, and arose of the BMBF sponsored project QLindA. Of course, the publication is accompanied by an extensive reproduction package that allows independent researchers to confirm our results.
OTH Regensburg ist Konsortialführer in 3 Millionen-EUR-Leuchtturmprojekt Q-Stream
Ein federführend von der OTH Regensburg eingebrachter Vorschlag für ein Quanten-Leuchtturmprojekt im Munich Quantum Valley ist nach dem Votum einer international besetzten Expertenkommission zur Förderung ausgewählt worde. Der Projekt Q-Stream: Quantum-Accelerated Data Stream Analytics, das sich mit der Konstruktion hybrider quanten-klassischer Spezialhardware beschäftigen wird, die auf die Analyse von Datenströmen spezialisiert ist.
Das Projekt zielt darauf ab, mittels eines Hardware-Software-Codesign-Ansatzes Anwendungen für die aktuell vorhandene Generation von Quantencomputern zu finden, die aufgrund technischer Imperfektionen noch zahlreiche Unzulänglichkeiten und Störungen aufweisen, die ihre potentiell enorme Leistungsfähigkeit noch nicht zur vollen Entfaltung bringen.
Von der Gesamt-Fördersumme vom 2.98 Millionen entfallen rund 750,000 EUR auf die OTH Regensburg, die zudem eine Stelle aus der High-Tech-Agenda in der Projekt einbringt. Das Labor für Digitalisierung wird sich in den Arbeiten auf den konzeptionellen Entwurf problemspezifisch adaptierter Spezial-Rechner konzentrieren. Weitere Quantenkompetenzen werden vom Fraunhofer-Institut für integrierte Schaltungen (IIS) (Transpilation und Dekomposition von Quantenschaltkreisen) sowie der Technischen Hochschule Deggendorf (Vorabsimulation zukünftiger Quantenrechner auf klassischen HPC-Systemen) eingebracht.
Dass es die OTH Regensburg neben sechs bayerischen Universitäten, der Max-Planck- und der Fraunhofer-Gesellschaft sowie der Bayerischen Akademie der Wissenschaften auf die Liste der geförderten Einrichtungen geschafft hat, scheint auch eine Bestätigung für die langjährigen Quanten-Aktivitäten in Forschung und Lehre von Prof. Dr. Wolfgang Mauerer zu sein.
Informationen des Bayerischen Staatsministeriums für Wissenschaft und Kunst, von dem das Labor die Förderung dankbar entgegennimmt, finden sich in einer Pressemitteilung.
Static Hardware Partitioning on RISC-V - Shortcomings, Limitations, and Prospects was accepted by the IEEE IoT Forum Special Session: Virtualization for IoT Devices 2022.
Abstract: On embedded processors that are increasingly equipped with multiple CPU cores, static hardware partitioning is an established means of consolidating and isolating workloads onto single chips. This architectural pattern is suitable for mixed-criticality workloads that need to satisfy both, real-time and safety requirements, given suitable hardware properties.
In this work, we focus on exploiting contemporary virtualisation mechanisms to achieve freedom from interference respectively isolation between workloads. Possibilities to achieve temporal and spatial isolation-while maintaining real-time capabilities-include statically partitioning resources, avoiding the sharing of devices, and ascertaining zero interventions of superordinate control structures.
This eliminates overhead due to hardware partitioning, but implies certain hardware capabilities that are not yet fully implemented in contemporary standard systems. To address such hardware limitations, the customisable and configurable RISC-V instruction set architecture offers the possibility of swift, unrestricted modifications.
We present findings on the current RISC-V specification and its implementations that necessitate interventions of superordinate control structures. We identify numerous issues adverse to implementing our goal of achieving zero interventions respectively zero overhead: On the design level, and especially with regards to handling interrupts. Based on micro-benchmark measurements, we discuss the implications of our findings, and argue how they can provide a basis for future extensions and improvements of the RISC-V architecture.
Ralf Ramsauer, Stefan Huber and Wolfgang Mauerer will discuss Zero-Overhead Virtualisation: It's a Trap! at the Embedded Linux Conference in Dublin
Abstract: Embedded processors are increasingly equipped with powerful CPU cores. For mixed-criticality scenarios with multiple independent real-time appliances, this allows us to consolidate formerly distributed systems. This requires absence of unintended interaction between different computing domains, which can be achieved by exploiting virtualisation extensions of modern CPUs. Though providing strong isolation guarantees, virtualisation comes with an overhead, which may endanger global real-time properties of the system. The statically partitioning, Linux-based hypervisor Jailhouse addresses this challenge and strives at zero-overhead virtualisation, which maintains real-time capabilities of the platform by design. However, limitations of current architectures counter our architectural design goal of eliminating virtualisation overheads. In this talk, we extract architecture-independent common requirements on contemporary platforms to enable zero-trap virtualisation. We explore and compare the ARM, x86, and the RISC-V architecture, and inspect their architectural limitations for embedded zero-overhead virtualisation. We present common pitfalls and barriers of those platforms: Issues that have been addressed, that are being fixed and that need to be addressed in future.
Die Bayerische Forschungsallianz hat Gastwissenschaftleraufenthalte an der Tokio University of Science bewilligt. Prof. Dr. Wolfgang Mauerer wird seine Systems-Expertise in ein Projekt zur statistischen Analyse von Echtzeitgarantien einbringen.
PhD student Manuel Schönberger took second place in the graduate track of this year's Student Research Competition at the CORE A* SIGMOD conference in Philadelphia!
The Student Research Competition takes place annually for various ACM conferences including SIGMOD. In the first round, students submit an extended abstract about their research. Based on the quality of their submission, a select few students from universities around the globe, including Columbia University, University of Illinois at Urbana-Champaign, Hasso-Plattner-Institut and TUM were invited to present their research posters at the SIGMOD conference. Three students were selected for the third round, where they gave a more detailed presentation on their research. In the graduate category, Manuel reached the second place, competing against Alex Yao and Sughosh Kaushik, both from Columbia University, who took first and third places respectively.
In his research, Manuel analyses the applicability of quantum computing on database query processing. The research goes beyond merely mapping problems onto quantum hardware, and moreover addresses the co-design of future quantum systems, such that they become tailor-made for database problems. Congrats, Manuel, for achieving this international recognition!
Quantum technologies are on the verge of breaking out of their ivory tower existence and entering the general marketplace.
With the premier of the World of Quantum researchers and industry presented the latest findings on potential quantum applications and quantum hardware at the Laser World of Photonics in Munich. Research Master student Maja Franz and others visited the fair and explored the new platform for quantum technologies.
Exhibitors from industry and manufacturers of quantum computers gave a broad overview of current quantum technology, for instance IBM Quantum let visitors see inside its quantum computer via augmented reality. Researchers in the field of quantum computing, such as from Fraunhofer IKS, offered a lively exchange on hybrid quantum-classical algorithms.
Thanks to the sponsors of the Bayerisches Staatsministerium für Digitales and other partners the World of Quantum was a success and an interesting experience.
State Minister announces extension of KI-Transfer Plus project headed by Wolfgang Mauerer
In the state-funded project "KI-Transfer Plus", AI regional centers such as the Regensburg Center for Artificial Intelligence (RCAI) support SMEs in getting started with artificial intelligence. At the closing event, Bavarian digital minister Judith Gerlach reviewed results of the first project round. The host of the event, Horsch Maschinen GmbH from Schwandorf, showed how artificial intelligence enhances its own agricultural machinery. Horsch developed an algorithm to recognize plants and their center points, which is important for autonomous driving in the field as well as for automated weed removal. The other five project participants from Upper Bavaria and the Upper Palatinate also presented innovative AI developments in a wide range of domains. Digital minister Judith Gerlach was pleased with the results and announced the expansion of the project. As a consequence of the program's success, Gerlach announced an extension for another year to prepare the Bavarian economy for the key technologies of the future. See a summary video of the impressive work engineers Nicole Höß and Matthias Melzer did together with our industry partners!
Masterand Mario Mintel stellt seine Arbeiten zur Adressraumduplikation mit dem von ihm entworfenen Scoot-Mechanismus auf der FGDB'22 in Hamburg vor.
1-2-3 Reproducibility for Quantum Software Experiments was accepted at Q-SANER 2022.
Abstract: Various fields of science face a reproducibility crisis. For quantum software engineering as an emerging field, it is therefore imminent to focus on proper reproducibility engineering from the start. Yet the provision of reproduction packages is almost universally lacking. Actionable advice on how to build such packages is rare, particularly unfortunate in a field with many contributions from researchers with backgrounds outside computer science. In this article, we argue how to rectify this deficiency by proposing a 1-2-3~approach to reproducibility engineering for quantum software experiments: Using a meta-generation mechanism, we generate DOI-safe, long-term functioning and dependency-free reproduction packages. They are designed to satisfy the requirements of professional and learned societies solely on the basis of project-specific research artefacts (source code, measurement and configuration data), and require little temporal investment by researchers. Our scheme ascertains long-term traceability even when the quantum processor itself is no longer accessible. By drastically lowering the technical bar, we foster the proliferation of reproduction packages in quantum software experiments and ease the inclusion of non-CS researchers entering the field.
QSAP@INFORMATIK 2022: "Workshop on quantum software and applications" was accepted at the annual conference of the German Gesellschaft für Informatik (GI) and will take place in September. It is co-organised by Stefanie Scherzinger (University of Passau) and Wolfgang Mauerer.
Beyond the Badge: Reproducibility Engineering as a Lifetime Skill was accepted at the SEENG@ICSE 2022.
Abstract: Ascertaining reproducibility of scientific experiments is receiving increased attention across disciplines. We argue that the necessary skills are important beyond pure scientific utility, and that they should be taught as part of software engineering (SWE) education. They serve a dual purpose: Apart from acquiring the coveted badges assigned to reproducible research, reproducibility engineering is a lifetime skill for a professional industrial career in computer science. SWE curricula seem an ideal fit for conveying such capabilities, yet they require some extensions, especially given that even at flagship conferences like ICSE, only slightly more than one-third of the technical papers (at the 2021 edition) receive recognition for artefact reusability. Knowledge and capabilities in setting up engineering environments that allow for reproducing artefacts and results over decades (a standard requirement in many traditional engineering disciplines), writing semi-literate commit messages that document crucial steps of a decision-making process and that are tightly coupled with code, or sustainably taming dynamic, quickly changing software dependencies, to name a few: They all contribute to solving the scientific reproducibility crisis, and enable software engineers to build sustainable, long-term maintainable, software-intensive, industrial systems. We propose to teach these skills at the undergraduate level, on par with traditional SWE topics.
Die Bayerische Forschungsallianz hat Gastwissenschaftleraufenthalte am FORTH-Institut in Kreta der Universität Ioannina bewilligt. Prof. Dr. Wolfgang Mauerer wird seine Software-Engineering- und Reproduzierbarkeitsexpertise in ein Projekt zur Schemaevolution in Datenbanken einbringen.
Peel | Pile? Cross-Framework Portability of Quantum Software was accepted at the QSA@ICSA 2022.
Abstract: In recent years, various vendors have made quantum software frameworks available. Yet with vendor-specific frameworks, code portability seems at risk, especially in a field where hardware and software libraries have not yet reached a consolidated state, and even foundational aspects of the technologies are still in flux. Accordingly, the development of vendor-independent quantum programming languages and frameworks is often suggested. This follows the established architectural pattern of introducing additional levels of abstraction into software stacks, thereby piling on layers of abstraction. Yet software architecture also provides seemingly less abstract alternatives, namely to focus on hardware-specific formulations of problems that peel off unnecessary layers. In this article, we quantitatively and experimentally explore these strategic alternatives, and compare popular quantum frameworks from the software implementation perspective. We find that for several specific, yet generalisable problems, the mathematical formulation of the problem to be solved is not just sufficiently abstract and serves as precise description, but is likewise concrete enough to allow for deriving framework-specific implementations with little effort. Additionally, we argue, based on analysing dozens of existing quantum codes, that porting between frameworks is actually low-effort, since the quantum- and framework-specific portions are very manageable in terms of size, commonly in the order of mere hundreds of lines of code. Given the current state-of-the-art in quantum programming practice, this leads us to argue in favour of peeling off unnecessary abstraction levels.
Projektvolumen: 8,2 Millionen EUR, Konsortialführer: Wolfgang Mauerer.
Die zunehmende Massenproduktion individualisierter Güter und die dafür notwendige komplexe Logistik innerhalb moderner Fabriken erfordern die Lösung umfangreicher Optimierungsprobleme in Echtzeit. Klassische Computer können solche Probleme nicht ausreichend gut lösen. In diesem Projekt sollen daher hybride, quanten-klassische Algorithmen entworfen werden. Diese befähigen die demnächst verfügbaren Quantencomputer mit einigen 10 Qubits zur Lösung dieser Probleme beizutragen. Dies erfolgt durch die Integration von angepassten Quantenprozessoren (QPUs) in bestehende Szenarien, und durch Erweiterung bestehender Methoden der Fabrikautomation und Produktionsplanung.
Durch den Fokus auf lokale Datenverarbeitung direkt im Betrieb statt durch Nutzung externer Cloud-Dienste wird die Notwendigkeit vermieden, grundlegende Kenntnisse und Daten zur Produktionslaufzeit mit Dritten zu teilen. Zudem treten bei zeitkritischen Berechnungen keine Verzögerungen durch Datenübertragungen auf. Ausgehend von der Annahme, dass geeignete maßgefertigte QPUs mittelfristig verfügbar sein werden, befasst sich das Projekt mit dem Mangel an Quantenalgorithmen zur Optimierung von Fertigungsaufgaben, der fehlenden Integration des Quantencomputing in industrielle Prozesse und der Zugänglichkeit zur Technologie für Anwender, denen die Resultate ohne tiefe quantenmechanische und quanteninformatische Kenntnisse zugänglich gemacht werden sollen.
Durch die systematische Übertragung realer Problemstellungen auf Verfahren, die Vorteile von Quantenalgorithmen mit Vorteilen klassischer Algorithmen kombinieren, sollen industriell verwertbare Anwendungsfälle erfolgreich gelöst werden. Perspektivisch lassen sich die in diesem Projekt entwickelten Algorithmen zukünftig auch auf leistungsstärkeren Quantencomputern ausführen und erweitern, sodass noch komplexere Optimierungen von Produktionsprozessen möglich werden, die die Produktivität und Wettbewerbsfähigkeit der Unternehmen weiter steigern (Textquelle: BMBF).
Abstract: Computer-based automation in industrial appliances led to a growing number of logically dependent, but physically separated embedded control units per appliance. Many of those components are safety-critical systems, and require adherence to safety standards, which is inconsonant with the relentless demand for features in those appliances. Features lead to a growing amount of control units per appliance, and to a increasing complexity of the overall software stack, being unfavourable for safety certifications. Modern CPUs provide means to revise traditional separa- tion of concerns design primitives: the consolidation of systems, which yields new engineering challenges that concern the entire software and system stack.
Multi-core CPUs favour economic consolidation of formerly separated systems with one efficient single hardware unit. Nonetheless, the system architecture must provide means to guarantee the freedom from interference between domains of different criticality. System consolidation demands for architectural and engineering strategies to fulfil requirements (e.g., real-time or certifiability criteria) in safety-critical environments.
In parallel, there is an ongoing trend to substitute ordinary proprietary base platform software components by mature OSS variants for economic and engineering reasons. There are funda- mental differences of processual properties in development processes of OSS and proprietary software. OSS in safety-critical systems requires development process assessment techniques to build an evidence-based fundament for certification efforts that is based upon empirical software engineering methods.
In this thesis, I will approach from both sides: the software and system engineering perspective. In the first part of this thesis, I focus on the assessment of OSS components: I develop software engineering techniques that allow to quantify characteristics of distributed OSS development processes. I show that ex-post analyses of software development processes can be used to serve as a foundation for certification efforts, as it is required for safety-critical systems.
In the second part of this thesis, I present a system architecture based on OSS components that allows for consolidation of mixed-criticality systems on a single platform. Therefore, I exploit virtualisation extensions of modern CPUs to strictly isolate domains of different criticality. The proposed architecture shall eradicate any remaining hypervisor activity in order to preserve real- time capabilities of the hardware by design, while guaranteeing strict isolation across domains.
Wolfgang Mauerer, Stefanie Scherzinger and Pia Eichinger recorded a making of video about the Reproducibility Engineering lecture.
KI-Transfer Plus wurde vom Bayerischem Staatsministerium für Digitales im Jahr 2021 als Pilotprogramm initiiert. Dabei übernimmt die appliedAI Initiative die zentrale operative Steuerung und Koordinierung.
Basierend auf der Expertise von appliedAI sowie den weiteren KI-Regionalzentren entsteht ein starkes KI-Regionalzentren-Netzwerk, das lokal und spezifisch bei allen Fragestellungen rund um Künstliche Intelligenz unterstützt. Die Regionalzentren begleiten die Unternehmen bei der Umsetzung und Implementierung eigener KI Use Cases und unterstützen die KI-Kompetenzentwicklung der Mitarbeiter. Ebenso ist das Ziel, gemeinsam mit den teilnehmenden Unternehmen eine strategische KI-Langfristperspektive zu erarbeiten.
Weitere Informationen finden Sie auf KIT+ und dem Flyer.
Nach dem erfolgreichen Ende des iDev-Projekt unterstützt die Infineon AG Regensburg das Labor für Digitalisierung mit der Spende eines Kuka iiwa-Roboterarms. Aufgrund der Einschränkungen durch die Covid-19-Pandemie konnte die ursprünglich mit Standortleiter Jörg Recklies geplante Übergabe nur im sehr kleinen Kreis stattfinden. "Wir freuen uns sehr über die hochwillkommene Unterstützung durch Infineon", sagt Prof. Dr. Wolfgang Mauerer. "Labore an der OTH sind auf Hilfe durch die Industrie angewiesen, um moderne Lehrangebote sicherstellen zu können. Durch die neuen Flächen, die mit den Mitteln der High-Tech-Agenda Bayern an der OTH Regensburg entstehen werden, sind wir zuversichtlich, dass ein Betrieb des Roboterarms bis 2030 möglich sein wird", prognostiziert Mauerer. Das Labor ist gespannt auf die Möglichkeiten, die der hochmoderne Greifer bei Forschung und Entwicklung an Mixed-Criticality-Systemen bieten wird (Bildquelle: Kuka).
Neuste Fortschritte in der Künstlichen Intelligenz (KI) haben es in den letzten Jahren ermöglicht, dass KI‐Systeme selbstständig lernen können, Spiele wie Schach oder Go besser zu spielen als je ein Mensch oder Computer zuvor. Die Schlüsseltechnologie dazu wird Reinforcement Learning (bestärkendes Lernen) genannt und mittlerweile auch für die lernende Regelung im industriellen Umfeld eingesetzt. Die aktuell schnell voranschreitende Steigerung der Kapazität von Quantencomputern eröffnet die Möglichkeit, Quantencomputer in KI‐Systemen einzusetzen, und bietet das Potential für bahnbrechende Leistungssteigerungen, die eine technologische Revolution mit Auswirkungen auf eine Vielzahl von Anwendungen auslösen können. Das Projekt zielt darauf ab, die jüngsten Fortschritte im Quantencomputing und in künstlicher Intelligenz, insbesondere im Reinforcement Learning (RL), zu kombinieren und technisch nutzbar zu machen. Dazu wird basierend auf den existierenden wissenschaftlichen Beiträgen untersucht, wie RL auf Quantencomputern (QRL) realisiert werden kann, um eine Vielzahl relevanter Probleme aus der industriellen Anwendung lösen zu können: die auf RL basierende Regelungsoptimierung in der Prozessindustrie, der Einsatz verteilter Automatisierungssysteme in der Smart Factory sowie die Optimierung in der Produktionsplanung. Die im Vergleich zum klassischen Algorithmendesign grundlegend andere, an die Hardware gekoppelte, Vorgehensweise erfordert schon vor Verfügbarwerden fehlerkorrigierter Quantenrechner die Erforschung der Übertragbarkeit klassischer Ansätze auf Quantenalgorithmen. Im Projekt werden neuartige Algorithmen entwickelt, ein Benchmark zur Evaluierung der Methoden und eine Bibliothek zur Nutzbarmachung für industrielle Anwendungen erstellt sowie die Möglichkeiten und Potentiale ebenso wie bestehende Limitierungen untersucht (Textquelle: BMBF).
Ralf Ramsauer presents The Sound of Silence: Mining Security Vulnerabilities from Secret Integration Channels in Open-Source Projects at CCSW '20 – due to Corona by video. In the paper, which has already been featured The Register and golem.de (German), we describe an approach to automatically detect patches that fix critical security issues before they are rolled out in the wild. We detect these patches (using the technology we described in our ICSE '19 paper) by the mere fact that they are not discussed on the mailing list. We analyzed the seven months before the release of Linux 5.4 and found commits that address 12 vulnerabilities. For these vulnerabilities, our approach resulted in a temporal advantage of 2 to 179 days to design exploits before public disclosure takes place.
Prof. Dr. Wolfgang Mauerer und seine Studierenden arbeiten aktuell an Verbesserungen für weltweit eingesetzte Betriebssysteme und erforschen an einem kanadischen Quantencomputer die Zukunft von Rechnern und KI-Systemen. Quantencomputing gilt als große Hoffnung für die nächste Generation von Hochleistungsrechnern. Von den physikalisch hochkomplexen und Millionen Euro teuren Prototypen erhofft man sich ein Vielfaches der Rechenleistung eines herkömmlichen Systems. Weltweit existieren nur wenige Quantencomputer, einer davon steht im kanadischen Ontario. Eine halbe Weltumquerung davon entfernt liegt das Labor von Prof. Dr. Wolfgang Mauerer an der OTH Regensburg. In einem neuen Projekt testet er, was der kanadische Computer wirklich auf dem Kasten hat. Dabei wird die kanadische Maschine genutzt, um deren Fähigkeiten zu verstehen und zu quantifizieren, insbesondere für industriell relevante Probleme. Experimentell ermittelt das Team um Prof. Dr. Mauerer an realen Aufgaben, welche Vorteile oder Nachteile Quantenrechner mit sich bringen. Jeder, der schon einmal an einem Flughafengate verzweifelt ist, kennt das organisatorische Durcheinander, das bei den zehntausenden Reisenden herrscht. Um den Überblick zu behalten, baut man gerne auf Künstliche Intelligenz für die Belegung von Gates. Das heißt: Transportwege minimieren, Wartezeiten verkürzen und Treibstoff im Flughafenverkehr sparen. Die Masterandin Irmi Sax erprobt gemeinsam mit dem Partner, der LMU München, die potenzielle Überlegenheit von Quantencomputern in diesem Spezialfeld. Doch auch in der Medikamentenentwicklung oder bei Leistungsschwankungen im Smart Grid setzt man Hoffnungen in die Quantenrechner. Vergleichsweise früh war die OTH Regensburg als Hochschule am aktuellen Trend-Thema Quantencomputing proaktiv beteiligt: Prof. Dr. Mauerer publiziert bereits seit 15 Jahren zu verschiedensten Aspekten des Themas. Seit fünf Jahren hält Prof. Dr. Mauerer Vorlesungen über Quantencomputing und bildet somit Ingenieure mit den erforderlichen Spezialkenntnissen für den Arbeitsmarkt aus.
Dieses spannende Thema stellt nur eines der Forschungsfelder dar, bei denen das Labor von Prof. Dr. Wolfgang Mauerer eng im Kontakt zur internationalen Forschungsgemeinschaft steht. Ein anderes ist die Weiterentwicklung von Software-Strukturen, in enger Zusammenarbeit mit der Linux-Foundation. Dort können die Forscher bereits auf unzählige Veröffentlichungen und einige Fachpreise zurückblicken. Ein Regensburger Tool namens „Pasta“ ist kurz davor, als weltweites Standard-Entwicklungs-Werkzeug aufgenommen zu werden. „Sie finden kein Industrieprojekt mehr, in dem keine Open-Source-Software drin ist – MRTs, Smartphones, Autos“, betont Prof. Dr. Mauerer die Bedeutung dieser Entscheidung. Das wichtigste Betriebssystem sei Linux. Die Arbeit von Mauerers Labor, vor allem von Doktorand Ralf Ramsauer, analysiert die menschliche Komponente bei diesem Software-Giganten. Am Ende sind es schließlich Menschen, die hinter den Millionen von Codezeilen stecken. Meist arbeiten mehrere Programmiererinnen und Programmierer gleichzeitig an einem Problem. Obwohl sie in engem Mailkontakt stehen, ist es oft schwer, Änderungen nachzuverfolgen. Im globalen Hin und Her der Verbesserungsvorschläge und Änderungshinweise gehen wichtige Informationen schnell verloren.
Mit Hilfe cleverer Software und maschinellem Lernen schaffen es Ramsauer und das Laborteam, Ordnung in das menschliche Chaos hinter Linux zu bringen. „Was wir entwickelt haben, ist besser als alles, was es bislang weltweit in diesem Bereich gab“, wirbt Mauerer. Nicht zufällig ist auch ein Münchner Automobilhersteller am Projekt beteiligt. Denn es geht vor allem um das Herz des Betriebssystems. „Wenn beim Kern etwas schiefläuft, steht das komplette System,“ erklärt Prof. Dr. Mauerer. Bei alldem ist es dem Professor wichtig, dass trotz Interesse aus der Privatwirtschaft die Forschungsarbeit frei zugängig ist. Alles, was das Labor veröffentlicht, ist auch Open Source. „Das sollte gute wissenschaftliche Praxis sein. Offenheit und Reproduzierbarkeit ist der Kern der Wissenschaft,“ plädiert Mauerer.
Mitchell Joblin received the dissertation award 2018 from the Univerity of Passau for his dissertation: "Structural and Evolutionary Analysis of Developer Networks"
For more information on the award visit https://www.uni-passau.de/forschung/wissenschaftspreise/dissertationspreise/.
Mitchell Joblin successfully defends his PhD Thesis. Congrats!
For more information on the award visit https://www.oth-regensburg.de/fakultaeten/informatik-und-mathematik/nachrichten/einzelansicht/news/erfolgreiche-promotion-im-bereich-software-engineering.html.