Elektronik und Informatik
Refine
Year of publication
Document Type
- Conference Proceeding (90)
- Article (37)
- Part of a Book (6)
- Patent (5)
- Book (3)
- Report (3)
- Doctoral Thesis (1)
- Master's Thesis (1)
- Other (1)
Language
- English (124)
- German (20)
- Multiple languages (2)
- Chinese (1)
Keywords
Institute
Internet der Dinge
(2020)
Zuschaltverfahren
(2021)
Magnetgetriebe
(2019)
Synchronization and Control of Modular AC- and DC-Sided Parallel-Connected Three-Level NPC Inverters
(2018)
Nowadays, businesses with focus on consumer-products are challenged by short production cycles, high pricing pressure, and the need to deliver new features and services in a regular interval. Currently, businesses are tackling these challenges by automating their business pro- cesses, while yet trying to be flexible by introducing methods for process variability modeling. However, for larger processes and variability models, it becomes difficult to consider, maintain, and optimize all process variations in the various execution contexts. In software development, highly agile requirements are usually tackled with a flexible microservice architecture. Nonetheless, the fast-changing service landscape is often not fully reflected in the underlying business processes, leading to inefficiency and loss of profit. With this work, we extend our framework for process variability modeling with concepts of Microflows, allowing agile business process modeling and orchestration while utilizing the full flexibility of underlying microservices. In addition, we present a case study, showing how this approach is used in the context of an IoT application
Parameter Identification and Optimization of an Oceanographic Monitoring Remotely Operated Vehicle
(2018)
Modelling and Essential Control of an Oceanographic Monitoring Remotely Operated Underwater Vehicle
(2018)
Additive manufacturing of optical elements out of polymer allow new design concepts for optics. The parts are built up layer by layer. Unlike polymer binding with glass particles with its sintering process no secondary step is necessary for polymer printing to create the final part. With more and more printers and transparent materials available, this technology becomes more and more relevant for prototyping or custom optics. Therefor a deep understanding of the optical effects in the part is desirable. Key property of optical elements is the refractive index. The materials for polymer printing are most commonly resins that cure under UV-exposure and show lower refractive indices in liquid phase than cured. Assuming a dependency of the refractive index on the grade of polymerization and therefor the UV-exposure, the layering process of additive manufacturing causes variations of the refractive index within the part. Using the Scanning Focused Refractive Index Microscopy, the distribution of the refractive index within and between the layers is analyzed. The analysis includes comparisons between raw parts after printing and parts after UV post curing. Additionally, layer free samples from a Continuous Liquid Interface Printing System are examined for the homogeneity of the refractive index distribution. The purpose of the presentation is to give a detailed insight into the optical effects occurring at the layer interfaces of elements created by additive manufacturing. Possible use cases of the refractive index distributions within the part are also discussed.
Linking Intrusion Detection System Information and System Model to Redesign Security Architecture
(2020)
Improved Direct Power Control Applied to Parallel Active Filtering Based on Fuzzy Logic Controller
(2018)
Microflows: Leveraging Process Mining and an Automated Constraint Recommender for Microflow Modeling
(2018)
The digital transformation occurring in enterprises results in an in- creasingly dynamic and complex IT landscape that in turn impacts enterprise architecture (EA) and its artefacts. New approaches for dealing with more com- plex and dynamic models and conveying EA structural and relational insights are needed. As EA tools attempt to address these challenges, virtual reality (VR) can potentially enhance EA tool capabilities and user insight but further investigation is needed in how this can be achieved. This paper contributes a VR solution concept for visualizing, navigating, and interacting with EA tool dynamically-generated diagrams and models using the EA tool Atlas. An im- plementation shows its feasibility and a case study using EA scenarios is used to demonstrate its potential.
VR-EA: Virtual Reality Visualization of Enterprise Architecture Models with ArchiMate and BPMN
(2019)
The digital transformation occurring throughout enterprises results in an increasingly dynamic and complex IT landscape. As the structures with which enterprise architecture (EA) deals become more digital, larger, complex, and dynamic, new approaches for modeling, documenting, and conveying EA structural and relational aspects are needed. The potential for virtual reality (VR) to address upcoming EA modeling challenges has as yet been insufficient- ly explored. This paper contributes a VR hypermodel solution concept for visu- alizing, navigating, interacting with ArchiMate and Business Process Modeling Notation (BPMN) models in VR. An implementation demonstrates its feasibil- ity and a case study is used to show its potential.
Software design patterns and the abstractions they offer can support developers and maintainers with program code comprehension. Yet manually-created pattern documentation within code or code-related assets, such as documents or models, can be unreliable, incomplete, and labor-intensive. While various Design Pattern Detection (DPD) techniques have been proposed, industrial adoption of automated DPD remains limited. This paper contributes a hybrid DPD solution approach that leverages a Bayesian network integrating developer expertise via rule-based micropatterns with our machine learning subsystem that utilizes graph embeddings. The prototype shows its feasibility, and the evaluation using three design patterns shows its potential for detecting both design patterns and variations.
While Virtual Reality (VR) has been applied to various domains to provide new visualization and interaction capabilities, enabling programmers to utilize VR for their software development and maintenance tasks has been insufficiently explored. In this paper, we present the Hyper-Display Environment (HyDE) in the form of a mixed-reality (HyDE-MR) or virtual reality (HyDE-VR) variant respectively, which provides simultaneous multiple operating system window visualization with integrated keyboard/mouse viewing and interaction using MR or in pure VR via a virtual keyboard. This paper applies HyDE in a software development case study as an alternative to typical non-VR Integrated Development Environments (IDEs), supporting software engineering tasks with multiple live screens in VR as an augmented virtuality. The MR solution concept enables programmers to benefit from VR visualization and virtually unlimited information displays while supporting their more natural keyboard interaction for basic code-centric tasks. Thus, developers can leverage VR paradigms and capabilities while directly interacting with their favorite tools to develop and maintain program code. A prototype implementation is described, with a case study demonstrating its feasibility and an initial empirical study showing its potential.
As the size of software program code bases in software development projects increases, insight into and comprehension of their underlying dependency structures presents a challenge for programmers. The increasing availability of virtual reality (VR) systems brings VR-based visualization of program code structures into practical reach for software developers and could support program comprehension and insight. However, the complete visual immersion with VR presents a cognitive burden and potential distractions. Applying gamification to such a VR visualization capability has hitherto been insufficiently investigated as to its potential motivation and program comprehension factors. This paper describes and evaluates a VR digital gamification approach for program code called VR Gamified Immersion in Software structures (VR-GaImS), which applies digital gamification to a multi-metaphor VR visualization of software program structures. The results of a preliminary empirical investigation utilizing our prototype indicate its potential to increase enjoyment and motivation, focus attention, and encourage the exploration of software structures.
A complex and dynamic IT landscape with evermore digital elements, relations, and content presents a challenge for Enterprise Architecture (EA). Disparate digital repositories, including Knowledge Management Systems (KMS), Enterprise Content Management Systems (ECMS), and Enterprise Architecture Tools (EAT), often remain disjointed. And even if integrated, insights remain hindered by current visualization limitations, making it increasingly difficult to analyze, manage, and gain insights into the digital enterprise reality. This paper contributes our nexus-based Virtual Reality (VR) solution concept VR-EA+TCK that enhances and amalgamates EAT with KMS and ECMS capabilities. By enabling visualization, navigation, and interaction in VR with dynamically-generated EA diagrams, knowledge/value chains, and KMS/ECMS digital entities, it sets the groundwork for stakeholder-accessible grassroots enterprise modeling/analysis and future collaboration in a metaverse. An implementation shows its feasibility, while a case study demonstrates its potential using enterprise analysis scenarios: ECMS/KMS coverage in the EA, business processes, knowledge chains, Wardley Maps, and risk analysis.
VR-EvoEA+BP
(2023)
Enterprise digitalization results in an evolving and dynamic IT landscape of digital elements, relations, knowledge, content, activities, and business processes (BPs), which are spread across disparate enterprise IT systems, repositories, and tools. To be relevant, useful, and actionable, Enterprise Architecture (EA) relies on comprehensive documentation based on underlying information corresponding to reality. Yet current diagram-centric 2D visualizations for EA and BP models are too limited in scope to express reality (intentionally simplifying), are typically static (and not kept up-to-date), and cannot express and integrate the changing complexities of the enterprise context. This misalignment with reality and a changing enterprise misinforms and constrains the context-awareness and perception of EA and BP for stakeholders, impeding analyses, management, and holistic insights into the enterprise digital reality. This paper contributes our nexus-based Virtual Reality (VR) solution concept VR-EvoEA+BP to support comprehensive enterprise context visualization in conjunction with EA and model evolution and BP mining and analysis. Portraying an organic, evolving, and dynamic enterprise while supplementing static enterprise structure depictions, our implementation demonstrates its feasibility. A case study based on enterprise analysis and BP scenarios exhibits its potential.
Enterprise Architecture (EA) Frameworks (EAFs) have attempted to support comprehensive and cohesive modeling and documentation of the enterprise. However, these EAFs were not conceived for today’s rapidly digitalized enterprises and the associated IT complexity. A digitally-centric EAF is needed, freed from the past restrictive EAF paradigms and embracing the new potential in a data-centric world. This paper proposes an alternative EAF that is digital, holistic, and digitally sustainable - the Digital Diamond Framework. D2F is designed for responsive and agile enterprises, for aligning business plans and initiatives with the actual enterprise state, and addressing the needs of EA for digitized structure, order, modeling, and documentation. The feasibility of D2F is demonstrated with a prototype implementation of an EA tool that applies its principles, showing how the framework can be practically realized, while a case study based on ArchiSurance example and an initial performance and scalability characterization provide additional insights as to its viability.
Databases are becoming an ubiquitous and integral part of most software as the data era and the Internet of Everything unfolds. Alternative database types such as NoSQL grow in popularity and allow data to be stored and accessed more simply or in new ways. Thus, software developers, not just database specialists, are more likely to encounter and need to deal with databases. Virtual Reality (VR) technology has grown in popularity, yet its integration in the software development tool chain has been limited. One potential application area for VR technology that has not been sufficiently explored is database-model visualization. This paper describes Virtual Reality Immersion in Data Models (VRiDaM), a generic database-model approach for visualizing, navigating, and conveying database-model information interactively. It describes and explores both native VR and WebVR solution concepts, with prototypes showing the viability of the approach.
DEKXTROSE: An Education 4.0 Mobile Learning Approach and Object-Aware App Based on a Knowledge Nexus
(2020)
The exponential growth in knowledge coupled with the decreasing knowledge half-life creates a challenging situation for educational programs - particularly those preparing software engineers for their very dynamic high-technology field. Teachers in high technology education areas are challenged in selecting and making relevant knowledge intuitively accessible to students, especially with regard the highly dynamic digital and software technologies. This paper contributes a knowledge nexus-based multimedia approach aligned with Higher Education 4.0 for creating learning apps on mobile devices that support multiple didactic models, leverage intrinsic curiosity and motivation, support gamification, and enable digital collaboration. Object recognition is used to trigger learning paths, and various didactic methods are supported via workflow-like learning flows to support group or team-based learning. A prototype app was realized to demonstrate its feasibility and an empirical evaluation in software engineering shows the didactic potential and advantages of the approach, which can be readily generalized and applied to the arts, sciences, etc.
DEKXTROSE: An Education 4.0 Mobile Learning Approach and Object-Aware App Based on a Knowledge Nexus
(2020)
The exponential growth in knowledge coupled with the decreasing knowledge half-life creates a challenging situation for educational programs - particularly those preparing software engineers for their very dynamic high-technology field. Teachers in high technology education areas are challenged in selecting and making relevant knowledge intuitively accessible to students, especially with regard the highly dynamic digital and software technologies. This paper contributes a knowledge nexus-based multimedia approach aligned with Higher Education 4.0 for creating learning apps on mobile devices that support multiple didactic models, leverage intrinsic curiosity and motivation, support gamification, and enable digital collaboration. Object recognition is used to trigger learning paths, and various didactic methods are supported via workflow-like learning flows to support group or team-based learning. A prototype app was realized to demonstrate its feasibility and an empirical evaluation in software engineering shows the didactic potential and advantages of the approach, which can be readily generalized and applied to the arts, sciences, etc.
Software models in the Unified Modeling Language (UML) can been created or automatically reverse-engineered and used for quickly gaining structural insights into larger, legacy, or unfamiliar software. But as the size, structural complexity, and interdependencies between software components in larger systems grows, two-dimensional viewing and modeling has limitations, and new ways of visualizing larger models and numerous associated diagrams of different types are needed to intuitively convey structural and relational insights. To investigate the feasibility of using Virtual Reality (VR) to create an immersive UML-based software modeling experience, this paper contributes a VR solution concept for visualizing, navigating, modeling, and interacting with software models using UML notation. An implementation shows its feasibility while an empirical evaluation highlights its potential.
The volume of program source code created, reused, and maintained worldwide is rapidly increasing, yet code comprehension remains a limiting productivity factor. For developers and maintainers, well known common software design patterns and the abstractions they offer can help support program comprehension. However, manual pattern documentation techniques in code and code-related assets such as comments, documents, or models are not necessarily consistent or dependable and are cost-prohibitive. To address this situation, we propose the Hybrid Design Pattern Detection (HyDPD), a generalized approach for detecting patterns that is programming-language-agnostic and combines graph analysis (GA) and Machine Learning (ML) to automate the detection of design patterns via source code analysis. Our realization demonstrates its feasibility. An evaluation compared each technique and their combination for three common patterns across a set of 75 single pattern Java and C# public sample pattern projects. The GA component was also used to detect the 23 Gang of Four design patterns across 258 sample C# and Java projects as well as in a large Java project. Performance and scalability were measured. The results show the advantages and potential of a hybrid approach for combining GA with artificial neural networks (ANN) for automated design pattern detection, providing compensating advantages such as reduced false negatives and improved F1 scores.
With the increasing pressure to deliver additional software functionality, software engineers and developers are often confronted with the dilemma of sufficient software testing. One aspect to avoid is test redundancy, and measuring test (or code or statement) coverage can help focus test development on those areas that are not yet sufficiently tested. As software projects grow, it can be difficult to visualize both the software product and the software testing area and their dependencies. This paper contributes VR-TestCoverage, a Virtual Reality (VR) solution concept for visualizing and interacting with test coverage, test results, and test dependency data in VR. Our VR implementation shows its feasibility. The evaluation results based on a case study show its support for three testing-related scenarios.
The increasing demand for software functionality necessitates an increasing amount of program source code that is retained and managed in version control systems, such as Git. As the number, size, and complexity of Git repositories increases, so does the number of collaborating developers, maintainers, and other stakeholders over a repository’s lifetime. In particular, visual limitations of Git tooling hampers repository comprehension, analysis, and collaboration across one or multiple repositories with a larger stakeholder spectrum. This paper contributes VR-Git, a Virtual Reality (VR) solution concept for visualizing and interacting with Git repositories in VR. Our prototype realization shows its feasibility, and our evaluation results based on a case study show its support for repository comprehension, analysis, and collaboration via branch, commit, and multi-repository scenarios.
Repeatable processes are fundamental for describing how enterprises and organizations operate, for production, for Industry 4.0, etc. As digitalization and automation progresses across all organizations and industries, including enterprises, business, government, manufacturing, and IT, evidence-based comprehension and analysis of the processes involved, including their variations, anomalies, and performance, becomes vital for an increasing set of stakeholders. Process Mining (PM) relies on logs or processes (as such evidence-based) to provide process-centric analysis data, yet insights are not necessarily visually accessible for a larger set of stakeholders (who may not be process or data analysts). Towards addressing certain challenges described in the Process Mining Manifesto, this paper contributes VR-ProcessMine, a solution for visualizing and interacting with PM results in Virtual Reality (VR). Our realization shows its feasibility, and a case-based evaluation provides insights into its capabilities.
As systems grow in complexity, the interdisciplinary nature of systems engineering makes the visualization and comprehension of the underlying system models challenging for the various stakeholders. This, in turn, can affect validation and realization correctness. Furthermore, stakeholder collaboration is often hindered due to the lack of a common medium to access and convey these models, which are often partitioned across multiple 2D diagrams. This paper contributes VR-SysML, a solution concept for visualizing and interacting with SysML models in Virtual Reality (VR). Our prototype realization shows its feasibility, and our evaluation results based on a case study shows its support for the various SysML diagram types in VR, cross-diagram element recognition via our backplane followers concept, and depicting further related (SysML and non-SysML) models side-by-side in VR.
VR-V&V
(2023)
To build quality into a software (SW) system necessitates supporting quality-related lifecycle activities during the software development. In software engineering, software Verification and Validation (V&V) processes constitute an inherent part of Software Quality Assurance (SQA) processes. A subset of the V&V activities involved are: 1) bidirectional traceability analysis of requirements to design model elements, and 2) software testing. Yet the complex nature of large SW systems and the dependencies involved in both design models and testing present a challenge to current V&V tools and methods regarding support for trace analysis. One of software’s essential challenges remains its invisibility, which also affects V&V activities. This paper contributes VR-V&V, a Virtual Reality (VR) solution concept towards supporting immersive V&V activities. By visualizing requirements, models, and testing artifacts with dependencies and trace relations immersively, they are intuitively accessible to a larger stakeholder audience such as SQA personnel while supporting digital cognition. Our prototype realization shows the feasibility of supporting immersive bidirectional traceability as well as immersive software test coverage and analysis. The evaluation results are based on a case study demonstrating its capabilities, in particular traceability support was performed with ReqIF, ArchiMate models, test results, test coverage, and test source to test target dependencies.
VR-SysML+Traceability
(2023)
As systems grow in complexity, the interdisciplinary nature of systems engineering makes the visualization and comprehension of the underlying system models challenging for the various stakeholders. This, in turn, can affect validation and realization correctness. Furthermore, stakeholder collaboration is often hindered due to the lack of a common medium to access and convey these models, which are often partitioned across multiple 2D diagrams. This paper contributes VR-SysML, a solution concept for visualizing and interacting with Systems Modeling Language (SysML) models in Virtual Reality (VR). Our prototype realization shows its feasibility, and our evaluation results based on a case study shows its support for the various SysML diagram types in VR, cross-diagram element recognition via our Backplane Followers concept, and depicting further related (SysML and non-SysML) models side-by-side in VR.
VR-GitCity
(2023)
The increasing demand for software functionality necessitates an increasing amount of program source code that is retained and managed in version control systems, such as Git. As the number, size, and complexity of Git repositories increases, so does the number of collaborating developers, maintainers, and other stakeholders over a repository’s lifetime. In particular, visual limitations of command line or two- dimensional graphical Git tooling can hamper repository comprehension, analysis, and collaboration across one or multiple repositories when a larger stakeholder spectrum is involved. This is especially true for depicting repository evolution over time. This paper contributes VR-GitCity, a Virtual Reality (VR) solution concept for visualizing and interacting with Git repositories in VR. The evolution of the code base is depicted via a 3D treemap utilizing a city metaphor, while the commit history is visualized as vertical planes. Our prototype realization shows its feasibility, and our evaluation results based on a case study show its depiction, comprehension, analysis, and collaboration capabilities for evolution, branch, commit, and multi-repository analysis scenarios.
VR-EDStream+EDA
(2023)
With increasing digitalization, the importance of data and events, which comprise its most fundamental level, cannot be overemphasized. All types of organizations, including enterprises, business, government, manufacturing, and the supporting IT, are dependent on these fundamental building blocks. Thus, evidence-based comprehension and analysis of the underlying data and events, their stream processing, and correlation with enterprise events and activities becomes vital for an increasing set of (grassroot or citizen) stakeholders. Thus, further investigation of accessible alternatives to visually support analysis of data and events is needed. This paper contributes VR-EDStream+EDA, a solution for immersively visualizing and interacting with data and event streams or pipelines and generically visualizing Event-Driven Architecture (EDA) in Virtual Reality (VR). Our realization shows its feasibility, and a case-based evaluation provides insights into its capabilities.
Learning for E-Learning
(2020)
VR Live Motion Capture
(2021)
Design and Implementation of a Plug-In Repetitive Controller for a High Precision Axis System
(2021)
Statistische Versuchsplanung
(2021)
IT-Sicherheit
(2018)
Can one 3D print a laser?
(2020)
Speichereinsatz versus Netzausbau - Methoden der Bürgerkommunikation am Beispiel des Projekts NEOS
(2020)
Red Teaming
(2019)
Zu Beginn der Arbeit wurden die theoretischen Grundlagen zu Penetrationstest, Audit und Red Teaming beschrieben. In den rechtlichen Rahmenbedingungen wurden betroffene Gesetze unter die Lupe genommen.
Anschließend wurde eine Marktforschung bestehend aus einer Primär- und einer Sekundärmarkforschung durchgeführt. Die Sekundärforschung beschreibt die Angebote und Dienstleister, die auf dem Markt Red Teaming anbieten. Daraus wurden Unternehmen aus dem DACH-Raum für die Interviews der Primärforschung ausgewählt.
Das Red Team muss ein vorher definiertes Ziel erreichen. Das Red Team führt in Abstimmung mit dem White Team Angriffe durch, die technische, physische und menschliche Komponenten betreffen können. Welche Komponenten verwendet werden, unterscheidet sich nach Projekt und Dienstleister. Das Blue Team hat die Aufgabe, die Angriffe zu erkennen und darauf zu reagieren.
Mit dem gesammelten theoretischen Wissen und den Interviews wurden die Methoden verglichen und eine Methodik zur Einordnung erstellt. Das Red Teaming, Penetrationstests und Audits sind für unterschiedliche Situationen nützlich. Um die Auswahl zu erleichtern, ist es sinnvoll, die Ziele zu definieren, die mit dem Test erreicht werden sollen.
Als dritter Indikator ist der Scope hilfreich, da ein Penetrationstest eine technische Prüfung darstellt und beim Audit oder Red Teaming oftmals eine ganzheitliche Betrachtung der Organisation erfolgt.
Im letzten Abschnitt wird eine praktikable Methode zur Durchführung von Red Teaming beschrieben. Hierzu wurden Thesen auf Grundlage des gesammelten Wissens aufgestellt.
Die Arbeit endet mit einem Fazit und den Zukunftsaussichten von Red Teaming.
Automated Software Engineering Process Assessment: Supporting Diverse Models using an Ontology
(2013)
Leveraging Augmented Reality to Support Context-Aware Tasks in Alignment with Business Processes
(2021)
The seamless inclusion of Augmented Reality (AR) with Business Process Management Systems (BPMSs) for Smart Factory and Industry 4.0 processes remains a challenge. Towards this end, this paper contributes an approach integrating context-aware AR into intelligent business processes to support and guide manufacturing personnel tasks and enable live task assignment optimization and support task execution quality. Our realization extends two BPMSs (Camunda and AristaFlow) and various AR devices. Various AR capabilities are demonstrated via a simulated industrial case study.
A Context and Augmented Reality BPMN and BPMS Extension for Industrial Internet of Things Processes
(2022)
In the context of Industry 4.0, smart factories enable a new level of highly individualized and very efficient production, driven by highly automated processes and connected Industrial Internet of Things (IIoT) devices. Yet the IIoT process context, crucial for operational process enactment, cannot be readily represented in processes as currently modeled. Despite automation progress, manual tasks performed by humans (such as maintenance) remain, and while complicated tasks can be supported by Augmented Reality (AR) devices, they remain insufficiently integrated into global production processes. To seamlessly integrate process automation, IIoT context, and AR, this paper contributes BPMN-CARX, a Context and Augmented Reality eXtension (CARX) for BPMN (Business Process Model and Notation) and the CARX Framework, which enables AR and IIoT context integration with existing Business Process Management Systems (BPMSs). An Industry 4.0 case study demonstrates its feasibility and applicability.
Production processes in Industry 4.0 settings are usually highly automated. However, many complicated tasks, such as machine maintenance, must be executed by human workers. In current smart factories, such tasks can be supported by Augmented Reality (AR) devices. These AR tasks rely on high numbers of contextual factors like live data from machines or work safety conditions and are mostly not well integrated into the global production process. This can lead to various problems like suboptimal task assignment, over-exposure of workers to hazards like noise or heat, or delays in the production process. Current Business Process Management (BPM) Systems (BPMS) are not capable of readily taking such factors into account. There- fore, this contribution proposes a novel approach for context- integrated modeling and execution of processes with AR tasks. Our practical evaluations show that our AR Process Framework can be easily integrated with prevalent BPMS. Furthermore, we have created a comprehensive simulation scenario and our findings suggest that the application of this system can lead to various benefits, like better quality of AR task execution and cost savings regarding the overall Industry 4.0 processes.
Although production processes in Industry 4.0 set- tings are highly automated, many complicated tasks, such as machine maintenance, continue to be executed by human workers. While smart factories can provide these workers with some digitalization support via Augmented Reality (AR) devices, these AR tasks depend on many contextual factors, such as live data feeds from machines in view, or current work safety conditions. Although currently feasible, these localized contextual factors are mostly not well-integrated into the global production process, which can result in various problems such as suboptimal task assignment, over-exposure of workers to hazards such as noise or heat, or delays in the production process. Current Business Process Management (BPM) Systems (BPMS) were not particularly designed to consider and integrate context-aware factors during planning and execution. This paper describes the AR-Process Framework (ARPF) for extending a BPMS to support context-integrated modeling and execution of processes with AR tasks in industrial use cases. Our realization shows how the ARPF can be easily integrated with prevalent BPMS. Our evaluation findings from a simulation scenario indicate that ARPF can improve Industry 4.0 processes with regard to AR task execution quality and cost savings.
Industry 4.0 production comprises complicated highly automated processes. However, human activities are also a crucial component of these processes, e.g., for machine main- tenance. Task assignment of human resources in this domain is challenging, as many factors have to be taken into account to ensure effective and efficient activity execution and satisfy special conditions (like worker safety). To overcome the limita- tions of current Business Process Management (BPM) Systems regarding activity resource assignment, this contribution provides a BPM-integrated approach that applies fuzzy sets for activity assignment. Our findings suggest that this approach can be easily applied to complex production scenarios, while providing efficient performance even with a large number of concurrent activity assignment requests. Additionally, our evaluation shows its potential for improved work distribution which can lead to cost savings in Industry 4.0 production processes.