peer reviewed
Refine
Document Type
- Article (142)
- Conference Proceeding (35)
- Book (1)
Language
- English (141)
- German (35)
- Multiple languages (2)
Is part of the Bibliography
- yes (178)
Keywords
Institute
- Maschinenbau und Werkstofftechnik (59)
- Wirtschaftswissenschaften (38)
- Elektronik und Informatik (27)
- Chemie (24)
- Optik und Mechatronik (14)
VR-ISA
(2024)
Software is, in its essence, an inherently invisible digital construct, and thus its comprehension and its visualization remain a challenge. All software involves some underlying structure(s), and Software Architecture (SA) comprises the (intended) conceptual abstractions and structuring principles across this invisible construct. Agile development methods, DevOps, and continuous development results in a changing implementation and associated SA that is evolving and continually in flux. Any presumed SA understanding and (perhaps outdated or inconsistent) associated SA documentation may also diverge from the reality, while any shared SA concept across stakeholder minds may vary or differ, potentially resulting in a lack of conceptual integrity. In contrast, an Informed Software Architecture (ISA) is grounded in reality based on actual data and evidence, rather than being influenced by out-of-sync models, documentation, misconceptions, or assumptions. Yet the challenge remains of how best to visually convey ISA aspects, such as internal static software structures and behavioral and operational dynamics, to support evidence-based design, comprehension, and insights in an accessible way for a wider stakeholder spectrum. This paper contributes VR-ISA, a Virtual Reality (VR) solution concept to immersively support an ISA with the visualization of structural, behavioral, and operational aspects. To exemplify our solution concept, three VR-based viewpoints, framing different concerns for different stakeholder groups, are used to illustrate the potential of VR to support ISA: 1) components and connectors, for depicting dynamic distributed event and data streams, 2) modules and dependencies, for depicting static internal module composition and their dependencies, and 3) execution observability, for depicting operational execution, tracing, and observability aspects. Our realization shows its feasibility, while a case-based evaluation provides insights into its capabilities and potential.
VR-DevOps
(2024)
DevOps, with its tools and practices that integrate and automate software development tasks, has become mainstream and offers a host of benefits for modern software development. While the main goal is to foster collaboration with stakeholders, the plethora of platforms and tools, the uniqueness of each pipeline, coupled with a non-uniform display of information (graphical or not), can hinder collaboration and insights, especially for non-developers. Our solution concept VR-DevOps contributes a cross-platform immersive visualization of DevOps pipelines in Virtual Reality (VR). Our prototype realization shows its feasibility, while a case-based evaluation provides insights into its capabilities.
VR-SDLC
(2024)
As systems grow in complexity, so does their associated lifecycle and with it the need to manage the various elements, relations, and activities involved in the Software (or Systems) Development Life Cycle (SDLC). Various notations for system, software, or process modeling have been specified such as the Systems Modeling Language (SysML), Unified Modeling Language (UML), Business Process Model and Notation (BPMN) respectively, yet due to their two-dimensional (2D) diagram focus, they are ill suited for visualizing a comprehensive contextualized view of the entire systems engineering or software engineering lifecycle. To address this need, the Lifecycle Modeling Language (LML) utilizes a relatively simple ontology and three primary diagrams while supporting extensibility. Yet lifecycle comprehension, analysis, collaboration, and contextual insights remain constrained by current 2D limitations. This paper contributes our Virtual Reality (VR) solution concept VR-SDLC for holistic visualization of SDLC elements, relations, and diagrams. Our prototype implementation utilizing LML demonstrates its feasibility, while a case study exhibits its potential.
VR-V&V
(2023)
To build quality into a software (SW) system necessitates supporting quality-related lifecycle activities during the software development. In software engineering, software Verification and Validation (V&V) processes constitute an inherent part of Software Quality Assurance (SQA) processes. A subset of the V&V activities involved are: 1) bidirectional traceability analysis of requirements to design model elements, and 2) software testing. Yet the complex nature of large SW systems and the dependencies involved in both design models and testing present a challenge to current V&V tools and methods regarding support for trace analysis. One of software’s essential challenges remains its invisibility, which also affects V&V activities. This paper contributes VR-V&V, a Virtual Reality (VR) solution concept towards supporting immersive V&V activities. By visualizing requirements, models, and testing artifacts with dependencies and trace relations immersively, they are intuitively accessible to a larger stakeholder audience such as SQA personnel while supporting digital cognition. Our prototype realization shows the feasibility of supporting immersive bidirectional traceability as well as immersive software test coverage and analysis. The evaluation results are based on a case study demonstrating its capabilities, in particular traceability support was performed with ReqIF, ArchiMate models, test results, test coverage, and test source to test target dependencies.
VR-SysML+Traceability
(2023)
As systems grow in complexity, the interdisciplinary nature of systems engineering makes the visualization and comprehension of the underlying system models challenging for the various stakeholders. This, in turn, can affect validation and realization correctness. Furthermore, stakeholder collaboration is often hindered due to the lack of a common medium to access and convey these models, which are often partitioned across multiple 2D diagrams. This paper contributes VR-SysML, a solution concept for visualizing and interacting with Systems Modeling Language (SysML) models in Virtual Reality (VR). Our prototype realization shows its feasibility, and our evaluation results based on a case study shows its support for the various SysML diagram types in VR, cross-diagram element recognition via our Backplane Followers concept, and depicting further related (SysML and non-SysML) models side-by-side in VR.
VR-GitCity
(2023)
The increasing demand for software functionality necessitates an increasing amount of program source code that is retained and managed in version control systems, such as Git. As the number, size, and complexity of Git repositories increases, so does the number of collaborating developers, maintainers, and other stakeholders over a repository’s lifetime. In particular, visual limitations of command line or two- dimensional graphical Git tooling can hamper repository comprehension, analysis, and collaboration across one or multiple repositories when a larger stakeholder spectrum is involved. This is especially true for depicting repository evolution over time. This paper contributes VR-GitCity, a Virtual Reality (VR) solution concept for visualizing and interacting with Git repositories in VR. The evolution of the code base is depicted via a 3D treemap utilizing a city metaphor, while the commit history is visualized as vertical planes. Our prototype realization shows its feasibility, and our evaluation results based on a case study show its depiction, comprehension, analysis, and collaboration capabilities for evolution, branch, commit, and multi-repository analysis scenarios.
Software design patterns and the abstractions they offer can support developers and maintainers with program code comprehension. Yet manually-created pattern documentation within code or code-related assets, such as documents or models, can be unreliable, incomplete, and labor-intensive. While various Design Pattern Detection (DPD) techniques have been proposed, industrial adoption of automated DPD remains limited. This paper contributes a hybrid DPD solution approach that leverages a Bayesian network integrating developer expertise via rule-based micropatterns with our machine learning subsystem that utilizes graph embeddings. The prototype shows its feasibility, and the evaluation using three design patterns shows its potential for detecting both design patterns and variations.
VR-EDStream+EDA
(2023)
With increasing digitalization, the importance of data and events, which comprise its most fundamental level, cannot be overemphasized. All types of organizations, including enterprises, business, government, manufacturing, and the supporting IT, are dependent on these fundamental building blocks. Thus, evidence-based comprehension and analysis of the underlying data and events, their stream processing, and correlation with enterprise events and activities becomes vital for an increasing set of (grassroot or citizen) stakeholders. Thus, further investigation of accessible alternatives to visually support analysis of data and events is needed. This paper contributes VR-EDStream+EDA, a solution for immersively visualizing and interacting with data and event streams or pipelines and generically visualizing Event-Driven Architecture (EDA) in Virtual Reality (VR). Our realization shows its feasibility, and a case-based evaluation provides insights into its capabilities.
Today’s Industry 4.0 Smart Factories involve complicated and highly automated processes. Nevertheless, certain crucial activities such as machine maintenance remain that require human involvement. For such activities, many factors have to be taken into account, like worker safety or worker qualification. This adds to the complexity of selection and assignment of optimal human resources to the processes and overall coordination. Contemporary Business Process Management (BPM) Systems only provide limited facilities regarding activity resource assignment. To overcome these, this contribution pro- poses a BPM-integrated approach that applies fuzzy sets and rule processing for activity assignment. Our findings suggest that our approach has the potential for improved work distribution and cost savings for Industry 4.0 production processes. Furthermore, the scalability of the approach provides efficient performance even with a large number of concurrent activity assignment requests and can be applied to complex production scenarios with minimal effort.
Although production processes in Industry 4.0 set- tings are highly automated, many complicated tasks, such as machine maintenance, continue to be executed by human workers. While smart factories can provide these workers with some digitalization support via Augmented Reality (AR) devices, these AR tasks depend on many contextual factors, such as live data feeds from machines in view, or current work safety conditions. Although currently feasible, these localized contextual factors are mostly not well-integrated into the global production process, which can result in various problems such as suboptimal task assignment, over-exposure of workers to hazards such as noise or heat, or delays in the production process. Current Business Process Management (BPM) Systems (BPMS) were not particularly designed to consider and integrate context-aware factors during planning and execution. This paper describes the AR-Process Framework (ARPF) for extending a BPMS to support context-integrated modeling and execution of processes with AR tasks in industrial use cases. Our realization shows how the ARPF can be easily integrated with prevalent BPMS. Our evaluation findings from a simulation scenario indicate that ARPF can improve Industry 4.0 processes with regard to AR task execution quality and cost savings.