FROM TEXT TO IMAGES: LINKING SYSTEM REQUIREMENTS TO IMAGES USING JOINT EMBEDDING
Editor: Kevin Otto, Boris Eisenbart, Claudia Eckert, Benoit Eynard, Dieter Krause, Josef Oehmen, Nad
Author: Chen, Cheng; Carroll, Cody; Morkos, Beshoy
Institution: University of Georgia
Section: Design Methods
DOI number: https://doi.org/10.1017/pds.2023.199
Smart manufacturing enterprises rely on adapting to rapid engineering changes while minimizing the generated risk. Making informed decisions related to engineering changes and managing risks against unexpected costs requires more information to be extracted from limited data. However, limited information in early-stage design can come in many forms, namely text and images. The development of innovative design tools and processes to link multisource data together is essential to assist designers in building model-based engineering (MBE) systems. However, the formal computational linking of multisource data is yet to be realized in MBE. We propose a framework to implement transfer learning and integrate domain specific knowledge to bridge this information gap. A synthetic dataset is created using web scraping techniques based on keywords extracted from the requirements. Requirement-image pairs are used to fine tune a contrastive language-image pretraining model to acquire domain knowledge. The results demonstrate how the content of images can be used to indicate all affected requirements for tracing engineering changes in a complex system.