Digital twins are an emerging class of enterprise software that generate insights about the state of things to produce actionable insights and better business outcomes. Most companies adopting Internet of Things (IoT) technologies are implementing digital twins to improve situation awareness and automate business responses to changing conditions. But the task of implementing well-designed digital twins is complicated. Thus, to ensure that you can thoroughly fulfil a digital twin’s full business role, it is vital to begin with the end in mind prior to designing and implementing digital twins.
Kanoo Elite presents this whitepaper as a guide for Application Leaders to apply a consistent optimized digital twin design approach for their needs, using the four building blocks digital twin reference model as a guide to improve design and implementation.
All Digital Twins Have Two Primary Roles for Improving Business Outcomes:
- Improve Situation Awareness: Functionally, all digital twins — at a minimum — monitor data from things (and often related contextual information) to improve our situational awareness. This improved insight alone can support better decision making.
- Automate Your Response: To scale up our use of digital twins we need to automate our business responses.
A digital twin implements a virtual representation of an IoT-connected entity used to provide situation awareness that helps improve business decision making and outcomes. Digital twins can be implemented for things such as equipment, processes, and organizations, as well as for people. At a minimum, digital twins always monitor things, but digital twin data is often analysed to predict future state, and at times is also used to simulate the behaviour of physical things, processes, or people. Every digital twin design includes the following four basic building blocks:
- Entity metadata — Information to describe the twinned object, including its physical components, how they’re assembled, the object’s behaviour and specifications.
- Generated data — IoT sensor-based time-series data, external contextual data, and whatever other data is used by analytical models.
- Analytical models — Software algorithms that ingest generated data and produce events which increase situation awareness.
- Software components — Application logic, visualization tools and other functionality to act, based on events produced by analytical models.
Digital twin software executed in a distributed runtime compute environment includes these additional technical elements:
Digital Twin Reference Model — Four Major Building Blocks
- Data sources — Any form of information used as input into a digital twin’s entity metadata or as input into a digital twin’s generated data.
- Digital twin enabling technology — Whatever application infrastructure middleware is used for digital twin software development and runtime.
- Related applications — Business applications — e.g., manufacturing execution systems, supply chain planners, ERP — that are integrated with digital twins to make them act on events generated by digital twin analytics.
Entity metadata is whatever identifying (not state) information we need for an object that we have twinned. The amount of metadata detail needed varies widely, in part based on who needs it. A key challenge for digital twin designers is producing entity metadata models that are “right-grained,” that is, neither too fine-grained. The key to success is correctly anticipating the minimally viable resolution of data required to support decision making. Another challenge is choosing what data to manage within the digital twin software versus what data may already be managed within existing applications, building information modelling. At times, metadata will be divided between digital twins and other business applications, and thus integration will be needed to keep metadata synchronized. To avoid data becoming obsolete, policies will be needed to ensure that digital twin metadata is updated when changes occur.
Generated data is whatever state (not identifying) information we need about an object that we have twinned. In IoT-enabled digital twins, sensors generate IoT time-series data for “twinned” objects to support analytics. IoT time-series data is time-stamped, name/value pairs of state information. All generated data must be normalized (e.g., temperatures converted to Celsius) and stored for use by a digital twin’s runtime analytics and software.
A key challenge is to avoid collecting too little data — which creates gaps or errors in monitoring and analysis — or too much data — which can obfuscate whatever data matters. Successfully collecting the right data is a technical and organizational challenge. It is a technical challenge because data can be processed, for example, either on the edge or in the cloud. Which approach you choose can impact computing and network capacity and complicate integration. It is an organizational challenge because your data scientists and operations staff must collaborate to identify which generated data must be collected, and at what cadence, to achieve the desired outcomes.
Analytical models are virtual representations of the behaviour of twinned entities that improve situational awareness. These models capture the business purposes of the digital twin. To provide such insights, the models must include representations of key features, critical variables, and describe those features and variables as algorithms. Depending on the scope of the digital twin and affected business cases, the algorithms might be based on mathematical representations of physics for a physical system, or economics for financial systems, or behaviour of organizations. The output results from the series of digital twin data-fed computations resulting from operations carried out on systems of interconnected mathematical relationships called “systems models.” This mathematical engine can train machine learning algorithms to accelerate the speed at which digital twins can guide business decisions about the paired entities.
Analytical models are approximations of true behaviour. The predictions and recommendation coming from these analytical models should be studied and considered. Contingencies should be planned and made available in case the outcome of analytical models is wrong. At their greatest value, they are a learning tool that can be successively refined over time with experience using them.
Making a highly detailed granular analytical model can be very expensive and time-consuming. Adding too much detail can introduce significant errors. Conversely, creating overly simplistic analytical models will filter out important behaviours that also produce misguiding results. “Rightsizing” analytical models comes with experience and the discipline to understand the reasons for bad outcomes and continually improve upon those.
Software components choreograph data ingestion, analytics, event generation, and — if needed — any visualization and workflow (for automating business responses). Digital twins are typically designed to achieve specific outcomes and are implemented as collections of fit-for-purpose software components (e.g., separate digital twins for equipment, people or processes) that are incrementally improved over time to expand their precision and capabilities.
This fit-for-purpose, incremental approach to software delivery is a natural characteristic of an emerging future of application design approach that we call “Composable Enterprise.” In that architecture, digital twins play the role of “packaged business capabilities” (PBCs). In a composable enterprise, digital twins are integrated (composed) with other PBCs, representing business capabilities of enterprise applications.
A successful digital twin relies on smart technology and an in-depth knowledge of physical assets—but it also requires leadership, credibility and vision. At Kanoo Elite, we connect our rich heritage of engineering with the latest technological capabilities to co-create digital twins for our partners and clients.
Solutions provided by us helps to overcome challenges, identify new opportunities, and make informed decisions more quickly, and also help to contribute to a better and more sustainable society, tackling issues like urbanisation, environmental sustainability, and the pressing need for clean water and improved sanitation.