By Jon Gold
Digital-twin technology is attractive to businesses trying to get the most out of their physical assets and increasingly to organizations attempting a systematic study of complex systems, such as smart cities and oil-and-gas supply chains.
The announcement last month of the Digital Twin Consortium is an attempt to make digital twin technology more powerful and usable than ever before, through addressing one of the key problems slowing down its development: interoperability. The consortium is an open-standards organization under the auspices of the Object Management Group, backed by Microsoft, Dell, Ansys and Lendlease, among many others.
Digital twins are, put simply, virtual copies of real-world pieces of equipment. The idea is to offer a way to let the designers, manufacturers and operators of that equipment turn real-world data into accurate predictions and simulations of what might happen in various use cases. Creating a digital twin involves physicists, mathematicians and data scientists projecting how real-world forces affect the equipment being simulated. The systems being twinned can be as simple as a mileage calculator for a car or as complicated as a model of an entire city’s traffic patterns.
At its highest level, digital twinning is a management tool, according to Gartner research analyst and vice president Al Velosa. It abstracts a layer of complexity out of the basic processes of monitoring and managing systems.
“I don’t care how you get the data about that thing or that process, I just want that data so I can make better decisions,” he said.
The instrumentation and data collection efforts needed for digital twinning frequently rely on IoT sensors, and the two technologies are closely intertwined. Gartner said last year that, while few businesses are currently using digital twins operationally, nearly two-thirds of those they surveyed had plans to begin using them in the near future.
Most early digital twins – and the technology is still relatively new, so this encompasses many of the digital twins being used today – are fairly simple. A turbine on a wind farm has a digital twin that its manufacturer can look at to see whether it’s working correctly, and the operator of the farm can look at for maintenance purposes. If that’s all a system is supposed to do, there’s no problem.
However, issues arise when trying to put large numbers of objects – say, a smart building, with environmental sensors, HVAC controls, lighting and security tools – into the same model. The various components of that system stand a vanishingly small chance of all coming from the same manufacturer, let alone using interoperable networking protocols, so it can be enormously complicated to try and forge all of those systems into a coherent digital twin of the whole building.
Forrester principal analyst Paul Miller said that this is the key challenge in the field at the moment, and an indicator of the direction in which the technology is heading.
“We’re moving from a world where digital twins are small and company-specific to a world where there large and have more stakeholders,” he said. “To do that, they need some of these standards to be more established.”
That’s where the DTC and other standards organizations come in. By creating open-source standards for sensors and other equipment made by a wide array of different companies, across a diverse set of verticals, the potential to create larger and more complex digital twins can be realized.
There are really two aims behind the creation of an organization like the DTC, according to the group’s executive director Richard Soley, who has been a part of the larger Object Management Group since 1989. Along with smoothing the way for open standards to play a larger part in the development of digital twins, the group is focused on getting the technology into as many markets as possible.
But the nature of many of those markets, which include infrastructure, aerospace and defense, and mining, oil and gas, means that there’s some entrenched resistance, since most vendors in those verticals have a lot of proprietary technology that they’re eager to protect.
“You have to take the time to convince vendors that they can have a slice of a larger pie, rather than a smaller slice in a larger pie,” said Soley.
Miller said that resistance from vendors could ease once it becomes clearer that what they are being asked to share isn’t necessarily of critical importance to their trade secrets.
“Yes, there are areas of intellectual property you want to keep under wraps, but there are other aspects that there’s no disadvantage in sharing,” he said. “The voltage generated by a wind turbine isn’t something that needs to be hidden.”
Velosa said that the consortium’s early focus seems likely to be in connected buildings, given the identity of the major backers.
This story, “Creating complex digital twins requires sharing intellectual property” was originally published by
Jon Gold covers IoT and wireless networking for Network World.
Copyright © 2020 IDG Communications, Inc.