Unfortunately, Internet Explorer is an outdated browser and our website does not support its use with our website. For the best experience please switch to Microsoft Edge, Google Chrome, Mozilla Firefox, or Safari.
Digital Twin (DT) technology has proven itself as a powerful tool in improving analytics and data visualization in a broad range of industries. Its use can change how fast and efficiently companies can solve arduous problems, since it permits remote and virtual interactions with company assets accompanied by full analytics of event sequences. But DT technology is still in its infancy, and choosing the right initial approach makes a big difference in cost/benefit overtime. Pitfalls include siloed data that frustrates the data flexibility principle of the Industry 4.0 movement. A poorly architected digital twin infrastructure can result in time delays and cost overruns in the implementation and operation of the digital twin system, resulting in frustration and sometimes a failure of the approach. With this in mind, how does a company effectively implement a Digital Twin in a way that respectsIndustry 4.0 and big data best practices? Well, fortunately, there exist some brilliant initiatives that model the right approaches. One of these is a NewZealand company called Nextspace, a data management and optimization company founded by one of the fathers of the voxel, Mark Thomas, and other data visualization veterans. The software-as-a-service developed by Nextspace, Bruce OS, is a comprehensive tool for integrating and visualizing data, enabling a future in which you can see your operations data in real time, and where the analytics possibilities are endless and forever flexible.
How does a Digital Twin work?
Given that about 75% of a construction project’s lifecycle costs are attributable to its post-construction management, we know that the annual maintenance costs will be very high for any aging facility. But imagine running simulations or playing out what-if scenarios with various maintenance processes, services, and products that could improve the operating efficiency of the facility over time, and all of this before investing any direct capital in real world implementations. That is why Digital Twin (DT) technology is so promising. In its simplest form, a Digital Twin is a virtual replica of a physical thing, which exists only as a digital representation; it seeks to objectively model the behavior of systems with inputs and outputs that attempt to mimic the real world behavior of systems. This new way of managing data allows you to connect with a physical asset without being present with it; it shows you what and why something is happening with your physical asset; and it can visualize future scenarios. DT is on the front line of the industry 4.0 movement.
DT technology represents the concept of integrating many product-related data flows, a response to the increasing digitization of product development, production, and products themselves. In essence, data becomes something much more fluid with this technology through integration of these multivariate data flows. DT technology includes many technology environments and tools, such as Internet of Things (IoT), Cloud Computing,Building Information Modeling (BIM), Light Detection and Ranging (LiDAR),Geographic Information System (GIS), and others. Today, DT has permeated abroad range of applications across many industries, including construction, agriculture, healthcare, manufacturing, distribution, oil & gas, mining, and many others.
Creating a digital replica of a physical thing can significantly improve one or more of the following processes: design, building, and operations. For these processes, DT involves 3 main phases: Digital Model, Digital Shadow, and Digital Twin.
In the Digital Model phase, there is no automated data exchange from the physical to the digital model; all of the data exchange is done manually. Moreover, no change in the state of the physical or digital model has direct consequences in both of them. This phase is used to run simulations and experiment with possibilities, for example when a building is designed virtually and then virtual design modifications are simulated to test solutions for optimal performance.
The Shadow Model phase is quite different from the DigitalModel phase, because it has an automatic flow of data from a physical object toa digital ‘shadow’ of the object, which means that a change in the physical object can impact the digital one. For example, a DT project involving transportation of containers between countries can provide real time data about geolocation, humidity, and temperature of each container (if sensors are deployed properly). A company can use this kind of information to assess quality through transport or whether a shipment will arrive with damage or not at the destination. Simulating altered conditions can provide cost comparisons among various remedial measures.
In the case of the Digital Twin phase, there exists full automation of a process, in which a flow of data between the physical asset and the digital version is bidirectional. This means that data from the physical asset can impact the digital version, but also that data from the digital version can impact the physical asset. This dynamic, integrated dataflow can give rise to a number of benefits, such as better predictive maintenance outcomes.
For example, oil refineries typically maintain thousands of pieces of equipment. A Digital Twin receiving temperature and oil flow data from wireless sensors attached to heat exchangers could enable engineers to better predict when the heat exchanger will become dirty and require cleaning. This can reduce downtime, and decrease both the frequency and the severity of repairs. With thousands of heat exchangers, the cost savings are substantial. Moreover, data analytics from the digital twin system permit fine-tuned modulation of cycling to further optimize efficiency, save energy, reduce wear and tear on equipment, and reduce staffing requirements.
There are many challenges in the implementation ofDigital Twins in each of the three phases mentioned above, such as (1) the efficient design of Information flow; (2) integration of different domains in the product engineering process; (3) interfaces for standardized information exchange; (4) accurate capturing of physical properties; and (3) system-wide data management.
To be considered a Digital Twin, the technology must be able to model both objects (e.g., buildings, cities, cars, airplanes, desks, or anything physical) and relationships between objects (e.g., a city transportation system, an irrigation system, or a shopping process). It is fundamental that a DT has to accurately capture an object’s physical properties, simulate behaviors, and adjust scale. But it’s not a simple task, since it requires 1) sensor networks to capture specific data; 2) design parameters that very precisely integrate with complex technologies, such as CAD, GIS, andLIDAR; and 3) system-wide device-state monitoring and analysis for compliance with normal operating parameters, among many other behaviors.
Another challenge is the integration of different data standards from various datasources. There must be a uniform exchange of information over the entire product or service engineering process; a basic challenge of a DT implementation is data standardization and data management.
Mark Thomas, founder of Nextspace, is one of the fathers of the digital twin technology and a data visualization veteran. Mark has worked for big tech companies, such as Adobe and SAP, both of which use technology developed byMark’s previous company, Right Hemisphere, in their current 3D rendering platforms. If you have used any of the tools in Adobe’s 3D viewer or SAP’s 3D renderer, you’ve been impacted by Mark’s work.
Big data has revealed itself as a tool set in which to tackle complex asset optimization problems. The challenge has been in defining, understanding and using data relationships
Digital twin technology founder
Nextspace and its DT technology suite Bruce OS
Because Bruce OS is a cloud-based system, accessed through a web browser, it has several powerful features; one is an incredible capacity to deal with different data standards on a very easy-to-use, intuitive platform. Bruce’s data framework facilitates flexible “standards management,” which enables users to merge, develop, and prototype self-appointed standards, or also adopt international standards. For example, with Bruce you can easily import CAD(computer-aided drafting), SHP (Shapefile), and LAS (LASer) files in the same project flow. These benefits allow one to mix and match data schemes and merge to a unified data schema, and improve data management analytics reporting.
Bruce has been applied in many fields due to its powerful and varied functionality, such as shadow analysis, multiple basemaps, high resolution print, one click 2D to 3D, 3D virtual tours, schema mapping.
Bruce’s other features include: schema and data mapping, data editing, data cleansing and updating, custom forms, pop-ups and reports, and 3D stacking diagrams and planograms. With all of these rich features, Bruce has established a new standard for state of the art Digital Twin implementation. And because Bruce’s visualization engine can be used from any web browser, it is widely accessible anywhere, at any time, and by any authorized user for visualizing complex data relationships and their effects. This has the added benefit of drastically minimizing the challenges around project collaboration.
And finally, when appropriate, Bruce works according to the concept of “federatedDigital Twins,” which means that Bruce can connect many Digital Twins together, within organizations and across projects, cities, regions, and states. For example, in agriculture, Bruce provides for the planning and mapping of systems, robotics and connected data devices, operational projects (e.g., stock rotation, feed per head count, and harvest planning), environmental optimization experiments (e.g., predicting and managing effluent effects, and noise and vibration), as well as tracking the security, activity, and operational status of a multitude of remote assets.
Nextspace.ai is founded by DigitalTwin experts from Aerospace and Automotive industries. We enable the management of complex data relationships, machine learning capabilities and advanced visualization to facilitate complex asset management and optimization tasks. Bruce OS is Nextpace’s answer to transforming data into intelligent tools to meet the challenges we face now and into the future - offering a pure ‘data ontology approach to Digital Twins.
The future of Digital Twin technology
A future that sees the full realization of DigitalTwin technology will result in solutions to some of the world’s largest problems; however this cannot be achieved with a top-down approach. Rather, it must be built from the bottom up, starting with well-architected data systems and linking multiple digital twins together through a common operating system.Through this approach, challenges such as the accommodation of different data standards inside a single system can be overcome. This is something that will require significant effort to solve, but fortunately we have exciting initiatives that are already making progress on this front, such as the BruceOS from Nextspace. As the world applies systems like Bruce to more tightly integrate the physical and digital worlds, the possibilities for safe, efficient, and surprising systems engineering will be endless and forever flexible.