The whole theory behind Digital Twins is that the information contained or generated by the design tool can be used by other information systems to convey many more valuable datasets than those we were using when our only outcome was drawings. If we could do that, then that model, which, if right and reliable, could be used from the early phase of building to the entire lifecycle. Of course, everyone aspires to build a road, or a bridge, an asset on a permanent location on the Earth that does not move. Those are the kind of models with 30 years of life span that we can add value to. In fact, the construction industry can add value to the whole process.
For the construction and infrastructure industry, a Digital Twin is a digital replica of an infrastructure asset that enables visualization of the asset across its entire lifecycle, track change, and perform analysis to optimize asset performance. Digital Twins are the backbone for a lot of decision-making in the construction sector. They tell you about what we are building, are we behind schedule, or the kind of risks we are facing. These are the kind of things that we humans try to derive from information. Decisions sometimes are poor — I heard a story about a project in which, due to unreliable data, they ordered twice as much material as they needed. These are the kind of problems that won’t happen when using Digital Twins.
However, I will give a slight caveat that the definition of Digital Twin is a broad thing. In a Digital Twin, we would include Internet of Things (IoT), control systems and automation systems. With the mainstreaming of IoT, there would be a manifold increase in the use of Digital Twins. We would be using the information in our BIM systems for a lot of other purposes. In the future, there will be many new opportunities to leverage that information for things that were not even known earlier.
The biggest game changer is that the information in your BIM model can evolve and change as the asset is modified. In other words, with any large-scale asset, not much change happens at a given time, but something is always changing. Something gets redesigned, something fails, and something gets improved. When the BIM model is static and is used for one purpose, it gets outdated. If the model is a true reflection of the current state of the world and has been updated, it reflects the current state of the asset; and the value of the model can multiply many times over. For example, everyone is out installing sensors in the world. Any large-scale asset people build nowadays, they put diagnostic tools in it. There are many ways to know the health of the asset. These sensors send data somewhere, but does anyone know where the sensors are and how to respond if they fail, and how to use their huge stream of information?
The context of the 3D model is vital for being able to properly reflect the information that you are receiving from your sensors. No one talked about it 10 years ago because it was not possible. Prerequisites like connections, technology for sensors and pairing without wireless transmissions were not possible. Now, we have Cloud infrastructure and bandwidth. It is not a matter of whether or not you have the connectivity, what’s important is how good is your connectivity. Nobody worries that the communications would not be able to handle our needs. The underlying technology has evolved quite radically in our industry. The way software and cloud computing work today provide the basis for creating such a system. We could not have built it 10 years ago.
I think the tools that we use to visualize, create and modify are going to be connected to other information systems in a way that you could not have done in the past. In future, you would ask your vendors whether the BIM tools are flexible enough to connect to your machine tools and inventory systems. All that needs to be part of the BIM process, which isn’t the case today. So, when we thought about it, we realized that we would need a different software architecture. Most of the software that exists today for building and engineering models runs only on a desktop and a very specific file format. We are looking to change all that and build on the concept of open ecosystem of tools that work on common data formats and sources. I think it does not exist today and that is why we are trying to create it. We call it the “High Twin Mission”.
Bentley’s current mission is to eliminate the need for construction drawings, or documentation. While this concept would continue to be relevant, instead of generating paper upfront, you can generate reports on demand. Bentley’s strategy is to address the real root problem in inefficiencies and shortfalls, which emanate from the tools that were designed and conceived to generate documentation.
If you had to spend a lot of money on sensors, you would think hard whether or not to use sensors. If the sensor cost is a significant percentage of construction cost, then you would minimize the number and would try to be very efficient in placing them where they would be most effective. But that has changed; sensors do not cost much, they do not take much power to run, and if you do not ever process the data that comes from them, you have not lost anything. So, a lot of people have decided to sense everything. Now, you have got the opposite problem of too much information and are trying to figure out or ignore stuff that is not giving value.
It is also to do with the types of sensors. Earlier, we use to be only able to sense temperatures. But now, there are very sophisticated tools for sensing any dimension of activity you want — motion, stress, temperature. You can basically know everything that is knowable on the field, if you trust your sensors. We take advantage of that in the tools that we use to operate, maintain, update or modify our assets. The problem, of course, is that most of the assets in the world are already built. The big question that faces much of the world is how we retroactively add sensors and make observations of things that are out there, where there are no sensors. That is another problem we work on.
Our Machine Learning tools help in recognizing rust, cracks and defects in structures. Having a human climb a pole to see what is rusty could be dangerous and expensive. So, drones and cameras are amazing new technologies that did not exist ten years ago. Drones have a small camera and a long flying cycle that can be used to gain reliable set of pictures every day. Who could think of such a thing a decade ago? These are examples of the underlying technologies. BIM is not a finished art by any means. It could get a new name and take advantage of new technologies. There are people who make their living writing software. We are always worried about them, because if technology gets so advanced, maybe one of the advances will render them jobless.
Yes. I have talked to many engineers who feel very threatened by Machine Learning being able to automate the engineering process. Yes, it does bother me, though I do not think that it would happen in my lifetime. Anything that can be solved with Machine Learning would be solved with Machine Learning. Why would someone devise a tool to write software?
To tell you the truth, my biggest worry is that people will resist, because people do not like change. You can think of a lot of reasons why someone may consider this as a threat, or people may realize that something they are so used to is now no longer there. I think that resistance to change is our biggest risk. There is no reason why Digital Twins should not become the norm in five years. But will it? I do not know. Because business processes do not change very quickly. I have been in this business for 35 years, and basically, we have been doing almost everything the same way we did in the original ‘paper days.’
I think a part of it has to be through contracts. The owners, the people for whom the benefits will accrue, have to demand that they wish for a Digital Twin. When they do that, the people who are contracted to perform tasks will eventually do it. Their contracts will have to be very different from what they are today. If you are trying to pay for a value for Digital Twin, then it has to be a different metric. I do not know how it will happen. Maybe the government is a good place to start because governments have a latitude to say we want to change this because it is good for the people and legislate it. Another part of the problem is: if you are successful and profitable, then why change? Often, there is no motivation other than external.
As technology becomes more accessible and cheaper, it will remove some barriers. Technology works well in combination. For instance, if you had an iPhone and did not have the apps, it would not be as popular. It’s about the ecosystem built around the technology. The Digital Twin concept is something that you can plug specialisations to and make things way easier. Open standards, open source, open data formats — these are the prerequisites for this to happen.
The motivation behind connecting data systems will come into play. There will be many vendors; I am not suggesting that Bentley will be the only supplier of this technology. The motivation behind making things work well together will be the key. I think there is a need for consistency and integration, much more national integration by eliminating the vendor lock-in and motivation for technology suppliers. It is like a breaking point. Before the personal computer existed, computers were things that you walked to — there was one per company and the idea at that time was ‘you know, we can make a computer so cheap that it can be given to one person.’ When Bentley was started, I thought at that time the change this is going to bring about in the world is massive.
You had to be at a certain rank in the organization before you were deemed worthy enough to have a computer dedicated to you. Now, everyone has a computer — the one in your phone is way more capable than those we used back then. I said two years ago that the amount of change that the infrastructure world could benefit from this new generation of cloud computing in particular is ten times larger than we had before the dawn of the personal computer. So, I think it is a reflection point — we are at a point where people are willing to say that “I will do it differently.” When cloud computing came, people resisted. They said, “No way. I would not put my data upon someone else’s computer.” Do you ever hear someone say today that they don’t want cloud computing? Security and data residency issues have been addressed pretty well, so the Cloud paradigm is going to change our business and the software industry for sure, changing the industries of our users. There, it would have a knock-on effect because when people and engineering professions change, then their consumers will change, and I think this concept will have many waves of propagation.
I think it is required to be competitive. The reason was the determination to become a very open, compatible and transparent technology supplier I think if the world will demand it, then someone would supply it, and then everyone would use it. My biggest fear is, it will be guys that are last to embrace the open wave, so I do not think it is really a choice but is just a matter of how, because that is how every other software businesses is configuring itself. But I also believe it is a good thing for efficiency of business owners. The world needs more efficient infrastructure and the planet needs to become more efficient or bad things are going to happen.
I feel good that we can help to make that come about faster by embracing a new business paradigm for our systems. I am confident that there will be Digital Twins in everyone’s future. I said that five years back, and I stick by that. I believe it will be built on an open platform, whether it is ours or someone else’s. Universities will begin adapting Digital Twin applications quickly for their learning processes and teach them to the next generation of engineers.