At the NAMUR 2024 Annual General Meeting, Emerson presented Boundless Automation℠, a vision of what a software-centric automation and operations architecture of the future could look like. Peter Zornio, Chief Technology Officer (CTO) at Emerson, explained to the atp editorial team in an interview on the afternoon of the first day of the congress why a standardized data infrastructure is essential for this and why the concept can drive forward the merging of IT and OT.
Mr. Zornio, what were your expectations when you travelled to the NAMUR Annual General Meeting and, above all, were they fulfilled?
I expected a healthy debate and discussion on various technical topics in the automation world with leading automation experts. We presented our ideas in our keynote and showed how they align with the NAMUR concepts. We also anticipated that attendees would provide feedback or possibly even have different points of view. I really appreciate this kind of exchange because it is the best kind of feedback on our product direction that we can get. I have been in this industry for almost 40 years now and I have not been disappointed.
As a CTO based in the USA, how do you perceive NAMUR's activities?
The chemical and pharmaceutical industry is very important for Emerson, and NAMUR is one of the most important international organizations when it comes to the interests of this industry. It is no longer just focused on Germany, but has a strong presence throughout Europe, China and the USA. That's why it was so important for Emerson to sponsor this event and present our ideas to this highly qualified audience and show what we can do for users of automation technology. There really is no other user community in the industries we serve that deals so directly with standards and concepts around automation.
A wonderful transition, because you have introduced a new, open automation concept with Boundless Automation℠. What exactly is behind it?
Boundless Automation℠ expands and changes the automation architecture and the way we handle data in the future. From new operating technologies to data aggregation and processing, the concept encompasses new ideas that ultimately help users to understand their operational systems even better and optimize them accordingly. Some of the core ideas on which Boundless Automation℠ is based are software-defined control, greater use of cloud technologies, a new security paradigm and expanding the scope of the integrated software suite for operations beyond today's DCS through a so-called Unifying Data Fabric.
What is special about this new standardized data structure?
Data has always been important, even in analog times. And it has always been about improving certain key figures, such as productivity or plant availability. Until now, however, it was often the case that different areas such as maintenance or safety selected or created their own software. They then fed this software with the specific data that these tools required, usually relating to the same systems or processes. In order for others to access this data, new connections often had to be established at great expense. The result today is many different systems that all require and collect specific data.
In other words: classic data silos.
Exactly, and to a certain extent this effect was even intensified by the digital transformation, because every division and every business unit then looked for the best and fastest way to reach its goal. This happened without any malicious intent, but simply because it was necessary and made sense. This resulted in many different data models that were developed individually for specific purposes.
...and unfortunately do not speak “the same language”.
What is “water pump 1” in one system may be “FIC105” in another and something else in a third. And it is precisely this diversity in the data models and the data itself that is the biggest hurdle the chemical and pharmaceutical industry currently faces when trying to optimize operations in the various functional areas, which now include sustainability goals. For this reason, Boundless Automation℠ places great emphasis on the data to ensure greater consistency and standardization.
This is certainly easier with the data that will be aggregated in the future than with the huge data lakes that already exist and that various tools already access, isn't it?
This is one of the biggest challenges in process engineering in particular, as a lot of data has already been incorporated into a large number of applications and is therefore virtually “in use”. Creating standardized designations and thus a single source of truth that can be used by all systems is a difficult task. Many are currently hoping that artificial intelligence will be able to do this work for us in a much shorter time. Emerson is also working with various start-ups to drive this development forward and ultimately enable an AI to merge the various data models into an overarching, ISA-95 compliant metamodel. The first step is to create a powerful data structure that can connect to all these data silos and become this single source of truth.
The data is then no longer stored in individual silos, but centralized in one place?
That would be ideal from our point of view. It is also necessary for this metamodel to be much broader than existing models, so that not only numerical data, but also documents, vibration spectra or even images, for example, can be recorded. Such a plant data model would then also have to be available centrally so that all systems can select what they need individually. It would not matter where the original data source is located in the plant. You could then even implement cross-domain applications. Together with AspenTech, we develop precisely such automation and optimization applications that are “integrated by design” from the outset.
Historian systems have been developed and implemented precisely for this purpose. Why are they not sufficient?
Correct, historians were originally designed for this purpose, and many companies continue to store their production data in such systems and use them effectively in certain applications. But especially when it comes to plant availability, not only numerical data is needed, as just mentioned. Boundless Automation℠ and the standardized data structure represent a new state of the art and are designed to do exactly what historians cannot. This new concept has also helped Emerson itself to better integrate our partners' systems.
But in the process industry in particular, analogue data transmission is still very common and the basis for comprehensive digitalization has barely been laid. How does this fit in with Boundless Automation℠, which you describe as the “next level of technology”?
We know the current situation in companies, where analog signals are being converted to digital with some effort. But even if we equip all new systems with Ethernet APL, for example, only a fraction of the installed base will be APL-capable. And even in the greenfield projects of the coming years, it will not be the case that 100% of the devices will support APL. This has also been the case in the past when switching to other digital technologies. That's why we've developed what we believe is the perfect migration path, combining APL and analog signaling into a single, field-ready architecture. And this hybrid signaling is likely to remain in place for a long time to preserve field data.
What is your solution for the installed base that already collects digital data?
From our point of view, most of the devices used in the chemical industry are already capable of providing digital data. Emerson and many other suppliers stopped supplying devices that do not support digital protocols such as HART as early as 1995. However, older control systems still mainly use the analog signal. The digital data is then trapped, so to speak, because there is no way out of the device. In addition, these control systems apply their proprietary data models to the sensor data. This is why existing standardized data models such as PA-DIM play a decisive role for us, as they can integrate semantically processed field data into systems. In addition, we have developed a data server that enables users to bring the collected data directly to the cloud via MQTT.
In other words, they rely on edge computing to process the aggregated data close to the source. However, there is still a lack of suitable bridges to bring the data into the IT-related processing systems, isn't there?
This is not surprising when you look at the automation pyramid, which still has a certain validity due to the older existing systems and security paradigms. However, a lot has already happened here in recent years, partly because OT and IT are merging more and more. Today, we see that IT specialists are heavily involved in OT topics and vice versa. There are a lot of bridge builders, but providing data for IT systems is only the first step.
What comes next?
As an industry, we need to continue to leverage technologies that were originally associated with IT, such as the cloud or protocols like MQTT or JSON, as well as computing environments like containers at the edge of the network. This is important for the data that is critical to the broader task of enterprise-level optimization across all functional areas: Security, production, reliability and sustainability.
Will the IT/OT fusion also be the driver that fuels the digital transformation once again?
In my opinion, cybersecurity is probably driving the most investment and action. It is responsible for many upgrades in the OT world, because in the hierarchy of needs, secure operations take precedence over optimized operations. The influence of the IT department is leading to an increasing “stay-current-tostay-secure” mentality. This attitude is not necessarily widespread in OT. Put simply, the mentality there tends to be “if it ain't broke, don't fix it”. But companies have to do both: work safely and optimize.
Hasn't the convergence of IT and OT already achieved a lot here?
The last “great leap” in the integration of OT and IT technologies, which by the way will always be functionally different, took place after OT moved to the same technological base as IT in the 1990s - WinTel, Ethernet networks, etc. This at least made it easier to achieve a certain level of integration using protocols such as OPC, which is still the most widely used in the automation industry. But converting the data was still a big task, as we have already discussed, and became the biggest hurdle of any integration project. However, the IT software infrastructure has evolved. Today, there is the cloud, new data exchange standards, containers, SaaS models, edge technology and much more. Incorporating these technologies and rethinking a permanent, universal solution for data integration, as we've discussed here, is necessary to truly achieve what many consider to be the “digital transformation” of their organization.
What is necessary for this transformation to really succeed?
We also need more IT mentality in the automation system itself. Apart from the control software, a modern Distributed Control System (DCS) runs exclusively on open platforms. Today, it is therefore nothing more than “another critical IT system”, like Enterprise Resource Planning (ERP), albeit usually an older open technology. And although open technology is the foundation for DCS, we don't treat it like an IT system, but more like a physical pump. This mindset needs to change.
Doesn't it already because automation today is increasingly software-driven?
Yes, but it's still happening too slowly, even though Emerson now has nine or more software developers for every one hardware designer. Unfortunately, I still often see people talking about DCS and then showing pictures of control hardware. However, pure software automation via standard platforms is certainly not a sure-fire success, because hardware-based systems have their advantages, especially in terms of latency and availability. That is why we are focusing on edge and containerized solutions to support boundless automation℠, which are expected to be available from March 2025.
Will this make hardware-based control systems obsolete?
No, most users from the chemical industry sitting here in the audience at the NAMUR Annual General Meeting will not use it to control their systems. These new systems lack redundancy, which ultimately jeopardizes plant availability and latency. They are more of a first step and we assume that they will use it in the future.
But why should the chemical and pharmaceutical industry look at software-based automation?
First of all, it's to avoid the infamous hardware lock-in effect of manufacturers. Software-centric automation effectively prevents this as it runs on non-proprietary hardware. It also maximizes the benefits of Moore's Law, as open systems always run on the latest generation of hardware and therefore always benefit from more performance than proprietary systems ever could. However, the most important reason is virtual controllers and the flexibility they provide. If I rely on hardware, I always have to buy and integrate a new controller when I need more capacity. Adding a virtual controller, on the other hand, is a simple software task.