Open-source protocols are the way forward for majority market players
Published on : Saturday 20-08-2022
Apoorva Dawalbhakta, Associate Director – Research, Quadrant Knowledge Solutions.

Seamless connectivity seems to be a topic arousing much interest. Why would production entities need to have autonomous communication?
In today’s day and age, most industries tend to digitally transform and automate their facility layouts making use of seamless/wired connectivity. Having said this, whether it be the creation of customised products or a simple upgrade to a new product range, the fixed asset and facility scenario makes it quite a challenge to undertake reconfiguration or modification of the entire factory set up. In this regard, to actively implement smart manufacturing, successive stages within the industry must be digitally connected, and feasibly accessible. Precise machine-to-machine and autonomous communication, in addition to closed loop and real time communications could be made possible which would aid effective task completion with the help of advanced robotics or robotic process automation. In a nutshell – dedicated, high-speed, wireless and seamless connectivity would ensure a plethora of increase in the practical use-cases by the inculcation of autonomous communication, enabling manufacturers to execute operations such as usage of autonomous mobile robots (AMR) in the context of real-time manufacturing and chain automation, ensuring the creation of Digital Twins for operation optimisation, creation of augmented reality ensuring strict quality inspections, and promoting increased uptime by working towards asset condition monitoring.
Is the need for real time communication also experienced over the enterprise boundaries? For example, with the vendor subsystems, with delivery subsystems? How about across multiple plants?
Essentially, enterprises never operate in absolute isolation, and instead always thrive in an environment where in continuous flow and exchange of information takes place actively. The interdependencies between different stakeholders of the industry including the partners, suppliers and customers as well increase exponentially with the growth of the industry. The organisational agility towards real-time strategy improvement is imperative in such situations to recognise and work towards developing robust communication between these crucial components within the enterprise ecosystem. Instead of focusing merely on improving and enriching the industrial assets (case in point investments in infrastructure and technology), the organisational growth will happen leaps and bounds by focusing on the smooth functioning between the internal processes. This can only take place provided fast tracked, real-time and effective communication channels are dedicatedly developed and nurtured across the enterprise boundaries. To dig deeper, real-time communication isn’t just an added tool down the sleeve of the industrial sector; it’s an absolute imperative to ensure smooth functioning, interoperability and transactions across the supply chain processes, production line, and crucial departments within the administration, partner vendor lines, and delivery channels as well. Subsequently, it goes without saying that a consistent, unbiased and real-time communication between different/multiple plants within the industry would but naturally heighten the organisational efficiency and is now a pre-requisite necessary industry practice which cannot be done without.
Historically major vendors have developed their own communication interfaces and protocols. Each such protocol was embraced by their partners. Equally, since ages there has been a call out for open protocols from the side of major buyers. What actually defines an open protocol? When can a system be said to support an open protocol?
I feel open-source protocols are the way forward for majority market players in terms of implementation due to the plethora of convenience they offer. These protocols are obviously way safer and secure, since collectively large number of people/organisations contribute actively and continuously towards building these, as compared to the proprietary ones. These use open source software for their implementation needs (as against the proprietary protocols which make use of their own software) and provide the ability for anyone to indulge and contribute to the same thorough technology research and subsequent testing, thus further authenticating their standards. Thus, the protocols which are free to be used by anyone, support scalability along with reduced cost of implementation, which could provide ease of modification and deployment, and which promote freedom to selecting from amongst variety of protocols based on technical & financial needs – could truly be called as an Open Protocol, which provides maximised value to the vendors and end-users alike. If the said system supports a variety of manufacturers, service organisations and vendors, and promotes the ability to add relevant capabilities into their offering real-time, they could be thus said to support the open protocols.
In automation industry, which is a competent body to provide accreditation and registration for protocols? Which is the entity who would define an open protocol? What commercial incentive would such entity hope for? How would upgrades and revisions be handled?
Associations and organisations like the IEEE and ISA, and such others are obviously some of the key bodies providing accreditation and registration for the protocols. Collectives like OPC Foundation, and others are actually responsible to define and withhold the standards and quality of these open protocols. However, it must be noted, that any and all such accreditation bodies are merely the symbolical torch bearers and custodians responsible for ensuring and upgrading the various standards as required. The real leg work needs to be done by the participating industries/organisations, who contribute freely towards the technology and other research for enhancing the quality and standards of these open protocols. The biggest takeaway/commercial incentive for any (and all) such participating industry would be the facts that are actually contributing towards their industry and market, to raise and maintain the bare minimum standard practices. In that sense, they are actually giving back to their domain of operation/society. Furthermore, the updates or revisions would (and should) be handled on a real-time and priority basis, such that these standards get enhanced continuously once sufficient data is there to back the revision of protocols. Lastly, it’s ultimately the end-user who benefits the maximum from this regular updation of standards, since the standards promote ease of management and efficiency which gets directly passed on to the upgraded products that are delivered to the customers.
How do upcoming technologies propose to deal with this topic? Such technologies include Cloud systems, Big Data Systems and more?
Multiple research studies and analysis have pointed to the fact that the investments in big data in major industrial automation vendors are exponentially increasing with every passing year. The fundamentals of Big Data and resultant visualisation techniques are ensuring ‘Visual Discovery’ by presenting current industrial automation information in a graphical manner. As we are aware optimisation of business processes in real-time is made possible due to the IoT Value Chain (from the enterprise applications to the devices) and can be collectively coined under the unison of ‘Internet of Things and Big Data’. This could be achieved by collecting data from the cloud systems and various points such as machine-generated data, enterprise applications and human-generated data, and processed making use of big-data analytics. The industry performance goals can be achieved efficiently by making use of the big-data analytics which promote the creation of appropriate decision-making models. Also, certain challenges in the implementation of cloud systems and big data systems get improved in terms of the need to lower the time-to-insight, enhanced scalability requirements, and effective real-time event handling – which can be done, with the incorporation of internet of things (IoT). However, it’s imperative for prominent industrial organisations to review relevant features and applications (pertaining to their specific requirements), to ensure effective big-data adoption in the industrial and automation space, to facilitate absolute open-source communication between different systems. Ultimately, the more qualitative usage of the scattered, disparate and increasing volumes of data (both end-user and enterprise-level) that can be put to good use, the better enhanced quality of product that can be delivered to the end-user based on personalised needs, real-time and personalised communication and custom requirements. Thus, usage of big data systems within the industrial automation space in context of the real-time and open-source communication is an extremely crucial and unavoidable imperative, which must be devoted ample resources and strategy by the enterprise.
Does use of Open communication compromise on cybersecurity aspects?
As is established already, an open communication protocol promotes high interoperability between vendors, eliminating the need to establish any formal proprietary gateways or interfaces. Contrary to an unpopular belief, open state protocols and the resultant open communication is considered way safer due to top notch technological features which are made possible due to contribution of large number of people and organisation. The more the number of people and organisations that participate in providing their inputs in regard to data and processes that they undertake, the stronger the database of threats and subsequent protection plans that can be created. The stronger the standards and the protocols that are set in place due to such practices, the harder it is for hackers or anyone with malicious intent to actually execute successful cyber threats/attacks. Open Communication thus ensures enhanced cybersecurity, asserting an extreme strong commitment towards providing the most up-to-date and relevant security features. In a nutshell, ensuring open communication across enterprises and within the enterprise always leads to safer practices and secure systems in general, keeping cybersecurity attacks at bay.
(The views expressed in interviews are personal, not necessarily of the organisations represented)
Apoorva Dawalbhakta is working as an Associate Director – Research, with Quadrant Knowledge Solutions (QKS). With a decade of strong industry experience working in the software industry and strategic technology research behind him, he is currently responsible for planning, managing and conducting - global strategic market outlook, market insights, technology guide, Quadrant’s proprietary ‘SPARK Matrix Analysis’, and user consulting assignments at QKS. Apoorva’s primary focus lies within domains including Intelligent Process Automation, Data Management, Analytics and AI, IT Networking and Supply Chain Management, amongst others. He is also actively involved in multiple consulting and research projects, and also heads a team of 25 analysts within the organisation. Apoorva has consistently worked with users from different industries for consulting projects, which include – technology architecture planning, cloud transformation initiatives, vendor selection strategies, and operational due diligence.
Apoorva has been a distinguished rank holder throughout his academics, holding an MBA in Marketing Management (Gold Medallist) and B-Tech in Mechanical Engineering (First Class with Distinction) from Pune University, India. He has also authored a book titled – ‘Roots in Oblivion’ and is currently working on his second book – ‘Mind in Oblivion’.
LinkedIn Profile: https://www.linkedin.com/in/apoorvadawalbhakta/