MARCH 2020
Inside: Confab at ARC Forum p1 Improve OEE and uptime p2 Digital transformation constraints p11
Supplement to Periodicals Publication
Graphic courtesy: HighByte
PRODUCTIVITY AND BEST PRACTICES: EDITOR’S COLUMN Kevin Parker Editor
The promise of a digital twin
I
For Yokogawa, with its control and measurement solut was with regret that this editor the last tions, digital transformation is a vehicle that leads users several years didn’t have a chance to attend the to a portfolio of digital twin solutions, based on firstARC Forum, but this year he did and was glad principles models and technical expertise that at the end of of it, as it was a good opportunity to get caught up on the day tells interested parties “what good looks like,” said discussions about automation-based IT with suppliers, Kevin Pinnan, a systems consultant with Yokogawa. analysts and fellow editors. As is well known, Bentley Systems provides software for engineers, geospatial professionals, engineers and Tell the truth others involved in design, construction and operation of A conversation with Peter Zalevsky, an industry direcindustrial plants, utilities and digital cities. tor, OSIsoft, indicated that the company intends to stay The company says it’s committed to advancing the true to its roots in its evolution from being the seminal digital twin in all its manifestations. For example, Bentley’s data historian supplier to the more expansive concept of iTwin Services fundamentally adoperational intelligence and not vance BIM and GIS to 4-D digital straying into the development twins. of machine learning or artificial What was important was intelligence applications. being able to combine operational “The truth is most companies Like looking in the mirror get value from the data because “Many companies have and enterprise data. of its real-time availability. Take, embraced some aspects of the for example, an instance where digital twin. But we see ourselves the performance of a process as ‘all digital twin, all the time.’ degrades rapidly over a six-month period,” Zalevsky said. This means taking applications that have been trapped The company’s engineers were on the verge of changand making them more accessible to a wider range of using their maintenance strategy, but they were able to look ers. One example would be PlantSight, which brings ERP or other enterprise data to a wider range of people,” said at those pumps in the enterprise system and realized all Alan Kiraly, a SVP for asset performance, Bentley Systems. the failing pumps were part of the same lot order. “What was important was being able to combine Jointly developed for the process industries by Siemens operational and enterprise data. IIoT streams from nonand Bentley, the software brings plant data and informatraditional sources will lead to better decision making. tion together, contextualizes it, and visualizes it, transOn the other hand, on the IT side, there’s been more acforming raw data into a digital twin. ceptance of the idea of a master data pool or an operaThe paradigm of accessing information through a 3-D tional data pool. This always seems to lead to arguments model is one people are familiar with and a good match about the validity of the data.” for those in an augmented planning or operational roles Finally, Ridchard Howells, VP of digital supply chain, because it allows them to access information not part SAP, told us that what users wanted from Industry 4.0 or of their core job function without recourse to multiple IIoT is “personalization, visibility into supply intelligence systems. networks, improved productivity and sustainability.” The SYNCHRO, on the other hand, introduces the time SAP asset intelligence network makes visible an installed element into projects that involve multiple disciplines and base of equipment, such as pumps. Operators access ensures that changes made in one area of a project don’t maintenance strategies and manuals from manufacturers cause problems to other disciplines. Or users can tempo— and manufacturers automatically receive asset usage rally “scroll” through a project comparing actual with the and failure data, potentially across an entire industry. IIoT planned or budgeted scope of work.
‘
www.controleng.com/IIoT
’
Industrial Internet of Things
MARCH 2020
| 1
THE AGE OF ANALYTICS
Use IIoT to improve OEE, increasing ROI Case-use examples illustrate the point
By Michael Risse
I
n the year 2020 discussing overall equipment effectiveness (OEE) seems a little out of date. Whatever it’s called — OEE, downtime reporting or asset utilization — it’s not a new topic. It might also have scenario-specific names like clean in place (CIP), a variation on a theme: how to quickly cycle a process and return assets to availability. Articles a decade or two old describe how to calculate OEE or measure asset downtime. It’s been a foundation of Lean or Six Sigma methodologies for years. So, what has it to do with the Industrial Internet of Things (IIoT) and analytics? Use of OEE as a metric improves with IIoT and advanced analytics. First comes relief from years of misery mired in spreadsheets investigating downtime issues. Second, significant savings arise from insights uncovered by downtime analytics. Finally, accelerated decision making and taking action leads to business transformation. These substantial benefits reinforce the importance of downtime reporting. But before we look at success stories, let’s take a deeper dive into downtime reporting and OEE.
Downtime is descriptive Downtime reporting is an example of “descriptive analytics” because it describes what happened, a term
2
|
MARCH 2020
IIoT For Engineers
associated with reports and data visualization: static and historic. Even if the reports are updated periodically, often using a dashboard, they are still descriptive analytics. Although the numbers may not be static, the calculations and parameters (fault codes, measurements, etc.) are fixed at the point of design (Figure 1).
‘
Downtime reporting is an example of descriptive analytics because it says what
’
happened.
Therefore, descriptive analytics is considered basic. It calls out what is looked for, defines calculations in advance and then measures and presents in past or present form. But as we all know, things are always changing in industrial plants and facilities, with various factors laying waste to the best laid plans. As Helmuth van Moltken famously said back in 1880, “No plan of operations reaches with any certainty beyond the first encounter with the enemy’s main force.” The enemy here is change and its potential negative effects on OEE, downtime and asset utilization. The fact that downtime reporting is
based on a calculation determined in advance of plant operations means that as soon as something interesting happens — as soon as there is an encounter with the enemy — then the report’s value as a picture of the past rapidly decreases. A new approach is therefore required to deal with the enemy, with process engineers leading the charge.
Engineers into the breach When insight is needed to solve an unknown issue — for example to figure out what happened that wasn’t expected, why the metric is below plan or to understand causation — then downtime analysis is required, along with a subject matter expert such as a process engineer. This approach puts process engineers where they should be, at the frontlines of insight. With advanced analytics they have a solution that is radically faster than the spreadsheet approach. In analytics terms, what has happened is a transition from descriptive analytics to diagnostic analytics (Figure 2). If descriptive analytics is a report, diagnostic analytics is interactive investigation and discovery, in some cases a root-cause analysis. Diagnostic analytics is how cause is determined and correlations found, and it can include discoveries of best practices and comparisons. All these insights are arrived at very quickly by leveraging computer science innovations www.controleng.com/IIoT
FIGURE 1. Reports as depicted in this screen view are a form of descriptive analytics, describing what happened or what is happening. All figures courtesy: Seeq Corp.
(big data, machine learning, etc.) that enable the “advanced” in advanced analytics applications.
Diagnostics in action To take one example, when drying different products, to identify the optimum endpoint, it is important to understand the impact of various parameters on drying time. According to Dr. Robert Forest, a development engineer with Bristol-Myers Squibb (BMS), the fundamental question is: “How long should we dry the wet cake to meet our drying endpoint?” If the product is dried for too long, it increases cycle times unnecessarily. But if the product is not dried long enough, it could fail the process control sample, which is a waste of analytic resources. Prior to using Seeq, BMS collected many different data points from the OSIsoft PI tag data, using a simple summary of the statistics to help determine drying times. The company would review the minimum, maximum www.controleng.com/IIoT
FIGURE 2. Process engineers and other experts can use advanced analytics to quickly diagnose problems and find root causes.
and average product temperatures; agitator speed; jacket temperature; and drying time. BMS would then compare these data points to the actual solvent loss measured by taking samples throughout the drying process. Manually collecting the data was tedious, time-consuming and error-prone. BMS needed a more automated way to collect and analyze these data. The typical filter drying process goes through three distinct drying process stages (Figure 3). Stage 1 (static drying) starts out by heating
without agitation. Stage 2 continues to heat, but with intermittent agitation. In stage 3, heating continues with continuous agitation. BMS wanted to identify some of the key parameters for each drying stage to optimize drying for a range of different batches. To do this, the company needed to automatically find when the dryer was operating by associating the drying phase with the drying time, jacket temperature, product temperature (maximum, minimum and average) and agitator speed. IIoT For Engineers
MARCH 2020
| 3
THE AGE OF ANALYTICS
Coresight: Visualize three-stage drying process
FIGURE 3. It was possible to quickly diagnose this drying system and optimize its operation.
The next step was to separate the operations data by the distinct drying stages. For stage 1, where there is no agitation, Seeq was used to find the needed data by simply searching for periods of time when the agitator was turned off for extended time periods (rather than intermittently) and combining these data with search results with a high jacket temperature (to indicate that the dryer was operating). To find the summary data for stage 3, search was set up to first find those periods of time when the agitator was turned on for an extended period, and then combine these with high jacket temperatures. Identifying the parameters for stage 2 was a little trickier because the agitator is turned on and off intermittently, so there is no constant signal value upon which to base the search. To find stage 2 data, BMS uses the system’s pattern-searching capability to find all batches in which stage 2 agitator on/off behavior is displayed. Seeq enables BMS to create a pattern search for the square waveform of the
4
|
MARCH 2020
IIoT For Engineers
agitator intermittently cycling on and off. This pattern search allows the user to specify a similarity heuristic to hone the search results. It also enables the users to combine the pattern search results with the jacket temperature. With the combined results, BMS could exclude periods of time before drying actually started, i.e., stage 2. As illustrated above, Dr. Forest’s team at BMS, in their journey of diagnostic discovery, was able in particular to separate the data into all three stages and automatically calculate the needed statistics. According to Dr. Forest, the ability to use Seeq to search by specific data and overlay batches reduced the time needed to collect the data and saved on average one hour of analysis time per batch.
Sharing results Using advanced analytics, as insights are found, they can be published to colleagues as web pages, PDF documents or updated images. Engineers using advanced analytics thus have an integrated approach to
both diagnostic (investigation) and descriptive (publishing) analytics. Of course, integrated investigation and publishing is a benefit as compared to using two different tools, but just as important is the fact that engineers can create, publish and update the analytics — without any required intervention from IT personnel or specialists. Unlike traditional OEE reports which are set up by IT departments or plant administrators based on a design that can drift away from plan (new assets, new recipes, new season, new raw materials, etc.), advanced analytics enables the front-line process engineer to quickly iterate the underlying calculations and source data as necessary to represent current plant conditions and associated opportunities. The investigation is faster, publishing is integrated and usage is self-service for the employee closest to the process or asset.
Predicting the future Process engineers take the next step in their efforts by incorporating www.controleng.com/IIoT
THE AGE OF ANALYTICS
predictive analytics (Figure 4). Predictive analytics completes the range of data from historical (diagnostic) and present (descriptive) to what is expected to happen. Instead of an unplanned downtime event, engineers can avert downtime with an early warning system that provides anything from hours to weeks of advanced notice, depending on the process, the data and the engineer’s expertise. For customers interested in moving from scheduled to predictive maintenance, for example, the ability to evaluate when maintenance is required and what parts to order is critical. The goal isn’t to measure the history of asset failure in downtime or OEE reports, but rather to avoid unnecessary downtime to the greatest extent possible.
Fixed bed catalyst prediction The challenge for one refinery was to optimize near- and long-term economics by predicting end-of-run for a fixed-bed catalyst system. This required selection and examination of historical data for training the correlations, which were auto-updated as new data became available. Another
challenge was to provide insights to enable collaborative analysis and investigation between the refinery licensor and the catalyst vendor. The solution was to use Seeq formulas to implement first-principles equations to calculate normalized weighted average bed temperature (WABT) for the fixed-bed reactor system. The next step was to normalize WABT for feed rate, feed and product quality, treat gas ratio and so forth. Seeq prediction features were then used to create a model to predict normalized WABT as a function of time within steady state conditions. This enabled the refinery to determine the end-of-run date versus the known WABT performance threshold, and to apply this methodology to their other fixed-bed catalyst processes. Benefits included monitoring of catalyst deactivation to allow co-optimization of near-term economics and risk-based maintenance planning. Better prediction of end-of-run allowed more effective analysis of the tradeoff between rate reduction and maintenance costs. Calculation of endof-life for the catalyst enabled rapid detection of unexpected changes and
FIGURE 4. Predictive analytics empower engineers to identify potential problems before they impact operations.
6
|
MARCH 2020
IIoT For Engineers
performance of corrective actions.
Final words Diagnostic, descriptive and predictive analytics can be performed on existing data from SCADA, historian and other manufacturing data sets — and there’s one more form of analytics. Unlike the other terms, this last analytics type doesn’t have an established name, but is the ultimate objective of all analytics: do the right thing. Yes, it sounds silly, but engineers using Seeq have been emphatic on their enthusiasm for its effectiveness for quickly figuring out the right thing to do to improve operations. In the past, the time to insight in a spreadsheet was ominously long and the work was onerous, so they simply were not able to do the necessary calculations in time to impact and improve production outcomes. But with Seeq applications, days and weeks become minutes and hours, enabling analytics over time to make a difference, consider tradeoffs and optimize for the big picture. Downtime reporting and its variants like OEE are not a new concept. They are a staple of every operations playbook. But that doesn’t mean there isn’t an opportunity for a fresh look at the potential of advanced analytics applications to improve all types of analytics. IIoT Michael Risse is the CMO and vice president at Seeq Corp., a company building advanced analytics applications for engineers and analysts that accelerate insights into industrial process data. He was formerly a consultant with big data platform and application companies, and prior to that worked with Microsoft for 20 years. Michael is a graduate of the University of Wisconsin at Madison, and he lives in Seattle. www.controleng.com/IIoT
Epicor® ERP Streamline your business processes and break down barriers to growth Learn how Epicor ERP can help you: X Track, measure, and monitor your entire business, from shop floor to top floor and from raw materials to final product X Reduce costs and streamline processes while you grow revenue and increase profits X Optimize lean manufacturing to focus on the key priorities to make smarter decisions, eliminate waste and increase customer satisfaction
“From a customer perspective, talking to a supplier I feel honesty is important, and we’re very pleased with the way it’s worked out. Now it’s been 10 years later and it’s [Epicor ERP] still working. It’s great. We’ve been able to grow into a $300 million company by investing in software that allows us to reach our customers, and control our inventory and production. I can’t imagine how this could have been done without having a very robust software system and partner, like Epicor.” —Carey Smith, Chief Executive Officer | Big Ass Solutions
Contact us today info@epicor.com | www.epicor.com | 1.800.999.1809 Copyright ©2020 Epicor Software Corporation. All rights reserved. Epicor and the Epicor logo are registered trademarks or trademarks of Epicor Software Corporation in the United States, and in certain other countries and/or the EU.
BIG INDUSTRIAL DATA
How to make industrial data fit for purpose Seven steps as a practical guide to data management By John Harrington
I
t seems to me that most of the manufacturers I talk to are drowning in data, and yet they are struggling to make it useful. A modern industrial facility can easily produce one terabyte of data each day. With a wave of new technologies for artificial intelligence and machine learning — on top of real-time dashboards and augmented reality — we should see huge gains in productivity. Unplanned maintenance of assets and production lines should be a thing of the past. However, even in 2020, this is not the case. Access to all this data does not mean it is useful. Industrial data is very raw. The data must be made “fit for purpose” to extract its true value. Also, the tools used to make the data fit for purpose must operate at the scale of an industrial facility. With these realities in mind, here is a practical, step-by-step guide for manufacturers and other industrial companies to make their industrial data fit for purpose.
STEP 1: Start with the use case Information technology (IT) and operations technology (OT) projects should start with clear use cases and business goals. For many manufacturing companies, projects may focus on machine maintenance, process improvements or product analysis to
8
|
MARCH 2020
IIoT For Engineers
improve quality or traceability. As part of the use case, company stakeholders should identify the project scope and applicable data that will be required. Make sure the right cross-functional stakeholders are in the room from the project beginning, and that all stakeholders agree to prioritize the project and can reach consensus on the project goals.
‘
Make sure the right crossfunctional stakeholders are in the room from the
’
project beginning.
STEP 2: Identify target systems With the use cases and business goals identified, the next step requires identifying the target applications that will be used to accomplish these goals. Characterize the target application by asking these questions: • Where is this target application located: at the Edge, on-premises, in a data center, in the Cloud or elsewhere? • How can this application receive data: MQTT, OPC UA, REST, database load or other? • What information is needed for this application?
• How frequently should the data be updated and what causes the update? Document your responses and then move on to the next step in which you will identify your data sources for the project.
STEP 3: Identify the data sources Industrial data is an important component for addressing industrial and business use cases. However, there are some major challenges with accessing this data and converting it into useful information. Volume. The typical modern industrial factory has hundreds to thousands of pieces of machinery and equipment constantly creating data. This data is generally aggregated within programmable logic controllers (PLCs), machine controllers or distributed control systems (DCS) within the automation layer, though newer approaches may also include smart sensors and smart actuators that feed data directly into the software layer. Correlation. Automation data was primarily put in place to manage, optimize and control the process. The data is correlated for process control and is not correlated for asset maintenance or product quality or traceability purposes. Context. Data structures on PLCs and machine controllers have minimal descriptive information — if any. In many cases, data points are referwww.controleng.com/IIoT
enced with cryptic data-point naming schemes or references to memory locations. Standardization. The automation in a factory evolves over time with machinery and equipment sourced from a wide variety of hardware vendors. This hardware was likely programmed and defined by the vendor. This has resulted in unique data models created for each piece of machinery and a lack of standards across the factory and company in all but the very largest and most sophisticated manufacturers. You can better understand the specific challenges you will need to overcome for your project by documenting your data sources. Characterize the data available to meet the target system’s needs by asking these questions: • What data is available? • Where is it located: PLCs, machine controllers, databases, etc.? • Is it real-time data or informational data (metadata)? • Is the data currently available in the right format or will it need to be derived?
STEP 4: Select the integration architecture Integration architectures fall in two camps: direct application programming interface (API) connections (application-to-application) or integration hubs (DataOps solutions). Direct API connections work well if you only have two applications that need to be integrated. The data does not need to be curated or prepared for the receiving application, and the source systems are very static. This is typically successful in environments where the manufacturing company has a single SCADA or MES solution www.controleng.com/IIoT
that houses all the information, and there is no need for additional applications to get access to the data. Direct API connections do not work well when industrial data is needed in multiple applications like SCADA, MES, ERP, IIoT Platform, Analytics, QMS, AMS, cyber threat monitoring systems, various custom databases, dashboards or spreadsheet applications. Direct API connections also do not work well when there are many data transformations that must occur to prepare the data for the consuming system. These transformations can easily be performed in Python, C#, or any other programming language but they are then “invisible” and hard to maintain. Finally, direct API connections do not work well when data structures are frequently changing. This happens when the factory equipment or the programs running on this equipment are frequently changed. For example, a manufacturer may have short-run batches that require loading new programs on the PLCs; the products produced may evolve and require changes to the automation; the automation may be changed to improve efficiency; or the equipment may be replaced due to age and performance. Using the API approach buries the integrations in code. Stakeholders may not even be aware of integrated systems until long after the equipment has been replaced or changes have been made, resulting in undetected bad or missing data for weeks or even months.
FIGURE 1. Integration hubs are an alternative to using application programming interfaces (APIs) solely when managing the many types of data associated today with automation projects. Diagram courtesy: HighByte
An alternative to direct API connections is a DataOps integration hub. Integration hubs are a new approach to data integration and security that aims to improve data quality and reIIoT For Engineers
MARCH 2020
| 9
BIG INDUSTRIAL DATA
duce time spent preparing data for use throughout the enterprise. An integration hub acts as an abstraction layer that still uses APIs to connect to other applications but provides a management, documentation and governance tool to connect data sources to all the required applications. An integration hub is purpose-built to move high volumes of data at high speeds with transformations being performed in real time while the data is in motion. Since an integration hub is an application itself, it provides a platform to identify impact when devices or applications are changed, perform data transformations and provide visibility to these transformations.
STEP 5: Establish secure connections Now that the project plan is in place, begin system integration by establishing secure connections to the source and target systems. Understand the protocols worked with and the security risks and benefits they provide. Many systems support open protocols to define the connection and communication. Typical open protocols include OPC UA, MQTT, REST, ODBC and AMQP — among others. There also are many closed protocols and vendor-defined APIs for which the application vendor publishes the API protocol documentation. Ask yourself: Does the protocol support secure connections and how are these connections created? Some protocols and systems support certificates exchanged by the applications. Other protocols support usernames and passwords or tokens manually entered into the connecting system or through third-party validation. In addition to user security, some protocols support encrypted data packets so if there is a “man in the middle” attack they cannot read the data being passed.
10
|
MARCH 2020
IIoT For Engineers
Finally, some protocols support data authentication. Then, even if the data is viewed by a third party, it cannot be changed. Security is not just about usernames, passwords, encryption and authentication but also about integration archi-
‘
Protocols like MQTT require only outbound openings in firewalls, which security teams prefer.
’
tecture. Protocols like MQTT require only outbound openings in firewalls, which security teams prefer because hackers are unable to exploit the protocol to get on internal networks.
STEP 6: Model the data The corporate-wide deployment and adoption of analytics or IIoT often is delayed by the variability of data coming off the factory floor. From one machine to the next, each industrial device may have its own data model. Historically, vendors, systems integrators and in-house controls engineers have not focused on creating data standards. They refined the systems and changed the data models over time to suit their needs. This worked for one-off projects, but today’s IIoT projects require more scalability. The first step in modeling data is to define standard models required in the target system to meet the business goals of the project. At the core of the model is the real-time data coming off the machinery and automation equipment. Most of the real-time data points will map to
single-source data points. However, when a specific data point does not exist, data points can be derived by executing expressions or logic using other data points. Data also can be parsed or extracted from other data fields, or additional sensors can be added to provide required data. These models also should include attributes for any descriptive data, which are typically not stored in the industrial devices but are very useful when matching data and evaluating data in the target systems. Descriptive data could be the location of the machine, the asset number of the machine, unit of measure, operating ranges or other contextual information. Once the standard models are created, they should be instantiated for each asset, process or product. This is generally a manual task but can be accelerated if the mapping already exists in Excel or other formats, if there is consistency from device to device that can be copied or if a learning algorithm can be applied.
STEP 7: Flow the data When the modeling is complete, the data flows should be controlled model -by-model. This typically is performed by identifying the model to be moved, target system and frequency or trigger for the movement. Over time, data flows also will require monitoring and management.
Wrap up Making industrial data fit for purpose will be critical to manufacturers looking to scale their IIoT projects and wrangle data governance. I hope this article serves as a practical guide to getting started on your next project. IIoT John Harrington is co-founder and chief business officer at HighByte. www.controleng.com/IIoT
OPERATIONS AND INFORMATION TECHNOLOGIES
Small-to-midsize manufacturers face digital transformation constraints IT and OT software and platform convergence moves forward Joe Brooker
A
successful digital transformation involves increasing integration among the enterprise systems that govern enterprise transactions and the systems that manage factory operations. Potential returns from industrial internet of things (IIoT) investments won’t be realized unless the information technology (IT) and operations technologies (OT) are more closely aligned. Yet fewer than 10% of companies have combined OT and IT departments, with the IT function more frequently representing corporate interests and the OT function representing plant or unit priorities. Given these challenges, it is no surprise almost 75% of companies having IIoT initiatives say they do not consider them a complete success. Despite this low success rate, managements see digital transformation as key to longterm success — nine out of 10 industrial companies are investing in digital factories. If manufacturers successfully digitalize production, it can pave the way for flexible, just-in-time production, with the goal to reduce inventories even while increasing revenue. While the IT and OT disconnect is a hurdle impacting digital transformation success, it is not the only reason. More than 90% of U.S. manufacturing companies are small businesses, and the majority have fewer than 20 employees. These small companies www.controleng.com/IIoT
probably don’t have dedicated IT departments, so they face a more basic question: Can we dedicate capital to an experimental project that may fail? The answer: probably not.
Connected applications Transforming operations in a small to mid-sized company is no small feat and will likely encounter pushback from various business and supply chain functions, including the back office, suppliers, customers and the plant floor. The many challenges associated with digital transformation mean few companies will complete the transition alone. Strategic partnerships with IT and OT providers are important. As IIoT platforms are integrated and implemented, connected applications have emerged. The value of investing in IIoT platforms will come from applications that take advantage of connectivity to address business needs. The platform is, or will become, a commodity. If every business has one, there’s no competitive advantage. If the focus is on improving operational efficiency, this doesn’t necessarily require major business transformation.
Connected applications may not necessarily be branded as IIoT. Cloudbased solutions allow smaller companies without dedicated IT departments and operating under capital and labor constraints to implement connected solutions that deliver business value.
Finding IIoT that works Companies that can dedicate capital and labor to an IIoT initiative will be able to exploit the benefits of digital transformation. This transformation will require IT (enterprise) and OT (factory/operational) changes. Having available resources doesn’t guarFIGURE 1: Cambashi connects nine IIoT market areas. All graphics courtesy: Cambashi
antee success, however. Operating at scale is often accompanied (particularly in larger, well-established companies) by siloed, disconnected operations. While IT departments always seek to exploit emergent technologies for competitive advantage, for OT, it is very different. Reliability is critically important. Operations executives are more concerned about what works than what’s new. If technology does need replacing or upgrading, they IIoT For Engineers
MARCH 2020
| 11
OPERATIONS AND INFORMATION TECHNOLOGIES
want something that can be implemented with minimal disruption. Neither is this disconnect confined to the factory. IT and OT departments will each have preferred suppliers they are accustomed to engaging with. That separation adds to the complications of digital transformation projects. Operations technology providers, such as Siemens, Bosch Rexroth and Honeywell, among others, have invested heavily in IIoT and connected applications. IIoT offerings that once spanned several solution providers are gathered in one business unit. GE spun off its IoT unit, GE Digital, into a separate entity. These entities work at the plant level rather than enterprise level for security and technological reasons. Despite restructuring, it has not been plain sailing for global giants GE and Siemens. GE’s software business has struggled to achieve profitability since being spun off into a separate subsidiary. For Siemens, despite a third-quarter decline in digital industries revenue, software revenues increased. Siemens says partnernetwork applications for its IIoT platform, MindSphere, are growing more than 25% per quarter. A brief review of the online marketplace for MindSphere applications include offerings to explore IoT data, develop dashboards, and automate analytics to create actionable intelligence. Operations-rooted applications emphasize machine connectivity, control and efficiency. It seems inevitable that operations technology providers will increasingly incorporate IT standards and protocols in their solutions. Blurred IT/OT distinctions, however, present challenges to IT providers as well. It will be harder for IT providers to offer OT software because of the machine-level domain knowledge needed for OT software.
12
|
MARCH 2020
IIoT For Engineers
Help from IT providers Enterprise software providers offer IIoT solutions that allow users to incrementally adopt IIoT. These include IBM, Software AG and PTC, which offer cloud networks, IoT platforms and connected applications. Microsoft (with Azure) and Amazon (with AWS) offer open, cloud-based capabilities and support a partner ecosystem. Focusing on a digital transformation initiative can seem overwhelming to even the largest corporations. Enterprise providers increasingly offer industry-focused, packaged software solutions.
‘
It will be harder for IT providers to offer OT software because of the machinelevel domain knowledge
’
needed for OT software.
Revenue growth of OT/industrial and IT/enterprise providers can be compared for 2017 and 2018, the most recent years with available figures. The data shows IT/enterprise providers grew by 65%, compared to 29% by OT/industrial players. While OT providers started from a higher base value, it still holds true that IT providers are making advances in their connected application offerings. This is consistent with statistics showing IT/enterprise providers are outperforming OT counterparts in year-over-year growth. OT and IT providers are increasing their focus on the Asia-Pacific region.
More blended IoT offerings Analysis of the IIoT market shows a changing distinction between the IT- and OT-rooted approaches. OT suppliers increasingly are offering IT solu-
tions, while both IT and OT providers are forming strategic partnerships. In one example, PTC and Rockwell Automation have entered into a strategic partnership to expand coverage, from design operations to plant floors,through creation of a joint product factory software suite. The software combines Rockwell Automation’s automation technology and domain expertise and PTC’s product design and product lifecycle management knowledge. These hybrid solutions are in place in Ford Motor Co.’s manufacturing systems for automobile and truck manufacturing. Volkswagen announced it will jointly develop the Volkswagen Industrial Cloud using Amazon AWS technology and Siemens as the integration partner. Siemens will be responsible for ensuring the equipment and machinery at Volkswagen’s 122 plants are efficiently networked in the cloud. Siemens also will make connected applications from its own IIoT platform, MindSphere, available in the Volkswagen Industrial Cloud. This will enable greater data transparency and analysis, which will lay the technological foundations for Volkswagen productivity improvement. The agreements with AWS and Siemens pave the way for Volkswagen to digitalize production and logistics across a global supply chain. Connected market areas vary in growth. In 2017, for example, connected asset and connected production were prevalent, with many diverse companies appearing in the market to offer applications. Connected asset and production are still big players, but over time, other market areas are emerging. These include connected transportation and connected product initiatives. IIoT Joe Brooker is industrial IoT analyst with Cambashi. www.controleng.com/IIoT
IIoT devices run longer on Tadiran batteries.
PROVEN
40 YEAR OPERATING
LIFE
Remote wireless devices connected to the Industrial Internet of Things (IIoT) run on Tadiran bobbin-type LiSOCl2 batteries. Our batteries offer a winning combination: a patented hybrid layer capacitor (HLC) that delivers the high pulses required for two-way wireless communications; the widest temperature range of all; and the lowest self-discharge rate (0.7% per year), enabling our cells to last up to 4 times longer than the competition.
ANNUAL SELF-DISCHARGE TADIRAN
COMPETITORS
0.7%
Up to 3%
Looking to have your remote wireless device complete a 40-year marathon? Then team up with Tadiran batteries that last a lifetime.
* Tadiran LiSOCL2 batteries feature the lowest annual self-discharge rate of any competitive battery, less than 1% per year, enabling these batteries to operate over 40 years depending on device operating usage. However, this is not an expressed or implied warranty, as each application differs in terms of annual energy consumption and/or operating environment.
Tadiran Batteries 2001 Marcus Ave. Suite 125E Lake Success, NY 11042 1-800-537-1368 516-621-4980 www.tadiranbat.com
*
MOVE SECURELY INTO THE CLOUD S
WAGO Cloud Amazon Web Services Other Cloud Services
Microsoft Azure IBM BLUEMIX
IIoEATDY R
Direct Field to Cloud Connection with the PFC Series Controllers • • • •
IIoT-ready with native MQTT and TLS encryption Built-in VPN and Firewall for increased network security Simplify data routing and reduce latency Interface with existing controls via onboard fieldbus gateways
www.wago.us/pfccloud
TY
RI U C E
-IN T L I BU