Cee 17 03

Page 1

Control, Instrumentation and Automation in the Process and Manufacturing Industries March 2017

www.controlengeurope.com

OPC UniďŹ ed Architecture – The new world standard Addressing the big data knowledge gap Machine vision: Looking on the bright side Proof testing: Calibration by another name?


Together facing a brighter tomorrow At Yokogawa, we believe the sky’s the limit. And to reach beyond today’s horizons, we work step-by-step with you to make the unimagined a reality. That’s how we move forward, through the synergy of co-innovation partnership. Join hands with us, and together we can sustain a brighter future. Yokogawa: Building a better tomorrow with you today.

Please visit www.yokogawa.com/eu


CONTENTS

OPC UA continues to move forward at a pace…

Editor Suzanne Gill suzanne.gill@imlgroup.co.uk Sales Manager Nichola Munn nichola.munn@imlgroup.co.uk Production Sara Clover sara.clover@imlgroup.co.uk Business Development Manager Iain McLean iain.mclean@imlgroup.co.uk Dan Jago Group Publisher David May Production Manager Stuart Pritchard Studio Designer

2016 was a busy year for the OPC Foundation, mainly due to the record growth in adoption of OPC UA. Industry analysts are now forecasting that OPC adoption will exceed 120 million installations by the end of this year. OPC Foundation has also promised some exciting things at Hannover Messe in April. There will, for example, be an OPC pod on the Microsoft stand, while SAP will be demonstrating OPC UA on its stand too. We have also just heard that the VDMA in Germany (the largest union of machine builders in Europe) is now also working collaboratively with OPC Foundation. So, interest in the technology shows no sign of slowing down. We report on the latest OPC UA news and develop-

INDUSTRY REPORT 4

Work currently being undertaken by Fraunhofer Institute for Production Technology is putting us on the road towards ever shorter production runs, with the help of digital twins.

EDITOR’S CHOICE

ments in a special feature section in this issue. (pg 19). Digitisation is also moving at a great pace. The Industry Report in this issue looks at work being undertaken to create more flexible production workflows and enable shorter production runs. It all comes down to the creation of a digital twin for parts being produced which will enable each part to make its own decision on the best route through production! (pg 4) There will be a demonstration of this idea at Hannover Messe on Stand C22, Hall 2. Suzanne Gill – Editor suzanne.gill@imlgroup.co.uk

BIG DATA MANAGEMENT 16 Big data can help synthesise information and provide users with the data needed to make informed, intelligent decisions. However, developing a process that benefits everyone in the plant from the top on down is crucial.

6 Getting a clear view of high flow rates; Interactive communication platform displays live data.

18 SME manufacturers must properly identify and analyse actionable data to ensure they remain competitive.

CALIBRATION

SPECIAL FOCUS: OPC UA

8 Many oil & gas operators have managed to dramatically reduce their operating costs. But, how does this situation affect the world of calibration?

19 OPC UA sees record growth in adoption.

10 Find the right calibration schedule and comply with the ISO 9001 risk-based thinking approach.

20 Collaboration between Microsoft and Leuze electronic has resulted in a sensor whose data can be transferred directly to the Azure Cloud via OPC-UA without the need for an intermediate gateway.

11 Calibration and maintenance can offer some moneysaving opportunities for the oil and gas industry by making better use of available data.

22 OPC-UA is developing fast! Suzanne Gill caught up with some automation vendors to find out more about their latest OPC-UA technology developments.

MACHINE VISION

MULTIVARIABLE CONTROL

12 Machine vision is not as challenging as it used to be. Setup, testing, and start-up is now easier and faster. Mark Hoske reports.

30 Exploring the base concepts of multivariable control.

14 A recent whitepaper looks into the causes of image brightness variation in machine vision applications and offers some possible solutions.

FINAL WORD 35 Dr James Truchard, CEO and co-founder of National Instruments (NI), talks about the changes in test and measurement technology in the past 40 years.

Control Engineering Europe is a controlled circulation journal published six times per year by IML Group plc under license from CFE Media LLC. Copyright in the contents of Control Engineering Europe is the property of the publisher. ISSN 1741-4237 IML Group plc Blair House, High Street, Tonbridge, Kent TN9 1BQ UK Tel: +44 (0) 1732 359990 Fax: +44 (0) 1732 770049

Control Engineering Europe

Control Engineering (USA) Frank Bartos, Mark Hoske, Renee Robbins, Vance VanDoren, Peter Welander Circulation Tel: +44 (0)1732 359990 Email: subscription@imlgroup.co.uk Completed print or on line registration forms will be considered for free supply of printed issues, web site access and on line services.

www.controlengeurope.com

Qualified applicants in Europe must complete the registration form at www.imlgrouponthenet.net/cee to receive Control Engineering Europe free of charge. Paid subscriptions for non-qualifying applicants are available for £113 (U.K.), £145 (Europe), £204 (rest of world); single copies £19.

March 2017

3


INDUSTRY REPORTS

ON THE WAY TO

production runs of one Work being undertaken by Fraunhofer Institute for Production Technology is putting us on the road towards ever shorter production runs, with the help of digital twins.

T

raditionally, machines have produced parts in networked, preprogrammed production runs: pieces are turned, milled and measured in a set order. However, if a machine fails or a customer order is changed production needs to be re-configured, which can be time-consuming and expensive. What if, instead of a central control program issuing commands, the workflow would develop flexibly, with each part deciding for itself the best route through production. Sound like a pipe dream? Well it may not be, as this is what developers at the Fraunhofer Institute for Production Technology (IPT) in Aachen are currently working towards. Entitled ‘Service-Oriented Architecture for Adaptive and Networked Production’ it functions in a similar way to an automobile

Siemens and Bentley Systems formalise strategic alliance Siemens and Bentley Systems have formalised a strategic alliance agreement to drive new business value by accelerating digitalisation to advance infrastructure project delivery and asset performance in complementary business areas. The two companies have agreed to invest over 50 million euros in the development of joint solutions to enlarge their respective offerings for infrastructure and industry. This work will make use of new

4

March 2017

cloud services for a connected data environment to converge respective digital engineering models from both companies. In addition approximately 70 million euros of secondary shares of Bentley’s common stock have been acquired by Siemens, under a company program that will continue until such time as Bentley Systems’ stock is publicly traded. The new investment initiatives will involve virtually all Siemens

www.controlengeurope.com

divisions accumulating intelligence from Siemens solutions throughout Bentley’s complementary applications for design modeling, analytical modeling, construction modeling and asset performance modeling. As a result, the integrated and accessible digital engineering models, such as the ‘digital twin’ viewed through an immersive 3D interface, will, according to Siemens, enable unprecedented operational performance, visibility and reliability. Control Engineering Europe


INDUSTRY REPORTS navigation system that uses current data to determine the best route in real time. Each part carries information regarding the next production stage – which machine will be called into operation is purposely left undecided. Only when a production stage is pending does the system select a machine from those that are readily available. Each part bears a QR-Code identifying it as a unique entity.

Digital twins The software remembers what was done to each part at each production stage. For example, ‘Hole is drilled with machine parameter A and tool X’. A digital twin emerges from this history, displaying at any time where its physical counterpart is in the production process. Digital twins can offer benefits to manufacturers of a wide variety of goods because updating or changing a production run does not require a system overhaul. The ‘Smart Manufacturing Network’ manages the digital twin, always analysing and reusing its process data to improve process robustness and product quality. “Networking machines with parts will enable companies to produce one-off products in the future – production runs of one,” said Michael Kulik, project leader at Fraunhofer working on the software development. A unique aspect of the system is the menu which configures a production sequence. Using dragand-drop, the user selects individual steps from a list of all services and arranges them in the desired order like building blocks. If a machine fails, a part is simply rerouted to another available machine. “Many machines in a production line can perform a variety of tasks,” continued Kulik. “For example, a five-axis milling machine can also do the job of a simpler three-axis milling machine. In the future, the Smart Manufacturing Network’s serviceoriented software will be able to Control Engineering Europe

flexibly decide whether to do the job on an idle five-axis machine.” To enable flexible production it is important that machines from various manufacturers are able to easily integrate into the Smart Manufacturing Network. IPT is working on this with partners from science and industry in Fraunhofer’s ‘Networked, Adaptive Production’ performance center. “The plug-andplay that we know from everyday

technology does not yet exist in industry,” said Dr. Thomas Bobek, coordinator of the Fraunhofer performance center. “However, our goal is to make plug-and-produce possible.” Fraunhofer’s researchers will demonstrate how the digital twin, service-oriented software and Smart Manufacturing Network collaborate at Hannover Messe at Stand C22, Hall 2.

Industrial robot deployment grows The global market for robotics in manufacturing is steadily gathering pace, says Frost & Sullivan. This, it says, is driven by the Industrial Internet of Things (IIoT). Digitisation and human-robot collaboration are set to transform manufacturing business models, it says, despite many manufacturing companies currently being unclear about its benefits. Major industry contenders are now investing in intuitive large robots for factory operations. As a result, adoption of industrial robots in factories can expect to witness a compound annual growth rate (CAGR) of 14.4% during 2016-2023. “Cloud, a major enabler of IoT and data analytics, will disrupt industrial manufacturing as manufacturers turn to software/data-driven services apart from legacy automation systems,” said Sharmila Annaswamy, Industrial Automation & Process Control research analyst. “The convergence of information technology (IT) and operations technology (OT) will drive collaborations between robot manufacturers and communication and software providers. By 2023, the global industrial robotics market is expected to reach USD 70.26 billion.” The ‘Industrial Robotics – Decoding the Robotics Impact on Manufacturing’ is part of Frost & Sullivan’s Industrial Automation & Process Control Growth

www.controlengeurope.com

Partnership Service programme that includes insights on Big Data for manufacturing, digital factories, supply chain evolution, Services 2.0, safety-security for connected enterprises, Industry 4.0, contract manufacturing, and emerging economies of scale. Benefits such as better utilisation of factory floor space and 25% reduction in installation cost will draw customers to collaborative robots over their traditional counterparts. However, market participants must strictly consider existing risk assessment methodologies and implement improvised safety regulations to ensure customers get the best value. This will encourage deployment of collaborative robots for niche applications such as assembling electrical or automotive parts. Key challenges facing industrial robotics companies include industrial cloud security concerns, low awareness, making cloud implementation seamless and cost effective, and boosting the skillset of resources to keep pace with the evolving manufacturing technologies. “Emphasis is also required on making industrial robots futuristic and economical through new business models, such as collaboration-as-aservice, plug-and-play, and robotics-asa-service, which focus on quick returns on investment and lasting customer satisfaction,” added Annaswamy. March 2017

5


EDITOR’S CHOICE

Getting a clear view of higher flow rates Mass Flow ONLINE B.V, the e-commerce channel of Bronkhorst High-Tech B.V has added two new models to its MASS-VIEW series – the MV-108 mass flow meter and MV-308 mass flow regulator. These new additions extend the MASS-VIEW product line to gas flow rates up to 500 ln/min (N2-equivalent), fulfilling demands from industrial users for gas supply in burner applications,

Functional safety frequency-to-DC transmitter Forming part of its FS Functional Safety Series, Moore Industries has released a SIL 3 capable SFY Functional Safety Frequency-to-DC transmitter with display to offer accurate monitoring of frequency or pulse signals in Safety Instrumented Systems (SIS). This can help provide overspeed protection by sending signals that warn the logic solver to alarm or shut down the monitored device for plant, process and personnel safety. The SFY conforms to IEC 61508: 2010 standards for safety-related applications. It is designed for use in both process and factory automation SIS, including turbine flow meters, magnetic pickups, dry contact closures, variable frequency drives, turbine tachometer generators, rotating equipment, motor and conveyor speed as well as pulse and frequency output transducers. Users can monitor frequency, period, high or low pulse width, and contact closure signals. The SFY converts the input signal to a proportional input-to-output isolated 4-20mA output ready for direct interface with a safety system, readout instrument, recorder, PLC, DCS, or SCADA system.

6

March 2017

chemical processes, and systems in the food & beverage industry. The instruments can provide local display as well as an electronic output signal and offer accuracies of ±1% RD plus ±0.5% FS up to 250 ln/min and ±2% RD for higher flow rates. Both models have four pre-installed ranges (100%, 40%, 20% and 10% of maximum FS range) selectable via a

menu, using a four-way navigation button. The user can select from one of 10 pre-installed gases – Air, N2, O2, Ar, CO, CO2, N2O, CH4, C3H8, C4H10, eliminating the the need to recalibrate for different gases. The maximum operating pressure is to 10 bar(g).

Interactive communication platforms displays live production data Bosch Rexroth has launched an interactive communications platform which, it says, is set to play a key role in the adoption of smart manufacturing. ActiveCockpit is a software solution that collates, filters and visualises data continuously, enabling operators to react quickly to any changes in production process. It provides a digital connection between operator, product, workstation and process, collating data on a touchscreen whiteboard located close to the production process. Key production data is displayed live, in both stationary and mobile forms. The system translates data into status

charts, annotation, notes and ‘to do lists’ in a format that will make sense to all staff members. The latest figures are displayed in real-time on the interactive touchscreen and can be directly analysed through annotation functionalities.

Simulation solution transforms product development An engineering simulation software solution from ANSYS can be used across the entire product lifecycle. ANSYS 18 is said expand the boundaries of simulation upfront in the development process to digital exploration as well as downstream with digital twins, expanding simulation to the operations and maintenance of products. The software can be used upfront in the development process to evaluate changes in design and downstream of the product lifecycle to analyse real-time

www.controlengeurope.com

operational data. Traditionally, early design decisions can lock in most of a product’s costs. Through digital exploration capabilities ANSYS 18 changes this as users are able to test hundreds of ‘what-if’ concepts early in the design phase to quickly assess product performance for strength, power, thermal, pressure, flow rate, electrical or a number of other performance requirements. This helps identify optimal combinations while eliminating outlying designs – saving time and money. Control Engineering Europe


Visit us! 15-16th March London ExCeL Centre Stand 1836

Transparency aT your fingerTip www.br-automation.com/factoryautomation

APROL factory automation – Smart factory solutions for your production

<

aproL enMon – Energy consumption at a glance

<

aproL conMon – Reduced downtime and maintenance costs

<

aproL pDa – Line monitoring, manufacturing intelligence – Seamless data acquisition and analysis

B&R UK & Ireland HQ: B&R Industrial Automation Ltd.

|

office.uk@br-automation.com

|

Southgate Park PE2 6YS Peterborough UK

|

+44 / 1733 371320


CALIBRATION

Proof testing: calibration by a different name? Today, the focus in the oil and gas industry is clearly on operational excellence – and many operators have already managed to dramatically reduce their operating costs. Andy Morsman discusses how this situation is affecting the world of calibration

T

he fall in the price of oil has resulted in the delay or cancellation of many investment projects. Additionally, a large proportion of the assets operating in and around the North Sea are having their operational lives extended beyond that originally designed. Of course this is not limited to off shore rigs – the gas receiving plants that were built in the mid 1980s were designed using the ‘CRINE’ contact terms – essentially every part of the installation was selected under ‘lowest cost wins’ rules. The result was the opposite of standardisation. A typical

8

March 2017

gas plant could have over 3,000 instruments, but from perhaps ten different manufacturers! Calibration of instrumentation used in regulatory control is mandatory – the introduction of ISO9001 made this very clear for any company using the standard as the basis of their Quality Processes. While ISO9001 has been revised, the task facing an instrumentation technician today is changing. In the past it was acceptable to calibrate every instrument on the plant annually when there was sufficient staff to calibrate 10 to 20 instruments every day. Now Operational Excellence

www.controlengeurope.com

(OE) task teams are rightly asking questions such as can we make the instrumentation tasks more efficient, can we change the calibration intervals based on the risk of failure or inaccurate readings – can we do more with less staff? Let’s just look at the variety of tasks being performed by instrumentation on an asset. There are the day to day operations of the plant, the pressure, temperatures, flows and levels associated with wellheads, separators, compressors, injection systems etc. Then there are the fiscal measurements – those skids that result in tax being paid – so typically Control Engineering Europe


CALIBRATION specialist flow rigs that measure the quantity and quality of oil and gas being shipped off a platform or received at an onshore plant. Traditionally, the task of calibrating fiscal and process instruments has been performed by two completely separate teams – and often separate contractors. Safety systems are increasingly becoming totally integrated with their associated Process Control Systems – although there is typically separate hardware and instrumentation, the user interface is similar or even the same. The Safety Integrated System (SIS) is designed to ensure a process goes to a safe condition if a critical event occurs that could lead to an unsafe condition or even a catastrophic event, such as an explosion or a uncontrolled pollution event. ISO61511 is used to define the design, installation, operation (and decommissioning) of safety systems, but in practical terms this means that instrumentation engineers are legally required to test the operation of the safety systems loops and record the information, making it available for inspection by a government agency. In many respects it is very similar to the regulations faced by instrumentation engineers in the pharmaceutical industry, where the emphasis is on electronic signatures, the competence of staff and integrity of data.

inspections have been carried out. Often the inspections are visual only to record the condition of the item and raise a corrective work order if necessary.

An obvious similarity There is an obvious similarity in the requirements of all these various calibrations, proof tests and inspections. So the question is can we take the multiple standard operating procedures documents, put them in an electronic form that we can carry around the platform and efficiently perform the tasks, recording the results and transfer them back to SAP or similar? While the majority of oil and gas customers are currently using separate teams or contractors to perform these tasks, you can see why several companies are ‘rationalising’ the workforce – making fewer technicians perform more and varied tasks. It makes a great deal of sense, therefore, if they are all able to use the same tools, hardware and software, to perform calibrations, maintenance checks and proof tests, and probably more importantly, from an operations point of view, to have the resulting records readily available and traceable. Andy Morsman is the business development manager for Beamex Ltd in the UK

Proof testing In terms of instrumentation, the proof testing tasks are very similar to a typical calibration, but the key difference is in the frequency of the testing. Of course, the risk-based approach to having an instrument calibrated on its due date is desirable, but it is not exactly critical. With an SIS proof test it is the exact opposite – the test intervals will very likely vary from loop to loop. Each loop is designed to meet a particular Safety Integrity Level (SIL) determined by the possible effect of a critical failure. Each element of the loop, each instrument, valve and switch will have been designed and manufactured to meet the required SIL. However, it all depends on the proof testing interval of the instrument as defined in the safety manual. So an instrument from vendor A will meet, for example, SIL2, but needs to be tested annually, but another apparently similar instrument from vendor B (hidden in the small print of the safety manual) may require testing every three months in order to meet its SIL, and often loops are designed to have similar instruments from different vendors remove any systemic failure modes. While IEC 61511 applies to SIS, there is also the task of the inspection and testing of all the control and instrumentation infrastructure in hazardous areas. The regular inspection of Ex rated switches, junction boxes, safety showers, etc. is now covered by IEC60079-17. Again the emphasis is on the legal requirement – not just to perform the inspections but to be able to prove the Control Engineering Europe

March 2017

9

Connectivity On Board For Fast and Accurate Weighing The new ACT350 weight transmitter was designed for use in automation. PLC connectivity is supported by pre-configured Device Description Files for easy integration. www.mt.com/ind-act350


CALIBRATION

to optimise efforts Find the right calibration schedule and comply with the ISO 9001 risk-based thinking approach.

R

egular calibration, based on a thorough process-risk assessment, can help stabilise production while conserving time and resources. Testing too infrequently risks undetected accuracy problems, while testing too often interrupts production. The 2015 revision of ISO 9001 brought a major change regarding the riskbased thinking approach that is found throughout the entire standard. It also impacts the calibration process. Its most relevant update is the introduction of risk management in the quality management process. Risk identification, assessment and management are key activities to maintain quality. To maintain their ISO 9001 certification, companies need to adapt to the new standard by September 2018. Recommended performance verifica�on ac�vi�es Service contract Maintenance and repair Yearly Tests Calibra�on (by service) Minimum weight cer�ficate Linearity Eccentricity (by user) Repeatability (by user) Sensi�vity (by user) Weights Weight 1 1 000 kg Class Weight 2 100 kg Class Recalibra�on interval of weights Test tolerances Sensi�vity Warning limit Weight 1 Control limit Repeatability Warning limit Weight 2 Control limit

The only way to ensure that measurements deliver the cost, quality and revenue results that help improve profitability is to choose the right procedure and schedule to periodically re-calibrate scale. However, industrial environments are tough on weighing equipment. If scales fail or deliver inaccurate measurements, there is a risk of production downtime, poor product quality, regulatory noncompliance, increased liability and potential profit losses.

Risk-based thinking Many companies are already familiar with risk-based thinking from other standards, such as Good Manufacturing Practice or safety standards. Risk is usually analysed in two parts – the likelihood (probability)

Yearly Yearly Yearly Yearly Quarterly Weekly M2 or be�er M2 or be�er Every two years

1.66 kg 5 kg 0.25 kg 0.75 kg

Example of a GWP verification for a scale including the risk assessment and recommended testing activities.

10

March 2017

that something happens and the severity (impact) if it does. Translating risk-based thinking into the calibration process starts with evaluating the impact of inaccurate weighing results on the business process. Examples could include loss of material and time, out-of-specification results, production stops, product recall and reputational damage. Additionally, the impact of wrong measurements on people and the environment has to be assessed. Finally, it is important to estimate how probable the detection of the faulty measurement is. The more accurate a weighing process has to be, and the higher the negative impact of faulty measurement, the more testing is required. A systematic risk-based review of the weighing process could be the most important step an organisation has to take to optimise calibration effort and comply with the revised ISO 9001 standard. Mettler Toledo has developed Good Weighing Practice (GWP), a standardised scientific methodology for secure selection, calibration and operation of weighing equipment, based on a thorough risk analysis. During a consultation, an expert will objectively assess risks related to the weighing equipment; will help to develop the right riskmanagement process to prevent failures before they occur; eliminate unnecessary testing when risks are low; and ensure top performance when risks are high. Full documentation for a transition to ISO 9001:2015 can also be provided. Learn more about optimising calibration in a Mettler Toledo webinar www.mt.com/ind-iso9001

Accuracy (+/-)

Taking a risk-based approach

0.01%

Medium

Serious

High

High

0.10%

Medium

Medium

Serious

High

1%

Low

Medium

Medium

Serious

10%

Low

Low

Medium

Medium

Minor

Moderate

Significant

Severe

Weighing Risk

Business impact of inaccurate measurements

The higher the required weighing accuracy and business impact of faulty measurements, the more testing is needed.

www.controlengeurope.com

Control Engineering Europe


CALIBRATION

An uncertain solution to optimise flow meter management

Calibration and maintenance can offer some moneysaving opportunities for the oil and gas industry by making better use of available data, says Craig Marshall.

C

alibration and maintenance tasks are often completed on a calendar-based timescale despite the fact that this method does not take into account the quality of the measurement. During calendar-based calibrations, a meter that is operating well within acceptable performance limits could still be removed and sent for calibration – an added cost that is not actually required. Conversely, the meter could have exceeded its calibration performance within the first few months of service – operating with an undetectable error for a significant length of time.

Risk-based approach Neither of these options are ideal. It would make more sense financially to have another system in place, particularly when the number of meters requiring calibration in any one installation is considered. One alternative method is a risk-based approach where uncertainty and the financial exposure of the measurements are used to optimise the calibration period. Essentially, this method makes use

of historic calibration data and uses this to predict the meter’s expected performance over the next year. Intuitively, this makes sense as many of the operating conditions and factors that affect performance will not change significantly from year-to-year. Therefore, the same meter, in the same conditions, should operate in a similar manner to the way it has previously. The use of historic data also allows for different stream conditions to be given different criteria for calibration, rather than a blanket calendar cycle. If a stream was particularly erosive, for example, it would make sense that the meter would change, exceeding its calibration performance limits quicker than a stream which is less erosive.

Using uncertainty Using previous history alone is a valid method, but coupling it with uncertainty in measurement, and hence financial exposure, can create a powerful tool for optimising calibrations. Uncertainty analyses are essential to determine whether meter measurement systems are capable of meeting performance targets, but they can also be used to determine where improvements in the measurement can

be made. There are different sources of uncertainty that all contribute to the final value. Some are constant over time, such as calibration uncertainty, and there are some that accumulate over time, causing measurements to drift from calibration values such as bearing wear on mechanical devices. Assessing the contribution of each source to the overall measurement uncertainty estimate allows end-users to better reduce uncertainty by focusing efforts on the largest source, or sources. In terms of calibration optimisation, sources of uncertainty based on meter drift are important as they determine the financial exposure faced by the company through changes in measurement performance from calibration conditions. If the calibration timescale question is then answered in terms of financial exposure and cost, it provides a more useable criterion on which to base the calibration. For example, knowing the additional exposure caused by drift and the average cost of calibration/maintenance of equipment over time, it is possible to evaluate the optimal calibration period – and hence the most financially efficient timescale to calibrate the meter and minimise costs. The above method essentially bases the calibration and maintenance timescale on financial risk as opposed to a routine calendar period. Another advantage of the risk method is that it can be updated every time a new calibration data set is obtained. By this method, the produced result becomes increasingly statistically relevant as it is based on the latest data. By using data readily available for most end-users it is possible to set calibration timescales of equipment based on the financial risk from the measurement itself. Considering the number of measurement instruments in use today, this could save a significant amount of time, effort and money. Craig Marshall is a flow measurement consultant at NEL, part of the TÜV SÜD Group.

Control Engineering Europe

www.controlengeurope.com

March 2017

11


MACHINE VISION

Machine vision lessons learned

T

wo recent automotive case studies demonstrated the value of high-powered machine vision applications, as discussed at NIWeek 2016. A high-speed, clear, plastic bottle inspection also demonstrated how processing power and smart software can help speed integration of machine vision applications. The case studies were part of the Vision Summit, which looked at high-speed, high-resolution continuous motion inspection.

Fast-moving web blurs defects

Inspecting thread quality in moulded, clear, plastic bottles can be a challenging vision application.

Machine vision is not as challenging as it used to be. Setup, testing, and start-up can be easier and faster than has been the case in the past. Mark Hoske reports on application advice shared at NIWeek 2016.

In the production of automotive 20ftwide rolls of ceiling foam, human-eye detection in a 600ft/minute web process was impossible, especially with defects in the foam or attached fabric as small as 0.020in, explained Craig Borsack, president at G2 Technologies. Human inspectors were found to be missing tears of over a foot in length in the foam. The vision system integrator used a fourth-generation automated web inspection platform, which evolved from reflective memory,

Four-channel compact GigE Vision system The new EOS-1300 from ADLINK is a compact embedded vision system designed for use in demanding machine vision and automation applications requiring multiple cameras, high computing power and time deterministic solutions. The EOS-1300 includes 6th Generation Intel Core processors, four Gigabit Power over Ethernet ports, four USB 3.0 ports, and FPGA-implemented digital I/O functions, all in a small footprint. The EOS-1300 is said to be suited to use in industrial applications. Its robust construction includes extended

12

March 2017

shock and vibration protection and long-term component availability to eliminate the failure points commonly found when nonindustrial PCs are used in industrial vision applications. The EOS is also tolerant to any instant power loss making it suitable for use in harsh environments. In addition, all I/O connectors are located on one side of the unit for easy installation. The processors used in the EOS provide the computing power needed for demanding machine vision applications. Four independent PoE (power over Ethernet) ports deliver

www.controlengeurope.com

data transfer rates up to 4.0 Gb/sec, allowing a wide variety of highresolution Gigabit Ethernet cameras to be used even in high throughput inspection applications. Advanced encoder functions are also provided, making the system perfectly suited to conveyor applications in production lines. Stemmer Imaging has developed an embedded lockdown utility which can be easily deployed to make the system tamperproof. Control Engineering Europe


MACHINE VISION local storage, to field programmable gate array (FPGA)-based, real-time image processing. “The integrated technologies provided automatic defect detection, marking of defect areas and thickness detection, using off-the-shelf components,” said

grease, and scratches. Also in the field of vision were varied rivet types. The customer wasn’t interested in changing the lighting and focused on changing the algorithms for image analysis. Within the field of interest, the inspection software focused on

with an image acquired every 10° of rotation. Within the area of interest, five-measurement pattern matching was used. “The challenge was that the thread shape changed with angle, and the top of the bottle looked similar to threading,” said Le. “Pattern matching was used so the software could learn the threads as the template, since views varied depending on rotation. Three pattern matches were used, and the software selected the top score of all the matches. Rakes were used to detect the edge detection, looking at the top edge of the threads and the top of bottle, calculating the distance between two points. Continuous acquisition identifies features and dimensions, variations as small as 0.05in,” concluded Le.

The challenge was that the thread shape changed with angle, and the top of the bottle looked similar to threading. Borsack. The FPGA handled image processing, acquiring the 20ft wide image at 70 MB/sec bandwidth. A multi-threaded process was used, with information going to the HMI to apply a binary mask and edge detection, which is used for continuous width measurement. A second thread was used for image defect processing and another for image storage to a database for data analysis and recording of defect location and images of the defect. The installation used nine cameras, in three sets of three, to provide the unique orientations that were required to highlight the defects. This solution has reduced waste as it allows defects to be highlighted and the material to be offered at a discount. “The information provided allows customers to easily work around the defects in the material,” said Borsack.

circular edge detection from the inside out, dark to light, ignoring a center hole, and providing an accurate inspection, he said.

Bottle thread inspection

Edge detection, pattern matching

There can be challenges with Mark Hoske is content manager inspecting thread quality in moulded, at Control Engineering. clear, plastic bottles, explained Vivian www.controleng.com Le, application engineering with Graftek Imaging. Sometimes hanging flash can be mistaken for multiple threads. The area of interest Modern OLED display, clearly indicating actual and was an ‘S’ measurement, totalized flow, type of gas, desired flow units an industry Gas flows: 50 mln/min - 500 ln/min standard Multi Gas / Multi Range measurement High accuracy, digitally calibrated for the Wide rangeability (up to 1:100) distance

FCC Indiana rebuilds clutches for Honda. The vision system originally used by the company was not always able to make the distinction between dirt and scratches, according to Tanner Blair, embedded systems engineer at National Instruments. Issues with the vision system on this application included inconsistent lighting, metallic surfaces, dirt,

between the top of the bottle to the top edge of the top-most thread. The bottle was backlit and imaged the full 360°,

Control Engineering Europe

March 2017

The intelligent alternative for (volumetric) VA meters

Virtually independent of T and P variation (mass flow) Alarm and counter functions RS232, Modbus-RTU and 0-5 V interface

13

www.massflow-online.com Online 24/7 webshop

-

2 Business days delivery

-

Cost-effective, net prices


MACHINE VISION

Looking on the bright side A recently published whitepaper looks into the causes of image brightness variation in machine vision applications and offers some possible solutions.

M

achine vision systems are used to make many critical measurements as part of a quality control approach where it is essential that the measurements are both accurate and repeatable. Since machine vision measurements are made from an image of the object on the sensor rather than the object itself, it is important to optimise every element of the imaging process to achieve the best possible image. Illumination is vital in the generation of an image with good enough quality for processing and measurement and LED light is most commonly used for this in machine vision applications. While the type, orientation and wavelength of the illumination can all play a crucial role in delivering good quality images, stability and repeatability are absolutely essential for a vision system to perform

14

March 2017

consistently, so undefined variations in illumination are unacceptable. The image brightness is a function of the amount of light arriving at the image sensor. Clearly the illumination source plays a critical role in this, but other components in the optical system, and external factors will have an influence too. Any imaging system which has thresholding, or is looking for subtle features such as surface defects or colour inspection will benefit from improving the repeatability of the light levels. Consistent illumination allows threshold parameters to be set closer to the background level, allowing for the detection of finer features, and increasing the contrast between target features and the background. Applications involving dimensional checking or the detection of gross defects, such as the presence of features, will be more tolerant to variations in illumination intensity.

www.controlengeurope.com

LED lighting control methods Continuous lighting: Continuous operation is the simplest mode of LED lighting control. The light is on all the time and the light intensity is proportional to the current supplied to the LED by the lighting controller. The maximum light intensity achievable is 100% of the LED manufacturer’s rating. Pulsed lighting: Pulsed lighting control offers a number of benefits. In pulsed mode the lighting is switched on only when needed and the controller receives a trigger signal when a pulse is required. Pulsing makes it possible to freeze the image of moving objects, making it suited to high speed imaging applications. An additional benefit of pulsed operation is that it is possible to obtain more than 100% brightness from an LED by driving it with more current for short pulses. Intelligent lighting: The intelligent lighting approach, taken by Gardasoft’s Triniti intelligent lighting platform, is said to take lighting control to a new level by networking the LED lighting, camera and imaging software to provide an integrated application with a single graphical interface for set up and control.

Issues affecting output Controlling the light output from the LED is a fundamental requirement for maintaining constant image brightness and there are a number of factors which affect light output. These include: Age of the light: The light output from an LED will deteriorate over time and, typically, an LED would be replaced when output falls to below 70% of its initial rating. The rate of decrease of light intensity will therefore be a function of how the LED is driven for a given application Control Engineering Europe


MACHINE VISION and this is more difficult to predict. Temperature of the light: Gardasoft has identified that as LEDs heat up from 25°C to 90°C brightness can drop by up to 40%. This is a huge variation that may not be seen during system commissioning, but which can cause variability during normal running. Variations in the drive to the light: LED light output is proportional to the current through the device, not the voltage, so all LED device manufacturers specify that current control is advised for efficient use. Making use of the characteristic profile of output vs current (provided by the light manufacturer) can enable very linear brightness control, even for high levels of overdriving. Accurate regulation of the current is essential, since 1% variation in current can lead to 1% variation in light intensity. Environmental effects: It is possible for the environment of the machine vision system to affect LED output. Another factor to consider is condensation when the LED is cold. This may affect light output due more to temperature (cooling) effects than absorption of the light by the condensed liquid.

Other factors While accurate regulation of light intensity is critical in maintaining image brightness, there are a number of other factors which affect the amount of light reaching the image sensor. These include: Ambient light: Most machine vision systems try to exclude ambient light from the camera view to remove this as a variable factor. However, this is not always possible. Reducing the exposure time will reduce the effect of ambient light. Machine vision lighting generally needs to be proportionally brighter, possibly by using the overdriving feature of a lighting controller. Keeping the exposure time the same but increasing the lighting brightness Control Engineering Europe

allows the iris to be closed, also reducing the effect of ambient light. Sometimes it is possible to separate the two types of light. For example, if visible light is needed for the normal working environment, it may be possible for a robot vision system to use near infrared light. Using narrow range visible lighting, such as LEDs or filtered conventional lighting, prevents the visible light affecting the infra-red imaging. Variations in lighting and camera exposure can also affect imaging brightness. When using pulsed lighting, it is essential to get the timing of the lighting pulse and the camera exposure perfectly aligned in order to optimise image brightness. If they are not in alignment, then the image will appear dark or in the worst case, no image will be seen. By adjusting the lighting pulse delay, the two can be brought into alignment. The Triniti intelligent lighting system, for example, enables the user to set up the timing for a whole machine vision system, with cameras and strobe-mode lighting, all from one place. As well as dust, dirt, liquids or vapours sticking to the LED, it is also possible for them to adhere to the surfaces of the lens system used, reducing light throughput, which will, in turn, will have a detrimental effect on overall image brightness. Compensation can be achieved either by increasing the camera gain or in the software processing of the image. This can include adjusting thresholds and other parameters based on the operator’s judgement.

www.controlengeurope.com

However, starting with an image which is not at the ideal brightness means that the image processing could result in less dynamic range in the final image. It is usually better to maintain the brightness of the original camera image, rather than compensate in the image processing with possibly compromised performance.

Conclusion When considering variability in image brightness it is essential to think about the tolerance in brightness that is acceptable – either within and individual system or from system to system. Having decided what measures to take to achieve this tolerance, careful consideration should be given to what extra performance or reduced maintenance times could be achieved by taking additional measures to maintain constant image brightness. A copy of the original whitepaper – Guaranteeing consistent illumination can be downloaded at: www.gardasoft.com/Downloads/ March 2017

15


BIG DATA MANAGEMENT

Developing a process to make more informed decisions Big data can help synthesise information to help users make informed, intelligent decisions. Developing a process that benefits everyone in the plant from the top down is crucial, says Patty Feehan.

T

look at trying to estimate a plant’s production. The company would need information about productionline capacity, warehouse capacity, personnel utilisation, and sales forecasts, for example. All of this data comes from different sources and individuals throughout the company may be required to work many hours to build reports in different formats.

Simplying the process

here is a tremendous amount of information available today and dissecting it can be overwhelming, especially when the information comes from different sources and may not be accessible to everyone who needs it. Big data can help synthesise this information and provide users with the exact information they need to make informed, intelligent decisions. However, knowing how

to capture the big data and make it useful is vital. In an enterprise manufacturing system, it is important to be able to see the whole picture when evaluating a problem. For example, an operator on the plant floor can use supply chain information to understand upcoming production schedules, and executives can use the production capacity of their various plants to shift production overloads from one location to another. Just

Manufacturing business intelligence software is designed to help simplify this process. The business intelligence systems available today provide the ability to bring real-time and business data together into a centralised location and allow users to align data collection with goals and objectives. Reports and information generated from these systems are standardised and repeatable and can be available enterprise-wide. Business software systems provide data aggregation,

This schematic shows how the abstraction layer can be used to consolidate data from different sources. Instead of having several bits of data in different locations, all the information about the equipment is combined within a filler object. Data is aggregated into useful information that will now be accessible enterprise-wide.

16

March 2017

www.controlengeurope.com

Control Engineering Europe


BIG DATA MANAGEMENT actionable alerts, and predictive analysis. They allow users to optimise the information and create business decisions quickly and intelligently. The business intelligence software systems include both data collection and visualisation, but the key item that brings everything together is the abstraction layer. This helps normalise the manufacturing process data and make the information relevant to the operational teams. This software is normally a configuration utility or modeling tool which allows you to model and contextualise data pulled from the many disparate systems throughout the manufacturing process. ISA-95 modeling standards were followed in the development of these tools and so it is best practice to utilise the same standards when developing a system. For example, when collecting data from a production process, such as filling, there may be several types

of fillers enterprise-wide, and the users may want to calculate overall equipment effectiveness (OEE) for each. Even though the various fillers may have different inputs and outputs they have the same basic properties. When a filler is defined, the user will want to know the capacity, number of items filled, and downtime. The user will also want to calculate planned production time, which would require production schedules and personnel availability. Lab or quality data could be used to help calculate rejected or scrap material.

A complication A major complication with making big data useful is the expertise that is needed from so many different resources throughout the manufacturing process. Organising the data within these modeldriven software systems provides a collaborative and secure environment

were data is accessible to everyone who needs it. Users will have the ability to access the data they need through predeveloped dashboards and reports or via ad-hoc capabilities using Microsoft Excel and other client tools for creating trends and charts. Successful implementation requires a team effort and all users need to buy into the system. Find champions in various areas of the process to define business requirements and to model the data so it makes sense to everyone. It is best to start small and introduce new data or new functionality slowly. This will ensure the system will be useful for everyone – especially when the company starts considering multi-site implementations. Patty Feehan is a senior developer at Maverick Technologies, a US-based automation solutions provider. This article originally appeared on www. controleng.com


BIG DATA MANAGEMENT data from multiple sources, data analysts can draw actionable insights and useful, decision-making information. A key outcome is being able to predict future activities.

As SME big data strategy

Addressing the big data knowledge gap Leor Barth warns that SME manufacturers need to properly identify and analyse actionable data to ensure they remain competitive.

M

anufacturers are often told that insights from big data analytics will offer a competitive advantage. Research published at the end of 2016 by Warwick Analytics, on behalf of the Alan Turing Institute, found that tools such as ERP, Business Intelligence (BI) and Customer Relationship Management (CRM) is helping manufacturers ‘identify hidden information which can be used by organisations to provide valuable insights.’ However, for many new adopter manufacturers – particularly SMEs – there remains a lack of knowledge about how this works in practice. Half of respondents to the same survey stated that they did not clearly understand the difference between business intelligence, big data analytics, and predictive analytics. Essentially, big data is the large volume of data – structured and unstructured – that inundates a business on a day-today basis. Big data or rather, big data

18

March 2017

analytics in manufacturing is about more than the amount of data in the industry. In the words of Matthew Littlefield at LNS Research ‘Big data analytics in manufacturing is about using a common data model to combine structured business system data with structured operational system data with unstructured internal and external data to uncover new insights through advanced analytical tools.’ Big data analytics is not all about technology – it’s also strategy, and if SMEs have both, they are off to a good start. So, for companies looking to better understand how big data can help their business, the first step is to see how much data the company has at its disposal. Most companies collect volumes of process data but typically use it only for tracking purposes, not as a basis for improving operations. For these players, the challenge is to invest in the systems and skillsets that will allow them to optimise their existing process information. For example, by centralising

www.controlengeurope.com

For SMEs looking to implement a big data strategy, the following steps provide a good structure to get started: • Define a strategy – Set priorities and goals and build a roadmap – define what you’re looking to achieve. • All aboard! – Get the right players in your court, all C-level management, IT and an outside consultant if needed. • Set / define CSFs – Critical Success Factors (CSFs) determine the value of your project from the outset, so establish, implement and agree before you begin. • Data requirements – Spell it out beforehand, ask (and answer) questions like: What’s the volume of data received? Who/what will be the recipient of this data? How do we retrieve it, process it, manage it, secure it? • Structure is key – Use open, flexible and scalable tools that yield quick (but viable) results. • Shop wisely – There’s no ‘one tool fits all’ so carefully scan and evaluate the market and its many vendors. • Pilot phase – Before you embark, it’s recommended to sponsor and run a pilot project to build management’s confidence and validate your project strategy and roadmap. Big data analytics is set to have a big impact on the manufacturing world when it is really put to good use - when it is analysed, not just collected and stored. When companies come head-on with vast amounts of data, they need to act fast, generating the ‘right’ data – the actionable data. Manufacturers need data to drive growth, push performance in an ever more competitive industrial environment. Leor Barth is vice president of R&D at Priority Software. Control Engineering Europe


INDUSTRY NEWS

First cyber factory training

FACILITY IN THE UK

T

Professor Mehmet Karamanoglu, Design Engineering and Mathematics Head of Department at Middlesex University (left), with Babak Jahanbani, Head of Learning Systems at Festo Didactic (GB) in the new Middlesex University Cyber Factory.

he UK’s first Cyber Factory training facility has been installed at Middlesex University London’s new Ritterman Building, coinciding with the £18m building’s official opening on 2 February. The Cyber Factory has been fitted with the latest technology for smart factories, with training equipment supplied by Festo. Commenting at the launch, Professor Mehmet Karamanoglu, Design Engineering and Mathematics Head of Department at Middlesex University, said: “Middlesex University prides itself in providing the best facilities to support its students. Everything we have developed over the years has been specified to industry standard. Expanding our provision in automation, smart technologies is no exception. “We have partnered with Festo since the early 1990s and have kept that partnership active for over two decades. We are grateful to Festo for bringing along some of its partner companies to enhance our provision, such as Siemens, as all of our automation labs are now fitted with technologies provided by the company. The benefit for our students, of working with such industry partners, is immense and will have a huge impact Control Engineering UK

on their employability.” Babak Jahanbani, head of learning systems at Festo Didactic (GB), said: “Festo and the University share a common goal to ensure that the key skills necessary to deliver the full potential of industrial automation are

being developed alongside advances in the technology. We have collaborated closely on a number of topics and also partner the University for the annual Mechatronics Competition of World Skills UK.” The training equipment supplied by Festo comprises a comprehensive sixstation table top unit (two production cells of three stations), as well as two bridging stations that enable an Automated Guided Vehicle (AGV) to deliver the logistics/transport between the cells. Fully automated, the facility also features an energy monitoring system; RFID – a method for tracking components and goods by means of tags which transmit a radio signal; a digital maintenance system; augmented reality; near field communication (NFC) – which enables any object equipped with a chip to exchange information directly without the need for a computer or communications network; and a manufacturing execution system (MES). Essentially, this all means that the training facility is Industry 4.0 ready.

Emerson opens oil and gas education facility Emerson has opened new education facilities at its Solutions Centre in Aberdeen, to provide a range of training courses and competency assessment programmes, enabling engineers and technicians to enhance their automation technology skills to help maximise operational performance of the assets on which they work. “North Sea operators are increasingly looking to achieve more from existing assets, and by upskilling their workers it is possible to maximise the potential of installed automation technology and create operational performance improvements,” explained Mark Boyes, business director Scotland at Emerson Automation Solutions. “Our new education facilities in Aberdeen provide training that will help engineers and technicians working in the industry maintain and enhance the skillsets needed to drive

www.controlengeurope.com

long-term efficiency improvements today and in the future.” Training and competency assessments will focus on the installation, operation and maintenance of automation technology. The facilities include a live liquid flow loop – the first of its kind in the UK – which has three metering streams that can be run in a number of different configurations. This enables training in all aspects of fiscal measurement. Emerson’s training enables operators to assess the competency of their workers in an onshore environment. Replicating real working conditions in this way makes training much safer and more cost-effective. The Solutions Centre is accredited for delivering the level 3 Scottish Vocational Qualification (SVQ) in measurement processes and for delivering a number of courses and tests to ECITB standards. March 2017

UK1


INDUSTRY FOCUS: FOOD & BEVERAGE

ACHIEVING MORE

accurate fryer control For a producer of tortilla chip snacks the complex interaction between controlling the temperature, cooking time and oil replenishment of its fryers is critical to maintaining the high quality of its tortilla chip snacks.

V

EGA radar sensors were already in use at the site, monitoring the cooking of raw corn in kettles, before it is softened, processed and shaped into the triangular tortilla shape. The chips require precise frying to achieve a crispy, crunchy texture. They are cooked in vegetable oil inside a large fryer. The tortilla chips are fed in at one end, via a conveyor, floating through the hot oil on a recirculating current, and are then lifted out by another conveyor at the other end. Afterwards they are checked, seasoned and immediately bagged. During this cooking process the quality of the chips is controlled via a complex mixture of cooking time, temperature, recirculation and oil volume. The oil depth and temperature also needs to be maintained and replenishment needs to be constantly and minutely controlled. The fryer runs at an ideal temperature of 186ºC and there is a likelihood of some build up from the oil and other deposits. The oil also changes in density and electrical properties through heating and product contamination. Ideally, any level measurement system needs to have no moving parts and should be very easy to be regularly cleaned. The oil level from the fryer is fed into a complex PID algorithm, which seeks to maintain both oil temperature and quantity. The level measurement range is over 220mm and takes place inside a small chamber off the side of the large frying vessel,

UK2

March 2017

with each mm representing 100’s of litres of cooking oil. It is important that the whole range is measured, both during charging of the oil and heating of the process as well as in full production. Heat input needs to be carefully controlled in any fryer system for optimal efficiency and safety.

The small dimensions of the 80 GHz radar make it suitable for small vessels in the food industry. It also has hygienic fittings and approvals.

Finding a solution Engineers at the plant initially tried a high temperature guided wave radar. However, all guided wave radar sensors have a physical performance restriction when working with poorly reflecting liquids – such as oil-based products. Oil is a poor reflector and so is not easily detected at extremes of the range, resulting in only the middle part making an accurate measurement A capacitance probe was installed in its place on the fryer with some success; it has no moving parts and detected the oil over the full measuring range. However, the oil changes in dielectric strength (conductivity) as it heats and cooks the chips – due to changing carbon levels in the oil – which directly affects the level accuracy of this type of probe. Cleaning a rod-based device wasn’t ideal either, but offered a better solution than a mechanical float system. The capacitance unit worked acceptably and, until recently, this was the method used - with a ‘best fit’ of oil dielectric in the calibration to maintain control. Ideally, the engineers wanted something more accurate. At the same time, VEGA was looking at new challenges for its pilot 80GHz radar VEGAPULS 64, and the

www.controlengeurope.com

fryer application was considered to be a good test of its abilities. Engineers at the factory were interested in how accurate this device could be, whether it could deliver the incremental improvements they were seeking to help increase efficiency. When the radar was installed and set up, one thing was immediately apparent; it had the sensitivity to detect the oil almost as soon as the filling began. Sharp focusing meant it was also able to measure the level of liquid inside the chamber over the whole range with no interference from the side connections. The accuracy of control was very consistent and also ‘to the mm’, with no effect from the ‘dielectric’ change. Feedback from this and other pilot units, showed how VEGA could further enhance the performance of its antenna systems, which today are fully implemented on its production devices. The consistency and better control provided by the VEGAPULS 64 now enables the factory to optimise both energy and oil input to an even greater degree and it is now looking to install the radar sensors on all of its fryers. Control Engineering UK



INDUSTRY FOCUS: FOOD & BEVERAGE

Drum motors: conveyor system benefits Chris Middleton explains why you should consider replacing geared motors on food conveyor systems with drum motors.

C

onveyors are the veins that carry the lifeblood of a food facility and it is critical that they perform reliably and at the highest level of efficiency. The majority of food processing plants still use conventional geared motors in their conveyor systems. However, this integral piece of equipment has now evolved into a space saving, sealed for life, maintenance free drum motors. Despite this, gear motors still remain popular. They are perceived as being more cost-efficient. Up front, many gear motors do cost only a fraction of the price of drum motors, but with drum motors being more hygienic, compact and energy-efficient, they can actually save money in the long run. For food processing facilities looking to make the switch to drum motors, there are some things to consider. Traditional gear motors tend to be bulky and have exposures and crevices in their design, allowing water to seep in and bacteria to thrive. To make these motors USDA compliant, manufacturers have sealed the motor, gearbox and bearings inside a stainless steel shell. A well-designed drum motor will be encapsulated in a smooth outer casing that is conducive to hygiene, can withstand high pressure hosing and will require minimal maintenance.

Materials are also a critical factor in equipment design. A plant’s equipment must be durable enough to withstand everyday exposure to products as well as routine sanitation programs. Stainless steel is popular among drum motor makers because of its durability and resistance to corrosion. The best design will be compact in order to capitalise on a facility’s space. Traditional gear motors extend beyond the belt with chains, pulley and guarding. This adds to the risk of exposure and also reduces the number of conveyors that can be installed. Conversely, a streamlined design that fits within the confines of the conveyor system will maximise space and increase production.

Efficiency In the long run, energy-efficient equipment will pay for itself. Drum motors tend to be more energy efficient than conventional motors. Although the savings may appear insignificant, companies could potentially save thousands of pounds in annual energy bills, depending on the number of motors in their factories. Increased energy efficiency also typically means that less heat is emitted from the equipment, which can translate into lower air conditioning costs and a more comfortable work environment for workers. Another benefit for companies

producing food products that are sensitive to temperature fluctuations is how limiting heat exposure could help maintain product integrity. Certain drum motors are also embedded with encoders that can provide data on how many rotations it makes in a given period. This will monitor throughput and can provide critical insight to increase efficiency during production.

Compliance It goes without saying that food companies should be taking the necessary measures to ensure they are in full compliance with industry standards, such as those set by the USDA and FDA. With the increasing evolution and enforcement of food safety laws, it is important to verify that drum motor certifications on new products are valid and up-to-date. Drum motors should carry a standard sealing certification, which indicates that the equipment is able to withstand wet and high-pressure washdown applications. It is also beneficial to check whether there are warranties for the drum motor’s components and the drum motor itself. There are also some important factors to consider when transitioning from traditional gear motors to drum motors. While it requires proper research and budgeting, the return on investment can be significant, extending beyond just the bottom line. Chris Middleton is managing director of Interoll UK.

Drum motors can offer many benefits in food industry conveyor applications.

UK4

March 2017

www.controlengeurope.com

Control Engineering UK


VISION. RIGHT. NOW. Innovative products, intelligent consulting and extensive service. Solve your machine vision projects with speed and ease by working with STEMMER IMAGING, your secure partner for success. Share our passion for vision. www.stemmer-imaging.co.uk


INDUSTRY FOCUS: FOOD & BEVERAGE

A GLIMPSE INTO THE future of the food plant

Ever wondered what the food manufacturing environment might look like in the future? Suzanne Gill spoke to some key industry vendors to get their thoughts.

I

n a market where consumer purchasing intelligence is becoming increasingly powerful, where internet ordering is growing rapidly, and where logistics are becoming highly dynamic, food manufacturers need to set their sights on becoming more agile and flexible to satisfy the changing and increasingly demanding requirements of both retailers and consumers. The UK currently ranks as the second worst country for productivity among the industrialised nations and, if we could raise productivity to the levels of the USA, each household would be £21,000 per year better off. “For the machine designer it is simply not possible to adapt many existing solutions to achieve the step change in productivity that manufacturers are looking for,”

UK6

March 2017

said Martin Leeming, CEO at Trakrap. “The whole system architecture needs to change from the bottom up and you quite literally have to go back to the drawing board and start again.” Leeming agrees with many others, saying that the answer lies in the digitisation of machines and processes and in the acquisition and proper use of big data. “A big step change in productivity improvement does not lie in simply undertaking the same process with less people. It requires companies to rethink other things like speed to market, asset utilisation, energy reduction, reduced changeover times and making to order, not to stock,” continues Leeming. However, he admits that there are a lot of barriers that need to be broken down to achieve this – only 48% of manufacturers claim to be ready

www.controlengeurope.com

for Industry 4.0 and it is thought that 40 to 50% of existing machinery will need to be replaced to make the step change the country needs. Keith Thornhill, business manager – Food and Beverage at Siemens UK & Ireland, says that the current rate of technological change within the industrial environment shows no sign of slowing, and as such, accurately predicting how the food industry will be implementing technology solutions in the middle of the next decade, is not an exact science. However, he has been able to highlight some definitive trends which look set to influence progress in the next few years. “With competitiveness and changing consumer demands increasingly being seen as industry drivers, it is clear that automated processes – whether they be physical or digital – can assist food manufacturers to deliver cost effective, repeatable and safe products,” he said. However, to achieve this, there is a need to focus on critical areas of productivity, efficiency and agility. By looking to maximise effectiveness it is possible to take significant steps towards creating the technological-driven solutions necessary for the competitive years ahead. According to Thornhill, there are some identifiable areas that require particular attention. He advises food manufacturers to set a clear mid- to long-term vision of digital integration so that factories can adopt maximum transparency across all areas of product development and production. “This includes a need for closed-loop data integration from the Enterprise Resource Planning (ERP) level right through to production, individual asset performance and back,” he said. “Detailed Control Engineering UK


INDUSTRY FOCUS: FOOD & BEVERAGE machinery and automation specifications also need to be agreed, so that data transparency can underpin strategic decision making and productivity targets can be addressed.” Thornhill also advises that companies seek to reduce obsolescence, both to reduce costs and ‘future proof’ available assets.

Investment drivers Andrew Macpherson, food & beverage manager at Festo, agrees that the pressure from retailers and consumers to safely produce more food, of greater variety, at sustainable prices and high quality, is driving investment in automation. He predicts that the level of automation in the food sector in 10 years will completely change the food production environment. “In the future, food production machines from different manufacturers will need to share data and communicate with each other, using open communication protocols, and will be able to make necessary adjustments automatically. These machines will also be able to tell the operator if a problem is developing, whether performance is dropping or energy consumption is rising, and more importantly what needs to happen to fix it,” said Macpherson. Food manufacturers, suppliers and customers will also become more closely linked. Data relating to consumer demands and trends will be shared to ensure that production is adjusted based on real time demands. “All of this available data will bring Control Engineering UK

new challenges, relating to security and how to interpret it. Engineers will have to work more closely with their IT colleagues to get the maximum benefit for their business,” he said. “As the link between the consumer and the food manufacturer gets closer, production machinery will need to become more flexible and able to change to smaller batch quantities quickly.” Macpherson believes that the materials used in the production machines of the future may also be completely different to what is being used as standard today. Take, for example, the development of nanocoating technologies such as plasma coating. “This could allow standard materials to be coated to provide increased bacterial resistance, potentially replacing stainless steel in the food sector,” he said. New technologies will also emerge that help extend the shelf life of food products. These might involve new processes or the use of different gases – for example ozone. “This will raise further challenges for food machinery producers, such as how to handle these new processes while ensuring it is a safe working environment for their staff,” concludes Macpherson.

t/h

°C

kg/m3 kg/h

fact

Constant product quality through reproducible mass and density measurements under all conditions OPTIMASS series – technology driven by KROHNE • Complete portfolio of Coriolis mass flowmeters for food & beverage applications • Entrained Gas Management EGMTM: immune to gas entrainments/two-phase flows, e. g. raw milk, ice cream, dough, sirup, tomato concentrate, spinach, meat, margarine, mayonnaise • Min. flow 0,3 kg/h/0.7 lb/h; superior density measurement • Dual or single tube designs for shear sensitive products, drainable, opt. heat tracing • EHEDG, 3A, FDA, CIP/SIP

Flexibility “The requirement for efficiency monitoring, product tracking and traceability, faster, safer, lower cost processing and packaging solutions and plant flexibility are > UK8 March 2017

UK7

°Brix

More facts: www.krohne.com/egm


INDUSTRY FOCUS: FOOD & BEVERAGE growing across all sectors of the food industry,” said Chris Evans, marketing and operations group manager at Mitsubishi Electric – Automation Systems Division UK. This has resulted in rapid advances in automation technology – one of the biggest is the increasing use of small articulated arm and SCARA style robots to perform repetitive tasks. “Mitsubishi hosts regular seminars on the use of robotics in the food industry and attendees are usually surprised by the low package cost, combined with ease of use and simple integration,” said Evans. “Platform integration is a major theme when it comes to delivering multiple benefits. Users expect an incremental improvement when replacing individual automation components such as drives, servo systems and PLCs. However, if you take a holistic view of your automation system, then you can make significant gains.” For example, using a powerful PLC to manage a production line can coordinate everything from guided operator pick to light systems that improve quality and throughput for manual workers, to conveyor systems, process plant, ovens and chillers, high speed packaging machines and robot cutting, packing and stacking lines. “The future for automation in the food industry is most certainly going to be centred around increased plant and automation platform integration, simply because the universal advances that producers require cannot be delivered without it,” concludes Evans.

UK8

March 2017

Personalisation trend The days of mass production appear to be coming to an end, with consumers demanding more highly personalised products. The automotive industry gives us a good example of where this is happening today. It is already possible to choose your own car down to minute detail, and it will be made for you on a flexible production line. The same choices are now also coming to the food market. It is already possible to design your own chocolates and have them made for you on a 3D chocolate printing machine, while here in the UK Boomf, a personalised confectionery producer, has seen business increase by 600% in its second year. “Great though all this choice is for the consumer, it often represents a challenge to the traditional business model which is built on high volume, mass production. If you want to survive then you need to differentiate, and do something different,” said Steve Arnold, business manager Food & Packaging at SMC. New technologies are emerging to help

businesses meet the need for more flexible, agile, smart manufacturing. Data appears to be the key – first collecting it and then being able to interpret it – to allow businesses to make more informed decisions. “New buzz words like ‘cyber physical systems’ are already allowing machine builders to transform a humble electric drive or process valve into a smart sensor, allowing manufacturers to purchase standard production lines but with the capability to run many different product formats in large and small batches and still turn a profit,” said Arnold. “The automotive sector has been doing this for many years and the challenge for the food sector is to embrace this new technology too, allowing for greater flexibility and agility to meet changing consumer demands.”

Keeping pace According to Mike Wilson, sales & marketing manager at ABB Robotics UK, robotic automation will become a more attractive solution for both production and packaging applications in the food and beverage industry as they offer a solution to allow producers to keep pace with the demand for greater variety, quality and faster delivery – producers need to find new ways to keep pace. “The UK food and beverage sector has

Control Engineering UK


INDUSTRY FOCUS: FOOD & BEVERAGE been investing more in automation in recent years to help it survive the supermarket price war. However, further investment in automation is needed to ensure continued competitiveness,” said Wilson. “Other countries are automating at a much faster rate, with the most productive country in the food and beverage sector being Finland.” According to data supplied by the International Federation of Robotics, Finland adopted 55 more robots per 10,000 employees than the UK in 2014. As factories around the world start to ramp up their use of smart technology, UK food manufacturing needs to focus its mind on the subject too. “As the tools which perform the various processes involved in smart manufacturing, robots are the primary technology that should be considered before implementing a full Industry 4.0 strategy,” advises Wilson. “As cyber-physical systems that use data to operate and communicate with other elements in the factory, robots are one of the most prevalently featured automated machines in any Industry 4.0 model. They are also beneficial to anyone wishing to overcome the typical challenges facing food producers, such as inconsistent product quality, flexibility and reliability problems. Robots can also perform processes which have been identified as high risk by the Health & Safety Executive (HSE).”

A high labour economy “The UK food industry is a high labour economy and I don’t really see that changing any time soon,” said Robert Brooks, European industry marketing manager - Food & Beverage for Omron Europe. “However, we are seeing a continuing move towards ‘redeployment’ of the workforce for better resource efficiency and this is being driven by both cost and need.” Brooks highlighted the juxtaposition now facing industry – that of a skills shortage coupled with the need for greater automation driven by a need for greater productivity and flexibility. “This means that any automation solutions, such as robotics, will need Control Engineering UK

to be easy to use,” he said. Brooks cognitive thinking systems being believes that there is now an increasing employed to make sense of the appetite within the food industry increasing amounts of data being for robotic solutions. Around-themade available from smart systems. AI process robotics, such as autonomous technology could be used, for example, intelligent vehicles (AIVs) will also to help manufactures link consumer form a fundamental part of the future buying patterns to the weather and food manufacturing plant. Initially environment. There are many areas applications for such solutions, which where cognitive thinking solutions could are designed to dynamically move find applications in the food industry.” For now, though, Brooks advises materials around a plant without that engineers need to look more the need for facility modifications, closely at the tools they already have might include raw stock material available to them. “I believe that there replacement and replenishment and is far more that could be achieved with bringing spares and repairs to the line. today’s robotic and vision solutions. It “Moving people away from repetitive is important to ensure that the best is tasks, and with the potential to link made of what is already available while these robots into the production also keeping a close eye on emerging and maintenance processes or ERP technologies. It is important to take time systems, the food industry could find out to find out how these technologies a host of applications for this new might fit into your organisation – now breed of robots. There are so many and in the future – because you can be opportunities for their adoption within sure that some of your competitors will the food manufacturing environment be doing just that.” and again, this will be driven by the need for greater efficiency.” Brooks also expects to see increasing adoption of simulation technologies to provide a low-cost solution to visualise the total cost of ownership and NEW proof of concept 3rd Generation of any potential Process Counters new automation solutions before n Single or dual configurable pulse inputs purchase. “Artificial n Displays sum or difference of inputs Intelligence n ATEX, IECEx, ETL and cETL certified (AI) is another Ex ia intrinsically Safe Models technology that Ex nA & Ex tc (without barriers) is knocking on n General purpose models also available the door of the n Three enclosures, all IP66 (front), food factory,” DIN 144 x 72mm and 96 x 48mm continued Rugged 105 x 60mm (316 Stainless Steel) Brooks. “We n Other features: Quadrature input, alarms, backlight, should expect 4-20mA and synchronous pulse outputs. to see these

Counting in Hazardous Areas?

March 2017

UK9

www.beka.co.uk

Hitchin, Herts. SG5 2DA, UK sales@beka.co.uk Tel +44 (0)1462 438301


HAZARDOUS AREAS

Avoiding the BIG BANG! Mark Shannon explains why food manufacturers need to be taking dust explosion risks seriously.

A

2011 study carried out by HFL Risk Services, specialists in process safety management, identified that 2,000 dust explosions occurred in factories and refineries across Europe, with 50 of these being in the UK. In spite of the explosion risk management requirements set out in Dangerous Substances and Explosive Atmospheres Regulations 2002 (DSEAR) and ATEX, devastating dust explosions still occur. As far as the food industry is concerned, there have been some notable dust explosions in the last decade. In 2006 a man was killed in a dust borne explosion at a milk powder production facility in Holland. The Visalia milk powder processing plant in California succumbed to an explosion that blasted a hole through the

UK10

March 2017

side of the facility in 2008. In 2012, an explosion caused nearly AU$300,000 worth of damage when a powdered milk facility was significantly damaged in an explosion. Sugar dust too has shown its destructive potential in several incidents around the world; one of the most notable in recent times being the Imperial Sugar Plant explosion of 2008 in the US. Accumulations of sugar dust in the factory were ignited, probably by a very small spark. However the ensuing secondary explosions and fire claimed 14 lives and left nearly 40 people with serious burn injuries. In 2003 in the UK, the British Sugar refinery in Cantley (Norfolk) also suffered a series of devastating explosions. They propagated throughout the facility causing severe damage starting in the bucket

www.controlengeurope.com

elevators and moving through dust extraction systems. The most alarming fact about this incident was that the factory was not operational and hadn’t been for some time. The ignition of the sugar dust was caused by welding repairs on the exterior of a bucket elevator. The real shock was how fast and vigorous the explosion propagation was from the moment of ignition by the welder’s spark. It begs the question, has awareness about combustible dusts grown? And is enough being done to prevent dust explosions?

Combustible dusts It is commonly known across many processing industries that wood-based dusts are highly flammable. Biofuel, recycling and wood pelleting processes are documented causes for dust explosions. We expect these substances to be a fire or explosion risk. However, these are closely followed by dust created during the food manufacturing process – familiar suspects like flour, sugar, powdered milk, powdered starches, tea and coffee. Indeed, the HFL Risk Services study identified that 24% of dust explosions occur in the food industry. However, the range of dangerous dusts is so expansive and encompasses organic materials that most people would not even consider to be an explosive risk. So, is there a need to improve industry’s damage limitation knowledge base? In the US, OSHA has compiled a comprehensive list of combustible dusts that pose an explosion risk during processing. (www.osha.gov/Publications/ combustibledustposter.pdf) The list comprises a myriad of organic dusts that are generated in various industries, ranging from food manufacturing, pharmaceutical and petro-chemical industries, recycling and agriculture. Substances listed include lesser known Control Engineering UK


HAZARDOUS AREAS food dust sources such as tomatoes, egg-white and onion, for example, are listed as potentially combustive materials when they are being processed in powdered form. In 1989 a paper presented by B. Porter at ‘Dust Explosions: Assessment, Prevention and Protection,’ gives one of the most accurate, documented set of dust explosion statistics in the public domain. Apart from the expected suspects, like wood based products and metals, foodstuffs rank quite high on the risk list.

How an explosion occurs The originating causes of dust explosions may be varied; a spark, friction on badly maintained machinery, an electrical fault, or grinding and milling friction. However, the fuelling and propagation of a possibly fatal explosion is almost always caused by suspended dust in the atmosphere. Levelling explosions are caused by the trinity of ignition, fuel and oxygen. When these three elements come together in an enclosed area, with rising pressure and rapid increases in temperature, a deflagration can occur. This primary explosion can cause a pressure wave carrying with it a flame that disturbs accumulated dust. Once the agitated dust is in suspension, this is where the extremely dangerous secondary explosion risk is created and has the ability to spread to other parts of processing equipment, risking its complete destruction. It travels through pipes, ductwork and silos until no part of the processing facility is safe.

Protective measures To eliminate the relationship between chance and disaster, the UK and European process industries are subject to guidance when it comes to protective measures. In the UK employers are subject to certain basic requirements according to DSEAR, which simply requires employers to assess and identify the risk; eliminate the risk; provide controls to minimise and protect against the risk; and provide additional controls Control Engineering UK

European and IEC Classification

Definition of zone or division

Zone 0 (gases / Vapours) An area in which an explosive mixture is continuously present or present for long periods. Zone 1 (gases / Vapours) An area in which an explosive mixture is likely to occur in normal operation. Zone 2 (gases / Vapours) An area in which an explosive mixture is not likely to occur in normal operation and if it occurs it will exist only for a short time. Zone 20 (dusts)

An area in which an explosive mixture is continuously present or present for long periods.

Zone 21 (dusts)

An area in which an explosive mixture is likely to occur in normal operation.

Zone 22 (dusts)

An area in which an explosive mixture is not likely to occur in normal operation and if it occurs it will exist only for a short time.

to mitigate the consequences, for staff and equipment. ATEX specifies what controls employers should use to prevent explosion risks, divided into danger ‘zones’ as shown in the above table. In theory, when one element of the incendiary trinity is interrupted or controlled, an explosion can be prevented. This invites different methods of protection to mitigate against fire risk including laboratory dust testing, better housekeeping to reduce or eliminate dust accumulation, to venting systems, spark detection devices and explosion suppression systems. For example, chemical suppression and isolation systems detect the start of an explosion (point of ignition) and deliver dry chemical extinguishing agents into a developing internal deflagration. It suppresses further flame propagation and protects interconnecting process equipment from any spreading explosion damage. Mechanical isolation uses rapidresponse valves that isolate each stage of the process so should an ignition occur, it won’t spread throughout the process equipment, fuelled by dust along the way. Spark detection devices detect hot particles, sparks and flame that might become the ignition source for a fire or explosion if allowed to travel

www.controlengeurope.com

on through pneumatic ducting and conveyors towards other materials handling equipment. The most advanced flameless vents intercept, extinguish and retain all burning materials, preventing them from hazardous release into the surrounding environment. These are all approved methods by which effective control of ignition, combustion and explosions can be safely controlled. From a business viewpoint, economic loss because of halted production due of destruction of property and equipment is an undesirable and unprofitable position to be in. There are other advantages to adequately safeguarding your process facility against dust explosion risk; such as possible insurance benefits in terms of reduced premiums if an employer demonstrates due care and diligence by installing correct explosion controls. As an employer there is a legal obligation to comply with regulations for the purposes of safety. Given how dangerous food dust explosions can be, it is vital to remain vigilant. Endangering or losing the lives of workers as a result of a dust explosion is a loss too great to put a price on and a cost to reputation that is damaging to future prosperity. Mark Shannon is European sales manager at BS&B Safety Systems. March 2017

UK11


HAZARDOUS AREAS

ATEX and the control panel Control Engineering UK finds out more about the creation of unique control panels which are suited to use within hazardous environments.

P

otentially explosive atmospheres, or zoned areas, can appear in applications spread across a number of industries – from obvious ones like oil & gas to less well recognised areas such as food & drink. The primary emphasis for all equipment operating in a zoned area is safety. The identification of different zones categorises the risk and therefore the level of protection required to avoid danger. Any component or system used in a zoned area must meet the relevant standards and must be installed by suitably qualified engineers in order for certification to be valid. The task of designing process control systems that meet DSEAR, ATEX and other similar standards requires considerable levels of competence in order to deliver a suitably compliant process that meets the demands of the application. The primary concern

in such conditions is safety and often this may need to be maintained at the expense of performance.

Understanding the application Designing and implementing efficient and reliable process control systems also requires a good understanding of the application. Manufacturers have developed a huge range of products and systems that can be combined and integrated to produce bespoke process control systems. However, when the application involves operation within potentially explosive atmospheres, designers could be forgiven for thinking that the choice of design might be restricted. Many manufacturers of control components produce an ‘ATEX’ range or similar and several system integrators have the required expertise to create control solutions for zoned areas. The difficulty lies in creating a design that minimises the number

of component suppliers and ensures compatibility between those that are selected. In terms of the individual components, manufacturers spend considerable amounts of time and effort producing products that are certified for use in particular applications under specific conditions. In situations where the requirement is simply for a replacement part, it is a relatively easy task to ensure continued compliance with the regulations. However, when there is a more wide ranging project – such as modifying an existing control system or building a completely new one – the task of delivering a fully certified installation is more complex.

Centralised or decentralised Many smaller systems and standalone equipment use a centralised control concept, which is based around a

The Menden Systemhaus maintains a high level of expertise in delivering control cabinets for installations in potentially explosive atmospheres.

UK12

March 2017

www.controlengeurope.com

Control Engineering UK


HAZARDOUS AREAS control cabinet that contains all of the necessary components to operate the control valves, including the PLC, network connections, input/output systems and valve islands. In many cases, especially in larger facilities, working towards a decentralised system of process control can provide a number of benefits compared to the more traditional approach. This concept uses intelligent, pneumatically operated process valves at the field level which can be equipped with all the required automation components such as a pilot valve with manual actuation, electrical feedback units and optical status indication, field bus interfaces and even positioners and process controllers. To bridge the gap between centralised and decentralised automation concepts, flexible pneumatic valve units and compact automation systems can be used. These units are typically wall-mounted directly inside small cabinets that can be installed close to the process in question. Small, pre-configured and standardised units can eliminate the long runs to valves and field devices, and can be maintained more easily.

Unique scenarios Each and every client will have a unique scenario that requires the manufacturer of the control panel to take a very individual approach to the project. BĂźrkert, for example, addressed this requirement with the creation of a network of specialist design and manufacturing facilities. Known as a Systemhaus, this network has been in operation for several years with the purpose of delivering precise fluid control systems designed to meet very specific process requirements and regulatory standards. The Menden Systemhaus, for example, maintains a high level of expertise in delivering control cabinets for installations in potentially explosive atmospheres. From initial concept to on-site commissioning, a dedicated team is assigned to each project to Control Engineering UK

ensure continuity and understanding of the project requirements are maintained. Due to the multiple competencies within each team, it is possible to design compact, integrated solutions that often contain both electrical and pneumatic circuits and even fluidic connections as well. In this way it is possible to deliver a single control cabinet, minimising space requirements as well as all the equipment that will deliver the process control itself, which reduces commissioning time and improves final delivery.

manufactured by a single, dedicated team with the experience to appreciate the fine detail involved. Ideally, a team is assigned to each project, from initial concept to on-site commissioning, to ensure continuity and understanding of the project requirements are maintained. Due to the multiple competencies within the team, it is possible to design compact, integrated solutions that often contain both electrical and pneumatic circuits and even fluidic connections. In this way it is possible to deliver a single control cabinet,

When the application involves operation within potentially explosive atmospheres, designers could be forgiven for thinking that the choice of design might be restricted.

Very often the theoretical design of a control cabinet for a zoned area is the simplest task, finding the components that will deliver the design in reality while also meeting the required ATEX or IEC-Ex standard can be more difficult. This is especially true for electro-pneumatic automation systems that are designed for installation in Zone 1, a product that is only available from a select number of manufacturers. This system is particularly suited to decentralised process control tasks concerning fine chemicals, pharmaceuticals, cosmetics and oil & gas. It can be used, for example, to automate the filling of potentially volatile solvents, alcohol or lacquers. In these application areas, it is necessary to have a compact electro-pneumatic automation solution allowing integration of EEx ia rated pneumatic valves without additional wiring. Systems such as this can only really deliver the expectations of the client if they are conceived, designed and

www.controlengeurope.com

minimising space requirements as well as all the equipment that will deliver the process control itself, which reduces commissioning time and improves final delivery. While many control panel manufacturers can develop designs, source components and assemble complete solutions, getting these tested and certified with all the necessary documentation can be a more difficult task. Some suppliers may need to develop additional partnerships in order to possess all the required skills to deliver a complete project, and this could lead to an extended project duration. Clearly, the overriding concept for process control within zoned areas is that of safety. However, it is possible to design and deliver efficient and compact systems that provide all of the control and precision required. The key is to understand the application and to be able to deliver a control solution that will conform to all the required standards. March 2017

UK13


SYSTEMS INTEGRATION

PICKING THE RIGHT systems integrator Demand for fully turnkey automated lines is accelerating, using a specialist system integrator can de-risk the operation.

Paul Wilkinson offers advice on de-risking the systems integrator selection process.

I

n the UK there are many companies specialising in delivering automation projects. Some design and provide the technology, components or sub systems, while others provide the full turnkey applications. Then there are system integrators, who specialise in crafting solutions, overseeing the engineering of automated systems and ensuring that all the elements on a manufacturing or packing line work harmoniously together to optimise productivity, reduce waste and address rising manufacturing labour costs. The well-publicised engineering skills shortage has resulted in fewer companies employing skilled inhouse engineering experts, so many manufacturers value being able to access external technical and engineering assistance that these specialists can supply. High-level engineering, automation, operational, communication and IT skills are all necessary, but this

UK14

March 2017

needs to be balanced with sector specific experience too. There is often tremendous value to be gained from working with companies that have refined their skills by taking on complex and bespoke projects and have a track record of integrating disparate automation platforms. Robotic devices are commonplace today and anyone can buy a robot arm. Yet, it is the engineering knowledge and the ability to design the best possible solution for each application that will result in the most cost-effective, long-lasting, reliable and trouble-free solution. To help craft the best solution, employ a well-resourced engineering and project design team which can provide a combination of engineering expertise and deep sector knowledge. It might sound obvious, but make sure you select a system integrator that you feel comfortable collaborating freely with, and one that is happy to hold regular status meetings and that will

www.controlengeurope.com

advise you on the elements that fall outside of your realm of expertise.

Keeping up The pace of innovation – coupled with customer demand for innovative products – is accelerating. System integrators need to have a wide breadth of experience from different industries, which will enable them to share best practice from one application to another. Ideally, look for an integrator that will present several automation options for consideration at the early project stages, with different benefits, features and cost implications. Don’t allow yourself to get saddled with an overly complex or expensive system as this might be less flexible in the long run. An emerging requirement is for multi-functional and flexible equipment that can easily be reconfigured to future product cycles. As well as optimising OEE and ROI, much of this equipment is now also compatible with future smart factory and Industry 4.0 trends. Being able to see examples in operation – either at the integrators’ demonstration facility or at a local customer site, can also help in the decision-making process.

Track record It is important to consider the longevity of your prospective system integrator’s existing relationships and not just the quality of strategic partners they have Finally, placing an order should just be the start of the support you receive from your integrator. Educational workshops, open days, regular site visits are more than just valuable extras. They they can help to boost workforce awareness about automation and the productivity benefits that can be realised so do make sure that these will be available to you. Paul Wilkinson is commercial & information systems manager at Pacepacker. Control Engineering UK


26 – 28 September 2017 NEC, Birmingham

THE COMPLETE PRODUCTION LINE EVENT for processing and packaging machinery, materials, robots and industrial vision

See the latest machinery in action

Network with industry peers

Gain valuable insights at seminars

“A day at the show gives a year of vision” PPMA Show 2016 Visitor

ppmashow.co.uk

Organised by


NEW PRODUCTS

A Worlds first for motor control? Fairford PLUSS, a patented technology from Fairford Electronics, allows two or more small soft starters to work together to control larger power motors. This is said to result in increased operational life, cost savings and improved system reliability for thousands of soft starter applications. Soft starters are often used to control the initial voltage to motors for a jerk free start in industrial applications to ensure there is no jump in energy requirement or mechanical stress. In industry alone, 75% of all energy is used by electric motors in applications such as

In-line accelerometer signal conditioning The new QV/04 in-line signal conditioner joins the QV/02 and replaces the outgoing QV/01. All products in the QV range of in-line converters, from DJB Instruments (UK), provide charge/voltage conversion when supplied with a standard IEPE supply current in the range of 4 to 20mA. The low impedance line drive is said to maintain signal integrity even over distances of several hundred metres, requiring minimal configuration to interface with vibration analysers and data acquisition systems. The small size and low mass of the QV/04 makes it suitable for use in compact multi-channel solutions. Its low package size minimises mass effect on cables when installed into the signal cable lines. The QV/04 is available with a range of gain options to suit a variety of different charge output accelerometers.

UK16

March 2017

HVAC, chillers and freezers, mixers and pumps, so, according to the company, this technology will enable massive energy reductions. Fairford PLUSS can also be applied to thyristor-based power controllers such as solid state relays and applications ranging from electrical heating to the control DC link voltages in motor drives. The system requires soft starters to be joined with a communications cable. The PLUSS software on each soft starter then synchronises each starter to operate alternately on each cycle of the input power. One soft starter takes the load for one half of the cycle and the

other soft starter takes over on the rest of the cycle, effectively swopping the load between the two starters up to 100 times a second. The more soft starters are connected with the technology, the greater the load sharing.

Barcode readers for proof of origin in the food industry Barcode readers are increasingly being used for the automated identification, tracking and tracing of products in the food industry. Cognex has helped implement an efficient track and trace solution at one food producers site with its DataMan 302 barcode reader, allowing pickled gherkins to be traced all the way back to the respective grower. Cognex DataMan 302 barcode readers form part of the track and trace system that enables the company to prove the precise origin of the classified gherkins. Right at the start of the production chain, the DataMan 302 detects and reads the labels found on the containers with the classified batches of gherkins. As soon as the manufacturer and product data read has been verified successfully, the gherkins enter the downstream production. The long-term storage of the data enables batches to be traced back even after a period of years to the respective field and grower. The fixed-mount DataMan 302 barcode readers are available with a

www.controlengeurope.com

selection of lighting and lens variants for reliably reading the most difficultto-read barcodes and data matrix codes – including high-speed lines. Both the integrated and controllable lighting and the liquid lens with adjustable focus enable the optimum setting of working distance, depth of field and field of view in order to achieve the best-possible read rates. An intelligent auto-tuning function automatically selects the optimum settings for the integrated lighting, autofocus and imager. The barcode readers can also read damaged, distorted, blurred and low-contrast codes and transfer the information to the controller. This allows the data to be processed immediately and stored in the long-term. Control Engineering UK


SPECIAL FOCUS: OPC UA

OPC UA sees record growth in adoption

In 2016 OPC UA experienced record growth in adoption, with industry analysts now forecasting that adoption will exceed 120 million installations by the end of 2017. On the following pages we look at why everyone is so excited about this technology.

T

he OPC Foundation has been busy! It manages and supports the OPC Unified Architecture (UA) which was released in 2008 as a platform independent service-oriented architecture to integrate all the functionality of the individual OPC Classic specifications into one extensible framework. Things look set to remain busy for some time to come too, as the adoption of OPC UA across all industry sectors continues at a rapid pace. “The significant value of the OPC UA technology has been recognised and is now being realised across the World with literally thousands of products being released by the OPC community,” said Thomas Burke, president and executive director at the OPC Foundation. “The value Control Engineering Europe

proposition of OPC UA extends the OPC classic functionality in significant ways. Its big advantages include platform independence, security, reliability and most importantly information modeling. The information modeling aspect of OPC UA technology allows us to collaborate with many organisations beyond industrial automation; taking complex data models and plugging them into OPC UA. “We have announced so many collaborations over the last 18 months my head is literally spinning. It’s amazing all the things that you can do with data and information and how important this is in the world of the industrial Internet of things (IIoT),” continued Burke. What is interesting about these collaborations and initiatives is how the

www.controlengeurope.com

end-user community now appears to be driving industrial interoperability across the globe. OPC UA technology has been chosen as the communications and information integration infrastructure across a range of industries. Key initiatives include the MCS-DCS Interface Standardisation (MDIS) network, which was formed as a result of the subsea vendors, oil companies and all of the DCS vendors collaborating to develop a standardised object model to harmonise data across their industry; the pharmaceutical serialisation effort called Open-SCS, which is important to the pharmaceutical companies working with the pharmaceutical suppliers to develop an OPC UA companion specification to address the serialisation being demanded by all the regulatory industries across international boundaries; and the effort and partnership between the Organization for Machine Automation and Control (OMAC) and the OPC Foundation, where the industry has demanded an OPC UA companion specification to support the packaging industry, with key companies like Nestlé and Arla foods being in the driver seat for developing a standardised information model that plugs directly into OPC UA. In January 2016, the OPC Foundation announced that it had over 45 million installations of OPC internationally based on analysis from ARC. Industry analysts are now forecasting that OPC adoption will exceed 120 million installations by the end of 2017. “Much of this is a direct result of the OPC UA actively being used as the IIoT technology solution,” said Burke. “The market for M2M, IoT, IIoT is breathtaking, in terms of the volume and types of devices and applications necessary to meet and exceed the requirements of these important initiatives.” For more information about the OPC go to www.OPCfoundation.org March 2017

19


SPECIAL FOCUS: OPC UA

A sensor solution with Industry 4.0 capabilities The result of collaboration between Microsoft and Leuze electronic has been the creation of a sensor whose process data and metadata can be transferred directly to the Azure Cloud via OPC-UA without the need for an intermediate gateway.

A

prerequisite for high data transparency is an intelligent and standardised data interface. However, this alone is not enough to realise Industry 4.0 systems. The RAMI 4.0 reference architecture model of the Industry 4.0 platform offers a representation for industry.RAMI depicts the properties of Industry 4.0 components in three dimensions. Firstly, the product life cycle is described – here, product data such as production data, data sheets, configuration data, etc., is collected. In the next dimension, a hierarchy is recorded, while the third dimension represents the IT representation. Industry 4.0 components need to be describable using the RAMI 4.0 model which means that a true Industry 4.0 sensor must be able to communicate across all IT levels. Currently, this is not possible for a sensor with a classic fieldbus interface, which communicates exclusively with the control level, but does not pass data to the upper IT levels.

A future M2M standard? Unlike classic fieldbus interfaces, an interface that is expanded with the OPC UA communication model is able to transport data to higher IT levels of the RAMI model. OPC UA includes a security implementation that consists of

20

March 2017

The reference Architecture Model of Industry 4.O (RAMI).

authentication, authorisation, encryption and data integrity with signatures. Unlike communication methods typically used in industrial environments, it allows for secure communication. From the field level of the automation pyramid, OPC UA can communicate via two different mechanisms – either via client/server communication or via a publisher process. With client/server communication, an OPC UA server is integrated in the data source – a sensor that can deliver data to a data recipient. With the publisher process, an OPC UA publisher is integrated in the data source. This can then make its data available to various data recipients. If there is more than one data source in the system, the data recipient can decide which data it would like to receive from

www.controlengeurope.com

which publisher. This means that the recipient does not always need to accept the data from all publishers. Using this process, communication from data sources to data recipients is possible. A data cloud can also retrieve data directly from the data source. Communication in the opposite direction – from the cloud to the sensor – will also be possible in the future. Industry 4.0 requires compatible communication and OPC UA is able to virtually ‘tunnel through’ the layers of the automation pyramid and transport data to the higher levels of the RAMI model. This makes standardised communication of sensors and actuators from various manufacturers directly with a cloudbased ERP system possible. Thanks to the secure communication, even Control Engineering Europe


SPECIAL FOCUS: OPC UA the exchange of data between different systems via public channels is conceivable.

The Microsoft Azure cloud However, the provision of data from components via OPC UA communication alone is not enough for an Industry 4.0 application. Additional mechanisms are required for data acquisition from the cloud. To realise telemetry data without additional components, such as an Industry 4.0 gateway, Leuze electronic and Microsoft have, therefore, collaborated to find a solution. The result was demonstrated at the SPS/IPC/Drive event in November 2016. Sensor data from the Leuze electronic BLC 348i bar code reader can be transmitted via the OPC UA Publish/Subscriber Communication Model (PSCM) to Microsoft’s Azure IoT Hub. This data is recorded there by the IoT hub and

made available to the Azure Cloud Services for analysis and visualisation. The collaboration demonstrates how an embedded device can be controlled from the Cloud. Using the bar code reader as an example, Leuze electronic has shown how a device can be addressed from the cloud on the lowest RAMI level without the need for another

gateway. The reading gate of a bar code reader can be controlled from anywhere in the world by any mobile device via the Azure Cloud. The sensor data recorded by the IoT hub can also be be analysed in the cloud according to predetermined criteria and this can trigger events in the Industry 4.0 total system.

OPC UA certified HMI/SCADA package The latest version of Codra’s Panorama Suite HMI/SCADA solution has gained OPC UA Certification. “Codra is pleased to have received the OPC Foundation’s certification for Panorama Suite,” said Eric Oddoux, president and CEO at Codra. “We share in the Foundation’s goal to expand OPC technology into new applications, including the Internet of Things. Certifications such as these help our customers and partners make informed

decisions as to what software will work best in which environments.” Panorama Suite 2016 (Panorama E² / Panorama COM / Panorama HISTORIAN) is a software suite for building applications in the areas of data acquisition, real-time monitoring, data archiving and reporting. The webenabled solution is used across a wide range of industries, giving the ability to better manage the process – from the control center to the boardroom.

Do you want your products to be seen by Senior Engineers responsible for designing, managing & maintaining plants & factories in the UK & across Europe?

on e iti op ed Eur iew ’s ril ing rev Ap er p ss ine sse mi ng Me n’t ol E er Do ntr nov Co an of or H f

Control, Instrumentation and Automation in the Process and Manufacturing Industries

With over 30 years experience of connecting buyers and sellers we can help you achieve your business objectives. Magazine & digital media options including: • Direct Lead Generation • Webinar Opportunites For more information contact Nichola Munn +44 (0)1732 3599990 or email nichola.munn@imlgroup.co.uk

Suzanne Gill – Editor +44 (0)1732 359990 or email suzanne.gill@imlgroup.co.uk


SPECIAL FOCUS: OPC UA

Industry 4.0 glue? OPC-UA technologies are developing fast! Suzanne Gill caught up with some automation vendors to find out more about their latest OPC-UA technology developments.

T

he growth of the Industrial Internet of Things (IIoT) and Industry 4.0 is driving the need for open and secure connectivity between devices. This resulted in the OPC Foundation’s Unified Architecture – OPC UA – increasingly being considered as the enabling standard to ensure this can happen. Rather than replacing the standards, OPC UA complements them by creating a common layer for exchanging information. Freed from the need to run on a specific OS, the open data connectivity that OPC UA embodies can be offered on virtually any networked platform. It supports interoperability between a range of manufacturing processes and equipment, protecting legacy of investment for many companies. There are security benefits too – the OPC UA standard was designed from ground up with security in mind. However, according to Arun Anathampalayam, senior product marketing manager at Honeywell, just choosing OPC UA isn’t enough. He said: “The importance of an effective software development kit (SDK) cannot be underestimated, as it provides the means for control suppliers to quickly and efficiently implement OPC UA into their products. With the right SDK, control suppliers can minimise their development time and effort.” Now, more than ever, manufacturing

22

March 2017

companies need to be able to make sense of the vast quantities of data that could have a critical impact on plant performance. “As decisionmakers transition towards a more connected, intelligent plant and the IIoT continues to transform the industrial manufacturing industry, the role of OPC UA will only grow in importance,” continued Anathampalayam. “The choice of SDK may not immediately be viewed by leaders as a priority, but finding one that accurately fits specific requirements will be vital in making the smooth transformation to Industry 4.0 a reality.” The choice of SDK will depend on the objectives. Suppliers in the automation market, for example, require an effective SDK solution for deploying IIoT connectivity across their products. This must include robust, user-friendly tools that can embed OPC UA functionality into the supplier’s device or microchip quickly, so no time is lost in moving new products to market. Vendor development teams, on the other hand, will be more likely seek a fully scalable SDK that can interconnect industrial software systems, regardless of the platform, operating system, or size.

The integrating ‘glue’ According Thomas Hilz, business development manager at Moxa, OPC UA is the glue that integrates machines and devices from different vendors with

www.controlengeurope.com

different protocols and technologies, making the ever-increasing amounts of data manageable. For this reason Moxa was an early invester in the development of a comprehensive OPC offering for its customers, including OPC UA loggers and servers. In most remote data acquisition systems it is necessary that additional human resources collect data manually from remote storage devices to load them into a database – all of this during regular daily operation. “Even with RTUs remotely collecting data over the network, software is needed to convert and upload these data logs,” said Hilz. “Moxa’s MX-AOPC UA logger makes realtime data collection easier, it also simplifies the conversion of historical data into database-ready formats.” The logger interacts directly with the company’s OPC UA Server, working as a bridge between field data and stored databases or spreadsheets. Furthermore, the MX-AOPC client converts and uploads data logs to the central database. The logger can collate tags from individual Moxa RTUs or remote I/O devices into the same database or spreadsheet, freeing users from the need to manipulate data after processing.

A real-time solution Although, from a technical standpoint, it would certainly be feasible to add real-time capability to OPC UA itself, doing so would involve considerable effort and would still have disadvantages. This is why many automation and robotics manufacturers have joined forces to move in a different direction and is why OPC UA will take advantage of Time Sensitive Networking (TSN). TSN is a set of extensions that will be included in the IEEE 802.1 standard. The goal is to provide real-time data transmission over Ethernet. The TSN standard has the automotive industry behind it, so the required semiconductor components will be available very quickly and relatively inexpensively. OEMs and system integrators have Control Engineering Europe


SPECIAL FOCUS: OPC UA

high hopes for OPC UA TSN. Until recently, these hopes have been based on theoretical concepts and technologies still under development. With its partner companies, B&R has now proven the ability of OPC UA TSN to meet communication requirements from the line level up to the ERP level under real-world conditions. “B&R has performed intensive field testing together with TSN network specialist TTTech. The results are impressive,” said Stefan Schönegger, marketing manager at B&R. “And, in some aspects, OPC UA TSN has outperformed our expectations.” Time-critical applications at the line level, such as synchronisation of conveyor belts with various other equipment, require cycle times as low as two milliseconds. “We’ve gone even lower than that on our test installations,” explained Schönegger. With jitter measurements as low as 100 nanoseconds, the results were on par with the best fieldbus systems on the market today. With its bus controller implementation, B&R has also tested a new feature of the OPC UA specification. The publisher-subscriber (pub/sub) model plays a key role in allowing OPC UA TSN to achieve the necessary performance. Until recently, OPC UA has used a client/server mechanism, where a client requests information and receives a response from a server. On networks with large numbers of nodes, traffic increases disproportionately and impairs the performance of the system. The publisher-subscriber model, in contrast, enables one-to-many and many-to-many communication. A server sends its data to the network (publish) and every client can receive this data (subscribe). This eliminates the need for Control Engineering Europe

a permanent connection between client and server, which is particularly resource intensive.

Driving development As a founder member of the OPC Foundation, Siemens is continuing to drive the development of automation, optimising the interoperability of technologies and boosting the integration of different automation layers in the factories of our customers. According to Heinz Eisenbeiss, head of marketing factory automation, Siemens Division Digital Factory, this commitment is already bearing fruit. He said: “OPC UA is implemented in various Siemens products on all automation layers. Our industrial network products, HMIs, PLCs, RFID systems and motor management systems, for example. “OPC UA is a standard that is particularly relevant,” continued Eisenbeiss. “With its semantic interoperability abilities it is a key element for increasing data exchange of components in factory and process automation.” Siemens was among the first companies whose products were certified – its RFID reader, for example. “This already supports the OPC UA AutoID companion specification defined by the AIM group in which we are actively involved,” said Eisenbeiss. “It allows system integrators to save time integrating our RFID reader into the plant structure by relying on a standardised interface with clear semantics.”

The Industry 4.0 journey On the journey towards Industry 4.0, OPC UA has played a key role for many automation suppliers, including Festo. Eberhard Klotz, Festo AG Industry 4.0 spokesperson, explains: “During the last two years OPC-UA has been

www.controlengeurope.com

installed as the key communication backbone into Festo’s latest Technology Plant in Scharnhausen, bringing machine KPI’s and energy data together with building management/ infrastructure data. New machines are being constructed with OPC UA directly integrated, but even more important are retro-fit projects”. For many older machines Festo controllers are piggy backed onto the existing controls to enable them to communicate within the latest OPC-UA networks. Festo uses SAP Plant Connectivity (Pco) with OPC UA to connect the plant data to the Manufacturing Enterprise System (MES). Klotz continues: “OPC UA is integrated into all Festo controllers utilising CODESYS. This includes IP65 controllers which are part of the Festo decentralised installation concept based on solenoid valve terminals, IP20 compact controllers or HMIs.” For complete pick and place handling and gantry systems or other decentralised mechatronic motion solutions utilising electric drives, and or pneumatic actuators, Festo uses advanced diagnostic and condition monitoring concepts based on the latest VDMA standard 24582 plus OPCUA, where the system is run by Festo controllers. This year at Hannover Messe, Festo will be implementing pilot projects with its own IoT Gateway, collecting OPC UA data from several network devices and converting them via AMQP (Advanced Message Queuing Protocol) or machine to machine connectivity protocol (MQTT) to the Festo Cloud. App based software packages will utilise features in the connected hardware, or provide cloud-based services and analytics. Festo Didactic also offers complete Cyber-Physical training systems on March 2017

> p24

23


SPECIAL FOCUS: OPC UA Industry 4.0, which allow OPC UA communication, plus cloud adaption to Microsoft Azure or SAP Hana. The first such system has been recently installed at Middlesex University in the UK, where students are able to work with other Industry 4.0 technologies including augmented reality and condition monitoring.

Other activities Interesting research activity has also started on the OPC UA implementation into existing Industrial Ethernet protocols and the IO-Link association is discussing how to prepare its data format for direct integration into the OPC UA frame. Klotz concludes: “We can

be sure this research will proceed and can expect other interesting standards developments for example the forthcoming wireless IO link. For now, as part of the ‘Platform Industry 4.0’, OPC UA is the numberone initiative for data collection for horizontal and vertical integration of devices to the IoT.”

Will OPC-UA negate the need for multiple gateways? Factory operators who need to integrate machines from different suppliers often still face the dilemma that those machines cannot communicate naturally with each other. Therefore the integration – both of a new plant, or integrating new machines into existing infrastructures – is still a challenge. “The call for standardisation was answered by OPC-UA, says Stefan Selke, segment manager machine building at Eaton. “today, nearly every week, additional companies from the machine automation market are joining the OPC family.” While the initial idea behind OPC-UA was to facilitate M2M communication it developed into other fields of application. Notwithstanding that OPC-UA with its Client-Server Architecture was designed for smart factory communication; the next generation of this protocol, offering pub/sub functionality, is already underway. This upcoming version of OPC-UA opens the world to Cloud communication based on standards. “SME machine builders and system integrators often don’t have the opportunity or the means to invest in secure, high-performance IoT and cloud technology. The conventional way to monitor a worldwide installation base for multiple customers is to use a dial-in system to collect the necessary data. This is not an effective way to monitor system conditions,” continued Selke. “It depends on the end customer opening

24

March 2017

the PLC. By simply adding new visualisation devices or interfaces with the right capabilities, even existing machines can communicate with the cloud based on OPC-UA.

Component level up its local network to external access. Using the potential of OPC-UA for M2M and M2Cloud communications sets all machine interfaces to a single architecture.” Eaton and T-Systems have bundled their key competencies to create a multi-IoT platform that networks machine equipment via the cloud. Based on Microsoft Azure, it seamlessly collects data from machines and applications based on a unified architecture that just needs to be adapted to individual requirements. The data is transported and secured safely and can be provided in customised dashboards to different groups of users. Each subscriber is provided exactly with the information needed. And there is just one secured line of communication from the smart factory into the cloud. All these users access the stored data directly, no longer having to bypass the end user’s Firewall. Many new machines are integrated into existing infrastructures that may not have the capability to communicate on the OPC protocol. For such cases there is a variety of possibilities to update machines carefully, without the need to touch

www.controlengeurope.com

One question that still needs to be answered is whether there is a need or benefit in bringing OPC-UA down to the component level? Currently, Industrial Ethernet is heterogeneous in nature. The one common theme is that regardless of the chosen Ethernet protocol installed on the machine, it all adheres to the same basic communication TCP/IP language and therefore is compatible with OPC-UA. A potential way to connect up to OPCUA would be using a workable set of components to create a Cyber Physical System (CPS) and execute such a functionality – a ‘smart to manage’ CPS. Different machines from different suppliers will feature several CPSs and they would all communicate using the same protocol. In the future, each machine might consist of several independent, modular CPSs that communicate directly through the OPC-UA protocol. This will enable every application to talk to another without needing multiple gateways. By bridging the communication gap via the OPC-UA standard down into the machine, the once defined levels of the automation pyramid become well and truly blurred. This will open up new possibilities to design modular machines, faster and more cost efficient. Control Engineering Europe


-booth at Join our OPC Hall 9, A 11 Hanover Fair:

The Industrial Interoperability Standard

Much more than a protocol … that is why it’s recommended for Industrie 4.0 OPC Unified

Architecture

for Interoperability

OPC UA is a framework for Industrial Interoperability

t of Things and the Interne

Industrie 4.0

IoT

4.0

Industrie

➞ Modeling of data and interfaces for devices and services ➞ Integrated security by design with configurable access rights for data and services – validated by German BSI security experts ➞ Extendable transport protocols: Client/Server and Publisher/Subscriber and roadmap for TSN ➞ Scalable from sensor to IT Enterprise & Cloud ➞ Independent from vendor, operating system, implementation language and vertical markets

1

M2M

Download of Technology brochure: opcfoundation.org/ resources/brochures/

Information models of different branches are mapped onto OPC UA to make them interoperable with integrated security. The OPC Foundation closely cooperates with organizations and associations from various branches: TM

Verband für Automatische Datenerfassung, Identifikation und Mobilität

www.opcfoundation.org


IIOT SECURITY

PUTTING THE FOCUS ON IIOT SECURITY There are ways for companies to get an Industrial Internet of Things (IIoT) project initiated while overcoming security challenges, but that requires a culture change and a different mindset, says Eric Byres.

T

he Industrial Internet of Things (IIoT) has been a major topic over the past year. You can’t go to a trade show or read an industry magazine without getting overwhelmed with new IIoT products or services that promise to completely revolutionise your business. But what exactly is the IIoT? Can it really help your company? And will it expose your plant floor to new security risks? If you can’t answer those questions, you are not alone. Most business executives don’t understand the IIoT either. Many don’t understand what it can (or can’t) do for their company. And even fewer have a plan detailing how they could deploy IIoT effectively. According to a 2015 Accenture survey, only 36% of 1,400 business leaders admitted their senior managers have fully grasped the implications of IIoT. Added to that, only 7% had developed a comprehensive strategy for IIoT with matching investments. There are enough real-world IIoT deployments happening that allow the careful engineer to separate hype from reality. Companies that have successfully rolled out IIoT projects have discovered it really does have the potential to unlock tremendous value in their manufacturing chain. Like all new technologies, IIoT is not without its challenges. According to a survey of IIoT experts conducted by Convetit, a company that organises on-line advisory boards and think tanks for Fortune 500 companies, the top four challenges of IIoT are:

26

March 2017

• The interoperability of different silos and systems • The resistance to organisational change • Problems implementing IIoT into existing processes, and • Increased security risks. Manage any of these poorly and an IIoT project can hinder rather than help a company. For every IIoT success story, there also have been some very difficult and failed IIoT projects. Good or bad, the same issues and solutions show up again and again. There are ways, however, to get an IIoT project focused while overcoming the security challenges facing IIoT implementations.

Rethinking IIoT The Internet of Things (IoT), a term first coined in 1999, defines our era of connected devices. It has most recently been characterised by the explosive rate of the interconnectivity

www.controlengeurope.com

between intelligent objects that are “network-connected” in order to enable information sharing. It isn’t a revolutionary concept in and of itself – most people have been interacting for years with some of the most useful, disruptive, and life-altering connected devices, such as the smartphone. Other popular examples of IoT consumerrelated goods include home light/ temperature controls and wearable biometric devices. In the industrial world we have been connecting smart devices for decades – network connected remote terminal units (RTUs), programmable logic controllers (PLCs), and humanmachine interfaces (HMIs) – are nothing new. What has changed is the depth of integration, its complexity, and the range of devices available. Until recently, most plant data stayed on the plant floor. Any connectivity was largely between controllers, input/outputs (I/Os), and operator stations. Control Engineering Europe


IIOT SECURITY What has changed with the IIoT is massive amounts of industrial data now can flow either up into the corporation and the cloud or down into increasingly smart field devices. Information previously locked into proprietary databases on a plant floor server now can end up accessed by corporate applications around the world. Perhaps most important, information doesn’t have to only flow up from the plant floor to management. It can simultaneously flow in multiple directions from multiple sources to different “data consumers.” At one major U.S. automotive parts manufacturer, measurements from field sensors in hydraulic presses now are being combined with feedback from customers to get a better understanding of the indicators of premature product failure. This interconnectivity requires new ways of looking at how the entire company can effectively integrate and use all the data available in our industrial process. And it requires new ways of understanding how our industrial processes can use the data available from other business units and the end customer to create a safer and more reliable product. “IIoT is the new label for something which has actually been developing for decades: The growing interconnectivity of ‘cyber’ devices which control physical systems,” said Steven C. Venema, chief security architect at Polyverse Group.

to the confusion concerning the actual technologies and protocols to be used. Consider the daily status meeting, a feature of manufacturing management for over a century. When an IIoT project is deployed, companies find their daily meetings miss huge opportunities to change operations in real time as new information comes in. A meeting format that is more responsive

Fear of change

Not the Field of Dreams

The unprecedented scale of information exchange means IIoT is often a transformative process for businesses. Unfortunately, transformations of the workplace often result in deep-seated concerns in staff at all levels. These include macro reasons such as the natural fear of change to delaying factors ranging from the excessive review of possible risk elements

“If you build it, they will come” is not a model for successful IIoT rollouts – but it’s a frequent stumbling block for many companies. When creating an IIoT infrastructure, companies gain the most value by creating it with the end in the mind. So they should prepare for it with the skillsets needed to securely implement IIoT in existing processes and to effectively interpret the resulting data. IIoT

Control Engineering Europe

infiltrates the entire company; it’s a mentality as much as it is a tool. A company culture must be such that it embraces – rather than resists – such a huge organisational overhaul. As the foundation of such a strategy, it’s often wise to find a platform for alliances. Enlisting the help of organisations that provide the platform for experts to convene on a variety of subjects is a good idea. These external experts can engage

“If you build it, they will come” is not a model for successful IIoT rollouts – but it’s a frequent stumbling block for many companies.

to real time information is often needed. Yet some staff will be reluctant to give up a meeting they have attended for decades. For an IIoT project to achieve its full benefit, it needs to address these concerns up front. Questions like, “How will this information get routed to the decision-makers? What systems will they use to evaluate it? If something dramatic changes, who gets told? And how do we make sure the right people can access the information?” all need answers before the IIoT project is launched. Businesses must strategise with a clear outlook regarding why, what and how their specific organisation will implement IIoT technologies.

www.controlengeurope.com

online with your company’s team, either for short timeframes of intense discussion or more routinely over a longer timeframe. Tom O’Malley, founder and chief executive of Convetit, has seen companies struggle to align their visions with their IIoT strategies. “Lots of folks are trying to figure out why,” said O’Malley. “What is your business hoping to gain? Why should senior management decide to implement IIoT? Why is IIoT the optimal strategy?” It’s essential to interact with IIoT experts whose successes are relevant to your industry. These experts demonstrate by example, explaining their own pitfalls and triumphs to help you make the right decisions and steer you toward the types of projects which produce real value. Above all else, remember IIoT is all about driving business value. It’s not just how you’re collecting data through interconnectivity; it’s why you want to do this in the first place. Eric J. Byres is a leading expert in the field of industrial control system (ICS) and Industrial Internet of Things (IIoT) security. March 2017

27


IIOT DATA ANALYSIS

Easier way to use

SCADA data for IIoT Industrial Internet of Things (IIoT) intelligent data analysis can be obtained more easily with a modular, on-demand subscription software offering economical predictive process analytics to improve efficiencies, says Bert Baech.

T

o collect and store data and monitor systems, most manufacturers are using technologies at least 30 years old. With the competition in today’s global market, most industrial companies hesitate to take advantage of new opportunities promised by the Industrial Internet of Things (IIoT), with concerns related to difficulty and cost. A recent LNS Research survey of more than 400 manufacturing executives showed the vast majority of companies do not have plans to invest in IIoT technology in the near future. It is understandable why industrial companies are reluctant to invest in new technology when considering the expense of existing systems. Old technologies that have been tweaked to try to take advantage of IIoT opportunities may require current systems to be removed. Affordable technologies developed for the Internet Age can work with existing systems to help manufacturers gain deep insight into process behaviour that translates into fast return on investment (ROI).

Valuable SCADA information Supervisory control and data acquisition (SCADA) systems were originally designed to collect data and monitor processes. Since SCADA systems generate enormous amounts of data, historians were added to store this data. Initially, historians

28

March 2017

were used to fulfill regulatory requirements, such as generating reports for government agencies. Leading industrial companies recognised the data hidden inside historians could provide valuable information on plant processes and production, but accessing and using data could be very difficult because historians weren’t designed for “read” purposes or a two-way transfer of information. Manufacturing execution systems (MES) were introduced in the early 1990s in an attempt to bridge the gap between plant-floor SCADA systems

Avoid a locked-in upgrade cycle With the amount of time and money industrial companies have spent on traditional software, we can understand the reluctance of some manufacturers to enhance existing systems. They fear that a new solution would be expensive, require extensive engineering and training for employees, and lock the company into a cycle of difficult and expensive upgrades, patches, and limited scalability. Next-generation software offers ease of use and affordability.

oftware enables users to search for particular S operating regimes, process drifts, operator actions, process instabilities or oscillations.

and enterprise ERP software. They also promised to provide analytics, such as key performance indicator (KPI) data, to improve plant-floor operations. MES can provide more advanced capabilities than SCADA systems, but are expensive and often require extensive engineering for implementation. MES was developed for a business era in which systems were still largely siloed, and Internet optimisation was largely an afterthought.

www.controlengeurope.com

Search engine for industry As mentioned, accessing historian data and turning it into actionable information to improve operations has been time consuming and difficult. Data modeling applications required extensive engineering and data scientists to perform. As a result, only mission critical applications were targeted, leaving vast areas of improvement opportunities hidden. In 2008, engineers from Control Engineering Europe


IIOT DATA ANALYSIS Covestro (then known as Bayer MaterialScience) leveraged timeseries data by examining different analytics models and identifying limitations for scaling-up beyond pilot projects. Using deep knowledge of process operations the engineers created “pattern search-based discovery and predictive-style process analytics” for the average user. Unique multidimensional search capabilities of this platform enable users to find precise information quickly and easily, without expensive modeling projects and data scientists. A simple example of how this works is the song title recognition application Shazam (by Shazam Entertainment Ltd.). While the technology Shazam uses is different, the concept is similar. Instead of trying to map every

Combining live data with historical context shortens the analysis latency to immediate analysis, providing an opportunity to take actions even before an event can affect process performance. Courtesy: Trendminer

Moreover, this next generation software enables users to search for particular operating regimes, process drifts, operator actions, process instabilities or oscillations. By combining these advanced search patterns

Unique multi-dimensional search capabilities of this platform enable users to find precise information quickly and easily

note in a song to its vast database of songs, Shazam uses pattern recognition software that seeks “high-energy content” or the most unique features of a song then matches it to similar patterns in its database. This is a simple explanation of a complex process, but the point is that it enables users to quickly find a song title with a high rate of accuracy. Industry demands more sophisticated algorithms beyond search software by connecting to existing historian databases then implementing a column store database layer for an index. This software makes it easy to find, filter, overlay, and compare interesting time periods to search through batches or continuous processes. Control Engineering Europe

users unlock information they need. For example, an operator compares multiple data layers or time periods to discover which sensors are more or less deviating from the baseline then make adjustments to improve production efficiency.

Contextualisation, prediction In addition to easier search, attention to process data contextualisation and predictive analytics capabilities is needed. Engineers and operators can provide annotation to provide greater insight. Predictive analytics capabilities enable an early warning detection of abnormal and undesirable process events

www.controlengeurope.com

by comparing saved historical patterns with live process data. Calculating possible trajectories of the process can predict process variables and behavior before it happens. This gives operators the ability to see if recent process changes match the expected process behavior and proactively adjust settings when it does not.

Online subscription model To move beyond traditional software challenges, online subscription pricing can make process analytics affordable to all companies and frees businesses from having to spend the time and money on adding additional licenses and upgrades. When users log in, they automatically get the latest version of the software. Companies can enhance the investment made in high quality historians by connecting to low-cost predictive analytics software that complement existing historians to provide more valuable business insights. Affordable, plug-and-play software can uncover new areas for improving operation efficiencies for the emerging IIoT generation. Companies no longer can operate on existing systems if they want to stay competitive. Bert Baeck is CEO of Trendminer. March 2017

29


MULTIVARIABLE CONTROL

Exploring the basic concepts OF MULTIVARIABLE CONTROL Multivariable controllers can balance competing objectives. Process controllers that can juggle multiple process variables simultaneously are becoming more common and more powerful, but they can still be difficult to design and implement.

C

ompeting process control objectives can be met carefully by using multivariable controllers. Single-variable controllers such as proportional-integralderivative (PID) loops are by far the most popular controllers for industrial applications. A singlevariable controller measures the oneand-only process variable, decides if its value is acceptable, applies a corrective effort if necessary, then repeats. This routine works well for process control problems with just one variable or with several variables that can be manipulated independently. As suggested, the problem gets more tricky when the control system is required to achieve multiple objectives using multiple actuators that each affect all the process variables simultaneously. A situation like this requires multivariable controllers that can balance the actions of all the actuators at once. Consider, for example, regulating the temperature and humidity in a commercial office. Lowering the temperature with a chiller also lowers the relative humidity. Raising the humidity with steam also raises the temperature. The required balance of cooling and steam injection can be difficult to determine. Multivariable control gets trickier still if it is possible to meet all the desired objectives with several different combinations of control efforts. The most efficient multivariable controllers

30

March 2017

Pitch control thrusters also cause a slight roll

Yaw control thrusters also cause a slight pitch

Roll control thrusters also cause a slight yaw

Figure 1: Like a spinning top, a spacecraft being pulled by gravity tends to precess as it rotates, so a roll manoeuvre causes a slight yawing motion as well. Yaw causes pitch, and pitch causes roll. A multivariable attitude controller can take advantage of these coupling effects rather than ignoring them the way three single-variable controllers would. All graphics courtesy: Control Engineering

Multivariable controllers are most common in the petrochemical, aeronautical, and energy industries.

can select the combination that is cheapest to implement. Some also can take into account the potential cost of not applying the correct control effort. Costs can include not only financial considerations, such as energy spent versus energy saved, but safety and health factors as well. Multivariable controllers are most common in the petrochemical,

www.controlengeurope.com

aeronautical, and energy industries. In a distillation column, for example, there can be hundreds of temperatures, pressures, and flow rates that all must be coordinated to maximise the quality of the distilled product. A jet aircraft control system must coordinate the plane’s engines and flight control surfaces to keep it flying. Control Engineering Europe


MULTIVARIABLE CONTROL Multivariable control techniques So how does a multivariable controller do all this? There are just a few basic multivariable control techniques, but oddly enough, the ubiquitous PID algorithm isn’t one of them. Neither PID nor any single-variable control technique can account for the effects that one controller has on the others without the help from more complex algorithms. Most single-variable controllers also ignore the cost of applying a control effort. Their only objective is to reduce the error between the setpoint and the process variable, regardless of how much energy is expended along the way. PID alone is not the answer. On the other hand, if control costs are in fact negligible and if the interactions between process variables are all relatively weak, then multiple single-variable controllers can be combined to regulate multivariable processes. NASA tried this approach with some of its early

Controller 1 +

-

Setpoint 1

+

C12

+ C22

-

P11

+

C21

Setpoint 2

+

C11

+

Controller 2

Control Engineering Europe

P21

P12 P22

+

Process variable 1

+

Process variable 2

+

Process

Figure 2: This two-variable decoupling controller could be applied to a two-variable HVAC process. If the temperature in the room were defined as process variable 1 and the humidity were defined as process variable 2, then box P12 would represent the effect that a change in humidity has on the temperature, and box P21 would represent the effect that a change in temperature has on the humidity. To negate these coupling effects, the decoupler C12 would have to cut back the heating coil whenever hot steam is injected into the airstream following a call for higher humidity. Similarly, C21 would have to cut back the steam whenever the room temperature is lowered since a cooler room requires less steam to maintain the same relative humidity.

Decoupling and process variables Single-variable controllers also can be used for multivariable applications if the process variables can be decoupled mathematically. Figure 2 shows how a simple process with two

The most efficient multivariable controllers can select the combination that is cheapest to implement. Some also can take into account the potential cost of not applying the correct control effort.

spacecrafts as shown in Figure 1. NASA used three independent controllers to regulate the pitch, yaw, and roll of a Gemini capsule. Each controller reacted to the effects of the other two as if it were handling an external disturbance. This scheme worked well enough, but the controllers tended to work against each other and ended up burning considerably more fuel than necessary.

+

controllers and two process variables can be decoupled so that each controller ends up affecting only one process variable. The decouplers (C21 and C12) are designed to cancel the crossover effects that each controller has on the other process variable (P21 and P12). The decouplers allow both controllers to operate as if each were in control of its own independent process. The simplest approach to

www.controlengeurope.com

decoupling addresses just the steadystate effects of crossover. A series of open-loop step tests will show the long-term effect that each controller has on each process variable. If, for example, a unit step from controller 1 increases process variable 1 by X% (via P11) and a unit step from the controller 2 increases the same process variable by another Y% (via P12), then the decoupler C12 can be set to a gain of -Y/X. That will give the second controller a zero net effect on the first process variable. Although a steady-state decoupler is relatively simple to design and implement, its use is limited to applications where only the longterm values of the process variables are important. If the process variables’ near-term fluctuations also must be controlled, more elaborate decouplers are required to account for the dynamic behaviour of the process. Furthermore, even decouplers that are designed to accommodate both the dynamic and steady state effects of crossover will work only if the crossover behaviour is either very weak or very well understood. Otherwise, the decouplers will March 2017

> p32

31


MULTIVARIABLE CONTROL not be able to negate the crossover effects completely. Decoupling also can fail if the behaviour of the process changes even slightly after the decouplers have been implemented.

Upper constraintt Process variable e

Minimum-variance control A minimum-variance control algorithm is generally much more effective for controlling multiple process variables simultaneously. Variance is a measure of how badly a process variable recently has fluctuated around its setpoint. It is computed by periodically squaring the measured error between the process variable and the setpoint, and adding the results into a running total. For a multivariable process, the overall variance is a weighted sum of the variances computed for each individual process variable.

Setpointt

High variance requires lowering the setpoint to avoid constraint violations Time

Figure 3: Variance is a measure of how much a process variable has fluctuated around its setpoint recently. Minimising the variance allows the process to be operated closer to its constraints.

the relative benefits of reducing energy expenditures versus keeping the room’s occupants comfortable. Minimum-variance controllers incorporate mathematical models of the process in order to predict the future effects of current control efforts. This advanced warning allows the controller to choose its next set of control efforts to minimise future variances between

If the controller can successfully minimise each process variable’s variance, the associated setpoint can be moved much closer to the nearest constraint.

A minimum-variance controller coordinates all of its control efforts to minimise the overall variance. It also can minimise the cost of control by treating each actuator as if it were another process variable with a setpoint of zero. The weighting factors used for the overall variance calculation can be chosen to dictate how much emphasis the controller places on eliminating errors versus minimising control efforts. In the HVAC example, the controller can be designed to be more or less aggressive depending on

32

March 2017

Lowering the variance allows a higher setpoint

the process variables and their respective setpoints.

Constraints for controllers The process model also allows the controller to impose limits or constraints on its control efforts and the process variables. If the model is accurate, the controller can look ahead to see where its control efforts and the process variables are headed, then change course to avoid violating constraints in the future. However, if the model does not reflect the behaviour of the process

www.controlengeurope.com

with sufficient accuracy, the constraints may be violated anyway. Constraints often represent the physical limitations of the process; for example, valves cannot be opened more than 100%, and mechanical actuators cannot be moved dangerously fast. Some process variables also must be constrained to remain close to their setpoints, no matter what the cost. Overshooting the setpoint may be the fastest way to achieve the desired temperature in an oven, but it also may incinerate the product inside. Satisfying such constraints is one of the principal motivations for using minimum-variance control for multivariable applications. Figure 3 shows how variance impacts the choice of a process variable’s setpoint. If the controller can successfully minimise each process variable’s variance, the associated setpoint can be moved much closer to the nearest constraint. This allows the process to operate at the very edge of its physical limits where productivity generally is highest. Unfortunately, the benefits of minimum-variance and most other forms of multivariable control come at a price. The mathematical formulation of these algorithms is tedious and more complex than traditional PID. This article first appeared in www.controleng.com. Control Engineering Europe


PRODUCT FORUM •

Enter Link Code on www.controlengeurope.com to read the full story

HIGH-PRECISION AUMA ACTUATORS FOR HEATING/COOLING AND METERING APPLICATIONS AUMA’s compact electric actuator ranges offer fixed and variable speed options for precise, robust and reliable flow control for chemical, food, and other industries. Applications include fluid metering and demanding temperature control systems, for example preheating and cooling systems as well as low-temperature processes. The fixed speed range comprises SBA linear actuators and ED/EQ part-turn actuators. Commands and setpoints are implemented by means of binary or analogue voltage or current signals. SBA linear actuators provide high positioning accuracy and are ideally suited for modulating applications. Covering a thrust range from 0.6 kN to 25 kN and a stroke range from 35 mm to 100 mm, they are often deployed in heating and cooling systems. ED/EQ part-turn actuators are the perfect choice when precise opening, closing or control are required for shut-off butterfly and ball valves or venting and flue gas dampers. They cover a torque range from 24 Nm to 600 Nm and swing angles from 90° to 180°. AUMA’s variable speed Smart Range includes SDL/SDG linear actuators, SVC globe valve actuators and SGC part-turn actuators. All these actuators are equipped with variable-speed motors that provide soft starts and stops, ensuring gentle treatment and long life for all mechanical components. Variable-speed operating profiles, meanwhile, help to avoid critical pressure surges and cavitation. Parameter setting via software is another key feature, and both Modbus RTU and Profibus DP interfaces are available. SDL/SDG linear actuators provide thrusts from 4 kN to 15 kN, with strokes from 55 mm to 300 mm. They work over a wide range of input voltages, so they are insensitive to voltage fluctuations. Thanks to their extremely low power requirements, the devices can easily be supplied by self-sufficient power systems such as solar PV. SVC globe valve actuators offer torques between 10 Nm and 100 Nm, with strokes from 60 mm to 70 mm. SGC part-turn actuators provide torques between 25 Nm and 1,000 Nm and swing angles between 82° and 98°. Auma Actuators Ltd Tel: +44 (0) 1275 871141 Email: mail@auma.co.uk www.auma.com

i

More info - Enter Link code 131888

BIFOLD PNEUMATIC ACTUATOR CONTROLS The Best Technology at the Lowest Cost – At Least 22% to 39% Cost Saving Bifold offer configurable valve control products, simplified for ease of selection with the highest safety factors and reduced spares requirements, offering the highest flow at the lowest cost. Maximise flow of typical tubed systems with Bifold components With traditional tubed pneumatic control circuits Bifold elements are capable of the highest possible flow in the market but the output is limited by the change in the bore size through the tubes and fittings. Maximise flow of typical nippled systems with Bifold components With traditional nippled pneumatic control circuits Bifold elements are capable of the highest possible flow in the market but the output is restricted by the diameter of the nipples. Optimise flow with Bifold patented modular solution Bifold developed an unrestricted common bore system that optimises the high flow capability of their components. Bifold have furthered this logic and developed a range of patented Filter Boosters removing as many elements as possible from the flow line including the filter and regulator, amplifying the possible flow for any given tube size. This principle results in smaller tube sizes being used and can be supplied in up to a 2” connection. If you are interested in the lowest cost solution for your application, please visit bifold.co.uk for further details and a video outlining the above. You can also contact Bifold on +44 (0) 161 345 4777. Bifold are dedicated to maintaining the excellence of their products and their new facility confirms their commitment to shortening lead times and meeting customer demands. Bifold Group would like to take this opportunity to thank all their customers for their continued support. Bifold Fluidpower Ltd Tel. +44 (0) 161 345 4777 Email: gbancroft@bifold.co.uk Web: bifold.co.uk

Control Engineering Europe

i www.controlengeurope.com

More info - Enter Link code 131890

March 2017

33


Enter Link Code on www.controlengeurope.com to read the full story

PRODUCT FORUM •

KONRAD TECHNOLOGIES GMBH AND SET GMBH TO WORK HAND IN HAND IN THE AREA OF ADVANCED ASSISTANCE SYSTEMS (ADAS), SENSOR FUSION, AND HARDWARE IN THE LOOP (HIL) Due to rising demands on mobility and autonomous driving, integrated and automated testing solutions are needed for Advanced Driver Assistance Systems (ADAS). The combination of ADAS sensor fusion with a Hardware in the Loop (HiL) testing system is necessary to enable a new level of innovative, automated testing solutions in the automotive space. Therefore, the companies Konrad Technologies GmbH, and SET GmbH, are joining their knowledge and skills to develop custom testing systems in the area of driver assistance systems. Together, the expertise on ADAS sensor fusion of Konrad Technologies GmbH and on HiL by SET GmbH form a complete, flexible set of tools from design to development, implementation and validation to production. These synergies allow optimal solutions to be offered to shared customers. The agreement provides for a collaboration between the two long-time, award-winning National Instruments Alliance partners. Both companies are successful system providers for innovative testing solutions in the automotive industry – from the initial idea all the way to end-of-line (EOL) tests. The focus is increasingly on National Instruments products, which are extended with their own products and solutions to enable high-performing and high-precision system solutions. Based on the in-depth experience of both companies, the focus of solution design is on considering the Total Cost of Test (TCoT), which allows a significant reduction thereof through suitable standardization efforts. To test modern ADAS sensors efficiently, Konrad Technologies GmbH has developed target simulators for camera, radar, and LIDAR sensors, which have been successfully used in the field by clients in the past years. The software for radar target simulation offers the opportunity to simulate typical driving scenarios in a laboratory environment. This allows a controllable and reproducible testing framework, which is a prerequisite for an objective and independent performance analysis of the sensor system. Based on the National Instruments VST, a Vector Signal Transceiver, Konrad developed an editor that allows the user to create complex scenarios that can be automatically reproduced. With the aid of Hardware in the Loop (HiL), individual parts of an integrated system can be simulated and tested in a virtual environment in real time. SET GmbH has specialized on HiL for a number of years and was thus named the Hardware in the Loop Specialty Partner by National Instruments. HiL simulation allows for an efficient assessment of systems while reducing testing time and costs and increasing reliability by replacing complex, real tests. In regard to ADAS sensor fusion, the complexity lies in integrating target simulators, classic analogue, and digital signals possibly including signal conditioning, control and evaluation of vehicle busses CAN and FlexRay, as well as the consideration of a car model. The HiL environment guarantees the parallel execution of all individual components in real time. i More info - Enter Link code 131903 Tel: +49 (0) 721 98 77 93 17 www.konrad-technologies.com

CARLO GAVAZZI’S CONB SERIES IMPROVES CONNECTOR PERFORMANCE Carlo Gavazzi UK continues to expand its series of IP67 connector and cables with the latest CONB series. In addition to the standard PVC jacket, the CONB series is also offered with a 100% PUR jacket. PUR jackets are suited to applications where there are high levels of oil, coolants and grease as it provides greater resistance to abrasion and mechanical stress resulting in improved flexibility and durability. The PUR jacket is also tear and abrasion resistant. Both connectors and cables are available in M8 or M12, straight or 90° angled for 3, 4 and 5 wires with some versions (NPN or PNP) having built in LEDs. The connector range has a rated current of <4A, IP67 degree of protection. CONB range is suited to operating temperatures of -40°C to 90°C whilst the CONB14NF connectors are -25°C to 90°C. The full CONB series of connectors and cables ensure optimal performance in industries such as wood, materials handling, food & beverage, plastic & wrapping, and machine tools. The CONB connectors are UL certified ensuring an excellent performance for harsh environments. Tel: +44 (0)1276 854110 www.carlogavazzi.co.uk

i

More info - Enter Link code 131904 USB_card_85x55mm_final.indd 2

14.01.16 16:57

ALICAT ADDS BACKLIT MONOCHROME DISPLAY TO MASS FLOW AND PRESSURE INSTRUMENTS Improved readability in any lighting condition with low power consumption Tucson, Arizona (24 January, 2017) – Alicat Scientific has added backlighting as a standard feature on monochrome LCD displays for its core range of mass flow meters, mass flow controllers and pressure controllers. The backlighting illuminates Alicat’s full-information, menu-driven, multi-parameter LCD display screens for easy reading, regardless of lighting conditions. Though reliable and dependable, with low power consumption needs, monochrome LCD screen crystals lack strong contrast, meaning these screens are best read in well-lit situations. With the new illuminated display, users can toggle the backlight on and off at the press of the button for easier reading and programming in darker environments. The new feature now comes standard on most Alicat instruments with monochrome LCD displays. Illuminated Thin-film Transistor multi-colour display and display-free instrument options remain available to suit customer preference. To learn more about backlit monochrome display options, visit www.alicat.com, or call +1 520 290-6060.

34

March 2017

i www.controlengeurope.com

More info - Enter Link code 131301

Control Engineering Europe


FINAL WORD

REFLECTING ON 40 YEARS OF TEST AND MEASUREMENT Dr James Truchard, CEO and co-founder of National Instruments (NI), talked about the changes he has seen in test and measurement technology in the past 40 years.

S

ince 1976 the test and measurement industry has gone from being an industry driven by vacuum tube technology in the era of general radio through a time when the transistor ruled with Hewlett Packard, up to the present day, when software truly is the instrument – a transition helped by NI. Moore’s law has taken us for a wild, fast ride and just when you think it’s run its course, process innovations extend into new dimensions (literally) and push performance even further. When Jeff Kodosky, Bill Nowlin, and I started NI in 1976, we saw room for innovation in how engineers and scientists interacted with, and built, test and measurement equipment. We believed there was a better way to serve the test and measurement needs that we – engineers and scientists – faced. The general purpose interface bus (GPIB, IEEE 488) was our gateway. While others might have seen GPIB as a hardware player, we recognised it for what it enabled in terms of software. As the PC industry evolved the GPIB cable made it easy to analyse and present data in a customised way. Users were no longer confined to the front panel of an instrument and their pencils and notepads for data acquisition. The opportunity to innovate then shifted to the software world, where programming languages needed instrument drivers for the connected boxes. Our strategy of writing and supporting those drivers offered a critical service that continues today. Engineers and scientists still needed Control Engineering Europe

to use tools designed for computer science to perform engineering, test, and measurement tasks. Our answer was twofold: LabWindows/CVI, to offer engineering-specific tools in ANSI C programming, and LabVIEW, a graphical programming paradigm that took the way we think about solving a problem (in flowcharts and pictures) and turned it into compiled code. The story was simple – acquire, analyse, and present. Do it in software tools designed for a customer’s use case that were easy to learn yet extremely powerful. We coined the phrase ‘The software is the instrument’ to describe this approach. Because LabVIEW is graphical it is tailor-made for parallel processing. LabVIEW users were among the first programmers to migrate from singlecore processors to multiple threads and multiple cores and see almost instant speed improvements. However, the toolchains and programming constructs were inaccessible to most mechanical engineers who were not digital design experts. We recognised this in the late 1990s with LabVIEW’s graphical paradigm. When you think about software as as we have, it’s easy to think differently about hardware, too and the creation of modular, PC-based plug-in boards were a natural by-product. Make the hardware as lightweight and cost-effective as possible and focus on ADCs, DACs, signal conditioning, and data movement.

The future There are glimpses of the future everywhere. Modern factories

www.controlengeurope.com

Dr James Truchard is CEO and co-founder of NI.

feature ‘cyberphysical systems,’ which combine software-centric computing technology with electromechanical systems and human operators to improve safety, efficiency, and cost structures. The ‘acquire, analyse, and present’ concept is still valid, but we’ve added ‘sense, compute, and connect’ as a parallel flow for IoT devices. Wireless technology in general is pervasive. The more you connect things, the more you can take advantage of data. As our capabilities become more advanced and the scale of the problems we try to solve grows and tools must be easier to navigate. Just as machine language migrated to assembly and to object-oriented, other paradigms, including graphical dataflow programming, are critical to offer the right level of abstraction. No great innovation will be done alone. The best platforms we use today are effective because they’ve fostered an ecosystem. Our software-centric approach spawned a partner network of more than 1,000 companies and 300,000 active LabVIEW users. The rise of mobile devices and ‘apps’ is possible only because of a healthy ecosystem built on developer-friendly platforms. Teambased development, code sharing, and community support soon will no longer be novel or best in class. They will be expected. My advice to any new engineer is simple – develop a vision for the future and pursue it with intensity. And, at the end of the day, don’t be afraid to have fun. March 2017

35


The ultimate for small tanks! The future is 80 GHz: a new generation of radar level sensors

When it comes to contactless level measurement of liquids in small containers, smaller is better. With the smallest antenna of its kind, VEGAPULS 64 is simply the greatest! With its excellent focusing and insensitivity to condensation or buildup, this new radar sensor is truly exceptional. Simply world-class! www.vega.com/radar

Wireless adjustment via Bluetooth with smartphone, tablet or PC. Compatible retrofit to all plicsÂŽ sensors manufactured since 2002.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.