THE FUTURE DEPENDS ON OPTICS™
IMAGING BY ® EDMUND OPTICS
mvpromedia.eu
1
Edmund Optics® Imaging –
900+
Employees
Your Imaging Solutions Provider
>1,5 Million Imaging Lenses Sold
200+
Engineers
3 Design Centers Arizona, New Jersey & China
170.000+ TECHSPEC® Edmund Optics® Designed, Manufactured & Guaranteed
Imaging Lenses produced per year
5 Factories
35+
US, Germany, Japan, China & Singapore
Tradeshow Exhibits Per Year
>$4 Million Products in Stock, Ready to Ship
6 Warehouses US, China, Korea, UK, Singapore & Japan
Edmund Optics® Imaging has over 20 years of experience designing, manufacturing, and delivering optical lens assemblies. We have a great team of respected designers experienced with designing and building complex, advanced opto-mechanical systems. With the help of design tools such as Zemax, Code V®, FRED®, SOLIDWORKS®, Abaqus, and Comsol®, we make sure we deliver precision, high-performance optical lens assemblies optimized for our customers’ applications. Whether it’s lens design, analysis, or optimization, we use our manufacturing knowledge to design with manufacturability and cost effectiveness in mind. Our designers are committed to creating reliable customer solutions.
2
Mary Turner, Ph.D. Principal Engineer, Senior Optical Designer (7+ years at Zemax, 3+ years at Edmund Optics®)
mvpromedia.eu
LH Series
LS Series
CA Series
NEW
High Resolution Lenses for Large Format Sensors A new range of lenses specifically designed for large format high resolution sensors. Get the best out of your sensor and see more with an imaging lens from Edmund Optics®! • CA Series: Optimized for APS-C sensors featuring TFL mounts for improved stability and performance. • LH Series: An ultra-high resolution design for 120 MP sensors in APS-H format. • LS Series: Designed to support up to 16K 5 micron 82 mm line scan cameras with minimal distortion. Find out more at:
www.edmundoptics.eu
MACHINE VISION & AUTOMATION
FORWARD VISION PROTECTING THE BLACK RHINO
SENSOR TECHNOLOGY
MERGERS AND ACQUISITIONS
UK: +44 (0) 1904 788600 I GERMANY: +49 (0) 6131 5700-0 I FRANCE: +33 (0) 820 207 555 I sales@edmundoptics.eu
ISSUE 24 - DECEMBER/JANUARY ISSUE 22 - AUGUST 2020 2021
mvpromedia.eu
THE FUTURE DEPENDS ON OPTICS
™
Effective? Here’s how. Perfect images at high speed
Precise inspection of fast processes. No limits when using the LXT cameras integrating latest Sony® Pregius™ or Gpixel sensors and 10 GigE interface. This way, you benefit from high resolution up to 65 megapixel, excellent image quality, high bandwidth and cost-efficient integration.
Learn more at: www.baumer.com/cameras/LXT
MVPRO TEAM Alex Sullivan Publishing Director alex.sullivan@mvpromedia.eu
Cally Bennett Group Business Manager cally.bennett@mvpromedia.eu
6
EDITOR’S WELCOME
8
INDUSTRY NEWS - Who is making the headlines?
13
PRODUCT NEWS - What’s new on the market?
16
EMVA - Thoughts from EMVA as we bring 2020 to a close
18
ALUMINIUM NORF CASE STUDY - Covering Both Sides
20 MVTEC - Setting New Standards in Machine Vision
Spencer Freitas Campaign Delivery spencer.freitas@cliftonmedialab.com
22 FLIR TECHNOLOGY - Protecting the black rhino Munu 25
Sam O’Neill Senior Media Sales Executive sam.oneill@cliftonmedialab.com
Jacqueline Wilson Contributor Jacqueline.wilson@mvpromedia.eu
Becky Oliver Graphic Designer Bart Dierickx; Stephen Hayes; Andrea Pufflerova; Markus Tarin; Frans Vermeulen Contributors
Visit our website for daily updates
www.mvpromedia.eu
CONTENTS
PLEORA - We talk to newly appointed President of Pleora Jonathan Hou
28 VISION VENTURES - Mergers and acquisitions in the vision sector 30
IENSO - Seven things you need to know for embedded vision for 2021
32
XIMEA - Humans and robots ultimate challenge – table foosball!
36
XILINX - Detecting pneumonia in chest X-rays using AI Inferencing
38
EDMUND - The latest developments in CMOS sensor technology
40 ADVANCED ILLUMINATION - High intensity light boards 41
ALYSIUM - On why persevering is important
42
IDS - Under the Surface
46
MACHINE VISION - Focus on Machine Vision Interfaces
50
HEAT JOBS
mvpromedia.eu
MVPro Media is published by IFA Magazine Publications Ltd, 3 Worcester Terrace, Clifton, Bristol BS8 3JW Tel: +44 (0)117 3258328 © 2020. All rights reserved ‘MVPro Media’ is a trademark of IFA Magazine Publications Limited. No part of this publication may be reproduced or stored in any printed or electronic retrieval system without prior permission. All material has been carefully checked for accuracy, but no responsibility can be accepted for inaccuracies.
mvpromedia.eu
5
WELCOME As I sat down to write this welcome I did wonder where to start given that it has been an extraordinary year for us all. For an industry that is built on collegiality and networking, that excels at showcasing products at events at an international level, we’ve all had to rise to the challenge and adapt our business models to succeed in the new normal, one of the points discussed by Sebastien Dignard, iENSO President, later in this issue. The end is in sight, however, with the announcement of successful vaccines, so I very much hope that this time next year we will have met in person at scheduled events and until then here at MVPro we will continue to focus on the industry’s innovations, collaborations and successes. As well as our usual topical insights into industry news, case studies and product launches, in this issue we talk to Vision Ventures Managing Partner Gabriele Jansen and Director Chris Yates on a busy year of mergers and acquisition in the vision sector and how they see a continued strategic need for vision in 2021. We look at how a FLIR solution of thermal imaging, visible cameras and NVR has provided round the clock monitoring for a critically endangered blind South Western Black rhino. As the international market for machine vision systems is growing rapidly and standard software solutions play an important role, Mario Bohnacker, Technical Product Manager HALCON, MVTec Software GmbH, tells us about their latest release HALCON 20.11. With medical technology uppermost in our minds Subh Bhattacharya, Lead on Healthcare, Medical Devices and Sciences at Xilinx discusses how AI is helping to detect pneumonia in chest X-Rays. And for all foosball players out there MEA and the Swiss Federal Institute of Technology take us through the ultimate challenge between humans and robots, using high speed cameras to discover the winner. We wish all of our friends and colleagues a happy and peaceful Christmas and look forward to working with you in 2021. Alex Sullivan Publishing Director
Alex Sullivan Publishing Director alex.sullivan@mvpromedia.eu 3 Worcester Terrace, Clifton, Bristol BS8 3JW MVPro B2B digital platform and print magazine for the global machine vision industry www.mvpromedia.eu
6
mvpromedia.eu
INDUSTRY NEWS
PPMA WELCOMES NEW CHAIRMAN “Under David’s chairmanship, and with the continued support of the PPMA board of directors and PPMA staff, I believe that we’re in a strong position to run our flagship PPMA Show next year (2021), as well other bespoke industry events. It is a huge credit to the way the PPMA has been run over many years and testament to those it continues to serve.” In response to his new appointment, Barber paid tribute to Paul for his hard work, achievements, and helping to circumnavigate the Group through a period of unprecedented change. “These are challenging times, but it has reinforced the value of trade associations to receive advice and for members to utilise the PR opportunities available to promote their products and services. “It’s fair to say that the processing and packaging industry has probably outperformed many other industry sectors due to being heavily involved with the food, beverage, and pharmaceutical sectors. David Barber
The PPMA Group of Associations (“PPMA Limited”) is pleased to announce the appointment of David Barber as its new Chairman from 3 December. Barber, who was elected to the PPMA board of directors in 2017 – and is the Head of R&D for multinational metal detection specialist Mettler-Toledo Safeline – succeeds Tim Paul after two successful years in the chair. With experience in business transformation and helping to bring about a range of innovative products and services aligns strongly with the PPMA’s ethos in how it continues to support its 500+ membership. Commenting on Barber’s appointment, former PPMA Group of Associations chairman Paul, said: “David has significant experience of implementing successful projects and embraces the Group’s strategic objectives in how it reaches out to even more members. “Through the onset of the coronavirus and the UK’s future trading relationship with Europe, the PPMA has had to change its modus operandi due to the postponement of live events, faceto-face networking opportunities, and training and seminars – to name a few. We Tim Paul have had to be flexible, adapt to change, and look at other mediums to support members during the imposed restrictions.
8
“Economic challenges are often when the most opportunities arise because companies are forced to think outside of the box. Therefore, the PPMA needs to be an enabler to support members; whether through live or virtual events, business information, or lobbying-type activities in conjunction with associated third-party organisations. “As a business, we have to continually adapt to change and be on the front foot. It’s important to look beyond our immediate sector to share information and efficiencies, and to leverage from a wide network of events and other industry resources.” While Barber believes that the PPMA should always remain true to its roots in being able to stage world-class events, he recognises that the PPMA has come on a long way since the Association was founded in 1987 to what it can offer its membership today. He believes that digital technology will have a greater role to play in how the Group engages with its members and the positive impact it can have on business efficiencies. “The world is changing, business is changing, and we need to make sure that the PPMA Group of Associations is equipped and flexible in its approach to serve all those it supports,” added Barber. “The Group has always been keen to embrace new ways of working; therefore, embracing new technologies and industry unification are key to the ongoing success of the PPMA and UK manufacturing at large.” MV
mvpromedia.eu
INDUSTRY NEWS
VISION START-UP OF 2020 WINNER REVEALED HD Vision Systems convinces judges with the topic of ‘Light field and Deep Learning-based Machine Vision’
are truly delighted and are already looking forward to the next VISION in 2021.”
HD Vision Systems has been unveiled as the winner of the VISION Start-up of 2020.
Jury member Sigrid Roegner, head of business innovation & ecosystem, IDS Imaging Development Systems, said: “I am delighted that HD Vision Systems won, because they showed us in their presentation how to solve previous problems easily with AI and image processing.”
The successful entrant was picked live from six candidates during an exciting digital pitch session on 11 November 2020. HD Vision Systems made the best impression with the topic of “Light field and Deep Learning-based Machine Vision” and was chosen as the winner with 39 per cent of the votes cast. “We are absolutely delighted with the award as the VISION Start-up of 2020! The VISION start-up pitch of VISION and the German Engineering Federation (VDMA) is a fantastic opportunity for us as a young company to make us and our technology better known to a wide audience of specialists,” said Benedikt Karolus, COO/ CFO of HD Vision Systems. “VISION represents total success for us: in 2016 as a university project and in 2018 as a newly established startup, we were able to acquire our first customers and forge many valuable contacts during the trade fair in Stuttgart. “The VISION start-up pitch has already proved worthwhile for our company: we received the first customer inquiry less than five minutes after the end of the competition! We
The winner can now look forward to free participation in the VISION Start-up World in 2021 and media reporting by various channels in the PR activities of the German Engineering Federation. Together with the VDMA Machine Vision (part of the VDMA Robotics + Automation Association) and the VDMA Startup-Machine network, VISION paved the way in which young start-ups could present their company with all their innovations. “There were so many exciting proposals, many of them with non-industrial applications. The selection of the six finalists out of the 35 proposals was very difficult for the jury,” said Dr Klaus-Henning Noffz, jury member, chairman of the VDMA Machine Vision Division and director new business development, BASLER. “This shows how dynamic the industry is and how much potential vision technology has! It is particularly interesting to see how quickly the start-ups convert the MV technologies of the future into business ideas.”
BMW CHOOSES INSPEKTO TO BRING AI TO THE FACTORY FLOOR The BMW plant in Steyr, Austria, is the BMW Group’s largest engine plant worldwide. At the facility, the top priorities are quality, efficiency and the transformation to Industry 4.0 through the implementation of digital technologies at shop floor level. To improve all three, plant managers embraced Autonomous Machine Vision, a new category of machine vision for quality inspection developed by German-Israeli company Inspekto.
state-of-the-art machine vision solutions in place to inspect the quality of its engines. However, even the most sophisticated traditional machine vision solutions suffer from pseudo-errors, where the solution flags a defect in components that were actually made to specification. Pseudo-defects create an extra and unnecessary loop in production, since flagged items need be rechecked manually. “The most problematic issue is that with increasing pseudodefect rates, employees at the repair station could let their guard down and assume that an actually defected item is good,” explained David Bricher, PhD candidate and expert in innovation and digitalisation at BMW. “BMW is not willing to compromise on quality and we want to prevent this scenario at all costs.” MV
In line with BMW’s mission to improve production processes using digitalisation, the Steyr plant has several
mvpromedia.eu
9
INDUSTRY NEWS
EURESYS DESIGN THE FUTURE WITH COAXPRESS-OVER-FIBER Inspection) and 3D inspection in general require much more data to be acquired. “We are designing the future of CoaXPress,” added Damhaut. “For two years, Euresys and Sensor to Image, a Euresys company, have been working on CoaXPress-over-Fiber, an innovative interface for computer vision applications.” The team at Sensor to Image has designed a way to run the CoaXPress protocol, as it is, unmodified, over standard Ethernet connections. Euresys is leading the development of CoaXPress with its Coaxlink series of frame grabbers and CoaXPress-over-Fiber. CoaXPress is arguably the number one interface for highbandwidth computer vision applications. It provides robust, stable, jitter-free image acquisition and is suitable for machines and applications that require the highest standard of reliability. “In 2019, our CoaXPress frame grabber sales have exceeded our Camera Link frame grabber sales”, says Marc Damhaut, CEO of Euresys. “Coaxlink is now the number one product of Euresys”. This trend is continuing in 2020 with very strong sales and growth. Today, requirements for more bandwidth between the camera and the computer, higher frame rate and higher image resolution, continue to increase. Applications such as OLED inspection, 3D AOI (Automated Optical Inspection), 3D SPI (Solder Paste
Using the Ethernet physical layer ensures that the interface is future proof, with constant evolution towards higher bandwidths such as 100 Gbps and 200 Gbps. In addition, using the Ethernet physical layer provides an easy way to migrate towards fibre optics, already a requirement for many applications. “Euresys, Sensor to Image and the CoaXPress Workgroup within the JIIA (Japan Industrial Imaging Association), which includes most major vision manufacturers, are working to make sure that this new concept is adopted by the machine vision community,” said Sachio Kiura, Chairman of the JIIA. A CoaXPress-over-Fiber Bridge IP Core is already available from Sensor to Image for evaluation by interested camera manufacturers. The Coaxlink QSFP+, a four-connection CoaXPress-overFiber frame grabber with one QSFP+ port compliant with 40 Gbit/s optical modules is also available from Euresys MV for evaluation.
DEMOCRATISING QUALITY BMW is a strong believer in bringing AI-based technologies to the shop floor, so plant managers were on the lookout for an intuitive quality assurance technology.
words, fully autonomous. This eliminates the lengthy and complex integration phases that characterise traditional machine vision projects.”
“The BMW Start-Up Garage helps us find new companies with ground-breaking ideas to improve our processes with digitalisation,” said Bricher. “The team researched several companies with innovative machine vision solutions, but only Inspekto offered exactly what we were looking for — a system that is so intuitive that any employee can set it up.”
Convinced by the capabilities of the INSPEKTO S70, BMW Group Plant Steyr purchased four systems and started a pilot phase to check their suitability for complex applications. Following its success, the systems are now operational in two different use cases — a connector with many small, hardly visible components and a fuel pipe. In both cases, the plant has noticed an improvement in quality and a noticeable reduction of false detection instances. MV
“Our INSPEKTO S70 is a self-contained product,” explained Harel Boren, CEO and co-founder of Inspekto. “It is self-learning, self-setting and self-adjusting — in other
10
mvpromedia.eu
INDUSTRY NEWS
INSPEKTO WHITEPAPER REVEALS HOW TO TAP INTO MULTI-BILLION-DOLLAR MACHINE VISION MARKET Good news for automation vendors, Inspekto, the pioneer of Autonomous Machine Vision, has released a whitepaper on the future of quality assurance in the industrial automation market. The paper addresses the needs of automation vendors and distributors who wish to add the first off-the-shelf product for industrial vision inspection to their portfolio. The paper is available to download for free from the company’s website. Industrial automation vendors provide their customers with all the equipment and components they need to automate their plants – from PLCs to motors, from cables to HMIs. Through their efficient corporate distribution channels, automation vendors can quickly deliver their products, ensuring that end-customers get what they need in a matter of hours. However, there’s one thing that they usually can’t deliver — solutions for quality inspection. Traditionally, machine vision solutions for quality inspection required the casespecific integration of numerous components including lenses, lighting equipment, frame grabbers and software, which have to be chosen and assembled by a systems integrator. As a consequence, automation vendors can offer the components to assemble a machine vision solution, but not the solution itself. The advent of Autonomous Machine Vision changes the status quo, allowing automation vendors to tap into the multi-billion-dollar machine vision market, for the first time in history. It is this turn of events that Inspekto’s whitepaper explores.
product, ready to use out of the box,” explained Harel Boren, CEO and co-founder of Inspekto. “It requires no integration and can be installed by any employee, without external support, in 30 to 45 minutes. As a consequence, it fits the broad distribution model typically adopted by industrial automation vendors.” The new whitepaper explains the limits of traditional solution for industrial vision inspection. Through a wide array of data, quotes and statistics, it demonstrates why these solutions are not compatible with the distribution model of automation vendors. The paper goes on to describe the characteristics of the only alternative on the market — Autonomous Machine Vision. It ends with a comprehensive overview of the size and potential of the machine vision market for quality inspection, a space that automation vendors can now finally explore. “The efficiency, affordability and ease of installation of Autonomous Machine Vision products will finally unlock the potential of the machine vision market, burdened by the complexity and high-cost of traditional solutions,” concluded Boren. “Autonomous Machine Vision marks a new era for industrial automation, the era where visual inspection takes its rightful place as a member of the industrial automation family.” For your copy of the whitepaper, please e-mail spencer.freitas@clifttonmedialab.com MV
“The first Autonomous Machine Vision system on the market, the INSPEKTO S70, is actually a standalone
mvpromedia.eu
11
INDUSTRY NEWS
IDS REVEAL NEW DEVELOPMENT SITE IN SERBIA “We are a company with a strong focus on development and are constantly coming up with new products and new technologies. In order to gain access to these new technologies, contact with science is particularly important. This is where we get the knowledge for our innovative products,” Jürgen Hartmann, founder and owner of IDS, explains the step. The company’s investment in the branch demonstrates the strategic importance of IDS in addressing the topic of artificial intelligence. “The topic of artificial intelligence will change our market in the long term,” says IDS managing director Jan Hartmann. Together with Alexander Lewinsky, he is also the managing director of the new IDS Imaging Development Systems Serbia d.o.o, while Professor Rastislav Struharik is responsible for the management on site. His team consists of specialists from the corresponding faculty of the University of Novi Sad. IDS Imaging Development Systems has unveiled the impact of its subsidiary in Serbia, which has been operational since July. Based in Novi Sad, Serbia’s second largest city and considered the country’s technological centre, the pure development unit is closely linked to the Faculty of Technical Sciences of the city’s university. The digital industrial camera manufacturer is thus intensifying its cooperation with science and research – especially in the field of artificial intelligence.
This is the first foreign development site for IDS and paves the way for even more active AI research and development. After all, “cameras and artificial intelligence are a combination that allows IDS to reinvent itself and contribute to shaping the future,” continues Jan Hartmann. In addition to the new subsidiary in Serbia, IDS already has own branches in the USA, Japan, South Korea and Great Britain, as well as two sales offices in France and the Netherlands. MV
THE PLUG & INSPECT® EXPERIENCE One of the most interesting characteristics of the INSPEKTO S70 is its ease of installation,” confirmed Bricher. “At BMW one of our main targets is to bring the potential of AI closer to the production field. We have to get rid of the mystique that surrounds the technology — people have to be able to say: I work with AI.” This philosophy mirrors Inspekto’s mission to democratise machine vision. With the INSPEKTO S70, the user simply switches on the controller and ensures that the field of view (FOV) covers the location to be inspected. The user then presents an average of 20 to 30 good items to the system, which will automatically learn their characteristics. The INSPEKTO S70 knows when it has enough information about a product and informs the user that inspection can start. Anything that is different from the memorised characteristics, will be flagged as an anomaly.
12
“Typically, machine vision solutions require a long training process, during which they are exposed to hundreds of defective parts. But in a manufacturing environment dedicated to the highest levels of quality, we don’t have that many defective parts available,” explained Bricher. “The INSPEKTO S70 only needs good parts, which is a huge advantage.” The benefits of Autonomous Machine Vision are already visible at BMW Steyr. “By eliminating pseudo-defects, the system has allowed us to avoid the extra loop of manual double-checks. But the more important thing for me is that with the INSPEKTO S70 we have brought AI closer to our production employees. In this sense, BMW and MV Inspekto have a shared vision.”
mvpromedia.eu
PRODUCT NEWS
IDS LAUNCHES NEW UEYE CAMERA FAMILY coated plastic housing. The first models will be equipped with the light-sensitive 5 MP sensor ON Semiconductor. “It will prove its worth in small appliance construction, measurement technology, transport and even agricultural applications,” said Jürgen Hejna, product manager at IDS. The cameras also show their strengths in classic industrial applications such as surface inspection. Thanks to their compact dimensions, the models fit into the smallest of spaces, for example as embedded vision solutions. The price-optimised design makes the cameras particularly interesting for applications where costs are the main concern. Therefore, the company also offers a large number of inexpensive lenses for the cameras. The new compact and powerful uEye XLE camera family from IDS has been specially developed for high-volume and price-sensitive projects. Possessing a space-saving design, practical USB3 interface and support of the USB3 Vision Standard, the industrial cameras can be easily integrated into any machine vision system. Users can choose between single-board cameras with or without C-/CS-mount or S-mount as well as variants with
All uEye XLE models feature a USB3 interface (SuperSpeed USB, 5 Gbps) and are 100 percent GenICam-compliant. The cameras can be used with any software that supports the USB3 Vision Standard. For an optimal user experience, it is recommended to use in conjunction with IDS peak. The free SDK includes all necessary components from source code samples to transport layer, so that customers can start developing their own applications right away. MV
THE NEW HYPERSPECTRAL CAMERA GENERATION – XISPEC IS BACK! XIMEA and imec have launched a new generation of the xiSpec camera series. After the intensified cooperation, XIMEA and imec have jointly developed an improved version of the xiSpec hyperspectral cameras. While maintaining the previous dimensions of the cameras of only 26.4 x 26.4 x 32 mm and a flyweight of only 32g, the camera housing and the bandpass filters have been optimised to improve the spectral performance of the cameras. Each xiSpec2 camera is individually calibrated. The new xiSpec2 camera series consist of the following models: New starter kits will be offered to match the new cameras to ensure a smooth start-up. The snapshot cameras will be delivered with imec’s HSI-Mosaic suite, access to imec’s hyperspectral API and XIMEA’s camera SDK. The line scan cameras will be sold to OEM customers after clarification of the project use. As part of the collaboration, XIMEA and imec offer xiSpec customers
mvpromedia.eu
tailored support to integrate the hyperspectral cameras into their application. Camera model Description
Spectral range
Spectral bands
MQ022HG-IM- Snapshot visible light
480 – 625
10
600 – 860
15
665 – 960
24
SM4X4-VIS2 MQ022HG-IM- Snapshot Red-NIR SM4X4-RN2 MQ022HG-IM- Snapshot NIR SM5X5-NIR2 MQ022HG-IM- Line scan 150 VIS-NIR 470 – 900
150
LS150-VN2
The standard xiSpec2 cameras come with a micro-B USB3 interface. For use in compact integration projects, models with USB3 flat-ribbon connection or PCIe will also be available in a later phase. The PCIe interface enables bandwidths of 10 Gbit/s with lowest power consumption and allows access to the maximum framerates of the sensors used. The new xiSpec2 cameras can be ordered and delivered MV starting December 2020.
13
PRODUCT NEWS
SVS-VISTEK FXO CAMERAS: EXTREMELY COMPACT TOP PERFORMANCE
Exceptionally high image quality and an extremely small design characterizes the new 10GigE camera series FXO from SVS-Vistek. SVS-Vistek uses the new Pregius S sensors of the fourth generation from Sony and the powerful 10GigE interface for their new FXO camera series. The new sensors deliver excellent image quality with a dynamic range of approximately 70 dB. Even with insufficient object lighting, when the camera is operated with a gain> 0, the FXO series enables images with excellent homogeneity and exceptional quality. The basis for this is, on the one hand, the integrated Pregius S Gen4 sensor with its relatively small pixels of 2,74 x 2,74 Âľm and, on the other hand, a sophisticated heating concept. The heat distribution in a camera plays a decisive role in image homogeneity. This applies even more to cameras that, like the FXO, are equipped with a 10GigE high-speed interface that generates considerable heat when in operation. In order to be able to operate the FXO cameras cool and with the maximum sensor frame rate, SVS-Vistek has implemented an elaborate thermal design with dust-free, external cooling. With 10GigE, SVS-Vistek relies on one of the most modern machine vision interfaces currently available for data transmission and enables a stable Ethernet connection with full 10GigE bandwidth thanks to an intelligent frame buffer concept. The Power over Ethernet module (PoE) allows FXO cameras to be operated with a single supply line that implements the power supply as well as the 10GigE data transmission. The 10GigE connection is
14
made via a standardized GenTL interface and thus offers a high degree of flexibility in the choice of software for image evaluation. The multi-channel strobe controller integrated in the camera and the precise I/O module with the sequencer are also addressed via GenTL in order to enable flexible software selection. The 10GigE interface does not require a frame grabber and is therefore a very economical interface. For applications where even more power is required, SVS-Vistek also offers the FXO models with the CoaXPress-12 interface. The new FXO camera series includes models with resolutions from 16.1 to 24.5 megapixels and frame rates from 30.1 to 45 frames per second. With only 50 x 50 mm, the cameras of the FXO series have the smallest dimensions on the market and are therefore particularly easy to integrate into applications. The many mounting holes on the solid housing are typical for cameras from SVS-Vistek and simplify mechanical integration. MV Further information: SVS-Vistek GmbH Muehlbachstr. 20 82229 Seefeld, Germany Phone: +49 8152 9985-0 Fax: +49 8152 9985-79 info@svs-vistek.com www.svs-vistek.com
mvpromedia.eu
ROBOT HANDS ONE STEP CLOSER TO HUMAN THANKS TO WMG AI ALGORITHMS While dexterous manipulation of objects is a fundamental everyday task for humans, it is still very challenging for autonomous robotic hands to master. Researchers at WMG, University of Warwick, have developed novel artificial intelligence algorithms so the robot can learn how to manipulate objects just like humans do. In simulated environments, the robotic hands learn on their own how to coordinate movements and execute tasks like throwing a ball to each other and spinning a pen. Robot hands can be used in many applications, such as manufacturing, surgery and dangerous activities like nuclear decommissioning. For instance, robotic hands can be very useful in computer assembly where assembling microchips requires a level of precision that only human hands can currently achieve. Thanks to the utilization of robot hands in assembly lines, higher productivity may be achieved whilst securing reduced exposure from work risk situations to human workers.
NEW
3520
Smart 3D Snapshot Sensor
BLUE LED STRUCTURED LIGHT 5-MEGAPIXEL STEREO CAMERA DESIGN
Automate your quality inspection with smart 3D snapshot scanning. Leverage high XY resolutions with extended field of view for robot-driven quality control and stop-and-go part verification.
You can read more on the WMG work in the paper, ‘Solving Challenging Dexterous Manipulation Tasks With Trajectory Optimisation and Reinforcement Learning’, by researchers Professor Giovanni Montana and Dr Henry Charlesworth from WMG, University of Warwick. The authors have written a second paper .‘PlanGAN: Model-based Planning With Sparse Rewards and Multiple Goals’, to be presented at the 2021 NeurIPS conference.
mvpromedia.eu
MV
visit www.lmi3D.com/snapshot
BREXIT CONCERNS
INCREASE AS YEAR-END APPROACHES
MVPro asked the European Machine Vision Association (E MVA) for thoughts as we bring a difficult year to a close The Covid-19 pandemic certainly captured more media attention than the impending exit of the UK from the European Union, but the absence of a defined trade deal between the UK and the EU is introducing uncertainty and risk into businesses on both sides of the English Channel, with both conformity and trade tariffs likely to be of significant concern to the vision sector. Conformity of products for sale within the European Economic Area has been simply addressed since 1985 by the use of the CE mark, coupled to manufacturers declaration of conformity and technical files. This system is intended to be replaced within the UK by the United Kingdom Conformity Assessment, which will be required for all products placed onto the UK market. 2020 will surely be remembered as the year when the first pandemic of the modern era swept the world, resulting in significant harm and impact to many people and businesses. However, as the year ends with positive indications of an effective virus vaccine, and many companies having successfully adapted to the new normal, other topics are increasingly of concern.
16
The approach is expected to be very similar to the CE mark, but companies will need to check on specific details, particularly in respect of using certified bodies for demonstrating compliance. Rapid clarification of statutory requirements is necessary, with seemingly minor issues having the potential to create significant disruption, such as the need to place the UKAC mark directly on the
mvpromedia.eu
product rather than either on the product, packaging, or documentation, as for the CE mark.
Notably the automotive industry is likely to be subject to 10% tariffs on import under current guidance. Potential knock-on effects could include a reduction in consumer demand or regional consolidation of assembly operations, and other impacts which may reduce the need for investment in new projects and vision technology. Despite the additional uncertainty, created by the Brexit transition, across the continent, adapting to any new trade relationship should progress smoothly overall. Additional operational overheads are almost certainly inevitable in many areas, but the common cultural approach to trade, established over the past 40 years’, is a strong foundation for future economic growth for both the UK and EU. At the close of 2020 we can reflect on a year in which significant difficulties were met with a flexible and practical approach to business, providing optimism that any challenges for the vision sector, associated with the Brexit transition, will be effectively addressed. MV
Import and export tariffs can appear complex and, while it may be the case that typical components of vision systems are not subject to import tariffs, in the absence of a defined trade deal between the EU and the UK, certain industries which are large users of vision technology will be negatively affected by new tariffs.
mvpromedia.eu
17
CASE STUDY
COVERING BOTH SIDES ALUMINIUM INGOT SCALPING OPTIMIZATION WITH GOCATOR® 3D LINE PROFILERS
THE CHALLENGE The plant’s existing system of moving point laser scanners was not capable of covering the whole ingot in length and width. The system only covered the cross profile at certain positions along the ingot’s surface. Characterization of the ingot surface in between these positions could not be measured using this method, and had to be compensated for with several additions in the scalping depth (called “safety” additions). Such “safety” additions led to unnecessary extra scrap material for each ingot that was scalped. Aluminium Norf (Alunorf) GmbH was founded in April 1965 and is today the largest aluminium rolling and remelt plant in the world, setting new standards for the processing of aluminium to sheet and plate for a wide range of applications. Alunorf employs more than 2,200 people, with production running around the clock, 7 days a week. Rolling and remelt volume has continuously increased and now totals approximately 1,500,000 tons per year.
THE APPLICATION This project was triggered by the necessity to improve geometry and topography measurement of raw aluminium ingots (8,6m X 2,2m X 0,6m). The rolling surface of each ingot has a variable amount of excess material that must be scalped off in order to remove the metallurgical shell from the casting process.
18
In addition, the use of a measurement system with moving sensor heads led to mechanical wear, which in turn caused maintenance downtimes and even production outages. Finally, the technology of a traversing laser made it necessary to interrupt the ingot transport over roller tables when measurement was in progress.
mvpromedia.eu
CASE STUDY
THE SOLUTION Given the numerous disadvantages of the old system, it became clear that the application requirements would be best fulfilled by implementing a system of multiple fixed-mount laser line sensors that could cover both sides of the ingot simultaneously. Such a system, which is now called AIT (Alunorf Ingot Topography System), would be able to cover the variations in ingot widths (between 900mm and 2200mm). Finally, an open and well documented data interface became obligatory in order to implement the measured data into an existing infrastructure of ibaPDA-systems. Gocator® 2100 series sensors were chosen for their large scan width (field of view), which minimized the number of required sensors and made the system affordable (10 Gocator sensors total - 5 sensors per side). In addition, the lack of moving sensor parts made mechanical maintenance due to wear obsolete. And, the installation at Alunorf marked the first system that was able to measure ingots on both sides while traveling inline.
THE GOCATOR® ADVANTAGE
“ The use of a Gocator sensor for our AIT-System made the process of implementation and calibration as easy as possible. Simultaneous to this the Gocator delivers a high grade of detail in measurement, covering a wide measurement area at a relatively low price. Finally, the possibilities of configuring measurements based on our own requirements are infinite. Both minor and larger topography effects are measurable with this system while transport movements do not affect the measurement itself.” - Stefan Schulz, M.Sc., Process Engineer, Alunor f
• Multi-sensor networking capability captures 360º scans of each ingot
reduction of 1 mm per side leads to enormous amounts of scalping chips that do not have to be remolten again.
• Easy to calibrate all sensors in the network to world coordinates and return highly accurate measurements
Alunorf is also able to generate KPIs on their central ibaPDA-System using the measured data from Gocator, which accurately describes Good or Bad ingot shape (i.e., deformation factor). This data helps determine individual or systematic problems in the casting process. For the scalping process itself, the measured data is used to correctly position the ingot in the scalper and to position the scalper heads for optimal operation.
• Easy system integration with the ability to seamlessly connect one or many Gocator sensors to Alunorf’s proprietary interface (ibaPDA-system) • Easy to develop custom algorithms for on-sensor measurement tools (e.g., vibration correction) and off-sensor data analysis using GDK/SDK • Ability to ensure 100% inline quality of ingot shape between different casthouses and molds
THE RESULT With the introduction of the new AIT-System, Alunorf is now able to measure the geometry of each individual ingot in full detail. Engineers can immediately detect the thinnest point of the ingot, rather than having to add “safety” factors that increase scalping depth and therefore scrap. Even a
mvpromedia.eu
NEXT STEPS The old scalping recipe system, which required certain “safety” factors built into the scalping depth, is now undergoing a factory-wide upgrade. With this new system in place Alunorf is focusing on achieving a reduction of scalping depth to the absolute minimum. MV For more information see https://www.alunorf.de/alunorf/alunorf.nsf/id/home-en
19
SETTING NEW STANDARDS IN MACHINE VISION:
MVTEC PRESENTS HALCON 20.11 The international market for machine vision systems is growing rapidly and standard software solutions play an especially important role. Mario Bohnacker, Technical Product Manager HALCON, MVTec Software GmbH, tells us why MVTec provides outstanding new and optimized features in its latest release: HALCON 20.11.
In HALCON 20.11, which was released on November 20, 2020, the MVTec experts have optimized a number of core technologies. A new 2D code type known as DotCode has been added that is based on a matrix of dots and can be printed very quickly, making it especially suitable for highspeed applications – for example, in the tobacco industry. With another new feature called Deep OCR, MVTec introduces a holistic deep-learning-based approach for optical character recognition (OCR). Deep OCR can localize numbers and letters much more robustly, regardless of their orientation, font type, or polarity. The ability to group characters automatically allows whole words to be identified. This significantly improves recognition performance and avoids MVTec HALCON 20.11 raises machine vision core technologies to the next level. the misinterpretation of
20
characters with similar appearances. Deep OCR contains algorithms that bring optical character recognition a crucial step closer to human reading capabilities. Deep OCR also demonstrates that MVTec does not view deep learning exclusively as a unique technology for solving applications that were previously difficult or even impossible to implement. It vividly shows how “traditional” machine vision technologies can also rise to an entirely new level with the aid of deep learning. An important goal, and part of MVTec’s product strategy, is to establish synergies between deep learning and existing core technologies in order to further improve HALCON’s quality and user-friendliness for customers wherever possible.
IMPROVED USABILITY AND FASTER 3D MATCHING The core technology shape-based matching has also been optimized in HALCON 20.11. More parameters are now estimated automatically, which improves both user-friendliness and the matching rate in low contrast and high noise situations. To give an example, the workpiece holder on a machine has special markings, known as reference marks. These marks must be located quickly right from the start in order to align the remaining
mvpromedia.eu
algorithms with them. However, these workpiece holders are typically used on a machine for very long periods. Over time, the marks become dirty and start to rust. Their contrast in the image diminishes more and more, and they become difficult to detect. MVTec optimized shape-based matching for this very application. A new operator is now available that enables users to find the right parameters very easily. In these cases, the reference marks can also be detected robustly, which improves usability significantly. The new release demonstrates substantial improvements in the 3D environment as well. Edge-supported, surface-based 3D matching is now much faster for 3D scenes with many objects and edges. Usability has also been improved by eliminating the need to set a viewpoint. HALCON 20.11 makes things much easier, not only for users but also for developers. A new language interface enables programmers who work with Python to seamlessly access HALCON’s powerful operator set. In this way, MVTec takes into account the increasing importance of Python as a programming language, especially in scientific and university settings. The integrated development environment HDevelop has also been given a facelift. The entire design has been made more attractive thanks to a simpler and more consistent icon language. In addition, it now offers more options for individual configuration, such as a modern window docking concept. Moreover, themes are available to improve visual ergonomics and adapt HDevelop to personal preferences.
COMING SOON: A NEW VERSION OF THE DEEP LEARNING TOOL The new release is available in both a Steady and a Progress edition. This means that the full range of new Progress features is now also available to HALCON Steady customers. For the first time, they can also use functions like deep-learning-based Anomaly Detection. This feature enables users to achieve outstanding results during automated error inspections within just a short period of time, with a small amount of training data, and without any labeling. Moreover, a new version of the MVTec Deep Learning Tool will also be available in December soon after the release of HALCON 20.11. Users will then be able to evaluate their trained network directly in the tool. With this addition, the Deep Learning Tool now covers the entire deep learning workflow for the first time.
PRECISE EDGE DETECTION WITH DEEP LEARNING HALCON 20.11 includes a new and unique method for robustly extracting edges with the aid of deep learning. Especially for scenarios where a large number of edges are visible in an image, it takes very few images to train the deep-learning-based edge extraction function to reliably extract only the desired edges. This greatly reduces the programming effort for processes of this type. Out of the box, the pretrained network is able to robustly detect edges in low contrast and high noise situations, which also makes it possible to extract edges that cannot be identified using conventional edge detection filters. In addition, “Pruning for Deep Learning” now enables users to subsequently optimize a fully trained deep learning network. They can now control the priority of the parameters speed, storage, and accuracy and, in this way, precisely modify the network according to applicationspecific requirements.
mvpromedia.eu
The new release contains many deep-learning-based features. gremlin via Getty Images
CONCLUSION MVTec HALCON 20.11 sets another milestone when it comes to standard machine vision solutions. This new release raises machine vision core technologies to the next level. Thanks to numerous sophisticated, mature, and field-proven features, users can improve the efficiency of their machine vision processes. The consistent further development of all the included technologies highlights HALCON’s role as a leading standard library and software for machine vision. MV
21
FLIR SOLUTION SECURES CRITICALLY
ENDANGERED BLIND RHINO When South Africa conservationist Brett Barlow needed a robust security solution to protect Munu, a blind, South Western Black rhinoceros whose species is critically endangered, he opted for a FLI R solution, comprising thermal cameras, visible cameras and an NVR, for around-the-clock monitoring, early detection and real-time response.
FLIR technology has played an instrumental role to protect Munu’s life and livelihood. Throughout the 20th century, big-game hunters, settlers and poachers have decimated Africa’s black rhino population. In the early 1970s, there were approximately 65,000 black rhinos, and by 2018, that number was reduced to 5,630. In 2020, there are three remaining subspecies of the black rhino—one of the most vulnerable being the South Western Black Rhinoceros, also known as Diceros bicornis bicornis, of which there are only 254 left in South Africa. Munu, a 20-year-old blind male rhino, is one of these critically endangered species. When Munu was in danger, South Africa conservationist Brett Barlow stepped in to save Munu´s life. Barlow teamed up with FLIR Systems to use state-of-the-art thermal and visible security cameras to act as Munu’s eyes, detecting threats, increasing safety and enhancing his overall quality of life.
22
RESCUING MUNU In 2019, rangers working at a South African National Park found a black rhino walking in circles and visibly disoriented. They knew they had to do something. After safely tranquilizing him, an ophthalmic surgeon confirmed that this rhino, today known as Munu, had suffered two detached retinas and was completely blind, likely as a result of disputes with other rhinos in the area. As soon as he heard about the situation, leading South African conservationist Brett Barlow spoke with the South African National Park and offered to permanently house and protect Munu. “Every rhino matters,” Barlow adamantly affirmed. “You wouldn´t put down a blind child, so why would you put down a blind rhino?”
SUPPORT FROM LOCAL DONORS The South African National Park later transferred Munu to Barlow’s care. However, Barlow wasn’t the only one who wanted to help Munu. Adrian Gardiner, globally renowned conservationist famous for founding the Shamwari Game Reserve and the Sanbona Wildlife Reserve in South
mvpromedia.eu
Africa, extended the invite for Munu to stay on one of his properties, the Mantis Founder’s Lodge. Wasting no time at all, Barlow relocated Munu to the lodge knowing it would increase his quality of life. The property, spanning 850 hectares, is home to five white rhinos as well as other game including a zebra and a giraffe. The White Lion Foundation, in which Gardiner and Barlow are both executive board members, donated funds to construct Munu’s boma, comprising a secure covered boma and a five-hectare open grazing area. Additional support came from a local Internet provider, who donated free Internet services for the project. American Humane, a non-profit organisation committed to ensuring the safety, welfare and well-being of animals, funded one year of feed for Munu. All donations for Munu go directly to the project with no administration costs deducted.
ONGOING THREATS Though under Barlow´s care and in a safe enclosure, Munu still faced threats. Because of Munu’s highly valuable horn, he remains a prime target for poaching. Much of Munu’s horn was removed to protect him, but the amount of horn which remains is still worth thousands of dollars. Experts say that one pound of rhino horn is worth at least $3,000 universally - and ten times that, on Asian black markets. Thus, even with much of his horn removed, Munu was still in danger. Self-harm was a risk should Munu charge into the boma. Munu’s next door neighbour, Rodney, a white bull rhino, was also a concern should a territorial fight occur. For all these reasons, Barlow looked for ways to enhance Munu’s safety.
MUNU’S NEW EYES Previously, the Mantis Founder’s Lodge employed two guards for Munu’s security. However, Barlow believes guards should only be a second line of defence, a visual deterrent that responds to threats.
or rain—both of which are commonplace at the Lodge. As such, he decided Munu needed a more robust and reliable system. In 2019, Wilke Pretorius, distribution sales manager for Sub Sahara Africa at FLIR Systems, was working with Barlow on a separate project. When Barlow told Pretorius about Munu, Pretorius informed the FLIR team who immediately got involved. FLIR donated an end-to-end surveillance system, featuring thermal and visible cameras, in order to protect Munu from poachers. FLIR’s powerful thermal and visible imaging cameras deliver intrusion detection at much longer ranges and complete, 24-hour perimeter protection, regardless of weather conditions.
“ I wanted to go down to the electronic security system route as technology doesn’t sleep,” Brett Barlow, conser vationist “Other camera manufacturers don’t compare. Their cameras can’t see through mist or rain. FLIR delivers images 24/7, rain or shine, darkness or light,” Barlow said. “Technology like FLIR thermal cameras allow for early warnings for perimeter breaches. Even though rhinos have weak eyesight—without any sight, they are basically defenceless. So, in essence, FLIR became Munu’s eyes.” Beyond FLIR´s high-performing technology, Barlow loved working with the FLIR staff.
“I wanted to go down to the electronic security system route as technology doesn’t sleep,” said Barlow.
“What drew me to FLIR was the people. Wilke and the rest of the FLIR team have been so passionate and resourceful—always available and willing to help when issues arise,” Barlow said.
The first security manufacturer Barlow hired charged high prices for their security products. More than this, once installed, Barlow discovered that these devices were unable to deliver quality images in conditions such as mist
“When I started working at FLIR, Chief Executive Officer (CEO) Jim Cannon said our mission is to save lives and livelihoods,» Pretorius explained. »These words stuck with me. Working on the Munu project, it was clear
mvpromedia.eu
23
that saving lives and livelihoods are indeed a passion of FLIR employees. I am proud to be a part of a company so eager and passionate to produce solutions and technology that make a positive impact in the world.”
INSTALLATION Installing the new security system was not an easy task. Merely two days prior to the arrival of FLIR cameras in March 2020, South Africa was ordered into an immediate lockdown due to the COVID-19 pandemic. But Barlow was eager to begin the installation process, so he set out to do it himself.
one network video recorder (NVR), specially designed to support dozens of channels. Meridian also features a FLIR United VMS EZ Client web interface, which simplifies viewing capabilities and saves the cost of additional workstations. Powering, processing and managing this system are six edge servers, FLIR´s USS Edge Appliances , containing 12TB of storage and preloaded with United VMS software, built to seamlessly manage multiple, varied devices.
LOOKING FORWARD
By early May, a two-person crew had manually dug over 600 metres of trenching to run cable and conduit through the lodge’s hard African soil. Barlow also installed a solar array to power the system; he cut bushes; installed polling; and connected the entire system to FLIR´s central network video recorder (NVR) to view the camera feeds both inside and surrounding Munu’s boma. The result, today, is a fully functioning, comprehensive security system.
TECHNOLOGY IN ACTION Barlow worked closely with Pretorius to strategically design and lay out the FLIR security system based on a two-tier model. The perimeter is shaped as a big triangle about 110 yards away from the boma enclosure. Six FLIR Elara™ FB-Series ID thermal security cameras, which use onboard analytics to classify human or vehicular intrusions, are installed to monitor the outer perimeter or the first tier. There are also 11 Ariel Full HD IP Bullet cameras deployed, which deliver 1080p video for high motion, complex and low-light scenes.
Thanks to FLIR’s technology, Barlow is confident that Munu can be an ambassador for his species. He hopes Munu’s story may inspire future conservancies around the world to partner with manufacturers, like FLIR, for heightened perimeter protection. What advice Barlow would give to other conservationists considering similar security technologies? He said: “Speak to the right people. Make sure you talk to someone who understands the product. See the solution in action. View a live site and see how it works. Work with the right people to implement that for yourself.”
To watch Munu´s boma, six FLIR Saros™ Dome DH-390 cameras, designed to deliver actionable alerts and alarm data, surround the enclosure. One FLIR Saros™ DM-Series camera is mounted inside the boma to capture every minute detail of Munu’s movement in all conditions. To manage the video from all the cameras, FLIR also supplied its Meridian TM product, a compact, all-in-
24
The longer Munu lives, the better he´ll do. Barlow plans to expand Munu´s boma, the more he acclimates to his new home. And he has already begun using the FLIR Saros DM-Series’ livestream capabilities to invite learners around the world to observe Munu up close. The plan for Munu is to mate with a female within his own subspecies, thereby directly contributing to the survival of his kind. If Munu does sire a calf, Barlow plans to donate the calf back to the South African National Park that Munu came from to help with the genetic diversity for the reserve. MV
mvpromedia.eu
DRIVING
STRATEGY FORWARD
Newly appointed President of Pleora Jonathan Hou, took time out of his busy schedule to talk to MVPro about his vision and the challenges they have met ‘successfully in the company’s 20th anniversary year.
CONGRATULATIONS ON BECOMING PRESIDENT. GIVEN YOU’VE ONLY BEEN WITH THE COMPANY A RELATIVELY SHORT TIME, HOW MUCH OF A SURPRISE WAS THIS AND WHAT DOES THE ROLE OF PRESIDENT ENTAIL? Thank you! I joined Pleora almost three years ago as CTO because I saw numerous opportunities in the machine vision space as everyone looks towards Industry 4.0 and the factories of the future. Over the last couple of years, we’ve introduced exciting technologies such as our AI Gateway and unique plug-in AI architecture for upgrading quality inspection systems at existing factories. We’ve been able to leverage our solid understanding and history of vision standards, such as GigE Vision, to help deploy AI more effectively so that companies can re-use existing cameras and software. Being named president wasn’t really a surprise, there was a very thorough succession plan developed with
mvpromedia.eu
our Board of Directors and my predecessor, Harry Page. Harry led Pleora for six years, and is retiring after successfully transforming us into a company focused on real-time sensor networking for industrial, medical, and security and defense applications. We have spent significant time in the background preparing for a smooth transition with the new executive team. One thing that’s very unique; we’ve been able to expand our executive team by filling roles internally with people who’ve worked hard to build the company and help move us forward with our strategy. The role of president entails driving our strategy forward, based on our mission and vision around “simplifying real-time sensor applications.” That ranges from working with OEM device partners and providing simple solutions for them to add AI processing and transport capabilities to devices, all the way to system integrators where we’re simplifying how to deploy AI in an easy, scalable way for factories. In this new role, I’m looking forward to speaking to potential, new, and existing customers, understand their challenges, and aligning the organization on delivering Pleora solutions based on our mission of simplifying sensor applications.
25
trends in the industry to identify application areas that we can focus on to solve our customers’ problems.
WILL YOU MISS THE CTO ROLE – AND BEING HANDS ON DEVELOPING TECHNOLOGY The role of CTO had been focused around looking at industry trends and technologies to build platforms and products that expand our growth in our industrial automation, X-ray medical imaging, and defense video solutions verticals. As president leading a pioneer in the machine vision and video over Ethernet space, I see the role as a larger scale extension on what I’ve been doing. It’s really an opportunity to speak more directly with both existing and new customers to learn about their business and challenges, then taking it back to our team to come up with new ideas and ways to solve problems and simplify the industry as whole.
HOW DOES THE ROLE FIT WITHIN THE MANAGEMENT / EXECUTIVE STRUCTURE? I’ll be leading our executive team and working in close collaboration with them to deliver on our growth strategy. We have a very close management team and are transparent across all levels within the organization. This transparency is a very important part of our corporate culture, and has been critical to our 20 years of success. Our key values include “Simplify, Focus, Solve, Commit, Deliver” and everyone is aligned on our strategy of simplifying the overall machine vision space and making it easier for our customers to take advantage of next -gen AI and vision technologies.
WHAT WILL YOU BRING TO THE ROLE AND WHAT CHANGES WILL YOU LOOK TO IMPLEMENT? My background and expertise is in machine vision, display & graphics, and inspection technologies and my objectives are around leading the company to continued growth by identifying key market opportunities and problems that we can uniquely solve. As with any leadership role, communication both within and outside of the organization is key. A primary focus area will be continuing our culture of innovation, including sharing information and
26
CAN YOU TELL US ABOUT PLEORA? (WHAT YOU DO, TECHNOLOGIES, SECTORS, BACKGROUND) Pleora is a leader in real-time sensor networking. The roots of our business are in GigE Vision and USB3 Vision sensor interfaces, primarily for industrial automation applications. Over recent years, we’ve significantly expanded the focus of the company, with new hardware and software solutions for embedded, medical imaging, and security & defense markets. Our latest product, the AI Gateway, is an embedded edge device that allows integrators to add AI deep learning capabilities in retrofit and new visual quality inspection applications to reduce costs and increase efficiencies.
HOW CHALLENGING A YEAR HAS THIS BEEN FOR THE BUSINESS? Despite the challenges posed by COVID, from the cancellation of all major events, to lockdowns, and working from home, we just finished a very successful year from a business perspective. In particular, our products for thermal screening and X-ray applications have been in very high demand. We were very quick to take proactive measures in March. The majority of our employees are working from home,
mvpromedia.eu
and we have reorganized work spaces to help ensure social distancing and safety for our essential employees so Pleora can continue to manufacture products. One of the biggest changes is how we demonstrate products, where typically we’d be demonstrating solutions at a customer site or trade show. Our applications engineering team really took on the challenge, and we’ve been effective in demoing the new AI Gateway online to potential customers around the world.
EARLIER THIS YEAR YOU PRESENTED ON HOW EMBEDDED SOLUTIONS, MACHINE LEARNING, AND ARTIFICIAL INTELLIGENCE ARE POISED TO CHANGE THE VISION MARKET. WHAT IS PLEORA DOING WITH REGARDS TO THIS AND HOW WILL THEY CHANGE THE VISION INDUSTRY? We’ve been invited to speak at a number of events on the evolution of embedded technologies and its impact on inspection applications today and potentially in the future as we look ahead to Industrial Internet of Things (IIoT). You’re right in saying AI will significantly change the vision market, but there’s a lot of confusion and concern around the technology when we speak with integrators, brand owners, and end-users. For example, many manufacturers have a significant investment in hardware, software, and proven processes that they can’t simply walk away from in favour of AI. There’s also a lot of concern about the cost and complexity of developing and training AI algorithms. Our AI Gateway really aims to simplify AI, from both a training and deployment perspective. The embedded devices comes pre-packaged with “plug-in” AI skills for quality inspection and hyperspectral imaging that are easy to train and then deploy on the AI Gateway in a production environment, as well as support for open source or custom algorithms. In the coming months we’ll be introducing a new software platform that makes algorithm development as simple as “drop-and-drag”. For deployment, the AI Gateway works with existing cameras and processing
mvpromedia.eu
software, meaning you can add AI capabilities in existing systems or choose the hardware and software that’s best for your application.
THIS IS ALSO A SIGNIFICANT YEAR PLEORA – WITH IT BEING ITS 20TH ANNIVERSARY. HOW HAS THE BUSINESS MARKING THE OCCASION? Great question in these unusual times! We would likely be having a pretty big in-person celebration. Hopefully we can get together in 2021 and mark the occasion properly. It is a pretty remarkable success story. George Chamberlain, who is still our CEO and very active in the company, co-founded Pleora with Alain Rivard based on the belief that real-time video could be transported over Ethernet. There was a lot of skepticism, but 20 years later we have shipped over 300,000 hardware and software products to more than 2,000 global customers in the industrial automation, Industry 4.0, medical imaging, and security & defense markets.
PLEORA HAVE RECENTLY ANNOUNCED A MOVE INTO THE EUROPEAN MARKET WITH A STRATEGIC PARTNERSHIP. WHAT ELSE IS IN THE STRATEGY TO GROW THE BUSINESS AND REACH NEW MARKETS? Earlier this year we did announce a new channel partner in Scandinavia to help broaden our geographic reach in Europe. Looking ahead, it will be an exciting year for the company. You can expect more product announcements as we expand the capabilities of the AI Gateway, including a “drag and drop” algorithm development platform and the addition of new plug-in AI skills from Pleora and our technology partners. We’re really at the beginning stages of understanding the potential impact of AI on vision, and are working with medical and defense manufacturers seeing ways to use machine learning to increase awareness and provide key decision-support. The same technology is attracting interest in other industrial applications, for example railway inspection and mining operations, where machine learning can help alert or inform end-users. It is a fast-moving industry, and we’re always discovering novel ways to employ our skills to solve challenges for our customers and simplify their lives. MV
27
STRATEGIC NEED FOR
VISION
CONTINUES TO DRIVE MERGER AND ACQUISITION ACTIVITY Vision Ventures Managing Partner Gabriele Jansen and Director Chris Yates reflect on a proactive year of mergers and acquisitions in vision sector and the continued appeal of the vision industry in 2021.
Mergers, acquisitions, and corporate transactions are an important cog in the machinery of free market economies, providing the means for companies to obtain technology, access markets, and secure new revenue streams, while creating the returns for investors that allow recycling of finance to further invest in growth. Whilst 2020 will long be remembered for the Covid-19 pandemic and the associated disruptions to our lives, it has also been an interesting year for transactions in the vision industry. It is clear there is a continued strategic appetite for vision technology which is undiminished by any practical difficulties created by the Covid-19 pandemic. As we approach the end of 2020 over 20 transactions have been completed during the year in the dedicated vision technology sector alone. Highlights from 2020 include the acquisition of ISRA Vision by Atlas Copco, representing the largest ever machine vision transaction and the establishment of a new operating division. In another cross-border transaction, German embedded design expert Dreamchip was acquired by Shanghai listed Goodix Technology, opening a path to greater capability for vision processing within Goodix and additional technical expertise. In an example of the continued desirability of profitable vision companies, the 3D laser profile and infrared imaging specialist, Automation Technology was acquired by Munich based private equity group, Pinova Capital. These
28
acquisitions continue a broad trend over the past few years for higher revenue multiples within the vision sector.
European venture capital investments over recent years’, showing total funds invested, number of rounds closed, and growth stage (to end of Q3 2020).
Other acquisitions have been completed by companies such as Cisco, Ziehm Imaging, Zalando, and Antares Vision, demonstrating the ability of companies to close deals in an effective manner, despite widespread restrictions on travel which inevitably limit face-to-face meetings, historically an important aspect of any corporate transaction. At the same time, the recent high value acquisitions of ARM by Nvidia and Xilinx by AMD, reinforce the trend, and are also likely to affect the vision industry in the years to come, with many vision products incorporating their processors and IP. At each stage of company growth there continues to be strong interest from investors. For venture capital, typically providing finance to earlier stage companies, 2020 remains on course to be a record year for both company investments as well as for raising of new finance
mvpromedia.eu
to fund future investments. Whilst the absolute levels of venture investment remain high, investors are also clearly focusing on those sectors which are expected to either benefit or are less impacted from the pandemic, with software, digitalisation, automation, and biotechnology all seeing increased interest. Vision technology has natural synergies with all these segments, as well as many start-up companies using a vision foundation in the delivery of novel solutions, for example the AI based inspection solutions developed by Inspekto in Israel and Neurala in the US, or the approach taken by Winnow Solutions in the UK to help reduce food waste. Within the private equity ecosystem, significant funds have been raised for deployment, with an estimated 220 billion Euro available at the start of 2020 within Europe alone. Private equity activity has increased during the third quarter after an understandable reduction in deal flow during the second quarter, and there are indications that the focus is primarily on funding new investments. This strategy will allow additional capital to be deployed within investments, and likely continue to underpin attractive valuations for well managed companies in desirable sectors such as vision. Exits from private equity funds have decreased over the year, possibly reflecting overall uncertainty in the global economy, and posing a potential short-term risk to the traditional cycles of private equity funds.
European private equity activity over recent years’, showing total financial activity, number of deals closed, and regional focus (to end of Q3 2020).
One aspect of the pandemic which has become obvious is the tremendous difference in impact to different companies. Certain sectors such as travel and hospitality are suffering devastating reductions in demand, whilst others such as consumer electronics have generated increased revenue. Notably within the automation sector, this has also provided additional confidence for public companies and multinationals with already strong balance sheets, to continue to execute their strategic plans through acquisitions. Over the recent months ABB has completed its acquisition of Codian Robotics, Datalogic has acquired AWM Smartshelf, increasing their exposure to the retail market, and UK based food delivery specialist Ocado have acquired two complementary robotic companies in the US; Haddington Dynamics and Kindred AI.
mvpromedia.eu
Historical purchase price valuations of vision companies shown as a multiple of company revenue.
The strong interest in vision technology investments is certainly part of a macro-trend which has seen a continued flow of cash into venture capital and private equity funds, itself is driven by the actions of central banks in continuing to deploy massive stimulus packages in an effort to support markets and provide liquidity. Already central banks have supplied nearly nine trillion dollars into the markets, with expectations that the support will continue well into 2021 and beyond. When coupled with historically low interest rates this produces a strong appetite for equity investments, notably visible in the disconnect between the performance of major stock markets over the year and the economic impacts of the pandemic. Independent of the macroeconomic climate, vision remains an attractive investment and strategic technology in its own right, underpinning continued investment and acquisition activity. Machine vision is understood to be one of the most powerful sensing elements which can be applied to automation, and the tremendous success of AI and neural network techniques, specifically using image data as the primary input, have significantly raised the profile of the industry. Looking ahead to 2021, it is probably no exaggeration to state that all major companies developing automation solutions, whether for factories, farms, hospitals, transport, or cities, view vision as a strategic technology for the future. MV For more insight into M&A go to: www.vision-ventures.eu Sources Quantitative data is sourced from Pitchbook, Dun & Bradstreet, and Vision Ventures internal research.
29
PANDEMIC SPARKS RISING
DEMAND
FOR VISION CAPABILITY AND EDGE AI INNOVATION Seven things you need to know if embedded vision is on your product roadmap for 2021
Businesses should act now if they want to bring their embedded vision products to market next year and meet the rising demands driven by the Covid-19 pandemic. That’s the view of iENSO President Sebastien Dignard. He said: “A growing variety of consumer products do, and will, incorporate embedded vision and Edge AI. iENSO’s conversations with product companies over the past six months suggest the pandemic has accelerated this trend.” iENSO, the embedded vision collaboration partner for major international brands, is also tracking the turbulent impact which COVID-19 is having on the design, manufacturing and time to market of any new embedded vision product. “With the pandemic having such a disruptive impact on the predictable flow of goods across borders, product companies are advised to think hard about how best to mitigate their risks in bringing a new embedded vision and Edge AI-enabled product to market,” says Dignard.
Employers and health officials need AI-driven and automated tools to monitor public spaces and job sites and ensure health and safety protocols are being followed. “All this is impacting the design of everything from home appliances and automation systems, to construction site surveillance and security systems, and even interactive electronic toys to help busy parents keep kids and pets occupied as they continue to work from home,” says Dignard. For example, a decade ago, a vision system embedded in the door of a refrigerator was intended to serve as a simple window – browse for a snack without holding the door open and wasting energy. “Today, such vision capability is coupled with the added intelligence to track brand penetration, expiration dates and reduce food waste, create a grocery list or suggest a recipe, and collect information about consumer tastes and habits,” says Dignard. “All of this adds value and is of critical importance to the future of our customers’ market share.” If embedded vision is on your product roadmap for 2021, here are some tips:
Meanwhile, market research company Omdia forecasts that global AI edge chipset revenue for vision will grow from US$7.7 billion in 2019 to $51.9 billion by 2025, with use cases across a variety of industries.
1. HOW PRODUCTS ARE ENGINEERED AND WHERE THEY ARE MANUFACTURED WILL CHANGE.
From iENSO’s perspective, consumers are looking for greater convenience as they spend more time at home.
This is being driven by the fact that component suppliers and contract manufacturers are primarily based in China.
30
mvpromedia.eu
Pandemic-related delays and political tensions have significantly impacted delivery times for components and the complexities of supply.
standard of quality that will support a product company’s business objectives. With the pandemic causing disruptions in the supply chain, the availability of components is throwing a wrench into product roadmap planning.
2. WORKING WITH OVERSEAS CONTRACT MANUFACTURERS HAS NEVER BEEN FOR THE FAINT OF HEART.
“We have even seen instances where components had to be quickly integrated and qualified to replace parts that were now causing unacceptable supply chain risks for our customers,” said Mike Liwak, CPO at iENSO. “We had to make sure image quality and reliability were not compromised during this process and that our clients had ease of mind knowing we were leading the charge with the offshore contract manufacturer (CM).”
Product companies looking to advance their product roadmaps on time, on budget, and to a suitable standard of quality have always faced challenges with managing Asian contract manufacturers due to differences in culture, language, time, and IP protection standards. The pressures of the pandemic have only intensified this dynamic. With personnel shortages and production line shutdowns, contract manufacturers are prioritizing larger orders and longstanding relationships over new entrants into the market.
3. RECENT DECLINES IN SOME COMPONENT AND DESIGN COSTS DOESN’T MEAN THAT EMBEDDED VISION HAS BECOME “EASY.” The complexities of configuration and tuning an embedded vision system specific to its intended application have arguably increased and require specialized expertise. Although the combination of cost-effective processors and sensors have many product managers dreaming of competitive pricing for their own products, these costeffective components introduce a certain amount of risk during the engineering and integration phase as tuning is needed in the design phase. When combined with new Edge AI needs and the requirement to seamlessly connect to the cloud, companies are looking for more than the original design manufacturer (ODM) model.
5. WHAT ABOUT THE DOMESTIC ECOSYSTEM? This is the question as we head into 2021. The pandemic and politics have cast in sharp relief the weaknesses of globalized supply chains. Repatriation of some offshored industries is inevitable and gathering steam as many customers are asking for transition options. What does this mean for the embedded vision industry and your product roadmap? Can you and your team easily move from your current ODM/CM without risking your business and future? All this remains to be seen. The key thing is that your organization needs the current market knowledge and the flexibility to make the most of both domestic and offshore resources to get the job done.
6. AND DON’T FORGET ABOUT DATA SECURITY. Any IoT system that is collecting, correlating, and relaying data must ensure the privacy of the individual is protected. The California Consumer Privacy Act (CCPA) is just one example of the regulatory compliance burden that may apply to your product in today’s market. The future is Edge AI, the cloud and big data analytics, and that future is already here. Are you sure that your customers’ data and your core IP is protected?
7. IS THIS REALLY A PROJECT TO UNDERTAKE IN-HOUSE?
4. INTEGRATION REMAINS THE BIG CHALLENGE. Further to that point, components must be sourced, integrated, and tuned appropriate to the intended application. The module must be designed to be manufactured at scale, at a cost point, and to a consistent
mvpromedia.eu
Ultimately, this is the question for any organization as it considers its internal competencies, the cost of acquiring what is lacking, and the time that must be dedicated to forging the right supply chain relationships. The alternative is to consider working with a proven design and manufacturing partner that has the required technical expertise and established supply chain relationships. MV
31
HUMAN AGAINST ROBOT:
HEAD-TO-HEAD IN THE FOOSBALL TABLE!
X I M EA and the Swiss Federal Institute of Technology use high speed cameras to discover the winner when it comes to the ultimate challenge between humans and robots – table foosball!
INTRODUCTION Professors and students at the Swiss Federal Institute of Technology in Lausanne are developing a robotic player installed into a foosball table that can be more accurate, faster, and strategic than any other human player. The idea is to replace a human player by a system able to sense the position of the ball on the table, evaluate a strategy to score, and command the player controls with precision motors to execute this strategy. The latest performance improvement to vision and strategy deals with the ability of the robot to keep the position of the ball under control, made possible by the implementation of a XIMEA xiQ high-speed camera. The camera, positioned under the table and runs at 500 frames per second (fps), creates a solid basis for detection of the ball in real time. At its current state of development, the foosball table robot is able to beat any regular player it encounters.
PROJECT DETAILS The Automatic Control Baby-foot project is one of several projects at EPFL (École polytechnique fédérale de Lausanne). People of all ages are fascinated by the concept and implementation every time it is on display, thanks to the ingenious way the lab worked to bring an old game
32
to the 21st century. This is an opportunity to improve the student’s required skills in multiple subjects and to realize a complete and multidisciplinary project for engineers interested in robotics and automation at a high level.
EPFL student and professor playing against the foosball table
At first glance, the foosball table located in the middle of the Automatic Control Laboratory looks perfectly normal. However, looks can be deceiving. One of the levers has a mechanical arm capable of propelling the ball into the opposing goal at a speed of 6 meters per second. “This is already enough to beat the average player,” said the researcher Christophe Salzmann, who heads the project. And this is only the beginning.
mvpromedia.eu
Among many disciplines, this project requires knowledge of mechanics, analog and digital electronics, programming, and disciplines. The current idea is to replace a human player by a system able to sense the position of the ball on the foosball table, evaluate a strategy to score and command the levers to execute this strategy.
Starting in 2015 student attention shifted to the improvement of the tactics of the game: fast positioning, correct information delivery to the proper engine control and measurement of the opponents’ performance. Focus on these aspects created the first steps towards autonomous strategic decision making.
The complete automatization of the rotation arm needs to be improved to implement complex computations during the game. That is, not only defensive strategies but also attacking and early evaluation of the counterparty’s moves as one of the project’s goals. The overall goal is to provide the automatic laboratory with a concrete, effective, functioning demonstration system of both software and hardware assemblies.
BABY-FOOT PROJECT EVOLUTION As for many other projects in similar fields, good results require time, effort and most importantly, persistence. Eight years of effort have now yielded impressive results. Everything started in the fall of 2012, when different students passionate about this project took a semester to entirely focus on the Baby-foot project and put into practice the theoretical knowledge they acquired during lecture. These students focused on the creation of the mechanical part of the foosball table and the automation of movements that match human performance. In 2013 successive students focused on the improvement of the cameras, so that the basis for ball positioning via laser could be developed. Moreover, the mechanical design was improved by rethinking the shape of the mechanical components and finally improving the control system.
2016 and 2017 were the years of vision control and strategy systems where students started focusing on ball control and position tracking, which have been systematically further improved in 2019. During that period, vision calibration, ball control and capture systems were developed, measured, and finally tested. The following section will dedicate particular attention to the vision characteristics needed for the detection task which offered the most substantial improvements to the system.
BALL CAPTURE – VISION AND STRATEGY Without any doubt, one of the skills needed to win the foosball table game is the players ability to capture and manage the ball during the game. For us humans time and experience are needed to master this, whereas a robot relies on the mechanical movement control of the arm and a reliable understanding of ball position. Made from start to finish by several student groups, the robotic arms depend on two algorithms on one computer: one to control the mechanical movement of the arm and the other to track the position of the ball. One goal of the current project is to upgrade the automatized control arms by intercepting the ball and retain static possession. To do so, computation of ball speed, position, and the necessary dampening need to take place to move the mechanical arms in advance to intercept the ball.
Ball detection system shown from XIMEA´s xiQ camera
mvpromedia.eu
In the current implementation of the project, image acquisition has played a tremendous role in capturing
33
are distanced by 150mm between each row. By doing so, it was possible to theoretically compute the accurate speed of the ball and predict its position before it reaches the next row of players.
HOW DOES THE FUTURE LOOK?
XIMEA´s high-speed xiQ camera positioned underneath transparent table for vision control and strategy systems
ball position. In order to position itself correctly, the robot has to have a clear idea of the location of the ball in real time. Students replaced the bottom of the foosball table with a transparent material and placed XIMEA high-speed cameras on the ground to film the game board. The ball vision system relies on the frames acquired by a XIMEA xiQ camera which has a 648x488 resolution and can achieve 500 fps. Thanks to its small size and light weight the xiQ camera series represents the perfect match for high efficiency and extremely low power consumption results. The camera is used at its maximum framerate of 500fps to guarantee the precision needed for fast paced object tracking. The camera is coupled with a wide angle M13VM246 TAMRON lens.
It is clear how the project is taking shape and continues to move in the direction of automation. There are several other steps that must be taken into consideration to finally conclude this project. In particular, better speed estimation for the ball, improved predictive placement, more advanced attacking and passing strategies and ultimately the improvement of sighting the ball in the edges. These are the areas where Christophe Salzmann and his research students will focus on in the future.
CONCLUSION Humans competing with robots is nothing new to the 21st century but nevertheless remains one of the most discussed topics that continuously finds new forms in different fields. That is what happened with the students of the EPFL, whose desire to develop an automated foosball table transformed the Automatic Control Laboratory into a training camp. The robotic foosball table has been in use for several years, but recent upgrades have given the system a speed, accuracy and power boost. So much so that EPFL students reckon that human players with an average skill level are regularly being beaten by the robotic table soccer player. The developers reckon that the winning advantage relies on millimeter precision and high-speed acceleration. Despite the recent upgrades in vision control mechanism and the improvement of the first use strategies, the table will take advantage of improvements in software and hardware. However, one thing is certain, the robotic foosball table will be unbeatable in a matter of time.
XIMEA MQ003CG CM, VGA at 500+ fps, with flat ribbon flex cable ideal for embedded vision systems, USB3 Vision Compliant
By running the algorithms with the camera connected to the USB3 port, it is possible to reach a tracking rate ranging between 400 and 500 [Hz]. This means, there is a record of a new ball position every 2.5 [ms] in usual conditions. Knowing that a professional player can move the ball up to 15 [m/s], this would mean that one would get one measurement every 37mm of movement. Nevertheless, when trying to shoot as fast as possible students only reached speeds of about 3 [m/s], which means they got measurements every 7.5 [mm]. This is far more reasonable knowing that the mechanical players
34
XIMEA XIMEA develops and distributes industrial and scientific cameras for various applications. From high-performance cameras, embedded and multi-camera systems to hyperspectral cameras. A 50/50 mix of OEM and series production guarantee innovative and technology driven developments as well as reliable supplies and support. XIMEA offers specialised products for the most demanding requirements and searches for challenges in the development of imaging technology. MV
mvpromedia.eu
SPONSORED
DETECTING PNEUMONIA IN CHEST X-RAYS USING
AI INFERENCING AT THE EDGE Subh Bhattacharya Lead, Healthcare, Medical Devices & Sciences at Xilinx
year, that’s 548,000 scans everyday. Such a huge quantity of scans imposes a heavy load on radiologists and taxes the efficiency of the workflow. Some studies show ML, Deep Neural Network (DNN) and Convolutional Neural Networks (CNN) methods can outperform radiologists in speed and accuracy, particularly under stressful conditions during a fast decision-making process where the human error rate could be quite high. Aiding the decision-making process with ML methods can improve the quality of the results, providing the radiologists and specialists an additional tool.
Figure 1
The use of artificial intelligence (AI), specifically machine learning (ML), is fast becoming a transformational force in healthcare. Using AI and various image processing techniques within radiological modalities like X-rays, ultrasound and CT scans can lead to better diagnosis and better patient outcomes. Additionally, use of AI can lead to increased operational efficiencies, and significant cost reduction in healthcare procedures. Chest X-rays used to detect respiratory diseases, like pneumonia, are the most used radiological procedure with over two billion scans performed worldwide every
36
Healthcare companies are also now looking for effective point-of-care solutions to provide cost-effective and faster diagnosis and treatment in the field or in locations away from large hospitals. As a result, there’s rising demand to perform accurate image inferences to efficiently detect respiratory diseases like pneumonia in the field and provide clinical care using small, portable and point-ofcare devices at the edge. Spline.ai (a partner of Xilinx) has developed a model using curated, labeled images for X-ray classification and disease detection (Figure 1). The model - trained using datasets from the National Institute of Health (NIH), Kaggle and from the likes of Stanford and MIT - can detect pneumonia with greater than 94% accuracy today. The model is then deployed and optimized on Xilinx’s
mvpromedia.eu
SPONSORED
Zynq® UltraScale+™ MPSoC running on the ZCU104 platform acting as an edge device. Xilinx’s Deep Learning Processing Unit (DPU), a soft-IP tensor accelerator that enables low inference latency of less than 10 microseconds. The model will be retrained periodically to improve the accuracy of its results. Xilinx technology offers a heterogeneous and a highly distributed architecture, delivering more compute for image processing, enabling healthcare companies to offer effective and faster diagnosis. The highly integrated Xilinx Zynq UltraScale+ MPSoC with its adaptable Field Programmable Gate Arrays (FPGA), integrated accelerators for deep learning (DPU), integrated digital signal processor (DSP) and its ARM® multi-processor systems perform accurate image classification and detection with AI inferencing in close to real-time with low latency and low power consumption. PYNQ™, the open-source Python programming platform for Zynq architecture with Xilinx’s latest AI Toolkit Vitis AI™ version 1.1, is used to compile the deep learning models for running accelerated inference making this solution cost-effective. Xilinx unified software platforms, Vitis for application development and Vitis AI for optimizing and deploying accelerated ML inference, mean that data scientists and algorithm developers can use advanced devices like the MPSoC easily in their projects.
mvpromedia.eu
New healthcare workflows need to deliver more compute for image processing, data privacy, security, patient safe ty and accuracy in much smaller edge devices. Heterogeneous and adaptable distributed systems, which can be small and portable, are key for solving this problem. Xilinx devices like the Zynq UltraScale+ MPSoC and the Vitis software platform are ideal for delivering the optimized clinical device enabled for AI inferencing at the edge. MV
37
SPONSORED
SENSOR MANUFACTURING DESIGNS & METHODS: ADVANCEMENTS IN
CMOS SENSOR TECHNOLOGY Over the last couple decades, advancements in mobile phone and smart phone camera technology have been at the forefront of developments in complementary metal-oxide-semiconductor (CMOS) technology. In turn, this has driven improvements to both the sensors and the methods for their fabrication. Also during this time, general manufacturing advancements have reduced noise in CMOS sensors and increased their reliability.
Figure 1: The configuration for a back-lit illuminated pixel and a front-lit illuminated pixel.
One specific alteration consisted of changing CMOS sensor structure from front-lit illumination to back-lit illumination.
Figure 2: Micro-lenses are used to capture as much light from wider angles onto the sensor.
38
Another major improvement to the CMOS sensor design was the incorporation of micro-lenses to maximize the light capture to improve sensor efficiency.
CMOS USAGE OVERTAKES CCD USAGE After these years of technological advancements, the use of CMOS sensors has surpassed the use of chargecoupled devices (CCD) for several key reasons. CMOS devices are able to capture images while consuming less power than CCDs, and are less expensive to manufacture making them less expensive (by about a factor of 10) to purchase. By February 25, 2015, CMOS sensor technology was so popular that Sony announced it would cease production of CCD sensor technology
APPLICATIONS DEMAND HIGHER RESOLUTION As applications become more demanding, higher image quality and resolution are needed. CMOS manufacturers attempted to create sensors with higher resolutions by decreasing pixel size and increasing pixel counts. This was moderately successful; however, it came with some issues, including increased sensor noise. To combat this issue, manufacturers returned to slightly larger pixel sizes but on sensor formats larger than 1.1� sensor formats. This method increased sensor resolution and maintained good signal-to-noise ratios (SNR). As resolution demands continue to increase, sensor manufacturers are not only making use of larger sensor
mvpromedia.eu
SPONSORED
Figure 3: Pixel sizes on sensors and overall sensor sizes have changed in size to accommodate higher resolutions.
Lens mount types such as the TFL and TFL-II mounts feature compact flange distances and larger diameters for sensor formats like the APS-C, APS-H, and other fullframe sensors. These mounts are also threaded types that feature superior stability, support for heavy lenses, and alignment reliability over bayonet types like the F-mount.
Figure 4: The TFL and TFL-II Mounts accommodate a larger maximum sensor diagonal.
formats but are also finding new ways to decrease pixel sizes without sacrificing image quality. One new sensor example is Sony’s 4th generation Pregius S 24.5 MP IMX530 CMOS, a 4/3” sensor (diagonal 19.3mm) with a 2.74µm pixel size (37% smaller pixel size vs. 3.45µm). However, as pixel sizes decrease and sensor sizes increase, significant changes to the optical designs must be made to make full use of increased performance. This requires imaging lens designs to incorporate additional optical elements, making imaging lenses larger in volume and heavier in weight. These two constraints put a strain on lens designers to create lenses with mounts larger than the C-mount and more robust and reliable than the consumer F-mount.
mvpromedia.eu
Sensor manufacturers are releasing the next generation of CMOS sensors with extremely high resolutions. The Canon 120MXS CMOS sensor features 120MP and pixels 2,2 μm in size and the Canon 2U250MRXS CMOS features 250MP and pixels of 1,5 μm. Both of these sensors have pixels with sizes much smaller than the industry’s typical pixel sizes. The new 4th generation of Sony Pregius sensors feature a smaller form-factor and an improved imaging performance of about 1,7X. These sensors also feature pixel sizes that have decreased from 3,45 μm to 2,74 μm. As machine vision applications demand higher resolution, CMOS manufacturers will have to continue reducing the size of individual pixels and increasing the overall size of sensors to improve image quality and effective resolution.
MV
39
SPONSORED
CEMENT BOARD INSPECTION WITH
HIGH INTENSITY BAR LIGHTS FROM AI
MACHINE VISION CHALLENGE: CEMENT BOARD INSPECTION A customer inspecting 4 ft. by 10 ft. white fiber-cement boards required a lighting solution to highlight small 5cm long and 1mm thick surface scratches and defects. They also identified significant surface concavity created during the manufacturing process on some of the 2cm thick boards. As a result, they required the flexible lighting solution to also assist their efforts to develop an identification and grading system to pass/fail board concavity.
AI SOLUTION: HIGH INTENSITY BAR LIGHTS Via Ai’s Distribution Partner, Kendall Electric, CSI, Inc. reached out to Advanced illumination to provide a dark field lighting solution for their customer. To match the size and uniformity needs, Ai suggested the LL174 High Intensity Bar Lights with a diffuse light conditioning. These lights are highly customizable, available in 24” lengths that matched the demands of the application. The configurable control options also allowed for the use of the iCS 3 cable Inline Controller to enable constant or strobed power to the light. The longer length and controller options met the customer’s initial requirements to fit within the existing machine vision setup constraints.
ILLUMINATION RESULTS: SUPERIOR CONTRAST FOR INSPECTION
Left: Board Inspection with Competitive Bar Light,
Right: Board Inspection with LL174 High Intensity Bar Light from Ai
The cement boards were inspected in motion, running 1 foot/second through an enclosed tunnel. The inspection setup required a low-angle, narrow beam throw for uniform illumination across the full surface of the board to identify the scratches. The tunnel setup also required the use of longer lights for full board coverage and to facilitate installation. An additional inspection using the same area-scan camera, but with bright field oriented lighting, necessitated the two lighting orientations be independently controlled with an on/off trigger. The customer contacted Control System Integrators (CSI), Inc. to configure a lighting solution for this inspection challenge.
40
When deployed in the inspection setup, the LL17424WHII3D Bar Lights provided a uniform light spread across the entire length of the cement board within the inspection tunnel. In comparison to competitive lights tested for consideration in this application, the High Intensity Bar Lights provided superior contrast to identify the scratches and other defects, even with the occasional concave boards. According to Daniel Alcala, Vision Automation Engineer for CSI, Inc., “we try to provide our clients with the best possible solution, and the small form factor, high intensity, and high uniformity of the Ai lights made them the best fit for the job.” MV
mvpromedia.eu
SPONSORED
PERSEVERING IS IMPORTANT All of these have an element in common; they rely on robust, reliable and high-speed communications. Whether wide area mobile and wireless networks such as the upcoming 5G and millimetre wave systems and also local, integration level with PCIe 4.0, 10Gb and 25Gb Ethernet, Infiniband, USB 3.2 Gen 2 and others, right down to board-to-board connectivity with higher and higher throughput requirements for FFC products, we can see common traits in the interconnect requirements.
It is no secret that we are facing a combination of significant and unique challenges. We are heading into autumn and under constant siege from media reports that do nothing for confidence and wellbeing. One could be forgiven for thinking it might be best to hibernate and hunker down for the long haul, and wait for the whole thing to blow over. Yet during the pandemic technologies are playing a crucial role in keeping our society functioning. Moore’s law is dead, the notion that the number of transistors on a microchip doubles every two years has become harder and harder to achieve as chip components get closer and closer to that of individual atoms. A new paradigm is emerging – tentatively named Huang’s Law, after Nvidia’s chief executive and co-founder Jensen Huang, who describes how chips that power artificial intelligence are doubling in performance every two years, with their latest GPU performance closer to tripled. This progress is enabling some of the most radical changes to the way we are keeping society functioning in times of lockdowns and quarantines: online shopping, online entertainment, big data enabled supply chain management, delivery drones, medical robotics and telepresence, distance learning – even simply staying in touch with family and loved ones.
mvpromedia.eu
At Alysium, we continuously embrace these new challenges, bringing new products into the market to satisfy the insatiable demand for speed and reliability, dovetailed with industry specific requirements like dust or water proof, robotic and C-track high flex life and extended assembly lengths. Integrating all these requirements into affordable products is our contribution to ensuring that all your enhanced requirements are met and that you can in turn, extend your competitive advantage. Recent examples of new product releases include a new M12 x-coded 90D plug, supporting 10GigE with the smallest form factor. We have also recently concluded a new raw cable development for 12G Coaxpress assemblies, and introduced new material to support legacy CameraLink™ assemblies at a lower cost. In the field of optical products, Alysium introduced both CLHS (10G per lane) and USB Gen1 AOC assemblies some years ago, to ensure that extended length and / or high flex options still exist, beyond the technical limits of copper. These products have also been continuously improved, with the latest CLHS AOC assemblies being rated for 14G per lane, in a smaller form factor. Progress in technology will undoubtedly have an impact lasting long beyond Covid-19. Progress will not waiting for anyone. We keep on building and making progress. We keep on persevering. Let us help you to make your next project be a bigger success. MV
41
WHAT TROUBLE LIES UNDER THE
SURFACE? However, wherever there are bindings, these can also “break” – be it through external influences or material fatigue. Regular tests are therefore indispensable. The iX600 portable inspection system from UNX Technologies in Taiwan provides contactless inspection of multilayer composites, even for large surfaces. It uses image processing to detect weak points that are invisible to the human eye. In addition to a thermal imaging camera, a uEye board level camera with autofocus from IDS Imaging Development Systems is used. In yacht and boat building the most commonly used materials are the so-called composites. By cleverly combining the positive properties of at least two components, they often have outstanding features – some of them even specially adapted to their respective areas of application. It is hardly surprising that they are also used for high-performance applications in the aerospace industry, in power engineering for the likes of wind turbines, in medical technology, in sports equipment manufacturing and in the automotive industry.
In maritime applications, the demand for composites is continuously increasing. In addition to floor elements, side walls, ceiling panelling or doors, toilet compartments and separation units made of composite materials can be supplied. Moreover, load-bearing parts such as floors, walls or roof panels are possible and provide considerable weight savings. A classic example of a composite material used here is CFRP – carbon fibre reinforced plastic. It has a high stiffness and is very light at the same time. Or also GFRP – glass fibre reinforced plastic, a material made of plastic and fibreglass. It is cost-effective and yet very high quality. With such lightweight construction materials, the individual components are not combined into a single material, but exist virtually side by side. Therefore, damage such as bubbles, pores, foreign inclusions and delamination (peeling off of layers) can occur from the manufacturing processes. Even during operation, i.e. while the vehicle is moving, the strength of the material can be impaired, for example by an impact event. An impact against a composite panel can leave no visible damage on the outside. Inside, however, it will still lead to a reduction in compressive strength. Regular testing of the materials is therefore not only sensible but also vital for safety. Yacht manufacturers usually use the ultrasonic or impact-echo method to detect defects. This is based on the use of sound waves. It was developed in
42
mvpromedia.eu
the 1980s for testing reinforced concrete components and is currently widely used. However, this process requires contact between the parts and is time consuming.
APPLICATION
UNX Technologies offers a real alternative. The portable inspection system iX-600 is non-contact and therefore guaranteed non-destructive inspection (NDI - NonDestructive-Inspection) of composite materials. Testing can be carried out with regard to detachment, delamination or porosity. The material properties are analysed beneath the surface to locate any defects in the interior of the material. The system also captures large inspection areas or curved structures effortlessly – without requiring a material sample in advance. For composite materials with multilayer structures, the iX-600 can also be used in addition to the manual sequential ultrasonic scanning (described above) to replace point information with area information.
The special feature of this: The fault location can be determined quickly and precisely. In addition to the infrared thermal image for evaluating the general temperature distribution on the test surface, the UNX system receives parallel image information from the integrated industrial camera. Thus, it determines the exact defect status of the material. The decisive factor is, therefore, the combination of an infrared camera with a board level camera with autofocus from IDS, paired with a built-in industrial computer with a seven-inch touch screen and the associated software. The instrument, patented in Taiwan, is mobile and does not require an additional external computer.
Using the example of the outer skin of a motor yacht, the procedure can be explained in simple terms. The board level camera records the current image of the area to be inspected exactly parallel to the thermal imaging camera during the inspection. With the help of image processing technology, the system can quickly detect defective areas by analysing and matching the image data and display their relative position. All measurement results are clearly displayed and documented in the image. The delamination defect for example (green frame in the picture) can then be easily located for repair.
INDUSTRIAL CAMERA WITH AUTOFOCUS UNX Technologies was looking for a compact camera for the mobile system that would also allow quick changes between different focal planes. “The iX-600 is a portable
mvpromedia.eu
43
device and must therefore be small and light. The same applies to its components. In addition, the integrated camera must have a computer-controlled autofocus function due to the different working distances,” explains Dr Yu-An Lai of UNX Technologies.
and GigE uEye cameras. In addition to the camera drivers, it contains sample programs in various programming languages such as .NET. “This shortened our development time considerably,” explains Dr Lai.
The uEye LE AF USB 3.1 Gen 1 board-level camera from IDS meets all these criteria. It is suitable for the use and control of liquid lenses and offers a practical autofocus function. This allows for optimally focused images at variable object distances. The automatic system is based on an ‘active’ liquid lens control and can be easily triggered by software. Depending on the application, the autofocus can be individually configured and ensures perfectly sharp images in no time at all. UNX has specifically chosen the model UI-3881LE AF with S-Mount lens mount. The camera is equipped with the 1/1.8” rolling shutter CMOS sensor IMX178 from the Sony STARVIS series and enables a resolution of 6.41 MP (3088 x 2076 px) at a frame rate of up to 58 fps. In addition to its high resolution, the sensor scores with speed and sensitivity, so that excellent images are achieved even under low-light conditions.
OUTLOOK
The UI-3881LE AF has a twist-proof USB Type-C connector as well as USB power delivery and can be easily integrated into embedded systems. Due to its compact design (36 x 36 x 20 mm) it can be installed in the smallest space and is perfectly suited for classic machine vision applications and tasks in the fields of microscopy, medical technology, metrology and traffic monitoring. It is also predestined for installation in small industrial devices, such as the iX-600 from UNX Technologies.
The iX-600 inspection system focuses on the market for non-contact and non-destructive testing of composite components. A market that holds a lot of potential. This is because the areas of application for composites are often sensitive. Whether in maritime applications, as described here, in space travel, medical technology or sports equipment construction – the load-bearing capacity of the materials used is essential and makes regular testing indispensable. However, UNX Technologies is also considering applications that will require industrial cameras with higher resolution or C-mount lens connection in the future. Here the company has the “sister camera” UI359xLE AF in mind. Just like the UI-3881LE AF model used in the iX-600, it offers the possibility of focusing liquid lenses conveniently via the camera using software. On the sensor side, however, it is equipped with a 1/2” rolling shutter CMOS colour sensor from ON Semiconductor, which delivers an extremely high resolution of 18.10 MPixel (4912 x 3684, 20.0 fps), 4K Cinema (4096 x 2304, 38 fps) and Ultra HD (3840 x 2160, 40 fps). No matter which variant is used, the compact and powerful industrial cameras open up countless application possibilities. In mobile solutions, such as the iX-600, they make everyday work easier for the user in every regard and at the same time ensure greater safety. For those who can “see-through”, know what is happening under the surface. MV
The Taiwanese company uses the IDS Software Development Kit to integrate the camera. IDS Software Suite is a free software package that is exactly the same for uEye industrial cameras (model designation “UI”) and can easily handle a mixed operation of USB 2.0, USB 3.0,
44
mvpromedia.eu
DIVE INTO DEEP LEARNING
Open eVision
EasyClassify
EasySegment
AT A GLANCE • Includes functions for classifier training and image classification • Able to detect defective products or sort products into various classes • Supports data augmentation, works with as few as one hundred training images per class • Compatible with CPU and GPU processing • Includes the free Deep Learning Studio application for dataset creation, training and evaluation • Only available as part of the Deep Learning Bundle
AT A GLANCE • Unsupervised mode: train only with “good” images to detect and segment anomalies and defects in new images • Works with any image resolution • Supports data augmentation and masks • Compatible with CPU and GPU processing • Includes the free Deep Learning Studio application for dataset creation, training and evaluation • Only available as part of the Deep Learning Bundle
Deep Learning classification library
www.euresys.com
Deep Learning segmentation library
INTERFACES FOR
MACHINE VISION Choosing the right interface for your machine vision application is a key decision in your camera selection process. With different types of cables and connectors available for machine vision applications, there are associated pros and cons. FLIR aim to demystify the options.
DEDICATED INTERFACES Useful for applications where extremely high-speeds or ultra high-resolution necessitate the use of such
interfaces; for example, line-scan cameras used to inspect continuous flow processes like paper or plastic film production where cameras frequently work in the kHz range. However, these interfaces tend to be significantly more expensive, less flexible and add to system complexity. CarmeraLink (supports up to 6.8Gbit/s of data) and CoaXPress (supports up to 12Gbit/s) are dedicated machine vision interfaces typically used in such applications. In addition to the cameras, systems using these interfaces require frame grabbers. Dedicated machine vision interfaces also use proprietary cables, making integration with other peripherals a little more challenging.
COAXPRESS (CXP) The CoaXpress interface was launched in 2008 to support high-speed imaging applications. CXP interfaces use 75ohm coaxial cables and support data transfer speeds of up to 6.25Gbit/s per channel, with the ability to use multiple channels to support even faster data transfer rates. A CXP cable can supply up to 13W of power per cable and requires that both the ‘device’ and the ‘host’ support the GenICam camera programming interface. While single-
46
mvpromedia.eu
lane coaxial cables are inexpensive, the cost of setting up multi-lane cable assemblies and frame grabbers add up very quickly.
CAMERALINK
Your requirements for resolution, frame rate, cable length and host system configuration should all be considered to ensure you get performance you require without spending more than you need. FLIR´s machine visions cameras support all three trusted and widely available interfaces.
The CameraLink standard was launched in the year 2000 by Automated Imaging Association (AIA) and has been upgraded progressively in order to support higher data speeds, with some versions requiring two cables for transmission. The three main configurations available include Base (2.04Gbit/s), Medium (5.44Gbit/s) and Deca/Extended (6.8Gbit/s). Like CXP interfaces, CameraLink requires frame grabbers and additionally need to be compatible with Power over Camera Link (PoCL) standard in order to supply power. CameraLink lacks any error correction or resend capabilities, requiring expensive and cumbersome cable setups to try and eliminate dropped images by maximising signal integrity.
CONSUMER INTERFACES These interfaces enable machine vision cameras to connect with host systems using widely available USB and Ethernet standards. For most machine vision applications, the USB 3.1 Gen 1 and Gigabit Ethernet consumer interfaces provide a winning combination of convenience, speed, simplicity and affordability. Furthermore, consumer interfaces support widely available hardware and peripherals for machine vision implementation. USB and Ethernet hubs, switches, cables and interface cards can be purchased anywhere. Most PCs, laptops and embedded systems include at least one port each of Gigabit Ethernet and USB 3.1 Gen 1. The most obvious difference between these categories of interfaces is their bandwidth. Faster interfaces enable higher framerates (Fig. 1) for a given resolution. A faster interface enables you to capture more images each second or capture higher resolution images without sacrificing throughput.
mvpromedia.eu
Fig. 1. Bandwidth available for each interface vs. sensor resolution and the resulting frame rate.
UNIVERSAL SERIAL BUS (USB) USB is everywhere. Most USB machine vision cameras use the USB 3.1 Gen 1 interface. This interface provides up to 4Gibt/s of image data bandwidth between the camera and the host system. The USB3 Vision standard helps
47
ensure compatibility between a wide range of cameras and software by defining a common set of device detection, image transfer and camera control protocols.
to greatly extend the working distance and provide Electromagnetic Interference (EMI) resistance. The performance of active optical cables is dependent on the throughput requirements and the host system configuration. When using optical cables, even those that supply power via the cable, FLIR recommends using powering cameras externally via GPIO. Additionally, locking USB cables provide a secure connection between cables, cameras and host systems. Prior to purchasing locking cables, FLIR recommends checking the locking screw position and spacing compatibility, as several options are available.
Fig. 2. USB 3.1 Gen1 cable (USB to locking USB)
USB supports Direct Memory Access (DMA). With this DMA capability, image data can be transferred across from the USB directly into memory where it is available for use by software. DMA coupled with the widespread support for USB and availability of drivers for USB controllers on virtually any hardware platform makes USB ideal for use in embedded systems. USB 3.1 Gen 1 can simplify system design by supplying up to 4.5 W of power to a camera.
Fig. 3. Different types of USB connectors
High-flexibility USB cables help maximise the lifespan of cables in systems where the camera must be moved repeatedly. Active optical cables (AOCs) may be used
48
GIGABIT ETHERNET (GIGE) GigE provides up to 1Gbit/s of image data bandwidth. Its combination of simplicity, speed, 100m maximum cable length and ability to supply power to cameras over a single cable make it an extremely popular camera interface. Ethernet cables are available with robust shielding. This is ideal for environments with high electromagnetic interference caused by proximity to the powerful motors found in some robots and metrology equipment. FLIR GigE cameras also support a packet resend feature which further boosts transmission reliability. Unlike USB, GigE does not support DMA. Packets containing image data are transmitted to the host where they must be reassembled into image frames prior to being copied to software accessible memory. This process is trivial for modern PCs, though it may result in latency for some low-power embedded systems with limited system resources. The widespread adoption of Gigabit Ethernet means there is an incredibly wide range of supporting products
mvpromedia.eu
10GIGABIT ETHERNET (10GIGE)
Fig. 4. Gigabit Ethernet / GigE Cable (RJ45 to RJ45)
from cables to switches, ready to meet any project requirement. The widespread adoption of Ethernet across many industries has enabled availability of many specialised cables and connectors for a wide range of use cases. Some industrial devices use an X-Coded M12 (Fig. 5, right) connector to provide increased shielding, however, for most applications, the familiar RJ-45 connector is good enough and provides greater convince at lower cost.
10GigE builds on the strengths of GigE by increasing the bandwidth to 10Gbit/s. 10GigE is an ideal interface for high-resolution 3D scanning, volumetric capture and precision metrology. GigE and 10GigE can be combined in numerous ways. Multiple GigE cameras can be connected to a 10GigE switch to support multiple GigE cameras at full speed over a single 10GigE port on a host system. 10Gbit/sec is a lot of data! Modern PC systems with high-speed CPUs, PCIe 3.0 and dual channel memory can handle this well, while higher performance systems can support multiple 10GigE cameras. Embedded systems with reduced system resources will generally lack the memory bandwidth and processor speed required to keep up with the incoming image data.
SUMMARY Both consumer and dedicated interfaces are used across many machine vision applications. Pros and cons mentioned in previous sections would eventually determine the suitability of one over another for a specific use case. However, the combination of performance, ease of use, widespread availability and low cost make consumer interfaces an attractive choice for most machine vision applications. MV
Fig. 5. The ubiquitous RJ45 connector (left) and the less common X-Coded M12 connector (right)
mvpromedia.eu
49
Powered by
CURRENT CANDIDATES THAT WE ARE WORKING WITH EXCLUSIVELY: Functional Safety Engineer - Degree Educated, TUC certified engineer with over 20 years’ experience into the automation industry spanning across; Pharmaceuticals (batch), Chemicals, Oil & Gas and Robotics. - Experienced with full project lifecycle including, hands on programming on Siemens PCS7 comprising of redundant F&H safety controllers. - Recently completed a project with over 10,000 I/O’s from scratch including the FDS with the client right the way through to start up and handover.
Robotics Engineer - Degree educated robotics engineer with over 10 years’ experience into the Automation and Robotics space focusing into the Pharmaceutical, Automotive, baggage handling and specialist industries. - Technically astute with various systems including 6-Axis Robots,WinCC 7.4, TIA portal, Omron HMI graphics, TIA Portal, Omron Sysmac, Kuka, Staubli, and Modbus. - Most recent project was upgrading the baggage handling system and a well known air port using Omron and KUKA where they had to install new controllers and upgrade existing systems.
PLC Engineer - Degree educated PLC engineer with over 15 years’ experience into the Controls and Automation industry spanning across, Robotics, Water and Chemicals. - Hands on, full lifecycle experience with various controllers including, Omron, Rockwell’s Allen Bradley and Siemens S7 PLC’s - Recently completed the upgrade of a sewage plant where they upgraded existing Allen Bradley controllers while installing new Control Logix and WonderWare SCADA
DCS Engineer - Masters Educated DCS engineer with over 15 years’ experience into the Controls and Automation industry predominantly focusing into the Pharmaceutical space. - Full project lifecycle experience with Emerson’s DeltaV DCS where they have gained in depth FDS and Programming experience into the S88 Batch environment. - Most recent project was adding new lines into an existing plant where they had to reverse engineer existing controllers, install new systems, test and commissioning DeltaV.
50
mvpromedia.eu
Powered by
CURRENT LIVE JOBS THAT YOU CAN APPLY FOR TODAY: We are currently partnered with a number of organisations that are looking at expanding their high performing engineering departments throughout the country, below are a few examples of opportunities that are currently live.
PLC Engineer South East England A pioneer within the water and waste water industry are looking at expanding their jewel in the crown automation engineering department with a junior PLC engineer. This is an opportunity for an engineer to grow within a reputable organisation in the water industry and get certified on multiple systems while executing greenfield projects. The successful candidate for this role must have. - Degree Education within engineering (Automation or Mechatronics Ideally). - Hands on programming with either Siemens S7 or Rockwell Allen Bradley. - The ability to work under pressure and to a deadline.
DCS Engineer Midlands We are currently teamed with an established system integrator within the midlands region that are looking for an experienced DCS engineer to join their ranks. The engineer will be joining an established tam and will have the opportunity of getting trained on new systems alongside working on some of the largest projects within the pharmaceutical, Water and Chemical industries. The successful candidate for this role must have. - Hands on experience with Emerson DeltaV is essential - Experience with hands on programming ideally into the Pharmaceutical/Lifesciences industries. - Experience within the batch processing environment (not essential) - The ability to work overseas when necessary..
Building Automation Service Engineer We are currently partnered with a number of building automation companies that are seeking to add to their talented servicing department within the South-East. The roles will be predominantly focusing on the Trend and Tridium software’s wear you will be tasked with reverse engineering both SW/HW to overcome each task. The successful candidate for this role must have. - Hand on experience with Trend, Tridium and Schneiders StruxureWare platforms. - Certifications for specific courses to allow them to work on systems. - Valid driving License - The ability to work alone and within part of a wider team.
mvpromedia.eu
51
Building Automation Commissioning and project support I am currently supporting a well known organisation in the pursuit of finding a new commissioning engineer to join the Building Automation department within the South East. The role will be to support the projects team on the back ed of projects through the test and commissioning tasks alongside executing smaller projects. This is a great opportunity for an engineer to gain more project exposure with a clear plan for the engineer to work their way into the projects department. The successful candidate for this role must have. - Hands on experience with Trend OR Tridium Systems. - Experience with commissioning and servicing. - Experience with smaller projects (not essential) - Valid driving license.
If any of these opportunities are of an interest to you or somebody you know then please reach out to me at sam. williams@heatrecruitment.co.uk to have a confidential conversation regarding your current situation and to see where my team and I can assist you moving forward.
NOVEMBER IN A NUTSHELL So, November has come on and gone along with Lockdown 2.0, but what has happened in the Automation world? Well, there has been great news with more and more projects being given the green light to go ahead in both December and 2021, it feels like we are almost getting back onto the course of normality, whatever that may be. We have had Cougar Automation be recognized as one of the top organisations to work for within the U K and we have had Rockwell Automation acquiring yet more Cyber-Security firms, they are shaping up to be a driving force with the 4.0 shift over to interconnectivity. I for one am excited to see how the end of the year pans out and especially intrigued to see what Q1 2021 has to offer.
WHAT TO DO NOW? I am always looking at conversating with professionals within the industry so if you are looking for your next role, some advice or you are looking at expanding the headcount of your workforce heading into 2021 lets have conversation, drop me a message on sam.williams@heatrecruitment.co.uk and we can organize a time that works best.
Powered by
TELEPHONE
0330 335 8347 Visit the Heat Recruitment website for more details of these and hundreds of other jobs too www.heatrecruitment.co.uk
52
mvpromedia.eu
What Type of Strobe Controller is Best for Machine Vision? Shifting illumination conditions can create massive problems for machine vision software. Achieving consistent illumination is a core requirements.
But, what type of controller is best?
A strobe controller will guarantee perfect illumination of the object and help improve speed and reliability of the process. It may also save costs.
The illumination LEDs produce is proportional to the current flowing. So, the most logical method to LEDs is via a Constant Current Controller. Current control has many benefits but sometimes needs careful power management. An alternative is a Pulse Width Modulation controller which is based on Voltage Drive and may be susceptible to variations in power supply. PWM can also be less reliable at speed.
The choice of strobe controller is an important one. You can learn more about the different types of controller at www.gardasoft.com/voltage-drive-current-drive/
Semiconductor
|
PCB Inspection
Telephone: +44 (0) 1954 234970 | +1 603 657 9026 Email: vision@gardasoft.com
www.gardasoft.com
|
Pharmaceuticals
|
Food Inspection
FILTERS: A NECESSITY, NOT AN ACCESSORY.
INNOVATIVE FILTER DESIGNS FOR INDUSTRIAL IMAGING
MIDOPT.COM
THE FUTURE DEPENDS ON OPTICS
LH Series
LS Series
CA Series
NEW
High Resolution Lenses for Large Format Sensors A new range of lenses specifically designed for large format high resolution sensors. Get the best out of your sensor and see more with an imaging lens from Edmund Optics®! • CA Series: Optimized for APS-C sensors featuring TFL mounts for improved stability and performance. • LH Series: An ultra-high resolution design for 120 MP sensors in APS-H format. • LS Series: Designed to support up to 16K 5 micron 82 mm line scan cameras with minimal distortion. Find out more at:
www.edmundoptics.eu
UK: +44 (0) 1904 788600 I GERMANY: +49 (0) 6131 5700-0 I FRANCE: +33 (0) 820 207 555 I sales@edmundoptics.eu
mvpromedia.eu
55
™
TECHNICAL RESOURCE GUIDE FOR IMAGING SOLUTIONS The resource guide contains over 50 pages of in-depth technical information covering machine vision fundamentals as well as many advanced imaging topics. Learn how to choose the right lens for your application and get the best out of your system.
TOPICS INCLUDE: V203E
1. Getting Started 2. Understanding Lens Specifications 3. Understanding Lens Design Limitations LH Series Fix ed Focal Len 4. Real World Performance gth Lens 5. Telecentricity and Perspective Error 6. Advanced Design Concepts 7. Understanding Hardware 8. Liquid Lenses in Machine Vision 9. Filtering in Machine Vision 10. Using Infinity Corrected Objectives 11. Cameras 12. Illumination CUSTOMER NUMBER
The FUTU RE Depe nds on Opti cs ™ for the
NEW
Industry-Leadin
120 MP APS-H Forma
2020 IMAGING OPTICS BOO K
Desig ned and Manu factu red
t Sensor
g Optical Desig ns | World-Cla ss Quality Manu facturing By Edmund Opti ® cs
ging
Global solut ions and engi neering
support when you need it
Expertise
Application-fo cused engineering and design
Testing
Imaging perform ance and ruggedization for harsh environ ments
Assembly
Component and custom manufa cturing
V203E
ptics .eu/e o-ima
optics .eu
www .edm undo
01(B)_Cover_V203E.
www.e dmund
See Inside Back Cover for Helpfu l Imaging Equati ons!
Focused on You
indd 1
www .edm undo
ptics .eu/e o-ima
ging
17.07.20 11:59
The 180 page imaging resource guide also contains Edmund Optics’ full range of imaging products, complete with detailed specifications and pricing.
REQUEST YOUR FREE CATALOG TODAY!
OR SCAN THIS QR CODE!
EMAIL: info@edmundoptics.eu I PHONE: +44 (0) 1904 788 600 OR GET YOUR COPY AT: www.edmundoptics.eu/catalog
WE SPEAK 6 LANGUAGES – HOW TO REACH US* +44 +49 +33 +44
(0) (0) (0) (0)
1904 788 600 6131 5700-0 820 207 555 1904 788 600
FAX
+39 800 875 211 +34 914 197 354 +7 499 350 3901 +44 (0) 1904 788 610
www.edmundoptics.eu | sales@edmundoptics.eu *if you are in the country indicated by the symbol please call the respective number.
ONLINE CHAT
APPLICATION SUPPORT
www.edmundoptics.eu/contact Monday - Friday, 2:00 - 24:00 CET (after official business hours in English only)
+44 (0) 1904 788 600 techsup@edmundoptics.eu Monday - Friday, 9:00 - 18:00 CET
56
mvpromedia.eu