1
Editor: Lou Reade Advertising Sales: David Chadd
Summer 2022
Industry update from the UK Industrial Vision Association
Making more of machine vision
Editor: Dr Denis Bulgin Advertising Sales: David Chadd
After the nightmare of Covid lockdowns, it was a relief for many to meet face-to-face and swap ideas at the Machine Vision Conference (MVC) earlier this year. For 2023, UKIVA is planning to double the duration of the show – and co-locate it with a robotics event. Live events have always been the lifeblood of manufacturing – from major exhibitions at the NEC to small, focused events that target a particular industry. This is true of MVC, which has maintained a particular character since its inception. “First and foremost, it’s a conference – and that’s not going to change,” says Neil Sandhu, UKIVA chairman. However, change is in the air as UKIVA plans a unique new event for next year. MVC 2023 will run for two days, in parallel with Automation UK – a new show on robotics organised by the British Automation and Robot Association (BARA). For Sandhu, this is a huge opportunity. “There’s a symbiosis between robotics and machine vision,” he says. “Running these events side-by-side makes perfect sense.” The two events will run at the CBS Arena in Coventry on 20-21 June 2023. While MVC will remain a conference with an attached exhibition, Automation UK is intended to be a “national automation and robotics show”.
The new two-day MVC offers those interested in machine vision more choice and flexibility. For starters, visitors can choose which day to attend (or simply attend both). This will allow them to gain greater insight into machine vision systems – as well as robotics and automation. Sandhu thinks the twoday event will help to boost interest in both areas. “The time is right for MVC to become a two-day event,” says Sandhu. “Combining it with Automation UK makes for a stronger show.”
CONTENTS Application Articles ......................... 3-10 Highlights from MVC 2022 .............. 13-16 UKIVA Member Directory ..................... 16 Product Updates ........................... 18-27
To contribute to Vision in Action, email the editor on ViAeditor@outlook.com. For advertising enquiries, contact David Chadd (david.chadd@ppma.co.uk) on +44 (0)20 8773 5505.
www.ukiva.org
2D/3D Profile Sensors
Automated Weld Seam Guidance In Robot Cells
In fully automatic robot welding cells, it is necessary to determine the exact position of joints before the welding process. For this purpose, a 2D/3D profile sensor is mounted directly in front of the welding torch on the robot. The special design created especially for the application is particularly compact in order to minimise the space required for mounting on the
welding torch. The sensor now detects the joint via laser triangulation. The guide point is determined via uniVision software and sent to the control system. A path correction is now carried out using this information and the weld seam is placed.
2D/3D Profile Sensor weCat3D MLZL • All-in-one solution: sensor, software & interfaces in one • Compact housing design specifically for welding torches • High profile quality (1,280 points per profile) • Integrated cooling and flushing
wenglor sensoric ltd. Unit 2, Beechwood | Cherry Hall Road | Kettering Business Park | Kettering, Northants NN14 1UE | Tel. +44 (0)1536 313580 | info.uk@wenglor.com
3
ASSOCIATION NEWS
APPLICATION ARTICLES
MVC EXPANDS INTO A TWO-DAY EVENT
Editorial material in this section is provided by UKIVA Members. Content accuracy is the responsibility of individual UKIVA Members.
ACTIVE SILICON
www.activesilicon.com
The thing with conferences is this: no sooner has one finished, plans begin for the next edition.
Global players use automated optical inspection
This is true of the Machine Vision Conference, though next year delegates can look forward to a longer, more extensive event.
Active Silicon says that “several global players” are using its automated optical inspection (AOI) systems to visually inspect PCBs – in order to detect imperfect ones, or identify those for removal from the production line for repair.
As well as running across two days, MVC 2023 will be co-located with Automation UK – a new event on robotics and automation, organised by the British Automation and Robot Association (BARA). BARA, like UKIVA, is part of PPMA. The new event will run at the CBS Arena in Coventry on 20-21 June 2023.
The company’s Camera Link and CoaXPress frame grabbers are suitable for industrial environments where light sources can be controlled. As components get smaller, higher image resolution and faster inspection are necessary. This means faster imaging – which has seen more manufacturers moving to high-speed CoaXPress cameras and frame grabbers, says the company.
ACTIVE SILICON
In addition, many production lines can capture both 2D and 3D images, which increases image processing time. High-speed image acquisition helps keep inspection time at the optimum level. Typically, images are acquired synchronised to the manufacturing line and lighting system. Strobing of the lighting system ‘freezes’ the motion of the part, or the part is momentarily stopped and the camera triggered under continuous lighting. Several “global players” are using Active Silicon’s AOI There are three main variations systems to inspect PCBs of programming an AOI system. The first, template matching, involves pre-examination of a ‘golden’ board – against which other boards are compared. The second, pattern matching, makes comparisons with good and bad examples that the system has already learnt. Thirdly, statistical pattern matching uses statistical methodologies to decide which deviations from the norm are acceptable, and which will result in rejection. Images and data about PCBs are programmed into the AOI system to train it to know what to look for, and the system can be up and running quickly in most production lines.
IDENTIFY DIRECT
www.identifydirect.com
Inspecting aerosol cans for pharmaceutical use
“Automation UK and MVC will be the largest annual event in the UK to showcase leading robotic and vision suppliers,” says Mark Stepney, a PPMA board member. “Its legitimacy is underlined by the backing of two leading industry associations – BARA and UKIVA. The increasing convergence of vision and robotics means that the overlap between the two markets continues to increase. While the two events run side by side, they will be quite different in nature: while Automation UK is squarely an exhibition – with many exhibitors showcasing robotics and automation offerings on larger stands – MVC will continue to be a conference surrounded by a compact exhibition. As with previous years, booths at MVC will be identical in size.
A major aerosol manufacturer recently commissioned Identify Direct to develop an on-line vision system. The system visually inspects laser-etched engravings on the bottom of aerosol cans – which are manufactured for the pharmaceutical industry. Several key issues had to be solved in order to read the cans quickly and accurately. Firstly, the writing and DataMatrix were both laser-etched – rather than being printed – meaning it had very low contrast. The writing was not straight across, but in an annulus around the base of the can. In addition, the bottom of the aluminium can was concave and shiny.
IDENTIFY DIRECT
Despite the obvious synergy between them, the two events will by physically separated – and not simply be separate ends of the same exhibition hall. In this sense, MVC will continue to be self-contained – yet offer interested delegates a huge opportunity to investigate robotics and automation technologies in greater detail.
Identify Direct developed a ‘flat dome’ to accurately read laser etchings on aluminium cans Inspection accuracy needed to be 100%, despite the cans moving along a production line and varying their positions across the track. Identify Direct managed to develop an in-house ‘flat dome’ which – with a classic dome and ring light – gave a consistent image on all the cans. The company also developed software based on a ‘polar unwrap’ tool. This used the central DataMatrix mark as a ‘locator’ to position the polar unwrap region of interest (ROI) correctly on each can, using a DALSA camera to capture the images.
The event format of MVC is unlikely to change radically – in that it will still have an extensive conference programme with keynote speakers. However, running it over two days gives UKIVA potential options – such as whether to run presentations on both days. These, and other ideas, are still being considered. David Harrison, CEO of PPMA, says: “There are many benefits to running MVC and Automation UK together. It doubles up on reasons to visit the overall event.” FOR MORE INFORMATION: MVC 2023: www.machinevisionconference.co.uk AUTOMATION UK: www.automation-uk.co.uk
www.ukiva.org
4
ADVERTORIAL
Intelligent robotics for laundries closes automation gap The textile and garment industry is facing major challenges with supply chain and energy issues. The future recovery is also threatened by factors that hinder production, such as labour and equipment shortages, which put them under additional pressure. The competitiveness of the industry depends on how affected companies respond to these conditions. One solution is to move the production of clothing back to Europe. Shorter transport routes and significant savings in transport costs and greenhouse gases would favour of this. On the other hand, higher wage costs and the prevailing shortage of skilled workers in this country must be compensated. The German deep-tech start-up sewts GmbH has focused on the great potential that lies in the automation of textile processing. It develops solutions with the help of which robots - similar to humans - anticipate how
a textile will behave and adapt their movement accordingly. In the first step, sewts has set its sights on an application for large industrial laundries. With a system that uses both 2D and 3D cameras from IDS Imaging Development Systems, the entrepreneurs are automating one of the last remaining manual steps in large-scale industrial laundries, the unfolding process. Although 90% of the process steps in industrial washing are already automated, the remaining manual operations account for 30% of labour costs. The potential savings through automation are therefore enormous at this point. “The particular challenge here is the malleability of the textiles,” explains Tim Doerks, co-founder and CTO. While the automation of the processing of solid materials, such as metals, is comparatively unproblematic with the help of robotics and AI solutions, available software solutions and conventional image processing often still have their limits when it comes to easily deformable materials. Accordingly, commercially available robots and gripping systems have so far only been able to perform such simple operations as gripping a towel or piece of clothing inadequately. But the sewts system VELUM can provide this. With the help of intelligent software and easy-to-integrate IDS cameras, it is able to analyse dimensionally unstable materials such as textiles. Thanks to the new technology, robots can predict the behaviour of these materials during gripping in real time. It empowers VELUM to feed towels and similar linen made of terry cloth easily and crease-free into existing folding machines, thus closing a cost-sensitive automation gap.
“We need a 3D camera that is cost-effective because we use two to three 3D cameras depending on the system configuration. In addition, it must above all ensure high accuracy of the depth data,” describes Tim Doerks. “Beyond that, we need 2D cameras that are light sensitive, deliver high dynamic range and are suitable for use in a multi-camera system.” The founders found what they were looking for in the IDS portfolio: For the VELUM multi-camera system, the choice fell on the new Ensenso S10 3D camera as well as models from the uEye CP camera series. Their task is to identify, both in 2D and 3D, interesting features and gripping points of the textiles that are fed into the system after washing and drying in an unordered manner in a container or on a conveyor belt. The shape and position of the individual objects cannot be predicted. The cameras capture the different textures of the materials. They distinguish which hems there are on a towel and where corners are. “We match the images from the 2D and 3D cameras to have a higher 2D resolution together with the 3D data. So we use the respective advantages of the 2D camera, in this case the higher resolution, and the 3D camera, i.e. the precise depth data.” Equipped with a 1.6 MP Sony sensor, the Ensenso S10 uses a 3D process based on structured light: A narrow-band infrared laser projector produces a high-contrast dot pattern even on objects with difficult surfaces or in dimly lit environments. Each image captured by the 1.6 MP Sony sensor provides a complete point cloud with up to 85.000 depth points. Artificial intelligence enables reliable assignment of the laser points found to the hard-coded positions of the projection.
The complementary uEye CP industrial camera from IDS Imaging Development Systems delivers near-noise-free, high-contrast 5 MP images. It offers maximum functionality with extensive pixel pre-processing and is perfect for multi-camera systems thanks to the internal 120 MB image memory for buffering image sequences. At around 50 g, the small magnesium housing is as light as it is robust and predestines the camera for space-critical applications and for use on robot arms. With systems like VELUM, laundries can significantly increase their throughput regardless of the staffing situation. “By closing this significant automation gap, we can almost double the productivity of a textile washing line,” explains CEO Alexander Bley.
www.ukiva.org
www.ids-imaging.com
5
www.ukiva.org
6
APPLICATION ARTICLES MATROX IMAGING
www.matrox.com
Brake disk assessment relies on efficient algorithms Mosaic, an Italy-based specialist in industrial vision systems, recently designed a solution to inspect the surface quality of brake disks. IDENTIFY DIRECT
Its aim was to find a way of spotting and removing inferior products from the production line more reliably. “It’s natural that products will be assessed differently by different operators,” says Marco Pistilli, machine-vision and software developer at the company. “With a vision system, inspections are repeatable 24 hours a day and the machines perform faster than any operator.” An operator can quickly assess brake disk quality on the moving line
The system – which finds defects that are almost invisible to the naked eye – relies on flowchart-based Matrox Design Assistant X vision software. The software runs on an HP workstation equipped with an Intel Core i9 processor and three Matrox Concord PoE frame grabbers – which acquire and process images from nine cameras. A custom-built lighting system uses filters on camera lenses and spotlights. The system interacts with a programmable logic controller (PLC) via fast-scan modules and an encoder, while hardware triggers come directly from the PLC. There were three main challenges during development. Firstly, Mosaic wanted to inspect every surface angle without stopping production. A short cycle time made physical displacement of the disk challenging. Using nine different cameras – each with a different shooting angle – helped to overcome this. Secondly, using the same vision setup for more than 200 different disk models required an effective ‘background removal’ algorithm. The same logic and processing algorithm processed all disk models, removing the need to redefine regions of interest. “Matrox Design Assistant X let us create efficient algorithms for analysis – implementing loops and reconfiguring steps dynamically to obtain the best performance results,” says Pistilli. Finally, precise image acquisition was critical. Using the Operator View, operators can manage recipes, create different sets of parameters for each camera, and customise the image colour-map to best match their needs. A filmstrip recalls previous pictures and relative datasets for further analysis.
OMRON
www.industrial.omron.co.uk
Automated inspection of curtain fabric boosts quality Vadain, a Netherlands-based manufacturer of custom curtains, has introduced a machine vision system to automate the inspection of incoming fabrics. Although staff are highly skilled in spotting errors, the process is time-consuming and inefficient – especially as Vadain stocks “tens of kilometres” of fabric. The company says that the new system will speed up the detection of flaws in fabric – and avoid costly defects in the final product. OMRON
The system was developed with Omron, Sycade and machine builder Eisenkolb. Vadain has automated the inspection of its curtain fabrics
The system performs checks quickly and accurately, and reduces complaints and cutting losses – which ultimately results in cost savings. Sycade set up a solution in which unrolled fabric passes over an assessment surface with integrated cutting unit – located inside a dark unit with vision technology from Omron. The lighting and camera inspection system detects small deviations, thanks to intelligent custom software from Sycade and ‘customised’ standard hardware from Omron. However, finding a fault in a roll of fabric is not enough: fabrics are never of equal thickness or transparency, and come with different weave structures, colours and reflectivity. This means that tests must be carried out to determine the correct light and camera adjustments and programming settings. For a stain or weaving error, the machine identifies the location of the defect – including how much fabric has been unrolled up to that point. After cutting from the correct location, a partial roll is marked with a sticker. This gives an overview of how many metres of flawless fabric are in stock. This ensures that the workshop knows exactly which partial – and errorfree – roll can be used most efficiently for an order. The total of the sub-rolls remains linked to the original metres of the parent roll, making it easy and efficient to re-order fabrics. As well as spotting fabric errors, the solution has saved activity in the warehouse. Because partial rolls are now precisely measured, registered and pre-cut, complete rolls no longer need to be picked up, cut and returned to the warehouse. This has saved 50% in loading and unloading time.
www.ukiva.org
MACHINE VISION CAMERAS AND COMPONENTS FOR AUTOMATION, CONTROL AND QUALITY INSPECTION Do you need to read or verify Barcodes and Labels reliably at high speeds ? Our powerful Machine Vision Software Solutions provide the answers.
Do you need to track and trace, decode and carry out complex quality inspection checks on your process line ? We provide Industrial scanners and Machine Vision Systems for your applications. Do you need Multispectral or Hyperspectral inspection for your process line ? ALRAD supplies Vision Systems for all wavelengths from Ultraviolet through visible, near-infrared, shortwave infrared (SWIR), thermal (LWIR) and terahertz.. Do you need to monitor ‘Hot Spots’ for pre-emptive maintenance or design inspection ? Our Thermal Vision systems give you the hidden picture.
Do you need to inspect or measure in 3D or Align your production processes ? Please call us to discuss how our StingRay Light Pattern Generating Lasers can help to provide your solution. ALRAD INSTRUMENTS LIMITED has been providing high quality components and scientific equipment to the OEM market, industry and research for the past 50+ years. Our technology areas cover: Imaging, Electronics, Photonics, Vacuum, Thermal and Logistics products.
Telephone: 01635 30345 Email: sales@alrad.co.uk Web: www.alrad.co.uk
The NEW ImageIR® 6300 Z
Compact zoom camera and new SWaP detector (Size, Weight and Power)
Smaller, lighter and without time-consuming lens changes. The ImageIR® 6300 Z is predestined for demanding applications in research and development as well as for stationary or airborne inspection and monitoring tasks. Further applications can be found in quality assurance, materials testing or in a wide range of OEM integration solutions. Features: ■ ■ ■ ■ ■ ■
Cooled focal plane array photon detector operating in snapshot mode with (640 × 512) IR pixels Standard built-in 7.5x zoom lens with motor focus Focal length of (15 ... 115) mm or (25 ... 170) mm Operation via smartphone or tablet with integrated web interface Compact size and low weight for space-saving system integration Storage of large amounts of data on the integrated SSD
FOR MORE INFORMATION
InfraTec distributor in the UK and Ireland: Quantum Design UK and Ireland Ltd, 1 Mole Business Park, Leatherhead, Surrey KT22 7BA Tel: +44 (0)1372 378822 | Email: info@qd-uki.co.uk
9
APPLICATION ARTICLES SCORPION VISION
www.scorpionvision.co.uk
Trimming vegetables with higher accuracy At MVC 2022, Scorpion Vision showed how its 3D neural camera can be used to enhance the presentation of fresh produce in supermarkets. The bespoke system uses stereo vision combined with a neural network to improve the way that vegetables are topped and tailed. Carrying this out more accurately leads to less – or zero – waste, and a more appealing appearance on the shelf.
However, a camera with AI can analyse each individual vegetable before deciding how to process it. The machine is shown examples of the stem plate in various conditions, and it will learn what to look for – enabling it to formulate its own conclusion about what it is seeing. “Off-the-shelf cameras with built-in AI are widely available – and attractive from a cost perspective – but won’t match the levels of repeatability that we can guarantee with our bespoke systems,” says Paul Wilson, managing director of Scorpion Vision.
SCORPION VISION
A good example is seen in leek trimming, where it can be difficult to determine the stem plate when it is obscured by roots or other debris. Cutting the leek too short causes it to dry out; cutting it too long leaves unsightly roots behind.
Scorpion’s 3D neural camera improves the trimming of fresh produce
The system on show at MVC was based on the 3D Stinger, designed by Tordivel for use in 3D stereo vision applications. As well as leek trimming, Scorpion systems have been applied to lettuce de-coring, sprout trimming, carrot batoning and swede slicing.
ALRAD INSTRUMENTS
www.alrad.co.uk
UAV thermography in crop cultivation Given this complexity, infrared thermal camera manufacturer Workswell ran experiments to test the genetic resources of minor crops. Those evaluated included winter and spring wheat, winter barley and both forms of triticale. The temperature conditions of the experimental varieties were scanned with a Workswell WIRIS Agro R thermal camera, mounted on a UAV. The flight was pre-programmed via UGCS Ground Station. The data was then processed in the Pix4D program and analysed in the Workswell CorePlayer and Workswell Thermoformat software.
ALRAD
Cultivating the target properties of any new crop variety is a time-consuming process – and success depends on the correct characterisation of a large number of potential gene resources.
An infrared thermal camera on a drone has boosted the cultivation of minor crops such as winter barley
In 2015-2016, field experiments were carried out: 325 units of winter wheat, 82 units of winter barley and 60 units of winter triticale were planted; 224 wheat genotypes and 10 triticale genotypes were sown within the spring forms. Preliminary analysis showed the benefit of this approach for the process of cultivation and phenotyping – when varieties have specific symptoms and significant differences in both visible and infrared radiation. Thermographic imaging via UAV platforms can offer new possibilities for the accurate description of genetic resources. By extending the classifier range – in terms of the relationship between plants’ thermal behaviour and their transpiration responses – the thermal camera and software tools expand the scope of information available to farmers and cultivators. Alrad supplies the complete range of Workswell infrared thermal cameras, which are available as stand-alone, OEM or ready-to-fly UAV/drone solutions.
CLEARVIEW IMAGING
www.clearview-imaging.com
Automated inspection for sheet steel UK-based Hydram Engineering has begun using an automated machine vision system from ClearView Imaging to boost quality assurance. CLEARVIEW IMAGING
The company converts sheet steel into high-quality components for customers in a number of industries. To date, it has relied on shop-floor inspectors for finished parts. However, this is prone to human error – and means staff are not free to perform other tasks. Now, using ClearView’s Vision Box, Hydram puts each part in, lines it up and presses a button to initiate inspection. In a few seconds, it receives a green box (for a passed part) or a red box for a rejected part – with details on why the part was rejected. Vision Box was designed around Matrox Design Assistant X, a flexible vision development system that the company has used to develop the interface behind Vision Box. The cohesive interface works as a hub for quality inspection, as it easily connects to label printers for QC labels, barcode scanners, and back-end database systems.
Hydram Engineering is using ClearView’s Vision Box to improve quality assurance
Training was offered to key staff members at Hydram, allowing them to autonomously manage their vision system with as few call-outs as possible. The system has helped Hydram prove part quality to customers.
www.ukiva.org
10
APPLICATION ARTICLES SICK (UK)
www.sick.co.uk
Deep learning helps Nestlé get the scoop SICK
Nestlé in Germany is using a vision system – trained by deep learning – to check that transparent measuring scoops have been added to cans of food supplement. Vision system helps Nestlé confirm the presence of a measuring scoop in food supplements
The company is using a Sick picoCam2 2D snapshot camera – which runs the company’s cloud-based deep learning system – to inspect the containers at its plant at Osthofen. Towards the end of the manufacturing process – for products such as sip feeds and supplements – a measuring scoop is added to each container, prior to automated powder filling. Previously, a vision camera with a colour pixel counting tool was used to determine the presence of the plastic scoop – at a speed of over 80 cans per minute. However, Nestlé has since changed to a near-colourless plastic scoop in order to improve recycling rates. The transparent scoop, with its slight grey appearance, is difficult for a conventional image processing system to detect. This is made harder because it sits close to a similarly coloured aluminium foil lid – which is also corrugated, embossed and reflective. Using Sick’s dStudio cloud-based service, neural networks were trained by being shown images of the enclosed scoop in many orientations. Sick’s decision-making algorithm was then downloaded to the camera. Its neural network can be retrained to accommodate new products. The intuitive user interface does not require any specialist knowledge in AI or image processing. If the system detects that a scoop is missing, it stops the line. Once it detects that a scoop has been added, it allows the process to continue without a manual restart. “We conducted tests both in our laboratory and on site to demonstrate just how confident we were with the solution,” said Klaus Keitel, national account manager for strategic customers at Sick. The system – which was easy to implement – helped Nestlé achieve high reliability – and the flexibility to expand the system to run new applications in future.
STEMMER IMAGING
www.stemmer-imaging.com
Applying vision systems to ELV battery production Machine vision systems can be used along the entire production chain of the new generation of electric vehicle (ELV) batteries, according to Stemmer Imaging.
STEMMER IMAGING
Inspection can be automated at every point in the process – from electrode production to packaging, it says: optical quality control can increase inspection accuracy, reduce rejects and lower costs. “For the transition to e-mobility to succeed, uncompromising operational safety and high performance are necessary,” says Baptiste Guldner, managing director of Stemmer Imaging France. “Machine vision provides reliable ways to monitor each step of the process, making battery production more efficient and resource-friendly.” Machine vision systems can be used at various stages of the ELV production chain
Stemmer lists 10 stages of the process that can be enhanced through the use of vision systems. For instance – near the start of the process – coatings can be inspected using line scan bars. This helps to identify coating defects and dual-side coating alignment. Later in the process, a 3D inspection system from LMI – which inspects micro-features on small parts – can be used in slicing, stacking guidance and to inspect welds. For such a highly automated process, vision-guided robotics also plays a key role in battery production. “Each step of battery production has its own challenges – and we can deliver successful optimisation solutions for every one of them,” says Guldner.
BYTRONIC
www.bytronic.com
Cutting pouch waste and production line costs Bytronic has helped a multinational food corporation in Indonesia to reduce waste and cut production line costs caused by downtime from contamination. Its HotSpot SealCheck thermal camera system helped the food company to ensure that pouches made at its facility were correctly sealed. Incorrectly sealed pouches can leak in transit – contaminating an entire delivery – or rupture during production, leading to a line shutdown. “The manufacturer was producing much more waste than it wanted – mainly because there are all kinds of variables in the packaging process,” said John Dunlop, CTO at Bytronic. BYTRONIC
These variables included temperature, the heated jaws that seal the pouches and many other factors.
Bytronic’s non-contact thermal camera system ensured that pouches were correctly sealed
www.ukiva.org
Bytronic visited the factory with its portable trial kit, which can distinguish between the baseline temperature of the pouches and the temperature of the plastic after being sealed – which is 175°C. This can identify and isolate any pouches that are incorrectly sealed in 50-100 milliseconds and remove them from the line before they leak or rupture. “This thermal, non-contact inspection method can see things that humans simply can’t,” said Dunlop.
11
www.ukiva.org
Agile and powerful Matrox Design Assistant® X is a flowchart-based integrated development environment that takes the gymnastics out of vision application development. This powerfully agile software removes the need for coding and is equally well-suited for simple application development or solving complex vision projects. The same environment also enables communication with various automation equipment and enterprise systems as well as the creation of a web-based operator interface. Field-proven Matrox Design Assistant X software is a perfect match for PCs and smart cameras. Get a leg up on development with an environment that delivers image analysis using deep learning, and provides traditional vision tools to inspect, locate, measure, and read in images and 3D scans.
Matrox Design Assistant X: Powerfully agile flowchart-based vision application development
www.matrox.com/imaging/design_assistant/ukiva
13
HIGHLIGHTS FROM MVC 2022 The year’s Machine Vision Conference (MVC) – the first since 2020, due to the Coronavirus pandemic – was a welcome return for the industry. While technology developments, customer projects and solution deployment have continued, this has largely happened out of view – as the pandemic prevented them from being showcased. This year’s MVC went some way to addressing this. Six presentation theatres – each themed to a different area of the industry – presented delegates with some of the latest developments. In addition, suppliers presented their technologies and services on the exhibition floor. The seminars covered six separate areas: • Deep learning & embedded vision • Optics & illumination • Systems & applications • Camera technology • Understanding vision technology In this section, you can read some of the highlights from each theatre. Elsewhere in the issue, we cover some of the products and applications that were presented at the event. Nearly 400 delegates attended this year’s Machine Vision Conference in Milton Keynes This is not an exhaustive review of everything that appeared at MVC. Instead, it gives a flavour of some of the event’s highlights. MVC 2022 delegates may appreciate being informed on elements that they missed; those who missed the show can get a sense of its scope – and maybe encourage them to attend MVC 2023.
UKIVA
• 3D vision.
On that subject, plans are already underway to host next year’s event. MVC 2023 will be held on 20-21 June 2023 at the CBS Arena in Coventry. It runs across two days for a reason: it will be co-located with the new BARA show – called Automation UK – which covers robotics and automation. BARA and UKIVA are sister organisations within PPMA. Some companies are members of both organisations. “It makes perfect sense to run these events together,” says Neil Sandhu, chairman of UKIVA. “Both sets of members will get huge benefit.”
RECYCLEYE
DEEP LEARNING & EMBEDDED VISION Vision systems can always be evolved to handle changing needs. Sophisticated add-ons, such as powerful algorithms, can raise performance to a higher level. For example, training a vision system to recognise multiple objects, using deep learning, helps to automate processes – making them far more efficient and cost-effective. This was illustrated by Seb Millar of UK recycling specialist Recycleye – whose deep learning vision system uses a huge dataset of images to sort waste streams more effectively. Government-set collection targets have elevated the importance of waste recycling – meaning that more effective sorting systems have huge potential. Recycleye’s system of identifying waste – using RGB sensors and a clever algorithm – is more costefficient than techniques such as NIR sensors and manual sorting, he added. It can determine food-grade plastic (such as milk bottles) from non-food-grade (detergent bottles) with 96% accuracy – something that NIR cannot achieve, he said. The system is still in development and has been tested in large recycling facilities. Part of its power comes from how it processes data. Recycleye’s deep learning vision system For instance, a technique uses a huge dataset of images to allow called quantisation allows automated sorting of waste streams it to round data without introducing errors. This allows much faster line speeds, with an accuracy reduction of just 1%. Millar said that developing the system for sorting waste streams has made it “highly robust” – so it could be applied to other sectors.
On the edge Deep learning may sound daunting but it need not be, said Paul Cunningham of Acrovision. While there is a gap between rule-based and deep-learning vision systems, it can be bridged through ‘edge learning’ (or ‘simple deep learning’). This performs on-device processing using pre-trained algorithms. Advantages of this approach include faster set-up, the ability to train ‘on the camera’, and the need for fewer ‘training’ images: while deep learning requires thousands of images, edge learning needs between five and 10. Processing time shrinks to minutes – rather than hours – and detailed understanding of deep learning is not needed. Recent edge learning applications include classifying wheel bearings, finding packaging defects and checking pharmaceutical blister packs. It can typically be carried out in combination with a smart camera.
Embedded vision Paolo Bai of Teledyne FLIR defined an embedded system as a combination of hardware and software that is designed to perform a dedicated function or specific task. Embedded vision systems can have a number of advantages, he said, such as lower cost and improved performance. At the same time, they can enable new applications such as portability and the chance for real-time feedback. He cited an example of a deep learning solution for inspecting handsoldered PCBs. The system’s aim was to classify good solder against bad/ missing solder and other defects. It needed to inspect multiple regions of interest (ROIs). This included features such as solder bridges, stray solder spatters, lifted pads and untrimmed leads. Choosing the right computing system and camera can be vital when designing an embedded vision system. Richard Oakley, of Allied Vision Technologies, described how the company carried out performance testing on a variety of camera models on an identical NVIDIA Jetson TX2 development kit – in either USB3 or CSI-2. The cameras ranged from 5MP to 20MP. “CSI-2 cameras seem to create much less CPU load than USB3 cameras – especially for higher resolutions or lower frame rates,” said Oakley.
www.ukiva.org
14
OPTICS & ILLUMINATION
SYSTEMS & APPLICATIONS
It’s a fundamental rule of photography that the most important factor is lighting. To this end, it is critical that any imaging system gets this right. This can be easier said than done. Jack McKinley, product manager at TPL Vision, explained some of the difficulties in creating smooth lighting for a large field of view (FOV). “It’s challenging to have both brightness and homogeneity with large FOVs,” he said. Factors such as uneven lighting and reflections lead to variable spectral distribution on the surface. This can lead to hotspots – and inaccurate readings. TPL’s answer is to use angle changers to create ‘curved brightness’ output. This helps to banish hotspots by smoothing out the perceived brightness on the surface. It has been applied to large FOVs and long working distances, he said.
While machine vision is straightforward in principle – capture an image, analyse it and act on the result – it uses many techniques that can be applied to a variety of industries. A relatively new example is Terahertz (THz) technology, covering the spectrum between microwave and infrared. Julian Parfitt of Alrad said it has many uses – from quality control in manufacturing and agriculture to medical diagnostics and security screening. He cited the example of determining whether nuts – in their nutshells – contain harmful fungus. THz cameras can identify signs that nuts and other foods such as corn are harbouring bacteria such as Aspergillus flavus – which can produce dangerous substances called Aflatoxins. The technique has also been used to probe the internal structure of wood for features such as voids and water content, and to check if opaque packaging is short of contents (such as pills).
Sequencing and timing can also be important factors in enhancing machine vision lighting, according to Douglas Bourne, product manager at Gardasoft. Techniques such as using a sync signal – or ‘pulsed light’ – can raise imaging performance. For instance, a system that runs millions of images can be disrupted by a ‘trigger’ event such as a noise pulse. This can be overcome by using a ‘sync signal’ to realign each cycle of images, after the noise pulse has been detected. Fast pulsing of light – compressing it into as short a period as possible – is helpful when imaging fast-moving objects and can minimise ‘pixel smear’. “Some applications require the pulse width to be 1 microsecond or less,” he explained. Benefits of pulsing include: using light only when it is needed; extending the life of LEDs; and consuming less power. Precise sequencing and timing can reduce system cost, allow novel imaging configurations and raise performance, he added.
UKIVA
Outdoor use An increased need for outdoor vision systems requires components such as lenses to be made more rugged. Shocks and vibration can shift components in a standard lens out of place, according to Thomas Armspach-Young, vision solutions engineer at Edmund Optics. This can be solved with measures such as fixing the aperture and removing the focus ring – to reduce the number of moving parts. For stability, lenses can be further ruggedised – such as by glueing optics into place to ensure that optical pointing and positioning is maintained, even after vibration. One more factor – that of ingress protection – can be improved by adding O-rings, for example – to Lenses can be ruggedised to protect them from prevent moisture and shocks and vibration, said Edmund Optics debris from entering. A more complicated process – requiring a new lens design – is ‘athermalisation’, which protects the lens from temperature extremes. Athermal lenses are usually customised, but the company offers some off-the-shelf versions, he said.
Fruit shoot Despite its humble beginnings to help children get into computing, the Raspberry Pi has found its way into industrial vision applications. Although it has many limitations, it has enough features to be incorporated into a variety of vision systems, for applications ranging from inspection and scanning to drones and medical imaging. Paul Wilson, managing director at Scorpion Vision, cited an impressive case study – a ‘Cubesat’ called Gaspacs, developed to test inflatable structures in space. As part of its mission, it transmitted photographs – taken with a Raspberry Pi camera – back to Earth.
www.ukiva.org
Stringent rules The pharmaceutical industry is notoriously stringent (with good reason) – and has rules in place that govern the use of vision systems, according to Carl Leyland of Acrovision. These are covered under regulations such as 21 CFR Part 11, which insist that electronic data has the same security as a printed document – and cannot be altered. This means that equipment like cameras must include 21 CFR Part 11 features, such as active directory authentication and security audit trails for change tracking. A verification system – of the type designed by Acrovision – must, for instance, be able to guarantee barcode quality and readability across the whole supply chain. “Barcode verification ensures any degradation in a barcode’s quality is detected at the first instance,” he said.
Sense of porpoise F1 fans will be familiar with ‘porpoising’ – a bouncing motion that has affected cars since the introduction of new constructor regulations. Keith Gallant of Reliability Maintenance Solutions told delegates that a system called motion amplification can help to detect porpoising. “It can measure movement that is not visible to the human eye – or not easily measured using traditional methods,” he told delegates. He said that it turns every pixel into a ‘displacement sensor’ to measure vibration and motion. It is widely used elsewhere – such as when circuit boards are drop-tested, then placed on shaker tables to look for loose components. Other examples include fan blade design and testing both automotive and aerospace parts.
Field testing Back in agriculture, vision systems can be applied to food production in many innovative ways. Mark Williamson of Stemmer Imaging described several examples of machine vision ‘in the field’ – for tasks ranging from tilling and cultivation to harvesting and sorting. “Weeds are detected in the field and partially burned with laser beams,” he said. “The use of fertilisers and pesticides is also controlled with machine vision.”
MIDOPT
Timing is everything
An NDVI filter – mounted on a drone – can find stressed plants in a field very efficiently He cited the example of detecting plant stress using a normalised difference vegetation index (NDVI) filter – mounted on a drone. A standard image may show a field of green, but a processed image highlights stressed plants. This is because they reflect different parts of the spectrum to a healthy plant. Other examples include: using segmentation algorithms to separate objects – such as crops – from the background; and combining vision and robotics to harvest crops such as broccoli selectively.
15
CAMERA TECHNOLOGY
UNDERSTANDING VISION TECHNOLOGY
Something that mainstream camera users can find bewildering is the variety of camera interface standards– including GigE, CoaxPress and USB. Each has its own pros and cons, so must be selected carefully. (See more in the ‘Understanding Vision Technology’ section.) Jason MacDonald of Matrox Imaging explained how GigE – which he called “the Ethernet standard for mainstream vision systems” – can have downsides. A common problem is that GigE cameras are often ‘locked’ to a specific network interface card (NIC). Using a different NIC will typically lead to longer processing times – from issues such as high CPU usage and bandwidth utilisation. This can be overcome with a standard ‘offload NIC’. Matrox offers cards that work for 1, 5, 10 and 25 GigE cameras – and can support two cameras at once.
The diversity of camera interfaces can be confusing. They include USB, GigE, CameraLink and CoaxPress – as well as several emerging interfaces. Their cost and complexity varies from low to high – and, as a vision system user, negotiating them can be tricky. Hashem Khan of ClearView Imaging tried to guide delegates through making these decisions, based on a simple framework of four key parameters: bandwidth; cable length; budget; and complexity. “There will be factors to consider outside these areas, but this is a good starting point,” he said.
Four factors Bandwidth basically means how much data is required; cable length determines the distance between camera and computer; budget can range from low to high; and complexity is about the ease of operation – not all are ‘plug and play’. He presented a few examples. A simple ‘vision box’ – a static, off-line system – was best served by GigE, he explained, due to the low cost and complexity, and relatively short cable length. Automated vehicles, however, required high resolution and speed – as well as having to carry out some pre-processing. He recommended CoaxPress here – partly because if offers power over cable – negating the need to run extra power cables around the vehicle. “Although complex, once you have a basic grasp of the key parameters of each interface you can use the four pillars to assist with interface selection,” said Khan. His colleague, Jak McCarthy, saw machine vision as a ‘puzzle’ with seven pieces. These pieces, he said, were: Algorithms (of three main types: pattern matching; analysis; and character recognition/ verification); Complexity – how easy it is to understand; Flexibility; Setup; Acceptance; Cost; and, Knowledge. “Not all of these pieces are technical, but they are all vital for a working system,” according to McCarthy.
Data integrity Andreas Lange, of Teledyne Dalsa, presented several examples to show the importance of data integrity in a GigE environment. One involved an automotive customer with up to 40 cameras for a paint line inspection system. The cameras were connected to 48-port Cisco switches using Intel NICs. However, the customer ran into a problem of frames being lost when run at the required speed. After analysis, Teledyne found that the Intel X550 driver had issues with link speed auto-negotiation. The answer was to replace the Cisco switches – and set the driver to manual speed negotiation (or use an older driver).
Short wave Cameras that work in the short-wave infrared (SWIR) region of the spectrum (around 1,000 to 2500nm) have found multiple uses in industrial vision. Torsten Wiesinger, of Lucid Vision Labs, said that because SWIR is sensitive to water it can be used to detect moisture in fruits and vegetables. This can be used as an indicator of bruising. “Normal wavelengths are unable to do this,” he told delegates. It has also been used to inspect silicon wafers – which become translucent at wavelengths above 1150nm. While visible light can find defects on the front side, SWIR can find imperfections on both sides of the wafer, he said. Stefan Waizmann, of SVS-Vistek, told delegates that a SWIR-based system also required the correct choice of lens “that is designed, optimised and coated for the SWIR wavelength range”. One designed for the visible spectrum will produce lower resolution images and higher chromatic aberrations – resulting in a blurred focal point, he explained.
Custom cameras
PHOTON FOCUS
Cameras contain many separate elements – including sensor, electronics and firmware. All of these can be customised – said Benoit Ostiz of Photonfocus – in order to fit exact needs.
Photon Focus stressed the importance of customised products For instance, he said the correct sensor could be chosen for its resolution, speed, sensitivity or pixel size. Once this has been decided, a processor can be selected according to its compatibility with the sensor, speed, calculation power and power consumption. Similar choices are needed for other elements such as interface, mechanics and software.
Many customers have a pre-conceived idea about the type of system they need in advance, according to Dominic Obiedzinski of Wenglor Sensoric. Companies like his need to use their expertise to help customers find the right solution – offering “high-level input to offer options and considerations”. This may go against what customers “think they need”, he said. In many cases, it can come down to a choice between a 2D camera and a profile sensor. He cited one example of an automated cell that used a Cobot to weld a number of bosses onto a cylinder. The vision system needed to: classify the cylinder type; locate all bosses and measure their diameters and output centre points; and have an accuracy of 0.5mm. Wenglor proposed using profile sensors with protective housings designed for welding applications. It used two reference positions to find cylinder diameter and angle/tilt – then used this data to position the sensor (to find boss locations and dimensions). This allowed it to calculate a welding path. Benefits of using the profile sensor included: one ‘recipe’ for all cylinder sizes; no need for added lighting; and the fact that sensors were factory-calibrated. Other potential uses of profile sensors that he highlighted Wenglor used a profile sensor to were tote stack depalletising inspect the automated welding of and picking/placing of bosses on a cylinder cardboard stacks.
WENGLOR
Customer care
www.ukiva.org
16
3D VISION The extra dimension of 3D vision can help users to gather more detailed data. Despite their extra complexity and cost, these systems can still be affordable. Mark Williams of Basler said two mobile imaging techniques – Quad Bayer Coding (QBC) and Dual Photodiode (DPD) – offer benefits including high dynamic range. Combining the two (giving a QBC with a 2 x 2 on-chip lens) could create a “next-generation high-performance CMOS image sensor” – which could make 3D systems more compact, cheaper and easier to integrate.
Time factor Another way of gathering 3D data quickly is with Time of Flight (ToF), which uses triangulation to capture whole scenes in one frame, rather than point by point. “This makes high speed applications possible,” said Tim Dodd of IFM. The most common method is continuous wave ToF – in which a continuous modulated light is sent out and the phase of the reflected light is used to calculate the distance to the object.
UKIVA MEMBER DIRECTORY ACROVISION LTD T: 0845 337 0250
www.acrovision.co.uk
ACTIVE SILICON LTD T: 01753 650600
www.activesilicon.com
ALLIED VISION TECHNOLOGIES GMBH T: +49 36428 677-0
www.alliedvision.com
ALRAD IMAGING T: 01635 30345
www.alrad.com
BASLER AG T: + 49 4102 463 500
www.baslerweb.com
BYTRONIC VISION AUTOMATION T: 01564 793174
www.bytronic.com
CLEARVIEW IMAGING LTD T: 01844 217270
www.clearviewimaging.co.uk
COGNEX UK LTD T: 0121 296 5163
www.cognex.com
CREST SOLUTIONS T: 01536 560275
www.crestsolutions.co.uk
DAHUA TECHNOLOGY UK LTD T: 01628 613500
www.dahuasecurity.com
DEMCON T: +31 (0)88 - 115 20 00
www.demcon.com
EDMUND OPTICS LTD T: 01904 788600
www.edmundoptics.co.uk
FISHER SMITH LTD T: 01933 625162
www.fishersmith.co.uk
GARDASOFT VISION LTD T: 01954 234 970
www.gardasoft.com
GET CAMERAS T: +31 408 514 838
www.get-cameras.com
GS CONTROLS T: 01799 528254
www.gscontrols.co.uk
HIKROBOT T: 01628 902140
www.hikrobotics.com
IDS IMAGING DEVELOPMENT SYSTEMS T: 01256 962910
www.ids-imaging.com
IFM ELECTRONIC LTD T: 020 8213 0000
www.ifm.com/uk
IMPERX, INC T: +1 561 989 0006
www.imperx.com
INDUSTRIAL VISION SYSTEMS LTD T: 01865 823322
www.industrialvision.co.uk
JEKSON VISION LTD T: +356 9977 0961
www.jeksonvision.com
UKIVA
KEYENCE UK LTD T: 01908 696900 IFM’s Tim Dodd explained how ToF can be used to gather 3D data quickly A number of factors need to be considered when specifying a ToF camera. Firstly, it will have a fixed resolution and lens aperture, which will restrict the field of view (FOV) and smallest detectable object. In the latter case, a manufacturer will typically quote a minimum object size at a certain distance. In addition, blind spots have to be overcome: mounting a camera too close to a reflective surface can cause noise.
www.keyence.co.uk
KNIGHT OPTICAL (UK) LTD T: 01622 859444
www.knightoptical.com
LAMBDA PHOTOMETRICS LTD T: 01582 764334
www.lambdaphoto.co.uk
LMI TECHNOLOGIES GMBH T: 03328 93600
www.lmi3d.com
MATROX IMAGING T: +1 514 822 6020
www.matrox.com/imaging
MULTIPIX IMAGING COMPONENTS LTD T: 01730 233332
www.multipix.com
MURRELEKTRONIK LTD T: 0161 728 3133
www.murrelektronik.co.uk
Light show
OEM AUTOMATIC LTD T: 0116 284 9900
www.oem.co.uk
Pattern projection lighting can help to increase inspection stability in 3D imaging, said Cameron Millar of Keyence. Here, a set of eight lights project ‘stripes’ of light onto a 3D surface – allowing a 3D image to be generated. The distance between light source and object must first be calibrated – after which it remains fixed. By using a third dimension, the technique can handle backgrounds with low contrast. It is typically used to inspect parts such as plastic mouldings but has also been applied to check circuit boards by assessing surface height.
OLMEC - UK LTD T: 01652 631960
Added benefits Using multiple 3D profile sensors has several advantages – not least an improvement in image accuracy. However, combining them can be a challenge, because they need to be synchronised. This is typically done by assigning a primary device that will receive triggers from an incremental encoder – or other trigger source – and propagate the signal to secondary devices, said Jason MacDonald of Matrox. Reasons for using multiple sensors include: scanning larger objects; overcoming occlusion due to laser fan angle; and acquiring images from different vantage points. This last example has been used on a fish-inspection line.
www.ukiva.org
www.olmec-uk.com
QUANTUM DESIGN UK AND IRELAND LTD T: 01372 378822
www.qd-uki.co.uk
SCANDINAVIAN MACHINE VISION LTD T: 0845 519 0484 www.scandinavianmv.co.uk SCORPION VISION LTD T: 01590 679333
www.scorpionvision.co.uk
SMART VISION LIGHTS T: +1 (231) 722-1199
www.smartvisionlights.com
SMARTMORE CORPORATION LTD T: 0742 1984124 SONY IMAGE SENSING T: 01932 816000
www.smartmore.global www.image-sensing-solutions.eu
SPOOKFISH INNOVATIONS LTD T: 0117 963 9663
www.spookfish.co.uk
STEMMER IMAGING LTD T: 01252 780000
www.stemmer-imaging.com
TAMRON EUROPE GMBH T: +49 221 669 5440
www.tamron.eu/de/industrial-optics
TELEDYNE DALSA GMBH T: +49 89 89545730
www.teledynedalsa.com
TPL VISION T: 01738 310392 ZEBRA TECHNOLOGIES EUROPE LTD T: 01628 556000
www.tpl-vision.com www.zebra.com
ADVERTORIAL
uEye XC: closes the gap between webcam and industrial camera Automatically perfect images even with changing
object
distances
and
lighting
conditions? No problem for the new 13 MP autofocus camera uEye+ XC from IDS Imaging Development Systems. It closes the market gap between industrial camera and webcam. The integrated autofocus ensures sharp images and videos, for example in kiosk systems, logistics and robotics applications. Their components are characterised by long availability – a must for industrial applications and a plus compared to consumer webcams. Useful functions such as 24x digital zoom, automatic white balance and colour correction ensure that users can capture all details perfectly.
uEye Warp10: Ultra-fast, ultra-high resolution 10GigE cameras
IDS NXT: AI-based automation without programming knowledge
For users with particularly high demands on resolution,
The development of an AI vision solution usually
image quality and transmission speed, IDS is launching
requires expertise, programming effort and investment in
industrial cameras with a 10GigE high-speed interface and
computing and storage hardware. However, the threshold
a wide range of CMOS sensors with resolutions of up to 45
for entry can be lowered: the AI vision system IDS NXT
MP. The sturdy, GenICam-compliant uEye Warp10 cameras
already contains all the necessary tools and workflows,
can precisely capture even high-speed processes and transmit image information in the Gigabit Ethernet-based network virtually without delay. This allows them to show their strengths in inspection tasks, for example, when scenes are to be captured, monitored and analysed in all details and without motion blur. The first models are expected for summer 2022.
from AI training software and assistance tools to powerful industrial cameras. Users need no special knowledge of Deep Learning or camera programming. The latest update introduced the block-based editor: with the combinable blocks and intuitive user interface, anyone can build projects using AI-based image processing - such as object detection or classification - without having to know the
www.ids-imaging.com
syntax of a specific programming language.
18
PRODUCT UPDATES ACTIVE SILICON
www.activesilicon.com
Frame grabbers designed for fast image acquisition Active Silicon’s latest Firebird CoaXPress frame grabbers support CoaXPress v2.0, and are designed to deliver high-speed image acquisition. ACTIVE SILICON
They offer a fast PCI Express 4-lane and 8-lane Gen3 interface, zero CPU acquisition and can be used with long cables. Firebird Camera Link frame grabbers support GenICam for Camera Link cameras and maintain real-time acquisition – even when used with multiple camera applications. Firebird-CoaXPress-4xCXP12-3PE8
The company’s 4xCXP-12 boards support up to 12.5 Gbps data rates on each link, enabling up to 50 Gbps in total along with device power up to 13W and device control up to 42 Mbps – all on a single coax cable. Camera Link options are GenICam compliant and include up to 80-bit and Dual 80-bit models, providing reliable, robust image transfer without CPU intervention. A software development kit, ActiveSDK, is included. Camera and frame grabber control is delivered by ActiveCapture front-end software. All Firebird frame grabbers support GPU processing.
ALLIED VISION
www.alliedvision.com
Fast, feature-rich USB cameras Allied Vision has expanded its Alvium USB camera series with two new models. Its 1800 U-052 and 1800 U-291 cameras incorporate Sony third-generation IMX CMOS sensors with Pregius S global shutter technology to its Alvium camera series with USB3 Vision interface. ALLIED VISION
Thanks to a new analogue-to-digital converter (ADC), the new cameras can be operated with varying sensor bit depths. By choosing the bit depth of the readout mode between 12bpp (bit per pixel), 10 bpp, and 8bpp the user can run the camera at up to twice its speed depending on the application needs. If a smaller region of interest of the image is selected, speed can be increased even further. Additions to the Alvium USB camera series used Sony third-generation IMX CMOS sensors
The cameras are suited to fast-moving inspection applications in factory automation such as fruit or material sorting and bottle inspection. Alvium 1800 U-051 delivers images with a resolution of 0.5 MP at up to 626 frames per second. A pixel size of 9 x 9 µm, from Sony’s IMX426 sensor, allows it to offer very high sensitivity. Alvium 1800 U-291, which uses the Sony IMX421 sensor, combines a small sensor size (2/3in) with a readout rate up to 141 frames per second (in 8-bit mode). In addition to ADC, the cameras feature a convolution filter with a 5 x 5 matrix including an adaptive noise reduction mode. This can reduce the noise in the image while keeping the corners and edges – which is especially important for applications selecting objects by edge detection.
IFM ELECTRONIC
www.ifm.com/uk
Vision sensors for inspection and QC IFM Electronic’s new Dualis 2D vision sensors can easily be configured to perform demanding inspection and recognition tasks, thanks to user-friendly application software. In standard applications, set-up can be completed in as little as two minutes. The sensor can check parameters including patterns, shapes, areas, dimensions, contours and even target contrast.
IFM ELECTRONIC
Typical applications include: detecting missing components in assemblies; identifying malformed threads, missing holes or incorrectly applied adhesive; sensing incorrect product orientation or positioning; and counting the number of items present in the target area.
Dualis 2D vision sensors from IFM Electronic
www.ukiva.org
IFM’s Vision Assistant application software, supplied as standard with the sensors, incorporates ‘wizards’ that provide detailed set-up guidance for many of these applications. It also offers an advanced mode allowing experienced users to access additional functionality. Dualis sensors feature integrated illumination and are available in both infrared and RGB-W versions. With RGB-W versions, items can be distinguished by colour; a built-in polarising filter eases the task of detecting highly reflective objects. To minimise the effect of extraneous light, the sensors incorporate a daylight filter. In strongly fluctuating light conditions, they can be configured to take up to five images – with different exposure times – and automatically select the best one for analysis.
19
Plug and Play for industrial image processing With Murrelektronik‘s Vision Installation Solutions, you can get started with industrial image processing and immediately benefit from the advantages with little effort, guaranteeing a quick return on investment.
www.ukiva.org murrelektronik.co.uk
100 percent hand, mate! Manual production: On-the-job-training has never been easier!
The smart worker assistance system. No frills at all. High assembly variance, personnel fluctuation and complex process sequences lead to a permanently high need for training in manual production – combined with an equally high commitment of resources when the experienced colleagues support the new ones with advice and assistance. Unless you rely on the new ifm mate worker assistance system in the future. A system that is so simple in design, so easy to use and so smart without bells and whistles such as trackers or VR glasses – and yet helps you to reliably ensure quality, save time and generate more added value more quickly. How do we know that? We have developed and tested it ourselves. For us. For you. ifm – close to you.
ifm.com/uk/mate
ronic Visit ifm elect 2 at PPMA 202 Stand B14
21
PRODUCT UPDATES MURRELEKTRONIK
www.murrelektronik.co.uk
Decentralised automation for vision systems Murrelektronik has begun offering decentralised installation specifically for industrial image processing.
The components include: the Xelity Hybrid Switch, which has connections for up to four cameras and allows smooth, error-free data communication; the Master Breakout Box, which is a power and signal distributor; and the Injection Box – a voltage and signal feeder. Each can be mounted directly in the machine environment near the vision system. Together, the components minimise installation effort and raise performance, says the company. “Our decentralised installation concept saves customers the time and expense of control cabinet installation,” said Carl Tyler of Murrelektronik UK.
MURRELEKTRONIK
This will allow machine and system builders to create ‘plug and play’ solutions for connecting power supplies and enabling signal/data management for smart networking of cameras in industrial production processes or logistics, it says. The solution includes switches, distributors and feeders, plus cables and connectors.
Components are critical to decentralised installation
Switch cabinets are not needed, as everything is IP67 rated and designed to be mounted in the field. Other advantages of the modular solution include: vision systems can be put into operation quickly and flexibly; further diagnostics can be carried out during operation, to reduce system and machine downtime. “Our solution works with all the major camera manufacturers,” said Tyler.
QUANTUM DESIGN
www.qd-uki.co.uk
Infrared zoom camera needs no lens changes InfraTec’s new infrared camera, the ImageIR 6300 Z, is small and light, and has n0 need for time-consuming lens changes.
The device includes an integrated 7.5x zoom lens as standard. In combination with its motorised focus, it allows fast, stepless adjustment to a wide range of object distances and lens sizes with high accuracy. In addition, the zoom camera is radiometrically calibrated over the entire focal length range, for very flexible use. The new SWaP camera core at the heart of the technology comprises a cooled MWIR focal plane array photon detector with XBn (HOT) technology in the format of (640 × 512) IR pixels. The XBn detector material – with a higher operating temperature (HOT) of 150 K – allows the use of a new linear cooler generation. This is characterised by a much smaller design, lower power consumption and longer service life. Combined with a small pixel pitch of 10 µm, this reduces the dimensions and weight of the device and extends its maintenance-free service life.
QDUKI
The compact zoom camera, available in the UK and Ireland from Quantum Design, is suitable for numerous applications. It is used for precise thermographic temperature measurements and incorporates a number of innovative technologies in the optics, detector and electronics.
The ImageIR 6300 Z IR camera is available from QDUKI
The maximum IR frame rate of 180 Hz can be increased to 600 Hz with the binning function in high-speed mode. Switching between the two speed modes is done via software and allows precise metrological tracking of fast thermal processes.
WENGLOR
www.wenglor.com
Profile sensor automates weld seam tracking Wenglor says that its new profile sensor for automated weld seam tracking is compact and easy to install. Its latest WeCat3D 2D/3D profile sensor – the MLZL – has a slimline housing, with integrated cooling and rinsing. It can be installed directly on the welding torch. Thanks to its small housing dimensions (33 × 183 × 69.8 mm), this allows the robot to operate in narrow corners.
It is optionally equipped with a red or blue laser, and users can choose from three laser classes: 2M, 3R or 3B.
WENGLOR
“The MLZL does not require any additional protective housing, nor does it need to be tilted for alignment,” said Sascha Reinhardt, product manager at Wenglor. “The design also offers protection against welding spatter and disturbing ambient light.” WeCat3D MLZL sensor needs no extra protective housing
The sensor has been optimised for the demands of welding robots, especially the task of optical tracking of weld seams – in both hardware and software. The MLZL runs in combination with Wenglor’s Univision software, which is now on version 2.5.0. It has a standalone module specifically for weld seam tracking, allowing welding applications to be set up in a few clicks. “Tracking points can be reliably determined even in the event of faults in the joint course, such as with datum points,” said Reinhardt. “Predefined templates, where all common joint types are saved, significantly reduce the configuration work.”
www.ukiva.org
22
PRODUCT UPDATES ACROVISION
www.acrovision.co.uk
Maintaining compliance in pharmacy labelling Acrovision has introduced the Validator CFR21 – which it says will reduce the cost of managing and documenting the entire labelling lifecycle within the pharmaceutical industry. ACROVISION
By removing paper-based systems – and increasing productivity, efficiency and compliance – the system can help companies meet their requirements with an all-in-one system, without the expense. Acrovision’s updated Validator provides 21 CFR-11 compliance in a number of areas
In addition, the US Food & Drug Administration (FDA) continues to expand compliance to new areas of new automation systems or drug discovery. Having this Validator in place will ensure ongoing compliance, according to Acrovision. The standard Validator specification can check for: correct and legible barcodes and legible text (such as sell by date codes); correctly positioned promotional labels; and packaging quality, such as fill-level and missing cap detection. The updated Validator provides 21 CFR-11 compliance in a number of areas, including: electronic signatures; individual and multi-level passwords for all users; audit trails; back-up and restore routines; disaster recovery; relevant documentation; and, code control. Acrovision says the new offering provides a more complete solution to the biotechnology, drug and medical equipment manufacturing industries, which are always looking to improve quality and profitability at an affordable cost. The company can work with label applicators and manufacturers of packaging machinery to provide a complete 21 CFR-11 compliant solution to the end-user.
ALRAD INSTRUMENTS
www.alrad.co.uk
Seamless coverage from 400 to 1700nm Alrad Instruments is offering two new visible/SWIR cameras from Omron Sentech.
ALRAD
The 0.3M and 1.3 Megapixel cameras use the new Sony IMX990 and IMX991 sensors. They are available with three interface options: GigE Vision (PoE compatible); CameraLink; and USB3 Vision. Visible/SWIR cameras from Omron Sentech are suitable for standard, multispectral and hyperspectral imaging
Providing seamless coverage from the visible (400nm) through to shortwave infrared (1700nm) spectrum, the cameras are suitable for standard, multispectral and hyperspectral imaging applications. They can be combined with Alrad’s visible, NIR and SWIR machine vision lighting and visible/SWIR lenses in a range of applications including automation, machine vision, UAV and remote sensing, medical, security and research. Supporting this is Alrad’s visible/SWIR lenses from Computar, which are compatible with the latest InGaAs sensor cameras – specifically those using the Sony IMX990 and IMX991 sensor. Key features of these lenses include: full correction of chromatic aberration from visible light (400nm to near-infrared 1700nm); a VSWIR light absorbent coating, which prevents light scattering and provides clear image quality; and APO floating design, allowing focus shift to be fully corrected and reduced at any wavelength and working distance.
SICK (UK)
www.sick.co.uk
Collaboration leads to miniature vision sensor A miniature version of Sick’s Visionary-T 3D snapshot vision sensor has developed from a collaboration with Microsoft. The Visionary-T Mini incorporates precise, rapid, high-resolution image capture into a robust design – that is compact and lightweight – for use in many industrial applications.
SICK
Sick says that it uses time-of-flight snapshot technology to set new standards of data accuracy at high speed. It captures the 3D depth and 2D intensity values of every pixel at high (512 × 424 px) resolution in a single shot of light, at up to 30 3D frames per second.
The Visionary-T Mini snapshot vision sensor was developed in collaboration with Microsoft
As a result of the collaboration, the Visionary-T Mini achieves a competitive priceperformance level with high 3D and 2D data quality in rugged industrial conditions – including under very bright or dark conditions. With a low weight – and no moving parts – it can deliver robust performance despite shocks and vibrations. Its high repetition rate ensures there are no blurring effects because of movement of the camera or the object. It is well-suited to dynamic applications – where lower weight is an advantage or installation space is at a premium. The automatic high dynamic range ensures that even widely varying contrasts and lighting conditions across a scene are captured in each frame with no need for complex set-up, or expert knowledge of time-of-flight settings. The sensor makes it easy to integrate the data stream into applications, particularly where the target is moving, says Sick.
www.ukiva.org
FIRST CLASS CUSTOMER SERVICE
23
Expertise
EXPERTS
TRUST
KNOWLEDGE
TEAM
CONSULTING
ADVICE
Providing Market Leading Components For Machine Vision Call, email or message us via our webchat
sales@scorpion.vision
www.ukiva.org
24
PRODUCT UPDATES CLEARVIEW IMAGING
www.clearview-imaging.com
Streamlined vision inspection boosts QA Vision Box is an affordable, end-of-line part inspection system – designed by ClearView Imaging – that is available from its system integration partners.
CLEARVIEW IMAGING
The company used Matrox Design Assistant X to create a streamlined vision inspection interface that works as a hub for quality inspection. It can easily be connected to label printers for QC labels, barcode scanners, and back-end database systems. Utilising multiple FLIR Blackfly S GigE Vision cameras fed into a Matrox 4Sight EV6 embedded vision system, the user can place a part in the imaging bay and click ‘inspect’. Within seconds a rigorous, detailed pass or fail validation is received. The system is easy to set up, use and update. Vision Box is ClearView’s affordable, end-of-line part-inspection system
Designed with Tier 2 and 3 manufacturers in mind, Vision Box is an effective stop-gap for ensuring high quality of parts leaving the shop floor. Inconsistent quality assurance – caused by human error – can affect profitability, but an automated system such as Vision Box can help to save money while raising the accuracy of the QC process, said ClearView.
COGNEX
www.cognex.com
Deep learning in an easy-to-use package Cognex has released its In-Sight 2800 vision system – which it says puts the power of a fullfeatured vision system into an easy-to-use package that can be set up in minutes. A combination of deep learning and traditional vision tools gives users the flexibility to solve many inspection applications. Operators simply select the correct tool that is designed to deliver the highest possible accuracy for a task. Tools can be used individually for simple jobs or chained together for more complex logic sequences. “It has never been easier to apply deep learning to a production line,” said Carl Gerst, executive vice president of products, platforms and solutions at Cognex. “In-Sight 2800 can be trained with just a few images to automate everything from simple pass/fail inspections to advanced classification and sorting – with no PC or programming needed.”
COGNEX
The EasyBuilder interface guides users through application development, making it simple for new vision users to set up a job. Experienced users will appreciate how an intuitive, pointand-click interface simplifies complex application development and keeps operations moving quickly.
In-Sight 2800 from Cognex can be set up in minutes
The toolset also includes ViDi EL Classify. Using as few as five images, this classifying tool can be trained to identify and sort defects into different categories and correctly identify parts with variation. The ability to classify by multiple features or characteristics allows users to solve more tasks with a single vision solution. The system offers a variety of accessories and field-changeable components to help users adapt to changes such as new parts, faster line speeds and higher quality standards.
MATROX IMAGING
www.matrox.com
New event-logging tool for frame grabbers Matrox Imaging has developed a new event-logging tool for its Rapixo CXP frame grabbers. MATROX IMAGING
The company says the new device, called Matrox Gecho, streamlines video acquisition – helping to correct performance bottlenecks and errors. The device helps developers optimise image capture and ensure proper performance. The utility logs acquisition activity, allowing users to troubleshoot capture errors, and measure latencies and execution times. Matrox Gecho streamlines image capture by detecting and correcting performance bottlenecks and errors
Offered with both Matrox Imaging Library (MIL) X and Matrox Design Assistant X vision software, it records events generated by the Rapixo CXP device driver and saves these to a JSON or CSV file. Running concurrently with the device driver, it logs acquisition activity -- allowing the trace files to be loaded into Google Perfetto for viewing on an interactively navigable graphical timeline. The new tool helps users to benefit from a simple acquisition log that helps them set up and streamline video acquisition – detecting and correcting performance bottlenecks or errors so as not to corrupt or slow down the image-capture process. With Matrox Gecho, users benefit from a simple acquisition log that helps them set up and streamline video acquisition, detecting and correcting performance bottlenecks or errors so as not to corrupt or slow down the image-capture process.
www.ukiva.org
25 ADVERTORIAL
Basler pylon vTools: Custom-Fit Image Processing Modules for the New pylon 7
With the release of pylon 7, Basler is expanding its popular software to include pylon vTools. With these software modules, Basler customers can use a range of intelligent image processing functions for their applications without extensive training. With the release of pylon 7, Basler AG introduces a series of software modules that enable customers to quickly and easily use complex, powerful image processing functions for their applications - the Basler pylon vTools. With pylon vTools, users can design, test, and flexibly integrate intelligent structure recognition, precise object positioning, or robust code recognition into their own applications in one go, with camera control and image acquisition - always perfectly matched to Basler’s camera portfolio. Companies that use computer vision are confronted with several challenges. Development teams often lack image processing experts, and building up image processing expertise can be costly. In addition, integrating basic visual tools into a company’s own architecture is very complex; large comprehensive image processing software modules must be purchased, although only a few functionalities are really needed.
The pylon vTools address these challenges and provide seamlessly integrated image processing with the proven pylon software, allowing customers to get image acquisition and image processing from a single source. They can be quickly and easily created visually and are simple to integrate into existing architectures. Furthermore, the functions are offered in small, costeffective modules based on customer needs. With the simple online activation of a demo license, pylon vTools can be tried out free of charge immediately after installation. Tilmann Zuper, Product Manager Software at Basler AG, is convinced that pylon vTools will make work easier for many customers: “Our customers will achieve results quickly and easily, from image acquisition to image analysis. They can be immediately productive without extensive training and benefit from lower manufacturing costs thanks to our flexible licensing model.”
Basler pylon vTools are available for Windows and Linux. More information about the vTools features and offered modules is available on Basler’s website www.baslerweb.com www.baslerweb.com/en/products/basler-pylon-vtools/
www.ukiva.org
26
www.ukiva.org
27
PRODUCT UPDATES IDS IMAGING DEVELOPMENT SYSTEMS
www.ids-imaging.com
New camera has ultra-fast transmission speed IDS has launched its uEye+ camera family with ultra-fast transmission speed.
Compared to 1GigE cameras, uEye Warp10 achieves up to 10 times the transmission bandwidth. The fast data transfer has wide benefits – such as applications on the production line with high clock rates or image processing systems in sports analysis. The camera electronics require a higher cooling capacity, so the housing is equipped with cooling fins. An active cooling plate if offered as an accessory.
IDS
When fast-moving scenes need to be captured, monitored and analysed without motion blur, a high-performance transmission interface is as important as the sensor. Thanks to 10GigE, the new camera family transmits data at high frame rates and with very low latencies.
As a 10GigE camera, uEye Warp10 from IDS offers fast data transfer
Initially, evaluation prototypes with Sony’s IMX250 (5 MP), IMX253 (12 MP) and IMX255 (8 MP) sensors will be launched. The cameras will then be available in series, with more sensors being added over the course of this year.
LEUZE
www.leuze.co.uk
Object detection at large operating ranges The new 36 sensor series from Leuze detects objects even at large operating ranges.
The sensors – which are certified to IP67 – are offered with different operating principles. They are available with background suppression (with an operating range up to 2.5m), as retro-reflective photoelectric sensors (up to 17m) or as through beam photoelectric sensors (up to 80m). The sensor design provides a very high function reserve, ensuring that objects are reliably detected – even if, for example, sources of interference are present. Users can also request retro-reflective photoelectric sensors to detect objects with depolarizing properties, such as for film-wrapped pallets.
LEUZE
The company says that manufacturers and operators of highly automated systems will benefit from their cost-optimised design.
Leuze says its new 36 sensor series are optimised to be more affordable
The sensors are available with either M12 connection sockets or different ready-made connection cables – as well as matching mounting accessories. This makes integration easier, even in existing systems. In addition, all common switching logics – NPN and PNP, as well as light and dark switching – are supported.
KEYENCE
www.keyence.co.uk
Code reader made for challenging operations Keyence has developed a compact, ultra-high performance code reader with built-in AI.
The AI and decoding algorithms provide stable reading between processes – tracking changes in codes that occur from one process to the next. It is also possible to link code readers between processes for improved performance. With these connections, the operating status and current settings of readers on the same network can be viewed together in a list. Automatic focus adjustment and fully automatic tuning make setup easy – with the press of a button.
MICRO-EPSILON
KEYENCE
The company says that its SR-X series of code readers has a compact design – being 72% smaller than its conventional models – while still providing high-performance reading for a wide variety of codes.
Keyence’s SR-X code readers combine a compact design with high-performance reading for multiple codes
www.micro-epsilon.co.uk
Software for 3D data capture and evaluation
It offers a wide range of functions for analysing and measuring captured data. These include parameter setting of the sensors, alignment of point clouds and the selection of relevant objects, filters for smoothing and optimising point clouds, as well as calculation programs for height, radius and determining flatness and distances between points and angles. It can find a target based on its contour and readjust it to allow further inspection to be made. “For end users who do not have the capability to write their own software code, we offer the 3DInspect software solution,” said Glenn Wedgbrow, business development manager at Micro-Epsilon UK.
MICRO-EPSILON
Sensor manufacturer Micro-Epsilon has developed a new software tool, 3DInspect, which enables 3D data capture and evaluation. The software is compatible with all 3D sensors in the company’s portfolio.
Micro-Epsilon’s 3DInspect is compatible with all 3D sensors in the company’s portfolio
www.ukiva.org
28
YOUR INTERNATIONAL PARTNER FOR MACHINE VISION TECHNOLOGY
ONE-STOP SHOP
BEST SOLUTION PARTNER
Optimised solutions due to a broad machine vision portfolio
Harness expertise with extensive development services
Flexibility due to manufacturer independence
Continuous improvement with solution refinement and lifecycle management
Extensive expert knowledge and customer support
Reduce time with hardware and software pre-assembly and configuration
Reduced effort through comprehensive service packages
Aligned project flow with customer-specific handling and operations assistance
VISION. RIGHT. NOW.
www.ukiva.org
WWW.STEMMER-IMAGING.COM