Sensing is key to autonomous operation PAG E 6
The centralized approach to autonomous driving PAG E 3 2
August 2017
Autonomous
& connected vehicles
170630_360IC_EEW_US.indd Digi-Key -- EE Autonomous HB 8.17.indd 1 1
6/26/17 8/2/17 3:57 3:07 PM
The World’s First Dual-Port INIC by Microchip Flexible Network Solutions for the Connected Car
Simplify your system design by using the OS81119, the latest addition to Microchip’s portfolio of MOST150 Intelligent Network Interface Controllers (INIC). The OS81119 is the world’s first dual-port INIC; it offers both coaxial and optical physical layers as specified for the MOST150 technology, while providing a choice in network topology with different options, including daisy chain and ring. •
Includes USB 2.0 high-speed port as the interface to System on Chip (SoC)
•
Supports dual-simplex and full-duplex communication
•
Supports coax system diagnosis
•
Compatible with Microchip’s Unified Centralized Network Stack (UNICENS)
•
MOST® Linux® Driver is available
•
Rapid prototyping tool available now!
www.microchip.com/OS81119 The Microchip name and logo, the Microchip logo and MOST are registered trademarks of Microchip Technology Incorporated in the U.S.A. and other countries. All other trademarks are the property of their registered owners. © 2017 Microchip Technology Inc. All rights reserved. 6/17 DS00002486A
Microchip -- EE Autonomous HB 8.17.indd 1
8/2/17 3:57 PM
The danger of treating autonomous vehicle control as a solved problem A
panelist made an interesting comment during the recently concluded TU-Automotive connected car conference near Detroit. It happened during one of the many sessions devoted to software infrastructure and connected services. “In Silicon Valley, autonomous vehicle control and driving is viewed as a solved problem. The real emphasis is on making use of the associated data for connected services,” said a speaker from a software start-up. What was noteworthy about this comment was the smug tone with which it was delivered. But the selfsatisfied nature of it seems out-of-whack in that statistics show a self-driving car failed roughly every three hours in California during 2016. This data comes from the state’s Dept. of Motor Vehicles. Carmakers testing self-driving cars in California must file annual “disengagement” reports with the state showing how many times their vehicles malfunctioned. The definition of a disengagement includes every time a human driver has to quickly grab the controls, either because of hardware or software failure or because the driver sees a problem. The reports cite 2,578 failures among the nine firms that conducted road-testing in 2016. The total autonomous miles driven aren’t impressive, coming in at a little under 657,000 with the vast majority of those miles accumulated by Google-parent Alphabet’s spin-out company Waymo. For comparison, Tesla’s Elon Musk has estimated that worldwide regulatory approval of driverless vehicles will require something on the order of six billion miles of data. It’s probably safe to say that most people engaged in autonomous vehicle development see nothing sinister about the number of reported disengagements. None of this year’s disengagements resulted in an accident. And it is clear that some of them happened during tests of subsystems. Bosch and Delphi, for example, -- both Tier One automotive suppliers -- were at the top of the disengagement list. It’s likely their engineers were using city streets to get real-world performance data for R & D projects, not ironing out the last few bugs in prototypes.
2
DESIGN WORLD — EE NETWORK
commentary — automotive handbook 8-17.indd 2
8 • 2017
Nevertheless, the results cited in the California report don’t inspire confidence about the reliability of autonomous systems. This is particularly so because some autonomous vehicle developers have a loose idea of what constitutes a disengagement. Waymo’s definition, for example, doesn’t include every time a human driver is forced to take over. Its disengagement count – just 124 in its 60 self-driving cars during 2016 -- only includes incidents where something unsafe would have happened if the human driver weren’t there. In contrast to what it reports, Waymo admits its drivers have to go hands-on many thousands of times annually. Critics claim these results indicate the technology isn’t ready for prime time and won’t be for the foreseeable future. They also note that at least one firm – Uber – stopped a self-driving pilot program in San Francisco in favor of running tests in Arizona and Pennsylvania, states where failure reports aren’t required. Such tactics have bloggers questioning whether it makes sense to unleash autonomous technology on public streets at this stage of the game. Says one about autopilot systems, “Drivers should not be Guinea pigs and misled into a false sense of confidence.” Which brings us back to the comments made during the TU-Automotive show. People in self-driving vehicle technology do themselves no favors when they come across as cavalier about the shortcomings of current autonomous systems. Nobody needs the perception that a traffic accident involving a self-driving vehicle is just a bug report incident for software engineers.
LEE TESCHLER
EXECUTIVE EDITOR
eeworldonline.com | designworldonline.com
8/2/17 3:53 PM
Getting you There In More
ways Than You
Know mp d dev ices ca n be fou n d in nu merous T r a n s p o rtat i o n v e h i c l e s i n c l u d i n g m o t o r c y c l e s a n d s c o o t e r s .
MPD m e m o ry p r o t e c t i o n d e v i c e s
MPD -- EE Autonomous HB 8.17.indd 3
M P D i s a g lo b a l m a n u fac t u r e r o t h e r e l e c t r o n ic c o m p o n e nt s . c o m p o n e nt s s h o u l d f i t e a s i ly w h ic h i s w h y w e a r e a lway s c r
b
a t
o f b at t e ry h o l d e r s a n d W e b e l i e v e t h at o u r i nto yo u r d e s ig n s , e at i n g i n n o vat i v e n e w
t e r y h o l d e r s . c o m
8/2/17 3:58 PM
INSIDE AUTONOMOUS & CONNECTED VEHICLES
02 The danger of treating autonomous vehicle control as a solved problem 06 Sensing is key to autonomous operation
A lot of work in sensor technology is taking place to make robotic vehicles practical.
12 The challenges of HIL testing and ADAS
Engineers running hardware-in-the-loop simulations face difficulties associated with how to devise sensor signals that truly exercise advanced drive assistance systems.
6
18 Sensors in the driving seat
Advances in sensor technology help make autonomous vehicles safe and reliable.
22 ADAS developers contemplate sensor fusion
Semiconductor makers now field sensor equipment complete with built-in safety systems that specifically target autonomous driving applications.
26 Managing EV batteries squeezed into odd shapes
18
Automakers are forced to fit EV batteries into nooks and crannies in an effort to maximize driving range. The resulting convoluted layouts call for a wireless approach to managing cells.
32 The centralized approach to autonomous driving
Engineers say the best way to make vehicles fully autonomous is with a centralized processor to analyze all sensor signals and driving situations.
38 Simulations help explain why autonomous vehicles do stupid things
4
Special test programs hope to help robotic systems make better decisions in short order.
DESIGN WORLD — EE NETWORK
Contents & Staff — Automotive 8-2017 V1.indd 4
8 • 2017
22 eeworldonline.com | designworldonline.com
8/3/17 11:03 AM
DESIGN & PRODUCTION SERVICES
EDITORIAL Editorial Director Paul J. Heney pheney@wtwhmedia.com @dw_Editor Managing Editor Leslie Langnau llangnau@wtwhmedia.com @dw_3Dprinting
VP Creative Services Mark Rook mrook@wtwhmedia.com @wtwh_graphics Art Director Matthew Claney mclaney@wtwhmedia.com
@wtwh_designer Executive Editor Leland Teschler lteschler@wtwhmedia.com @dw_LeeTeschler
Graphic Designer Allison Washko awashko@wtwhmedia.com
Senior Editor Miles Budimir mbudimir@wtwhmedia.com @dw_Motion
Traffic Manager Mary Heideloff mheideloff@wtwhmedia.com
Senior Editor Mary Gannon mgannon@wtwhmedia.com @dw_MaryGannon Senior Editor Lisa Eitel leitel@wtwhmedia.com @dw_LisaEitel Associate Editor Mike Santora msantora@wtwhmedia.com @dw_MikeSantora
Production Associate Tracy Powers tpowers@wtwhmedia.com Director, Audience Development Bruce Sprague bsprague@wtwhmedia.com
DIGITAL MEDIA/MARKETING Web Development Manager B. David Miyares dmiyares@wtwhmedia.com @wtwh_WebDave
Senior Web Developer Patrick Amigo pamigo@wtwhmedia.com @amigo_patrick
Marketing Coordinator Lexi Korsok lkorsok@wtwhmedia.com @wtwh_Lexi
Web Production Associate Skylar Aubuchon saubuchon@wtwhmedia.com @skylar_aubuchon
Digital Marketing Specialist Josh Breuler jbreuler@wtwhmedia.com @wtwh_Joshb
Web Production Associate Janae Garrett jgarrett@wtwhmedia.com @janae_wtwh
Marketing Associate Aly Ryan aryan@wtwhmedia.com @wtwh_Aly
Digital Media Manager Patrick Curran pcurran@wtwhmedia.com @wtwhseopatrick
Leadlift Onboarding Specialist Mike Ulanski mulanski@wtwhmedia.com @wtwh_mike
Web Production & Reporting Associate Jennifer Calhoon jcalhoon@wtwhmedia.com @wtwh_Jennifer
Digital Marketing Director Virginia Goulding vgoulding@wtwhmedia.com @wtwh_virginia
FINANCE Controller Brian Korsberg bkorsberg@wtwhmedia.com Accounts Receivable Specialist Jamila Milton jmilton@wtwhmedia.com
VIDEO SERVICES Videographer Manager John Hansel jhansel@wtwhmedia.com @wtwh_Jhansel Videographer Bradley Voyten bvoyten@wtwhmedia.com @bv10wtwh
Marketing & Events Coordinator Jen Kolasky jkolasky@wtwhmedia.com @wtwh_Jen
Videographer Derek Little dlittle@wtwhmedia.com
WTWH Media, LLC 6555 Carnegie Ave., Suite 300 Cleveland, OH 44103 Ph: 888.543.2447 FAX: 888.543.2447
DESIGN WORLD does not pass judgment on subjects of controversy nor enter into dispute with or between any individuals or organizations. DESIGN WORLD is also an independent forum for the expression of opinions relevant to industry issues. Letters to the editor and by-lined articles express the views of the author and not necessarily of the publisher or the publication. Every effort is made to provide accurate information; however, publisher assumes no responsibility for accuracy of submitted advertising and editorial information. Non-commissioned articles and news releases cannot be acknowledged. Unsolicited materials cannot be returned nor will this organization assume responsibility for their care. DESIGN WORLD does not endorse any products, programs or services of advertisers or editorial contributors. Copyright© 2017 by WTWH Media, LLC. No part of this publication may be reproduced in any form or by any means, electronic or mechanical, or by recording, or by any information storage or retrieval system, without written permission from the publisher. Subscription Rates: Free and controlled circulation to qualified subscribers. Non-qualified persons may subscribe at the following rates: U.S. and possessions: 1 year: $125; 2 years: $200; 3 years: $275; Canadian and foreign, 1 year: $195; only US funds are accepted. Single copies $15 each. Subscriptions are prepaid, and check or money orders only.
2014 Winner
2011 - 2016
Subscriber Services: To order a subscription or change your address, please email: designworld@halldata.com, or visit our web site at www.designworldonline.com POSTMASTER: Send address changes to: Design World, 6555 Carnegie Ave., Suite 300, Cleveland, OH 44103
eeworldonline.com | designworldonline.com
Contents & Staff — Automotive 8-2017 V1.indd 5
8 • 2017
DESIGN WORLD — EE NETWORK
5
8/2/17 4:08 PM
Sensing is key to autonomous operation A lot of work in sensor technology is taking place to make robotic vehicles practical.
LEE TESCHLER | EXECUTIVE EDITOR
Ford Motor runs tests of its Level 4-capable autonomous vehicle prototypes at the University of Michigan’s Mcity facility. Sensors let the car navigate scenarios like traffic in intersections, pedestrians in crosswalks, different traffic signals and cyclists.
It’s
becoming clear that autonomous vehicle deployment will depend on the development of smart sensors. Industry analysts see advances coming in traditional areas of radar and proximity sensing. They also think the architecture of vehicle systems will evolve in ways designed to optimize the handling of information coming in from the multiplicity of sensors necessary for autonomous operation.
6
DESIGN WORLD — EE NETWORK
automonous sensors — automotive handbook 8-17.indd 6
8 • 2017
The dependence on sensing systems becomes clear from a review of how the autonomous vehicle industry classifies levels of autonomous ability: Level zero – The human driver is in total control. Level one – This is where most cars on the road today reside. Function-specific automation characterizes this level: The vehicle might carry automation for single functions such as braking or cruise control. But the driver is still completely in charge of vehicle controls. Level two - This level is characterized by the automation of more than one control into what’s called combined function automation.
eeworldonline.com | designworldonline.com
8/2/17 3:14 PM
™
COLOR COdEd fuSE hOLdERS
• Supplied in Colors to match Auto Fuse Amperage • Simplified visual Fuse Identification • Fully Insulated Contacts • Available for Mini, Low Profile Mini and Standard automotive blade fuses • All Fuse Holders are rated for 20 Amps • Available in Thru Hole Mount (THM) or Surface Mount (SMT) configurations • Available in Bulk or on Tape & Reel • Compatible with all wave and reflow operations • Compatible with all vacuum and mechanical pick & place assembly systems • Request Catalog M65
It’s what’s on the InsIde that counts ® E L E C T R O N I C S
www.keyelco.com
EE-AH-THiNK Keystone -- EELighthouse+CAFH Autonomous HB 8.17.indd 8-17.indd 71
•
(718) 956-8900
C O R P.
•
(800) 221-5510
6/19/17 8/2/17 3:59 3:29 PM
AUTONOMOUS DRIVING LEVELS Autonomous vehicle developers divide autonomous abilities into six categories.
The system can take control for some driving conditions but the driver is still ultimately responsible for driving the vehicle and must be able to take over on short notice. Dynamic cruise control, where braking takes place automatically to handle some events during cruise control, is an example of a level-two function. Level three – This level defines limited self-driving automation where the vehicle takes control most of the time. The driver is expected to occasionally take over with comfortable transition times. Highly autonomous driving on the highway would fall into level three. (Audi’s A8 is said to be the first productionready Level 3 autonomous car, capable of steering, braking and accelerating itself on highways up to a speed of 35 mph.) Level four -- Full self-driving automation. The vehicle is completely in control and the driver isn’t expected to take over. Industry analysts say the transition of control between drivers and automated systems will be critical for vehicles at levels two and three. Autonomous systems operating at these levels will be human-machine interface intensive. Sensors will make this sort of interface possible. The human driver will be monitored constantly to assess his or her state-of-mind when the system gets ready to hand the reins back. The vehicle will have to communicate status and status changes to the driver. The current thinking is that such feedback will likely involve a heads-up display and haptic means such as vibrating pedals, steering wheels, seats, or tightening seat belts. 8
DESIGN WORLD — EE NETWORK
automonous sensors — automotive handbook 8-17.indd 8
8 • 2017
THE EYES OF AN AUTONOMOUS VEHICLE The sensing part of the architectural picture for autonomous vehicles is becoming clear. Designers now divide sensing tasks into two categories: sensors that look outside the vehicle and sensors looking at the human driver to gauge his or her state. It looks as though driver monitoring will involve cameras and software designed to detect drowsiness and fatigue, distraction, and similar conditions. They might also play a role in security and in metering how the autonomous system should issue warnings. For example, when it comes to issuing a warning, the system might factor in how closely the human driver is paying attention before it decides on what the intensity of the warning should be. Cameras and vision systems are the primary focus for driver sensing simply because cameras are relatively compact and inexpensive and widely available. But there are a lot of issues surrounding their use. One is simple user acceptance; some drivers aren’t comfortable with a camera constantly pointed at them. Another problem is that sun glasses, brimmed hats, and other fashion items can obscure the driver’s face, making it difficult for software to decide whether a driver is paying attention. The level of ambient light could present problems as well. Autonomous system software has used classical methods of determining the driver’s state of attention that include eye tracking, eyelid estimators, face recognition for the driver’s mode, and so forth. However, such systems have more recently begun to implement artificial intelligence schemes that factor in driver behavior sensed via other means such as the movement of the steering wheel or posture in the seat. When it comes to sensing the environment outside the car, autonomous systems generally divide the task into three categories: environmental perception, localization, and communication. The sensing of the environment around the car generally involves the use of both lidar and radar. eeworldonline.com | designworldonline.com
8/2/17 3:14 PM
SENSING IS KEY
Lidar sensors measure the distance to an object by calculating the time it takes a pulse of light to travel to an object and back. The lidar unit usually sits atop the vehicle where it can generate a 360° 3D view of potential obstacles to be avoided. Vehicular Lidar systems typically use a 905-nm wavelength that can provide up to 200 m range in restricted FOVs (field of views). Some manufactures now make 1,550 nm units with longer range that are more accurate. One problem with lidar units is their expense. It’s said, for example,
that some of the lidar units in Darpa Autonomous Vehicle Grand Challenge cost more than the vehicles they sat on. However, costs are dropping partly thanks to the development of solidstate lidar (SSL) that eliminates the scanning mirrors and other moving parts in today’s technology. SSLs currently cover smaller FOVs but their lower cost makes it practical to equip vehicles with multiple units. For example, some systems under development use four to six lidars. Among them will generally be one high-definition lidar, where high-
definition typically means using between 64 to 128 lasers to generate pulses to yield an angular resolution of less than 0.1°. High-definition lidar can generally resolve cars and foliage up to 120 m away. Autonomous vehicles will also carry between three and five longrange, medium and short-range radars on their side to detect on-coming traffic. Here short-range radar (SRR) generally has a range of 0.2 to 30 m, medium-range radar (MRR) covers the 30 to 80 m range, and long-range radar (LRR) 80 to more than 200 m.
FUNCTIONAL DIAGRAM - RADAR SENSOR FOR AUTONOMOUS VEHICLES One example of radar sensors being created for use in autonomous vehicles is the AWR1642 device from Texas Instruments. It operates in the 76- to 81-GHz band with two transmitters and four receivers. The device includes a DSP subsystem and ARM R4F-based processor subsystem
eeworldonline.com | designworldonline.com
automonous sensors — automotive handbook 8-17.indd 9
which handles radio configuration, control, and calibration. Programming model changes let the sensor work for short, mid, and long-range uses with the possibility of dynamic reconfiguration for implementing multimode sensing.
8 • 2017
DESIGN WORLD — EE NETWORK
9
8/2/17 3:15 PM
Visible on this Chevy Bolt are some of the numerous sensors that permit autonomous operation. GM is testing autonomous vehicles on public roads in Michigan and is producing autonomous test vehicles. Testing is also underway on public roads in San Francisco and Scottsdale, Ariz. (Photo by Steve Fecht for General Motors)
Inertial measurement units (IMUs) measure linear and angular motion usually with a triad of both gyroscopes and accelerometers or magnetometers. They generally output angular velocity and acceleration. Differential GPS units are an improvement of the Global Positioning System that provides better location accuracy, boosting the 15-m nominal GPS accuracy to about 10 cm. The better accuracy comes from the use of networked ground-based reference stations that broadcast the difference between the positions indicated by the GPS satellite systems and their own known positions. Then shorter-range transmitters send
LRR is the defacto sensor for adaptive cruise control (ACC) and highway automatic emergency braking systems (AEBS). One problem is that systems that depend on LRR may not react correctly in such scenarios as when a car cuts in front of the vehicle, when there are thin-profile vehicles such as motorcycles staggered in a lane, and when a curvature of the road potentially confuses the ACC system about which car to follow. To overcome such limitations, some developers pair radar with cameras to provide additional context. One reason is that camera images PERCEPTION SENSORS FOR can be analyzed for azimuth angles, a AUTONOMOUS VEHICLES measurement not possible with radar. Autonomous vehicle prototypes generally carry between four Besides helping to interpret radar returns, and six Lidars, three to five radars, four to eight cameras, cameras are used to illuminate blind spots and and multiple ultrasonic sensors for sizing up the environment to detect a variety of features that include lane outside the car. Development efforts today aim to reduce the markings, lane width and curvature, stop signs, number of sensors needed to handle autonomous functions. speed limit signs, pedestrians, and buildings. Some prototypes currently carry four to eight cameras aimed forward, back, and toward each side. Autonomous systems have in addition traditionally used a lot of ultrasonic sensors because they are inexpensive. It is not uncommon to find 10 to 16 on a prototype because they are easy to integrate into a vehicle. But there is a lot of redundancy and overlap in what all these sensors do. The feeling is that once proof-of-concept work is complete, manufacturers will begin working toward reducing the number of sensors used on production vehicles. Radar and camera units are strictly for perception tasks. Another set of sensors are used for localization -- basically, finding out where the vehicle is on a map. Prototypes use high-grade inertial measurement units, as well as differential GPS receivers for highly accurate localization. 10
DESIGN WORLD — EE NETWORK
automonous sensors — automotive handbook 8-17.indd 10
8 • 2017
eeworldonline.com | designworldonline.com
8/3/17 12:32 PM
SENSING IS KEY
out the digital correction signal locally. Differential GPS receivers can make use of the correction signal up to about 200 miles away from the reference station, though the accuracy drops in proportion to the distance. One significant area of autonomous vehicle research is in how to fuse the various sensing technologies. For example, researchers are interested in fusing a Lidar with a camera to both improve performance and enable the use of Lidar with lower resolution (and cost) without sacrificing capabilities. Ditto for radar and cameras. Also, indications are that ultrasonic sensors may be phased out of autonomous sensing simply because lidar, radar, and camera technology may make them superfluous. There is a discussion about at what level sensor data should be fused – at the object level or at the lower level of raw returns from lidar and radar. Indications are that arguments for lower-level fusion are holding sway. The implication is that the data manipulation necessary for this kind of system necessitates use of a powerful central processor rather than several smaller units distributed around the vehicle. The centralized processor – dubbed a unified controller by autonomous practitioners – takes care of sensor fusion for a variety of tasks. These tasks include detecting and tracking pedestrians, vehicles, lanes, and vehicle traffic, as well as path prediction vehicle control, and managing the HMI. V2X Autonomous vehicles have a communication sensor suite that normally includes dedicated short-range communication (DSRC) technology and cellular LTE connectivity. Each has a different purpose. Cellular phone technology is envisioned as a means for downloading and updating maps, providing corrections to the GPS receiver, and similar tasks. DSRC uses 75 MHz of spectrum around the 5.9 GHz band and is based on the 802.11p standard. Consequently, it can make use of relatively inexpensive Wi-Fi chipsets and has a range of 300 m or more. It’s main use now is to warn about hazards around the car. But 802.11p is seen as particularly useful for both vehicle-to-vehicle (V2V) as well as vehicle-to-infrastructure (V2I) communications because it can support low-latency, secure transmissions and the ability to handle rapid and frequent handovers that characterize a vehicle environment. And adverse weather conditions generally don’t cause problems. eeworldonline.com | designworldonline.com
automonous sensors — automotive handbook 8-17.indd 11
Data coming back from autonomous vehicle sensors positioned on the outside of the car go toward creating a perception of the vehicle’s surroundings. Perceived information is often depicted as a virtual environment for testing purposes, as with this display created by software supplier Wind River Systems.
Expectations are that future versions of DSRC will handle V2P collision warning, V2V platooning through cooperative adaptive cruise control, V2I for weather, and more. The DOT has identified more than 40 V2I ideas, such as the ability to pay for parking and tolls wirelessly, identify when a car approaches a curve too quickly and alert the driver; adjusting traffic signals to accommodate first responders in an emergency; and alert drivers of conditions such as road construction. Of course, it can get complicated anticipating all possible eventualities of those scenarios. That’s why autonomous developers are investigating the use of machine learning and implementing sensor perception through neural networks. The appeal of machine learning is that it could potentially bypass the complicated math traditionally used in object and feature detection. It can potentially be faster to implement and perform better than classical methods if given enough training data. The challenges, however, include verifying and validating the machine learning system. Practitioners point out that when a neural network system fails, it’s not always possible to pinpoint why. Thus debugging can be problematic. This in not exactly the warm fuzzy feeling developers might hope for when designing vehicles that can potentially run-down people in crosswalks.
REFERENCES
U.S. Dept. of Transportation, What is DSRC?, www.its.dot.gov/factsheets/dsrc_factsheet.htm AASHTO Subcommittee on Transportation Systems Management and Operations, www. systemoperations.transportation.org/
8 • 2017
DESIGN WORLD — EE NETWORK
11
8/3/17 12:32 PM
The challenges of HIL testing and ADAS Engineers running hardware-in-the-loop simulations face difficulties associated with how to devise sensor signals that truly exercise advanced drive assistance systems. DAVID A. HALL NATIONAL INSTRUMENTS CORP.
More
than 50 years ago, in 1959, the Cadillac Cyclone XP-74 concept car featured two modified Nose cones on the aircraft radars designed to alert front of a Cadillac Cyclone the driver about oncoming concept car housed traffic. Today, an automotive modified aircraft radars. radar sensor is smaller than a hockey puck and has transitioned from concept to reality. Primitive techniques for collision and lane change avoidance are being replaced with advanced drive assistance systems (ADAS). These new systems introduce new design and test challenges. Modern ADAS architectures combine complex sensing, processing, and algorithmic technologies into the what will ultimately become the guts of autonomous the front, rear and side mirrors, a long-range radar vehicles. For consumers, the growth in ADAS technology and laser scanner at the front, a front camera at the provides comfort and convenience. For engineers, the top of the windscreen and a mid-range radar at evolution of ADAS provides a combination of job security each corner. mixed with quite different design and test challenges. As a result, autonomous vehicles employ As ADAS systems evolve from simple collisionsignificantly more complex processing avoidance systems to fully autonomous vehicles, they technologies and generate more data than ever demand sensing and computing technologies that are before. As an example, the Tesla Model S contains complex. For example, consider the sensing technology 62 microprocessors – more the three times on the Tesla Model S, which combines information from the number of moving parts in the vehicle. In eight cameras and 12 ultrasonic sensors as part of its addition, Intel recently estimated that tomorrow’s autopilot technology. Many experts are claiming that the autonomous vehicles will produce four terabytes 2018 Audi A8 is the first car to hit Level 3 autonomous of data every second. Making sense of all this operation. At speeds up to 37 mph, the A8 will start, data is a significant challenge – and engineers accelerate, steer and brake on roads with a central barrier have experimented with everything from simple without help from the driver. The car contains 12 ultrasonic PID loops to deep neural networks to improve sensors on the front, sides and rear, four 360° cameras on autonomous navigation.
12
DESIGN WORLD — EE NETWORK
national instrument article — automotive handbook 8-17.indd 12
8 • 2017
eeworldonline.com | designworldonline.com
8/2/17 3:17 PM
Scalable Automotive Network Solutions Cost-Efficient • Real-Time • Flexible
Microchip has been delivering robust, automotive-qualified CAN, LIN, Ethernet, MOST® technology and USB solutions to automotive suppliers for over ten years. Our MOST technology and USB solutions are the de facto standards for in-vehicle infotainment and consumer device connectivity worldwide. Microchip’s automotive Ethernet solution, Quiet-WIRE® technology, offers industry-standard, Ethernet with proven automotive EMC performance. If your automotive design requires in-vehicle transport of audio, video, control or Ethernet packet data, we offer solutions which work reliably over UTP, coax and optical physical layers with guaranteed low latency. Software stacks are also available from Microchip, as well as third parties, allowing you to focus your efforts on application software development.
Application Examples •
Body control
•
Rear-view camera
•
Top-view camera
•
LTE/3G connectivity
•
HMI
•
Infotainment head unit
•
Ambient LED lighting
•
Exterior LED lighting
•
Smart sensors
www.microchip.com/automotive The Microchip name and logo, the Microchip logo, MOST and Quiet-WIRE are registered trademarks of Microchip Technology Incorporated in the U.S.A. and other countries. All other trademarks are the property of their registered owners. © 2016 Microchip Technology Inc. All rights reserved. 3/16 DS00001813C
Microchip 2 -- EE Autonomous HB 8.17.indd 13
8/2/17 3:59 PM
ADAS decision-making uses inputs from multiple types of sensors.
SYSTEM-LEVEL TEST Increasingly complex ADAS technology makes a lot of demands on test regimes. In particular, hardwarein-the-loop test methods, long used in developing engine and vehicle dynamics controllers, are being adapted to ADAS setups. HIL test systems use mathematical representations of dynamic systems which react with the embedded systems being tested. A HIL simulation may emulate the electrical behavior of sensors and actuators and send these signals to the vehicle electronic control module (ECM). Likewise, an ADAS HIL simulation may use real sensors to stimulate an emulation of the ECM and generate actuator control signals. For example, a HIL simulation platform for the development of automotive anti-lock braking systems may have mathematical representations for subsystems that include the vehicle dynamics such as suspension, wheels, tires, roll, pitch and yaw; the dynamics of the brake system’s hydraulic components; and road qualities. A point to note is that the in-the-loop notation means the reaction of the unit being tested influences the simulation. HIL methods verify and validate the software in the physical target ECUs. So HIL test benches must provide the ECU with realistic real-time stimuli and simulated loads. Multiple ECUs get tested in network HIL setups that wring out the bus systems, sensors, and actuators. More recently, vehicle-in-the-loop methods use a physical car to replace vehicle simulations and most of the virtual ECUs. This approach comes in handy for testing safety critical ADAS functions by, say, crashing into virtual obstacles while operating the real vehicle. 14
DESIGN WORLD — EE NETWORK
national instrument article — automotive handbook 8-17.indd 14
8 • 2017
Typical radar test patterns.
The way an ADAS typically works is that it first processes the raw sensor data via feature and object recognition algorithms. The result is a data set that resembles a grid-based map of the environment or a list of recognized objects (such as trees, pedestrians, vehicles, and so forth). A situation analysis algorithm combines this processed data and estimates the current traffic situation. This situational analysis gets forwarded to the ADAS application. The application finally decides on actions to take such as slowing down or triggering an emergency brake. Many HIL tests of ADAS functions boil down to sending simulated object lists to the UUT (unit under test). Depending on the sensor systems involved, the object lists may include information about object kinematics as well as whether the object is a pedestrian, vehicle, or something else. Besides actual sensor data, the ADAS application needs supplementary vehicle data from other ECUs. This information usually passes to the UUT via the CAN bus and generally includes details like gear rate, acceleration, engine speed, steering angle, or GPS data. All ADAS require at least some of this information to check the plausibility of the situation eeworldonline.com | designworldonline.com
8/3/17 10:21 AM
THE CHALLENGES OF HIL TESTING
TYPICAL HIL ARCHITECTURE analysis. The ADAS makes safety-critical interventions only if all this information is consistent. When testing ADAS functions within HIL regimes, there can be a variety of UUTs and interfaces. For example, a single ECU might be tested while it runs core emergency functions such as a radar-based emergency brake-assist. Or a dedicated ADAS ECU might receive sensor information from other control units. Depending on the setup, multiple real ECUs may be part of the test bench and connected via automotive network systems like CAN, FlexRay, or Automotive Ethernet. software interface, so establishing the connection In a HIL test bench there can be several points may take some work. Additionally, the fact that at which virtual data is added. One option is feeding simulated data data may no longer come from the physical to physical sensors. Of course, real ADAS have multiple sensors so this sensor can force some software and hardware strategy entails simultaneous generation of multiple signals. In the case of modifications of the UUT. cameras, for example, engineers might show each camera a sequence of images of actual scenery via a screen or projector. For radar returns, the PLATFORM-BASED APPROACH TO HIL HIL system needn’t generate the radar output, just simulated echoes Manufacturers such as National Instruments offer coming back. a platform-based approach to HIL testing. Key One advantage of using physical sensors in HIL testing is that there’s features of NI’s HIL test approach include a tight no need to modify the UUT for testing purposes – the simulated signals synchronization among numerous PXI instruments come via the same physical interfaces as found in real vehicles. Among to enable simulations of driving scenarios and the biggest challenges is assuring the quality of the injected signals. For sensors. Particularly important in ADAS and example, images projected on a screen might not represent the dynamic autonomous driving applications is NI’s FPGA range a camera would see in real life. The classic example is that of a car technology. FPGA technology enables engineers driving straight into a blazing sunset, then descending a hill and plunged to design HIL test systems with extremely fast loop into dusk. All in all, the injection of data into a physical sensor involves few rates for quick decision making. modifications of the UUT, but the accurate representation of scenarios can One recent example of an HIL test system be quite demanding and not currently possible for all sensors. using the platform-based approach to sensor Consequently, engineers sometimes inject virtual data after digitization fusion testing was demonstrated by a consortium as, say, injecting data after the ADC stage of a radar, or by providing called ADAS Innovations in Test. This group is a electrical stimulation of a camera’s imager. This approach is quite productcollaboration between NI alliance partners S.E.T., specific – there is no such thing as a standard radar or camera hardware/
A typical test configuration for an ADAS HIL simulation depicts the different methods of simulating sensor data. Test equipment can inject digital data into the sensor channel to simulate sensor inputs. Alternatively, test equipment can develop sensor inputs in the form of images for cameras or echo patterns for radars to simulate actual sensed data. There are advantages and drawbacks to each approach.
eeworldonline.com | designworldonline.com
national instrument article — automotive handbook 8-17.indd 15
8 • 2017
DESIGN WORLD — EE NETWORK
15
8/3/17 11:11 AM
Konrad Technologies, measX, and S.E.A. At the recent NIWeek conference, the group demonstrated an ADAS test setup which can synchronously simulate radar, lidar, communications, and camera signals for an ADAS sensor. In one case, the setup was able to simulate a virtual test drive using IPG CarMaker and NI VeriStand software. Modular systems such as PXI simulate many vehicular physical signals – effectively recreating the physical environment of the ADAS sensor or ECU. Synchronization is a critical requirement of the PXI modules – because all these signals must be precisely simulated in parallel. The ECU needs the radar, V2X and camera signal to arrive simultaneously if it is to process and understand the scenario and act accordingly. HIL test techniques based on a platform let engineers simulate a virtually unlimited duration of “driving time.” The lengthy test time provides more opportunities for finding problems and lets engineers better understand how the embedded software performs in a wide range of situations. As a result of simulating driving conditions in the lab, engineers can identify critical design flaws much earlier in the design process. For example, at Audi AG, the radar team recently adopted a PXI-based system radar simulation. Project lead Niels Koch says radar hardware-in-the-loop simulation enabled them to “simulate ten years of sensor environments within few weeks.”
THE V MODEL OF AUTOMOTIVE DEVELOPMENT
16
DESIGN WORLD — EE NETWORK
national instrument article — automotive handbook 8-17.indd 16
8 • 2017
Fifty years ago, a front-mounted radar system that notified the driver of oncoming traffic was a gimmick. Five years from now, it will be the difference between life and death for passengers in autonomous vehicles. Given the impending safety and regulatory considerations of this technology – engineers will utilize HIL test techniques to literally simulate billions of miles of driving. Advanced systems like autonomous vehicles are quickly re-writing the rules for how test and measurement equipment vendors must design instrumentation. In the past, test software was merely a mechanism to communicate a measurement result or measure a voltage. Going forward, test software is the technology that allows engineers to construct increasingly complex measurement systems capable of characterizing everything from the simplest RF component to comprehensive autonomous vehicle simulation. As a result, software remains a key investment area for test equipment vendors – and the ability to differentiate products with software will ultimately define the winners and losers in the industry.
REFERENCES ADAS IIT-Innovation In Test, www.adas-iit.com/ National Instruments Corp., www.ni.com
The V model is frequently used in the automotive industry to depict the relationships of vehicle-in-the-loop, hardware-in-the-loop, software-in-the-loop, and model-in-theloop methods. Stages depicted on the left have a corresponding testing counterpart on the right. MiL settings test a model of the functions to be developed. MiL is applied in early stages to verify basic decisions about architecture and design. SiL setups test the functions of the program code complied for the target ECU but without including real hardware. HiL methods verify and validate the software in the physical target ECUs. ViL tests replace the vehicle simulation and most of the virtual ECUs with a real vehicle. The simulated parts of the environment are injected into the vehicle sensors or ECUs.
eeworldonline.com | designworldonline.com
8/3/17 10:22 AM
Take the path of least resistance
Increase the efficiency of your power electronics systems with our ultra-low DCR power inductors Our AGP Family of high current power inductors feature flat wire construction for exceptionally low DC and AC resistance, letting your power systems run cooler and more efficiently. They’re offered with inductance values as high as 470 µH and current ratings up to 108 Amps.
AEC
Q200 Grade1
And all AGP Family inductors meet stringent AEC-Q200 Grade 1 quality standards, making them suitable for automotive and other harsh environment applications. Visit www.coilcraft.com/AGP to learn more and arrange a test drive. Free samples are always just around the corner! ®
WWW.COILCRAFT.COM
Coilcraft -- EE Autonomous HB 8.17.indd 17
8/2/17 4:00 PM
Sensors in the driving seat Advances in sensor technology help make autonomous vehicles safe and reliable.
RALF BORNEFELD | INFINEON TECHNOLOGIES
Many
drivers on today’s roads are already supported by advanced driverassistance systems (ADAS) for functions such as parking assist, cruise control and automated lighting. Now, more intelligent and connected environmental sensors are making autonomous vehicles the next logical step for the industry. But as the development of driverless cars steams ahead, the debate continues as to what technology the ideal car of the future should offer and whether it should be completely driverless. Since the invention of the first engine-driven vehicles in the late 1800s, cars have given people the freedom to go wherever and whenever they want in
comfort and style. Part of the reward of owning a car is the thrill of being in complete control of a large, fastmoving machine. Passionate drivers might therefore feel averse to the idea of completely relinquishing this sense of independence to an autonomous vehicle that drives itself. However, there are times when driving can be more draining than exciting, as when commuting in heavy traffic, or during long-distance journeys. Americans were stuck in traffic for a cumulative eight billion hours in 2015. The ability to switch-on autonomous mode and let the car takeover would not only relieve tired drivers but also reduce accidents.
As more new cars begin to be fitted with intelligent connected sensors, vehicles will be able to pass on data to others through vehicle-to-vehicle (V2V) communications about factors like traffic congestion and dangerous conditions.
18
DESIGN WORLD — EE NETWORK
infineon article — automotive handbook 8-17 v2.indd 18
8 • 2017
eeworldonline.com | designworldonline.com
8/3/17 11:13 AM
SENSORS IN THE DRIVING SEAT
THE MISSION FOR ZERO FATALITIES In 1965, the U.S. recorded 47,089 deaths from car crashes. Fifty years later, despite the population rising from 198.7 million to an estimated 324 million, statistics show fatalities dropped to 35,092 in 2015. This is thanks to the introduction of safety regulations over the years such as the mandatory use of seat belts, and increasingly advanced safety mechanisms such as anti-lock braking systems (ABS). As the growing number of intelligent, connected sensors designed into new cars issteadily making driving safer, the vehicles of the future will be designed to attain “Vision Zero” – road traffic without any fatalities. It’s thought that human error causes some 90% of motor vehicle crashes, and with the implementation of triedand-tested autonomous driving technology it is hoped this figure could be drastically reduced in the future. In 2006, cars carried an average of 40 sensors. Currently, this figure sits at around 90, and the selfdriving car in 2025 might have double that number. Fully automated vehicles are set to go into mass production within the next decade, and this will no doubt improve our lives, offering greater flexibility, comfort and safety. Technology heavyweights such as Google, Uber, Intel, Apple and even Samsung have partnered with auto manufacturers in the battle to be recognised as the
eeworldonline.com | designworldonline.com
infineon article — automotive handbook 8-17 v2.indd 19
A combination of both radar and lidar sensors on vehicles will help to reduce accidents caused by human error. Expectations are that sensed information will be conveyed to human vehicle occupants via methods such as heads-up displays.
architects of a brave new world of autonomous vehicles. Although the technology is generally not yet advanced enough to permit the driver’s hands to be removed from the wheel or attention to be taken away from the road, soon it will be possible for the driver to completely disconnect while the vehicle is driving itself. Google is considered one of the earliest pioneers, with its self-driving vehicles clocking 1.5 million autonomous miles by March 2016. Google’s parent company, Alphabet, subsequently started a new self-driving car spin-off company called Waymo. Tesla Motors rolled out its Autopilot feature in a software over-the-air (SOTA) update in January 2016. Autopilot allows the Tesla Model S to act autonomously on limited-access highways with the full attention of the driver. Ride-sharing company Uber began trialling a fleet of self-driving Ford Fusions in Pittsburgh in September 2016, each vehicle equipped with 20 cameras, seven lasers, GPS, radar and lidar. Uber’s main rival, Lyft, recently bolstered by a $500 million investment from General Motors and a technology partnership with Waymo, has also announced a forthcoming self-driving car trial in Boston.
8 • 2017
DESIGN WORLD — EE NETWORK
19
8/2/17 3:27 PM
ANATONOMY OF A SOLID-STATE LIDAR SENSOR Reinforcement structure, to achieve high frequencies with very flat mirror surface Lidar systems can be better than human senses in some cases, for example, in detecting potential obstacles on the road. One example of a development in this area is a microelectromechanical system (MEMS) developed by Dutch company Innoluce, acquired by Infineon. The MEMS lidar device measures just 3 x 4 mm and consists of an oval-shaped mirror on a bed of silicon. Actuators use electrical resonance to make the mirror oscillate and change the direction of the laser beam.
Of course, the trials on public roads have been inevitably marred by some wobbles and even tragedy. Uber’s trials in Pittsburgh, San Francisco and Tempe, Ariz. have been plagued with legal issues, and one of its cars in Tempe flipped over in an accident at a yield sign. The first fatality happened in May 2016 in a highspeed collision between a Tesla Model S and a turning 18-wheeler truck. The subsequent inquiry was closed without recall, revealing that since Autopilot was rolled out the overall Tesla accident rate had dropped by 40%. SHRINKING SENSORS Lidar, radar and optical are three of the most important sensor technologies for the development of autonomous cars. The ultimate goal is to recreate the human power of reliable judgement, with the ability to make split-second decisions based on a combination of information from the sensors and lessons learned from previous experiences. While radar uses radio-frequency electromagnetic waves, lidar sends out laser beams to scan an area and then analyse the reflections that bounce back. Lidar systems can be better than human senses in some cases, for example, in detecting even small objects on the road. However, up until recently they have not only been expensive – between $50,000 and $10,000 each – but also insufferably bulky because of their reliance on mirrors positioned to direct the laser beams. Technology companies are using different tactics to try and shrink lidar proportions, some by hoping to design a solid-state lidar without any moving parts and others by using flashes of laser light instead of constant beams. 20
DESIGN WORLD — EE NETWORK
infineon article — automotive handbook 8-17 v2.indd 20
8 • 2017
Infineon has taken a slightly different approach by focusing on a microelectromechanical system (MEMS) developed by Dutch company Innoluce. Infineon acquired Innoluce in October 2016 with the aim of being able to offer technological expertise in all three complementary sensor systems required for autonomous driving. The MEMS lidar device, which measures just 3 x 4 mm, consists of an oval-shaped mirror on a bed of silicon. Actuators use electrical resonance to make the mirror oscillate and change the direction of the laser beam. With a range of 250 m and the ability to scan 5,000 data points/sec., the MEMS lidar is small, robust and reliable, and expected to cost automakers less than $250. Lidar systems will need to be semiconductor-based – thus becoming more compact, cost-effective and robust – to become a standard feature in all car classes, and they are going to be essential for self-driving cars to accurately identify roadside conditions such as traffic signs, road obstacles and pavement markings. The good news is that the burden of detecting all approaching dangers will soon be shared. As more new cars begin to be fitted with the full array of intelligent sensors connected to the network, road users will be able to pass on data to others about things like traffic congestion and dangerous road conditions. Cars that drive themselves will also require enhanced computing power, and domain computer architectures connected via high-speed buses will need to become an integral part of the vehicle. It is therefore important to create redundant systems and domain architecture that guarantee fail-safe operation and safety. eeworldonline.com | designworldonline.com
8/3/17 10:23 AM
PowerShuntAd_DesignWorld8_17.qxp_Layout 1 7/10/17 11:59 AM Page 1
SENSORS IN THE DRIVER SEAT
Power Shunt
Current Sense Resistors
As with every new technology, success will be long in the making and hard-won after the inevitable initial failures and setbacks. Absolute reliability is of utmost importance to ensure that drivers can confidently delegate responsibility to the vehicle, without fears about safety. A combination of all the most innovative sensor technologies will provide a segmented safety cocoon to ensure that autonomous vehicles of the future are as safe as possible. The 2002 science fiction movie ‘Minority Report’ depicted cars in the year 2054 as being sleek selfdriving vehicles shuttling through a vast networked transit system. However, the futuristic Lexus 2054 that actor Tom Cruise found himself in was still capable of being driven manually to facilitate a dramatic car chase. The manual override compromise on automated vehicles might just be enough to safeguard the driving pleasure of the car enthusiasts well into the future.
REFERENCES Americans were stuck in traffic for 8 billion hours in 2015, CNN Money, March 15, 2016, www.money.cnn.com/2016/03/15/news/uscommutes-traffic-cars/index.html
Help you Make the Leap from
Concept to Reality
U.S. Dept. of Transportation, National Highway Traffic Safety Administration Traffic Safety Facts 2015, www.crashstats.nhtsa.dot.gov/Api/Public/ ViewPublication/812384
KOA Speer’s continually expanding line of Power Shunt Current Sense Resistors are the ideal solution to help you optimize your power system design. Our new Power Shunt Current Detectors are designed for DC to DC converters, inverters, batteries, motor controls, automotive modules, power supplies or any other power management application. KOA Speer Power Shunts Deliver:
Human Error as a Cause of Vehicle Crashes, The Center for Internet and Society at Stanford Law School, December 18, 2013, www. cyberlaw.stanford.edu/blog/2013/12/humanerror-cause-vehicle-crashes The Verge, Lyft teams up with NuTonomy to put ‘thousands’ of self-driving cars on the road, June 6, 2017, www.theverge.com/2017/6/6/15742274/lyftnutonomy-self-driving-car-partnership-boston-pilot
• High Power: up to 10W • Ultra Low Resistance: 0.5mΩ ~ 1mΩ • 4 Terminal, 2726 size - PSG4 2 Terminal, 3920 size - PSJ2 • Wide Temp Range: -65°C ~ +75°C
Wall Street Journal, Uber Resumes Self-DrivingVehicle Program After Arizona Accident, March 27, 2017, www.wsj.com/articles/uber-resumes-selfdriving-vehicle-program-after-arizona-accident1490641844?mod=mktw
Check out the new products from the industry’s most award winning passive component supplier… KOA Speer.
Visit KOASpeer.com
The Verge, Fatal Tesla Autopilot accident investigation ends with no recall ordered, January 19, 2016, www.theverge.com/2017/1/19/14323990/teslaautopilot-fatal-accident-nhtsa-investigation-ends
8 • 2017
DESIGN WORLD — EE NETWORK
infineon article — automotive handbook 8-17 v2.indd 21
21
8/2/17 3:43 PM
ADAS developers contemplate sensor fusion LANCE WILLIAMS | ON SEMICONDUCTOR
Semiconductor makers now field sensor equipment complete with built-in safety systems that specifically target autonomous driving applications.
Increasingly
sophisticated advanced driver assistance systems (ADAS) are helping cars move up through defined levels of autonomy toward the day when fully autonomous driving is a reality. ADAS is an automotive megatrend that will enable the biggest change in how the world’s population gets from A to B since the inception of commercial flight. In the automotive sector, almost all the discussion is about the fully autonomous or driverless vehicle. Safety is the key driver in the march to make vehicles more autonomous. WHO 2017 figures show 1.25 million people die in road traffic accidents each year and a further 20 to 50 million people are injured or disabled. The cause of these road traffic accidents is overwhelmingly human error. The U.S. Dept. of Transportation has found that 94% of the two million accidents happening annually were caused by driver mistakes. Research began in the 1980s and today there are about 50 companies actively developing autonomous vehicles, as well as numerous university projects pushing boundaries. Autonomous emergency braking, lane departure warning systems, active cruise control, as well as blind spot monitoring are common features on vehicles produced today, and automated parking is proliferating at pace. These systems provide valuable input to the driver, who ultimately remains in control, for now. As ADAS evolves, the industry is reaching a tipping point where the vehicle itself is providing integrated monitoring. Companies like ON Semiconductor are addressing the trend through both hardware and software innovation and support for other activities in ADAS. Sensor fusion is the key to passing this tipping point. Diverse systems in the vehicle are becoming linked, boosting the ability to make more complex, safety-critical, decisions and providing a redundancy that will help prevent errors that could lead to accidents.
22
DESIGN WORLD — EE NETWORK
on semiconductor article — automotive handbook 8-17.indd 22
8 • 2017
eeworldonline.com | designworldonline.com
8/2/17 3:01 PM
ADAS DEVELOPERS
Here’s how an ADAS might size up a view through a typical windshield. Cameras on the vehicle would feed the view to image processors that would recognize vehicles, lane divider lines, and road signs. Radar sensors would gauge factors such as the relative speeds and trajectories of the recognized vehicles. The ADAS processor would fuse this information with other data such as the distance to a destination to make decisions about how to guide the vehicle.
ON THE ROAD TO LEVEL FIVE
SENSOR FUSION Vision is an increasingly important facet of vehicle technology. Vision sensors now support active safety features that include everything from rear-view cameras to forward-looking and in-cabin ADAS. Engineers combine and process the data from multiple types of sensors to ensure correct decisions, responses and adjustments. This approach may include combinations of image sensors, radar, ultrasound and lidar.
eeworldonline.com | designworldonline.com
on semiconductor article — automotive handbook 8-17.indd 23
Progression through the levels of ADAS has brought us to a tipping point for the industry.
8 • 2017
DESIGN WORLD — EE NETWORK
23
8/2/17 3:08 PM
AN ADAS VIEW OF LATENCY
Latency is a critical parameter in fault detection processes.
There is a safety standard called ISO 26262 that applies to ADAS. ISO 26262 covers the functional safety of vehicle electronic systems. It is the vehicularspecific extension of a broader safety standard called ISO 61508. Functional safety focuses primarily on risks arising from random hardware faults as well as systematic faults in system design, hardware, software development, in production, and so forth. Under this standard, manufacturers identify hazards for each system, then determine a safety goal for each of them. Each safety goal is then classified according to one of four possible safety classes, called Automotive Safety Integrity Levels (ASIL). Vehicle safety integrity levels cover the range ASIL-A (lowest), to ASIL-D (highest). An ASIL level is determined by three factors: severity of a failure, the probability of a failure happening, and the ability for the effect of the failure to be controlled. Functional safety starts at the sensor. Issues such as latency, and high-speed fault detection are given close attention by automotive OEMs, Tier-Ones and sensor makers alike. There can be catastrophic implications if faults in image sensors used for ADAS go undetected, especially for systems such as adaptive cruise control, collision avoidance and pedestrian detection. The process of detecting one of the thousands of potential failure modes is processor-intensive, requiring
24
DESIGN WORLD — EE NETWORK
on semiconductor article — automotive handbook 8-17.indd 24
8 • 2017
an algorithm for each fault. In fact, some faults are impossible to detect at the system level. Latency within fault-tolerant systems is a primary concern for all system designers - simply put, this is the time between when a fault occurs and when the system returns to a safe state. For safety, the fault must be detected and addressed before it leads to a dangerous event. Vision sensors are becoming more advanced, and functional safety fault detection is moving from the ADAS to the sensor itself. Detection is built-in and faults are identified by design. The benefit of sensor-based detection is better fault detection as well as less pressure on the ADAS processing capacity. Even today, many ADAS struggle to meet ASIL-B compliance. In the near-term, the number of systems required to meet ASIL-B compliance will rise dramatically. Future ADAS will need to meet ASIL-C and ASIL-D compliance if widespread use is to become reality. ON Semiconductor is already active in this area. It has equipped many of its image sensors with built-in sophisticated safety mechanisms to ensure complete functional safety. Driven by the stringent demands of automotive operating environments, image sensors for ADAS are adding features such as light flicker mitigation (LFM) which overcomes issues of misinterpretation of scenes caused by front or rear LED lighting in the sensor’s field of
eeworldonline.com | designworldonline.com
8/3/17 11:16 AM
ADAS DEVELOPERS
MARS IMAGING SYSTEM BLOCK DIAGRAM
A typical camera system designed for prototyping aspects of ADAS vision systems might take the form of this Modular Automotive Reference System (MARS) from ON Semiconductor. Most camera systems targeting use in vehicles have a configuration resembling that of MARS: Here a camera module contains the camera and lens, and a serializer for converting camera data to either LVDS (low-voltage differential signaling) signals or Automotive Ethernet. In applications where humans will view the image, the camera module might also contain an image coprocessor for operations such as correcting lens distortions, tone mapping, and other image pipeline functions. Camera data gets sent back to a controller via serial lines because the camera module might lie 10 m or more away from the controller, making bus connections impractical. At the controller, data from multiple cameras get aggregated and deserialized.
view, superior infrared performance, and the ability to work in either extremely bright or low-light conditions. All in all, advanced and functionally safe sensors will be at the heart of ADAS. The fusing of these sensors with other in-vehicle technology and ensured cyber security will move the industry past the tipping point where vehicles can become truly autonomous.
eeworldonline.com | designworldonline.com
on semiconductor article — automotive handbook 8-17.indd 25
MARS HARDWIRE
REFERENCES ON Semiconductor www.onsemi.com ISO 26262 www.en.wikipedia.org/wiki/ISO_26262
8 • 2017
DESIGN WORLD — EE NETWORK
25
8/2/17 3:08 PM
Managing EV batteries squeezed into odd shapes Automakers are forced to fit EV batteries into nooks and crannies in an effort to maximize driving range. The resulting convoluted layouts call for a wireless approach to managing cells.
Lithium-Ion
GREG ZIMMER | LINEAR TECHNOLOGY CORP.
batteries require considerable care if they are expected to function reliably over a long period. They cannot be operated to the extreme end of their state-of-charge (SOC). The capacity of lithium ion cells diminishes and diverges over time and usage, so every cell in a system must be managed to keep it within a constrained SOC. It takes tens or hundreds of battery cells to provide sufficient power for a vehicle, configured in a long series generating as much as 1 kV or higher. The battery electronics must operate at this high voltage and reject common-mode voltage effects while differentially measuring and controlling each cell in these strings. The electronics must be able to communicate information from each cell in a battery stack to a central point for processing.
NOW PART OF ANALOG DEVICES INC.
In addition, the operation of a high-voltage battery stack in a vehicle or other high-power applications entails tough conditions, such as the presence of significant electrical noise and wide operating temperatures. Nevertheless, the battery management electronics are expected to maximize operating range, lifetime, safety and reliability, while minimizing cost, size and weight. Steady advances in battery cell monitoring ICs have given battery packs in automobiles high performance, longer life and reliability. The development of the wireless BMS (battery management system) promises to further improve safety and reliability of the full battery system.
The battery module in Volkswagen’s e-Golf occupies space stretching from the rear axle to the front powertrain, wherever engineers could squeeze it in. The distributed nature of the e-Golf’s batteries characterizes most EVs in production today.
26
DESIGN WORLD — EE NETWORK
linear tech article — automotive handbook 8-17 v2.indd 26
8 • 2017
eeworldonline.com | designworldonline.com
8/3/17 11:18 AM
PUSH PERFORMANCE TO THE TOP
PCB Terminal Blocks for Power Electronics • • • •
Compact dimensions and high current carrying capacity – up to 76 A Fast, easy connection with Push-in CAGE CLAMP® Optimal handling for any application – with or without lever Comprehensive range covering 24 - 4 AWG
www.wago.us/powerelectronics
Powerelectronics PCB - Design World - Print.indd 1 Wago -- EE Autonomous HB 8.17.indd 27
7/21/17 8/2/17 8:15 4:03AM PM
MODULAR BATTERY MANAGEMENT SYSTEMS In 2008, Linear Technology announced the first high performance multicell battery stack monitor, the LTC6802. Among its features, the LTC6802 measures up to 12 Li-Ion cells with 0.25% maximum total measurement error within 13 msec, and many LTC6802 ICs can be connected in series to enable the simultaneous monitoring of every cell of long, high-voltage battery strings. Linear Technology has improved upon the LTC6802 many times over the years. All of the devices in Linear’s LTC68XX family are intended for precision battery management within hybrid/electric vehicle (HEVs), electric vehicles (EVs) and other highvoltage, high-power battery stacks. The LTC6811 is Linear Technology’s latest multicell battery stack monitor, incorporating an ultrastable voltage reference, high-voltage multiplexers, and dual 16-bit delta-sigma ADCs. An LTC6811 can measure up to 12 seriesconnected battery cells at voltages with better than 0.04% accuracy. In the fastest ADC mode, all cells can be measured within 290 μsec. With eight programmable third-order low-pass filter settings, the LTC6811 can provide outstanding noise reduction. The result is outstanding cell measurement accuracy, enabling precise battery management for better battery pack capacity, safety and life. Each LTC6811 includes two built-in 1-MHz serial interfaces, an SPI interface for connecting to a local microprocessor, and the proprietary two-wire isoSPI interface. The isoSPI interface provides two communication options: Multiple devices can connect in a daisy chain to the BMS master (host processor), or multiple devices can connect and be addressed in parallel by the BMS master. MODULAR BATTERY PACKS To accommodate the large quantity of cells required for high-powered automotive systems, batteries are often divided into modules and distributed throughout available spaces in the vehicle. There are 10 to 24 cells in a typical module, and modules can be assembled in different configurations to suit multiple vehicle platforms. A modular design simplifies 28
DESIGN WORLD — EE NETWORK
linear tech article — automotive handbook 8-17 v2.indd 28
8 • 2017
A modular battery management system configured to use a CAN bus might have this sort of architecture. Note the isolators between the CAN bus connection and the BMS chips.
maintenance and warranty issues and can be used as the basis for large battery stacks. And it allows for more effective use of space. It takes a robust communication system to support a distributed, modular topology within the high electromagnetic interference (EMI) environment of an EV/HEV. Both isolated CAN Bus and Linear’s isoSPI are road-proven ways of connecting modules in this environment.
eeworldonline.com | designworldonline.com
8/2/17 3:34 PM
MANAGING EV BATTERIES
Here’s how a modular battery management system might be configured using isoSPI communications. The BMS master is an LT6820 which provides bidirectional SPI communications between two isolated devices through a single twisted pair connection. Each LTC6820 encodes logic states into signals that transmit across an isolation barrier to another receiver. Precision window comparators in the receiver detect the differential signals.
Given the success of CAN Bus in automotive applications, it provides a well-established network for interconnecting battery modules but requires several additional components. For example, implementing an isolated CAN Bus via the LTC6811 SPI interface requires the addition of a CAN
eeworldonline.com | designworldonline.com
linear tech article — automotive handbook 8-17 v2.indd 29
A modular battery management system using a SmartMesh network: Here participant nodes cooperate to route packets back to the BMS master. Note that the BMS master can reside remote from the batteries and cellmonitoring electronics. The SmartMesh network is based the 6LoWPAN and 802.15.4e standards. Wireless nodes, known as motes, collect and relay data. The bus master monitors and manages network performance and security and exchanges data with a host processor.
transceiver, a microprocessor, and an isolator. The primary downside of a CAN Bus is the added cost and board space required for these additional elements. An alternative to a CAN Bus interface is Linear Technology’s innovative two-wire isoSPI interface. Integrated into every LTC6811,
the isoSPI interface uses a simple transformer and a single twisted pair, as opposed to the four wires required by CAN bus. The isoSPI interface provides a high RF noise-tolerant interface in which modules can connect in a daisychain over long cables and operate at data rates up to 1 Mbps.
8 • 2017
DESIGN WORLD — EE NETWORK
29
8/2/17 3:34 PM
One relatively new development is that of a wireless BMS. Here, each module connects with the BMS via a wireless link instead of a CAN bus cable or an isoSPI twisted pair. Linear Technology has installed a wireless BMS in a concept car. This wireless BMS concept car is a BMW i3 that combines an LTC6811 battery stack monitor with Linear’s SmartMesh wireless mesh networking products. This demonstration of a fully wireless BMS car offers the potential for better reliability, lower cost and reduced wiring complexity for large multicell battery stacks in electric and hybrid/electric vehicles. Automakers are challenged to assure the driving public that electric and hybrid/electric vehicles are both safe and reliable. Linear Technology is now looking beyond the safety and reliability of the battery monitoring IC to address the potential mechanical failure of connectors, cables and wiring harnesses in high-vibration automotive environments. Many have viewed the vehicle environment characterized by a lot of metal and high EMI as too harsh for reliable operation of wireless systems. However, SmartMesh networking offers a truly redundant interconnect system that overcomes these difficulties. It uses both path and frequency diversity to route The differences in connections among CAN bus, wireless messages around obstacles and to mitigate isoSPI, and wireless battery monitoring become clear interference. Field-proven in industrial Internetfrom a view of how these techniques would appear of-Things applications, SmartMesh embedded in a real system. CAN wiring is in a bus configuration
WIRED VS. WIRELESS BATTERY MONITORING CAN BUS SYSTEM
and generally includes four wires, two for the data connection and two for power and ground. The isoSPI interconnections are also in a bus configuration but involve a single twisted pair that is galvanically isolated. A transmitter encodes conventional SPI signals –which normally use four single-ended connections -- into a differential isoSPI signal which transmits via a simple pulse transformer through the twisted pair. At the other end of the twisted pair, the isoSPI signal can be translated back to SPI. In a system using SmartMesh wireless networking, none of the monitors need be physically connected to each other. Communication is via peer-to-peer networking where participating nodes cooperate to route data packets.
isoSPI SYSTEM WIRELESS MESH, including driver console, temperature sensor read-outs
WIRELESS MESH SYSTEM
30
DESIGN WORLD — EE NETWORK
linear tech article — automotive handbook 8-17 v3.indd 30
8 • 2017
eeworldonline.com | designworldonline.com
8/4/17 10:03 AM
MANAGING EV BATTERIES
wireless networks deliver >99.999% reliable data transmission in such harsh environments as railcar monitoring, mining, and industrial process plants. By delivering the reliability of wires while eliminating mechanical connector failures, the wireless BMS concept car shows how wireless technology can significantly improve overall system reliability and simplify the design of automotive battery management systems. A BMS with a SmartMesh network can potentially deliver new functions that wired systems cannot. The wireless mesh network lets battery modules sit in hard-to-reach locations and makes it possible to install sensors in places where a wiring harness can’t go. The BMS Master can collect additional data germane to the accuracy of battery SOC calculations, such as current and temperature, simply by adding SmartMesh-enabled sensors. SmartMesh automatically time-synchronizes each node to within a few microseconds and accurately time-stamps measurements at each node. The ability to time-correlate measurements taken at different locations in a vehicle is a powerful feature for accurately calculating the battery SOC and state of health (SOH). A SmartMesh node with local processing at each module improves normal BMS operation and also presents the potential for smart battery modules where module diagnostics and communication may augment assembly and service. SmartMesh wireless sensor networking products are chips and pre-certified PCB modules complete with mesh networking software, enabling sensors to communicate in tough industrial Internet of Things (IoT) environments. There are over 50,000 of these customer networks deployed in 120 countries. These chips and modules have security measures that include NISTcertified AES128 encryption.
Body & Electronics
Drive Train
Safety
Connected Car
eMobility
Consulting
Discover Innovation in Motion
AUTOMOTIVE Products RUTRONIK AUTOMOTIVE offers you a new range of bundled hardware, software and services. RUTRONIK AUTOMOTIVE brings together entire solutions to build applications for: Body & Electronics Safety eMobility
Drive Train Connected Car Consulting
More information: www.rutronik.com/automotive sales-na@rutronik.com | Tel. +1 469 782 0900
REFERENCES LTC6811 multicell battery monitors, www.linear.com/product/LTC6811-1
8 • 2017
DESIGN WORLD — EE NETWORK
linear tech article — automotive handbook 8-17 v3.indd 31
31
AUTOMOTIVE 8/4/17 10:04 AM
The centralized approach to autonomous driving Engineers say the best way to make vehicles fully autonomous is to analyze all sensor signals and driving situations with a centralized processor
It’s
AMIN KASHI, MATTHIAS POLLACH, NIZAR SALLEM MENTOR GRAPHICS, A S I E M E N S B U S I N E S S
no secret that elbow room may be growing scarce among the companies working on self-driving tech. A Comet Labs tally published by Wired magazine identified 263 firms “racing toward autonomous cars,” though surely only a fraction will reach the finish line. Meanwhile, the tech utopian ideal — full Level-Five autonomy (meaning the vehicle can safely drive itself with no human input in any/all conditions) — remains years away. There are numerous thorny engineering problems yet to solve. Among them are nitty-gritty details relating to power consumption, signal latency, system integration, manufacturability and cost. The media sometimes erroneously suggests that the basic approaches to solving these problems are essentially interchangeable. In fact, there are major differences in how to best solve self-driving car challenges. One example is Mentor’s 2017 introduction of its DRS360 Autonomous Driving Platform. Many in the industry appear to be working on scaling-up existing ADAS systems, but Mentor’s approach takes a different and more direct path to Level Five, one that leans heavily on the centralized fusion of raw sensor data. DRS360 applies advanced machine learning technology to narrow slices of fused data which leads to efficient and high-confidence object recognition. Accurate object recognition is key to hands-off driving in all conditions. The approach taken is to broadly acquire data from multiple sensor modalities deployed all around the car, then identify, classify and act upon only those objects and events of relevance to autonomous driving policy. Not incidentally, this approach mimics the sensory perception and decisionmaking of human drivers. And the production-intent nature of the DRS360 platform is relatively open and agnostic to other hardware and software used to implement self-driving. Consequently, developers avoid the black-box, vendor-lockin problem inherent in other solutions.
32
DESIGN WORLD — EE NETWORK
mentor graphics article — automotive handbook 8-17.indd 32
8 • 2017
In short, as autonomous driving moves from hype to the hard work of getting past level-two autonomy, the centralized, raw-data platform approach may well represent the surest means to get there. SCALING UP DISTRIBUTED ADAS DOESN’T WORK In recent years ADAS (advanced driver assistance systems) have proliferated, even in new, economy class vehicles. For example, for just $1,000, buyers of modest Honda Civic sedans can add the Honda Sensing system, which includes adaptive cruise control, lane keeping assist, forward collision warning and more. The cameras and sensors that make ADAS possible all come with onboard hardware and software for processing data captured as the car moves down the road. OEMs and Tier Ones implementing complex ADAS systems must integrate the output of all these devices. Camera and sensor systems typically use microcontrollers to filter and process data before sending it via CAN bus to a central processor. But the precise way this filtering takes place is opaque to carmakers. In addition, the processing of data within the sensor nodes boosts system software complexity and adds cost, especially with the extensive verification processes necessary to ensure the code meets strict automotive standards. Power and consequent thermal challenges pose related problems. It’s not practical to actively solve thermal issues in computing systems (by deploying fans or water cooling) because the additional equipment adds more risk of component failure, maintenance complexity, and noise. eeworldonline.com | designworldonline.com
8/3/17 10:26 AM
ADVERTORIAL
I-PEX Connectors:
Answering the Demand for Smart Car Connectivity Today’s drivers are demanding Smart Car connectivity like never before. This demand has industry leaders investing heavily in Internet of Things (IoT) technologies to advance Smart Car connectivity at an astounding rate. In fact, Business Insider estimates the automotive industry will ship 94 million connected cars in 2021. This figure, which represents 82% of the industry, says a lot about the demand by today’s drivers for this rapidly evolving technology. Currently, Smart Car connectivity occurs in two ways. Embedded cars utilize a built-in radio and antenna connecting to local cellular networks, while tethered cars use a wireless connected device, typically a cell phone, to provide the data connectivity. This is done via Bluetooth® or a USB cable. In addition to data connectivity, cars have GPS for navigation and emergency services, satellite radio (Sirius), hands-free calling, sensor and data gathering, as well as key-less entry and start. Some of these system like tire-pressure monitoring, can utilize an embedded or board-level antenna. Others, like emergency services or data-intensive systems like navigation or satellite radio require an external, application-specific antenna. Despite these highly varied requirements, I-PEX Connectors is able to offer a robust and diverse product offering to suit critical RF interconnectivity needs for all of these systems thanks to I-PEX’s patented iFit® MHF® Micro RF Coaxial Connector Series. The MHF family of connectors provides automotive OEM module manufacturers a reliable and right-sized solution for their RF needs. While the frequency ranges from DC to 15 GHz (MHF5L), the majority of the frequencies used in automotive, DC to 6 GHz, can be realized with the MHF I family. In addition to a mated board height maximum of 2.5 mm (1.8 mm OD wire) to 3 mm (2.0 mm OD wire), the MHF I Connector offers the most versatility in choosing the right cable for your application.
I-PEX -- EE Autonomous HB 8.17.indd 33
94
MILLION CARS CONNECTED By 2021*
SMART CAR
CONNECTIVITY
EMERGENCY
SERVICES
82%
OF ALL CARS CONNECTED
DATA
By 2021*
SATELLITE
INFOTAINMENT SYSTEMS
RADIO
GPS
NAVIGATION
HANDS
FREE
CALLING
SENSOR &DATA GATHERING
I-PEX® EVAFLEX® 5-VS Shielded FPC/FFC Connectors
I-PEX® CABLINE®-CA II Micro-Coaxial Cable
I-PEX® MHF® I Micro RF Coaxial Connector with Lock
www.i-pex.com *http://www.businessinsider.com/internet-of-things-connected-smart-cars-2016-10
Repeatability and consistency are critical to ensuring device operation. The MHF Series consistently has a 1.3 VSWR max from DC-3 GHz and a 1.5 VSWR max from 3-6 GHz. To assist with durability, I-PEX offers a 4-pad layout in addition to the 3-pad configuration. The engineers at I-PEX Connectors have designed automotive applications to handle the unique problems that stationary RF systems do not encounter, particularly shock and vibration. Cars are in motion, roads are uneven, and this generates mechanical harmonics that could cause connectors to disengage over time. To deal with these issues, I-PEX Connectors has engineered products like the soon-to-be released MHF LK Connector system in 2017.
connectors
This innovative solution will be available in cable sizes (0.81 mm OD, 1.13 mm OD, 1.32 mm OD, 1.37 mm OD and 1.80 mm OD) to suit various insertion loss and cable length needs. Additionally, the initial mating disengagement force of 20 N min outperforms the standard MHF I Connector Series of 5 N min. Even at 30 cycles, the MHF LK, at 10 N min, performs better than the MHF I, at 3 N min. For more information and to help determine which I-PEX Connector is compatible with your system’s needs, contact your I-PEX Connector representative. http://www.businessinsider.com/internet-of-thingsconnected-smart-cars-2016-10
8/2/17 4:04 PM
Obviously, power consumption will be a major factor in electric autonomous vehicles because of range limitations and other issues. Less obvious is how power is a gating factor for internal combustion engines, too. Power-hungry ADAS may force a re-architecting of cabling for the vehicle electrical distribution system (EDS), exceptionally expensive in both time and cost. Today the amount of data sensors generate is a choke point for decision making algorithms. Indeed it seems inevitable that the eyes and ears of Level-Five vehicles, whenever they arrive, will carry an array of simplified, microcontroller-less “raw data” sensor modules that leave most of the processing and decision making to a central processing engine, much as the central nervous system works in humans. Unlike traditional, distributed computing approaches, a platform based on raw data uses the entirety of all captured data to generate the most complete view of the vehicle’s environment. Driving decisions are made based on the full availability and comprehension of all data at hand—data that’s accurate, complete and unbiased/unfiltered. In object recognition, targeting the most relevant data produces better, higher-confidence results. And this can only happen if you preserve all the data until it’s time to use it.
34
DESIGN WORLD — EE NETWORK
mentor graphics article — automotive handbook 8-17.indd 34
8 • 2017
ADAS today generally process sensor data by integrating each sensor with a microcontroller located nearby. The microcontroller processes the raw sensor data. In the case of a camera, for example, the local microcontroller might partially recognize objects in the scene, then send back the location of the objects to a central controller. The advantage of this approach is that it reduces both the amount of data sent back to the central controller and the central controller’s computational load. One problem is that local processing at the sensor introduces some delay before the data gets sent to the central controller. Another difficulty arises because an ADAS sometimes needs data from more than one sensor to distinguish, say, a vehicle from its reflection on a wet road. Engineers today feel the fusion of sensor data needed to make these judgments is best handled by a central controller which itself processes raw data from sensors simultaneously.
eeworldonline.com | designworldonline.com
8/2/17 3:51 PM
THE CENTRALIZED APPROACH
Utilizing unfiltered data is the ideal way to build a rich, spatially and temporally synced view of the car’s surrounding environment – a goal made feasible by combining FPGA and CPU technologies on the same platform. Autonomous vehicles based on the centralized, raw-data platform approach are fundamentally agnostic in terms of sensor vendor. Nearly any camera, lidar, radar or other sensor type can be adapted to the platform, which is open and transparent. Mentor’s sensor fusion and object recognition algorithms are advanced, but carmakers and Tier Ones are free to apply their own algorithms and even chose their own chips and SoCs for the processing. Several major sensor systems only work with particular chipsets and GPUs, another form of vendor lock-in. And the major sensor vendors often require contracts for bulk purchases over relatively long periods, a potential risk given the fast-changing market landscape. The new platform is also a boon to sensor vendors whose core competencies are usually in hardware rather than software.
Centralized, raw-data platforms provide an efficient way to identify events and regions of interest around a vehicle, and then combine the raw data from various sensor modalities to classify objects and ultimately make decisions about these events. This is important because each sensor comes with specific limitations. Lidar systems aren't useful for detecting objects close to the car, may have difficulties in fog and rain, and tend to be large and expensive, though prices are dropping. Radar does better in inclement weather and is less expensive than lidar, but generally has lower resolution and is still unreliable at very short distances. Cameras are among the most commonly deployed sensors today. However, vision sensors capture both useful and non-useful information without discrimination (the sky is of a little interest to a car), and optimal image sensing depends on lighting conditions. Today most of these sensor types use onboard microcontrollers to preprocess and filter data before sending it downstream. The problem is that independently handling tasks like object tracking or
250V Ultra-Junction X3-Class HiPerFETTM Power MOSFETs
Offering best-in-class on-state resistance and gate charge Figure of Merit APPLICATIONS
Part Number IXFA60N25X3 IXFP60N25X3 IXFP60N25X3M IXFQ60N25X3 IXFA80N25X3 IXFH80N25X3 IXFP80N25X3 IXFQ80N25X3 IXFH120N25X3 IXFQ120N25X3 IXFT120N25X3HV IXFH150N25X3 IXFT150N25X3HV IXFH170N25X3 IXFK170N25X3 IXFN170N25X3 IXFT170N25X3HV IXFK240N25X3 IXFX240N25X3 IXFN240N25X3
VDSS
ID25
(V) 250 250 250 250 250 250 250 250 250 250 250 250 250 250 250 250 250 250 250 250
TC = 25°C (A) 60 60 60 60 80 80 80 80 120 120 120 150 150 170 170 170 170 240 240 240
RDS(on) max TJ=25°C (mΩ) 23 23 23 23 16 16 16 16 12 12 12 9 9 7.4 7.4 7.4 7.4 5 5 4.5
Qg(on) typ
trr typ
RthJC max
(nC) 50 50 50 50 83 83 83 83 122 122 122 154 154 190 190 190 190 345 345 345
(ns) 84 84 84 84 105 105 105 105 116 116 116 134 134 135 135 135 135 165 165 165
(°C/W) 0.39 0.39 3.5 0.39 0.32 0.32 0.32 0.32 0.24 0.24 0.24 0.16 0.16 0.13 0.13 0.32 0.13 0.1 0.1 0.18
TO-220
D
NEW
TO-268HV
S
Battery chargers for light electric vehicles Synchronous rectification in switching power supplies Motor control DC-DC converters Uninterruptible power supplies Electric forklifts Class-D audio amplifiers Telecom systems
Lowest on-resistance RDS(ON) and gate charge Qg Fast soft recovery body diode dv/dt ruggedness Superior avalanche capability International standard packages
G
FEATURES
SOT-227 Package Type
TO-263
TO-263 TO-220 OVERMOLDED TO-220 TO-3P TO-263 TO-247 TO-220 TO-3P TO-247 TO-3P TO-268HV TO-247 TO-268HV TO-247 TO-264 SOT-227 TO-268HV TO-264 PLUS247 SOT-227
EUROPE IXYS GmbH marcom@ixys.de +49 (0) 6206-503-249
USA IXYS Power sales@ixys.com +1 408-457-9042
TO-247
TO-264
TO-3P
PLUS247
ASIA IXYS Taiwan/IXYS Korea sales@ixys.com.tw sales@ixyskorea.com
www.ixys.com
mentor graphics article — automotive handbook 8-17.indd 35
8/2/17 3:52 PM
scene flow segmentation increases power consumption, processing requirements and most important, latency. Object tracking, for example, requires analyzing a sequence of frame data over time. A better approach is to temporally and spatially sync all data from the car’s sensors in a single frame, narrow in on an object in a region of interest, and then examine the slice of fused data to perceive what’s going on just in that region. By dynamically processing only the data relevant to the decision at hand, centralized raw data platforms can make good driving decisions while costing less while using less power and space. From a latency perspective, centralized, raw data platforms also excel. The platform employs sensors which are unburdened by the power, cost and size penalties of microcontrollers at each sensor nodes. There is no pre-processing at the sensor node, thereby reducing latency. The Mentor platform is based on a streamlined data transport architecture which further lowers system latency by minimizing physical bus structures, hardware interfaces and complex, time-triggered Ethernet backbones. The FPGA-centric approach provides the highest possible signal processing performance. And fourth, all processing is handled in centralized fashion, avoiding latency resulting from data swapping, which is inherent in traditional, non-centralized approaches. Neural networks and machine learning, of course, loom increasingly large in autonomous driving. It’s problematic and ultimately unrealistic to dump all sensor data to one deterministic system that must see every situation at least once before learning. One issue is the incredible overhead spent processing utterly useless information. Another is the black-box challenge of never knowing precisely what such a system is learning from the data fed into it, and the impossibility of ever reproducing all the data that went into making a certain machine-generated rule. This sort of lack of transparency is forbidden within standards like ISO 26262. Centralized, raw-data platforms represent the “antiblack box” approach, which means customers can keep and access the highly valuable sensor data and other data flowing through the platform. The systems are also designed to allow OEMs and Tier Ones to customize or extend the algorithms used within the platform. Centralized, raw-data platforms imply that the much better method is to not worry about classifying all vehicle surroundings in detail. Just as human drivers do, an autonomous driving system should invest resources mostly in those tasks that are central and safety-critical to the driving mission (which, for lack of a better description is safely traveling the roadways from 36
DESIGN WORLD — EE NETWORK
mentor graphics article — automotive handbook 8-17.indd 36
8 • 2017
point A to point B). This means separating out general situation awareness (modest focus), event detection and perception (more serious attention), and object classification (most critical). For example, suppose you’re traveling down the road and a car merges into your lane in front of you. What’s happening two lanes over or several vehicles behind you matters little compared to how fast the merging vehicle is traveling, whether its brake lights are on, and whether you’ll have to slow down. Human drivers generally focus on such a situation automatically. Level Five autonomous vehicles ultimately will need to do the same. THE SIGNAL AND THE NOISE As sensors and instrumentation proliferate on vehicles, much is made of the issue of so-called data exhaust. Intel CEO Brian Krzanich made news last year when he predicted that the average self-driving car in 2020 would generate 4,000 gigabytes of data per day, roughly the daily data generated by 3,000 internet users. Skeptics like to ask about the point of collecting raw sensor data because there is no way to process it in real time. The answer is that there’s no need to consume all the raw data in real time -- the important point is that all the data is there when it’s needed most. The iterative approach to scaling up ADAS won’t work in the end. Far better to build new, highly open and flexible systems from scratch designed for level Five (but that allow OEMs to work in parallel and field systems in Level 2-4 along the way). Indeed, automotive engineers mostly agree on the technical superiority of raw data sensor fusion, though to-date Mentor is the only firm that has gone to the trouble of building a platform that implements this approach. So, by far the biggest technical hurdle for Level Five — one that Mentor’s DRS360 platform based on centralized, raw-data fusion clears by a mile — is the need for new architectures that emulate a human way of fusing disparate information and focusing attention where it matters. Of course, this is a matter of life and death for drivers and others on the road.
REFERENCES DRS360 Autonomous Driving Platform, www.mentor.com/embedded-software/drs360 ISO 26262, www.en.wikipedia.org/wiki/ISO_26262
eeworldonline.com | designworldonline.com
8/3/17 10:28 AM
JUL17 Allied Services Ad (DWEE).qxp_Design World EE Sup 5/31/17 1:43 PM Page 1
The automation and control distributor with the latest order cut-off times.
Order by 10 p.m. ET for same-day shipping. Order by 8 p.m. ET for next-day delivery.
Allied Electronics thinkallied.com •1.800.433.5700 Š Allied Electronics, Inc 2017.
allied | t&m 5.17.indd 37
8/2/17 3:56 PM
Simulations help explain why autonomous vehicles do stupid things Special test programs hope to help robotic systems make better decisions in short order.
Here’s
LEE TESCHLER | EXECUTIVE EDITOR
a riddle: When is an SUV a bicycle? Answer: When it is a picture of a bicycle that is painted on the back of an SUV, and the thing looking at it is an autonomous vehicle. Cyclists painted on the back of an SUV is what’s known in the autonomous-car industry as an “edge case.” This is a situation where autonomous system software understands an odd-ball scene differently from how humans would. The result of edge case scenarios is generally unpredictable behavior on the part of the robotically guided vehicle.
Edge cases like this one are the reason the Rand Corp. reported in 2016 that autonomous cars would need to be tested over 11 billion miles to prove that they’re better at driving than humans. With a fleet of 100 cars running 24 hours a day, that would take 500 years, Rand researchers say. It’s not just of scenes painted on the back of vehicles that throw autonomous vehicles for a loop. “There are a lot of edge cases,” says Danny Atsmon, the CEO of autonomous vehicle simulation firm Cognata Ltd. “The classic example is that of driving at
An edge case from a Cognata Ltd. simulation. If you were an autonomous vehicle, this would look like three cyclists instead of the back of an SUV.
38
DESIGN WORLD — MOTION
av simulation — automotive handbook 8-17.indd 38
8 • 2017
motioncontroltips.com | designworldonline.com
8/3/17 11:22 AM
SIMULATIONS HELP EXPLAIN
night after a rain. The pavement can be like a mirror, so you see a car and its reflection. Autonomous systems can interpret the scene as two different cars.” Cognata, based in Israel, has a lot of experience with edge cases because it builds software simulators in which automakers can test autonomous-driving algorithms. The simulators allow developers to inject edge cases into driving simulations until the software can work out how to deal with them. This all happens in the lab without risking an accident. “It can take months to hit an edge-case scenario in real road tests. In a simulation that’s not a problem,” says Atsmon. Simulations like those that Cognata devises are also helpful because of the way autonomous systems recognize situations unfolding around them. Traditional object recognition techniques such as edge detection may be used to classify features such as lane dividers or road signs. But machine learning is the approach used to make decisions about what the vehicle sees. Here learning algorithms handle image recognition. The feature detectors are so-called To an autonomous vehicle system running a Cognata simulation, this convolutional layers in software that can adapt to doesn’t look like the back of an RV with a scene painted on. According training data. To handle specific problem scenes, to the color map above, the autonomous software interprets the back developers collect numerous training examples of the RV as a weirdly shaped building dead ahead. Edge cases like and choose parameters such as the number of this one help developers debug the machine learning algorithms that layers in the learning network, the learning rate, interpret driving situations. the activation functions, and so forth. Eventually the recognition system adapts its features to the given problem at hand. This approach works better than handcrafting features that may handle foreseen problems quite well but break for others. To help developers of automated vehicle systems, Cognata recreates real cities such as San Francisco in 3D. Then it layers in data such as like traffic models from different cities to help gauge how vehicles drive and react. The simulations are detailed enough to factor in differences in driving habits of people in different cities, Atsmon says. The third layer of Cognata’s simulation is the emulation of the 40 or so sensors typically found on autonomous vehicles, including cameras, lidar and GPS. Cognata simulations run on computers that the auto manufacturer or Tier One supplier provides. Sensor emulation is particularly important because autonomous cars overcome issues such as baffling images by fusing together information gathered from REFERENCES different types of sensing. Just as cameras can be fooled by images, lidar can’t sense glass and radar senses mainly metal, explains Atsmon. Autonomous systems Cognata Ltd., www.cognata.com learn to deal with complex situations by gradually figuring out which data can be used to correctly deal with particular edge cases.
motioncontroltips.com | designworldonline.com
av simulation — automotive handbook 8-17.indd 39
8 • 2017
DESIGN WORLD — MOTION
39
8/2/17 3:42 PM
AD INDEX AUTONOMOUS & CONNECTED VEHICLES
Allied Electronics ............................................................................... 37 Chroma Systems Solutions ............................................................... BC Coilcraft ............................................................................................. 17 Digi-Key ................................................................................ Cover, IFC I-PEX Connectors .............................................................................. 33 IXYS ................................................................................................... 35 Keystone Electronic Corp. ................................................................... 7
SALES
LEADERSHIP TEAM
Mike Caruso
David Geltman
Publisher
mcaruso@wtwhmedia.com
dgeltman@wtwhmedia.com
Mike Emich
469.855.7344
516.510.6514
memich@wtwhmedia.com
@wtwh_david
508.446.1823
Garrett Cona
@wtwh_memich
gcona@wtwhmedia.com
Neel Gleason
213.219.5663
ngleason@wtwhmedia.com
Managing Director
312.882.9867
Scott McCafferty
@wtwh_ngleason
smccafferty@wtwhmedia.com
Jessica East jeast@wtwhmedia.com
310.279.3844
330-319-1253
Tom Lazar
@wtwh_MsMedia
tlazar@wtwhmedia.com
Michael Ference mference@wtwhmedia.com
@SMMcCafferty
408.701.7944
EVP
@wtwh_Tom
Marshall Matheson mmatheson@wtwhmedia.com
408.769.1188
Jim Powers
@mrference
jpowers@wtwhmedia.com 312.925.7793
Michelle Flando mflando@wtwhmedia.com 440.670.4772 @mflando
@jpowers_media Courtney Seel cseel@wtwhmedia.com @wtwh_CSeel
mfrancesconi@wtwhmedia.com
Ad Index — Automotive HB 8-2017 v1.indd 40
@mmatheson
find us
@DESIGNWORLD
630.488.9029
DESIGN WORLD — EE Network
805.895.3609
on Twitter
440.523.1685 Mike Francesconi
40
KOA Speer Electronics, Inc. .............................................................. 21 Memory Protection Devices, Inc. ......................................................... 3 Microchip Technology.................................................................... 1, 13 Newark element14........................................................................... IBC Rutronik.............................................................................................. 31 WAGO................................................................................................ 27
8 • 2017
eeworldonline.com | designworldonline.com
8/4/17 10:22 AM
newark.com | 1 800 463 9725 Your Trusted Source for Engineering Solutions
Industry’s Best Website Voted #1 by United Business Media
Industry’s First Catalog 88 Years of Publication
newark14 -- EE Autonomous HB 8.17.indd 41
8/2/17 4:04 PM
Go ahead. Bench it.
The New Auto-Ranging, Programmable, Low Noise, Fast Transient Response Benchtop DC Power Supply Model
Voltage
Current
Power
62010L-36-7 62015L-60-6
0~36V 0~60V
0~7A 0~6A
108W 150W
The new auto-ranging 62000L programmable DC power supplies provide low ripple, low noise linear performance and a fast transient response to meet the requirements of next generation power electronics like automotive power electronics MCU/ECU, wireless communications, and semiconductor IC drivers – all in a portable, lightweight, and cost effective package. GPIB and USB come standard and up to 7 units can be connected to achieve greater voltage and current output. To find out more about these new DC sources, visit chromausa.com.
More Programmable DC Power Supplies 62000P
Programmable 600W ~ 5kW
Power Conversion and Electrical Safety Test Instruments and Automated Systems
62000H
Programmable 5kW ~ 150kW
chromausa.com | 949-600-6400 | sales@chromausa.com
62000B
Modular, Hot Swappable up to 120kW
Chroma -- EE Autonomous HB 8.17.indd 1
Š 2017 Chroma Systems Solutions, Inc. All rights reserved.
8/2/17 4:05 PM