Glen Hawk VP of NAND Solutions
Brian Shirley VP of DRAM Solutions
Electrical Engineering Community
Bringing your
concepts to reality 1.
is as easy as...
2.
Create schematics, technical diagrams, and flowcharts using your browser. • 600+ Symbol Library • Share Schematics Online • Export High Quality Images
digikey.com/schemeit
Free and easy-to-use circuit simulator that runs in your browser. • SPICE Simulator • AC/DC/Transient Sims • Waveform Viewer
partsim.com
3. Full featured online CAD application for designing and manufacturing electronics hardware. • Schematic Capture • PCB Layout • BOM Integration
pcbweb.com
Visit: digikey.com/schemeit • partsim.com • pcbweb.com Copyright ©2013 Aspen Labs LLC.
CONTENTS
PULSE
4
Glen Hawk & Brian Shirley MICRON
The VPs of NAND and DRAM solutions at Micron talk about their next-generation technology that is making memory exciting again
Featured Products
This week’s latest products from EEWeb.
ON Semiconductor
Brings Optimized Application Solutions to the IGBT Market
Reactive Power—Pt. 2
by P. Jeffrey Palermo
A look into specific problems with highly compensated power systems.
28
DSKY
Working With the Interface of the Apollo Program
16
Augmented Reality & Wearable Computers by Alex Toombs
An overview of the history and future of some revolutionary wearable computer devices.
4
12 16 22 28 32
22 Visit: eeweb.com
3
PULSE
4
Visit: eeweb.com
INTERVIEW
M
icron Technology, Inc. is a multinational semiconductor company based in Boise, Idaho. The company’s revolutionary DRAM and Flash NAND technologies have solidified its status as one of the world’s leading technology companies. However, Micron’s reputation is not a thing of the past; the company has some gamechanging products in the pipeline that will raise the bar even further for memory capabilities. We spoke with the Vice Presidents of the NAND and DRAM divisons, Glen Hawk and Brian Shirley, about Micron’s innovative work culture and why these next-generation products are making the field of memory exciting again.
Visit: eeweb.com
5
PULSE How did both of you end up in the positions you’re in at Micron?
Glen, could you give us an overview of Micron’s current NAND products?
Brian Shirley: I am a native of the Boise area having spent essentially my whole life here aside from my college years. As I was growing up, and certainly during the time I was getting my electrical engineering degree at Stanford, I kept track of a pretty neat company by the name of Micron that had started up in Boise. After doing a number of summer internships, I joined the company full-time in 1992. If you count my internships, I’ve been with company almost 25 years now.
GH: In terms of Micron’s NAND offerings, the first thing I’d like to point out is that we are a true silicon systems NAND provider. On the silicon side, we have an incredible range of NAND silicon components. In terms of density, it actually ranges from 128-megabit chip all the way up to a 128-gigabit device that we’ve just started production on. It’s actually the world’s smallest 128-gigabit NAND chip. Just a little bit of trivia here: we can grind them down to about the thickness to a sheet of paper and cram 8 of those into a single package. We do actually deliver 1 terabit worth of NAND Flash memory in a single package.
I started in the group we call product engineering and then moved into design where I worked on a number of 4-megabit and 16-megabit designs. I went on to manage the DRAM design group for about 10 years and from there moved into what we call a business unit role, that includes management, design, marketing, and engineering functions for various pieces of our memory portfolio. Memory is a pretty exciting field—it’s a crazy and competitive industry, but what I’ve found is that for a number of us here at Micron, memory gets into your blood. Over the years, Micron has found a path forward that has allowed us not just to survive, but to prosper. If you look at the history of the industry you will see that while a number of companies have exited the memory business Micron has persevered and done very well. Glen Hawk: I’m a relatively recent addition to the Micron team. I joined back in 2010 as part of the Numonyx acquisition, and prior to that, I was with Intel for 22 years. I was with Intel when we started the Flash business and was in various positions in engineering and marketing there. I eventually became the general manager of the NOR Flash products group and spun that off to form a private company called Numonyx around 2008. We ran it for 2 or 3 years and managed to keep it alive through a pretty tough downturn in the semiconductor market and became an attractive acquisition target for Micron. Lucky for us, we were acquired in 2010. Steve Appleton and Mark Durcan asked me at the time to lead the NAND group, which I have been doing now for about two and a half years.
6
Visit: eeweb.com
On the NAND silicon side, it’s petty amazing to see the breadth that we have. Of course, the larger data storage opportunities are interested in the higher density NAND, but there’s a still a tremendous market in the embedded and automotive segments for the lower density NAND at the 128-megabit level, so it’s kind of nice that we have that multiple order of magnitude breadth. Above that NAND silicon, the next level of integration is what people call “managed NAND solutions,” which is where you have NAND chips with controllers in the same package that makes it easier for the system to use the NAND memory. We offer one of the more popular formats, which is called embedded MMC and we also have a NAND product line—both of those are managed NAND solutions. Of course, if you keep going up the chain, we’re very aggressive in solid-state drives and we’ve come a long way in the last few years. We’ve gone from 0% to probably mid-teens in terms of market share over the last two to three years. We now actually offer a full line of Enterprise-class solid-state drives as well. We just announced a couple of new products this year: a SAS interface and a SATA interface Enterprise drive. We are having a lot of fun breaking into
INTERVIEW that segment as well. Believe it or not, we’ve actually started working on the next level of system solution above that. A year ago, we acquired a company called Virtensys that makes a virtualized I/O appliance and we are using that team to develop a product that will include NAND Flash for data centers. We certainly have our eyes on climbing the value chain here.
Have you seen a lot of growth in MCPs on the NAND and low-power DRAM side? BS: The entire mobile space is growing by leaps and bounds. This is beneficial to Micron both in terms of higher volume of MCPs as well as the fact that more discrete DRAM and discrete NAND products are going into cell phone applications. On the MCP side, MCPs that combine DRAM and NAND components continue to be very popular— there has been a lot of growth in China in particular. Some of the recent trends show that the more recent MCPs have been moving from a DRAM standard called LP DDR1 into something called LP DDR2. More and more of our MCPs also have a NAND controller inside - something called EMMC. These MCPs can be fairly complex—there may be two to four DRAMs in a single package as well as multiple NAND devices in a single package or NAND controller, so you can see how the assembly technology needs to keep up. That kind of tight form factor offers high value to the mobile space where board layout is at a premium.
“Memory is a pretty exciting field—it’s a crazy and competitive industry, but the thing I’ve found is that, for a number of us out here at Micron, memory gets into your blood.”
If you move back to computing for a second and think about some of these server and data center applications, NAND SSDs started out using the SATA bus because most companies were used to the hard disk drive interface. That’s still a popular bus for NAND-based SSDs, but now we’re pushing hard to get a lot of interest in SSDs that sit on the PCIE bus, which was never really created specifically for NANDbased storage. However, it ends up being a much better interface for NAND-based SSDs because of the significantly faster data rates and lower latencies. It allows us to get the most out of NAND technology. The point is that these technologies are helping memory get to the place in the system where it can do the most good.
What is the current process node that Micron is using for its NAND?
Do you see a trend in more and more mobile applications just going to discrete NAND and DRAM over combo packages?
GH: The major lithography that we have is on a very steep ramp right now. It’s certainly going to be in a majority of our wafers by the end of this year. It’s what we call our 20nm lithography, which is our leading-edge technology that’s in production. That’s the one that delivers the world’s smallest 128-gigabit monolithic chip. That node is going very well for us right now. I should mention that when I was at Intel, I actually helped negotiate the formation of the Intel/Micron manufacturing venture IMFT as well as the joint technology and NAND product development programs.
BS: Just as in the computing space, there is a trend to get DRAM closer to the processor in the mobile area. There are other trends as well—it all points back to the fact that, in the memory space, things have gotten really exciting again. This excitement is enabled by the needs of the applications, power, form factor, board space, the requirements of the CPU, but also by better assembly technologies that allow us to combine all these technologies together. The best way to think about it is that both DRAM and NAND are migrating to the place in the system where they can do the most good.
There’s a lot being written these days about the challenges of NAND scaling. As a figure of merit on that 20nm technology, the number of electrons that separates states in the flash memory cells is only about 20. Electrons are not very well behaved in small numbers, so the next few scaling steps are very challenging. However, we do actually have a pretty solid roadmap in place on the 20nm technology. We made a transition to what’s called a Planar Cell that’s very scalable. In addition to that—and it’s no secret—we, as well as our competitors are working on a major transition in the industry to what’s called a 3D NAND technology, where you actually stack the NAND cells vertically. This has the possibility of having a major breakthrough, so we’re pretty excited about the future of NAND despite all of the doom and gloom of Moore’s Law and running out of gas—we’re finding ways to work around those issues just like we have in the past. Visit: eeweb.com
7
PULSE that you will see in the future as pioneered by the DRAM group, but it will at some point be used with NAND.
What is Micron’s fab capacity for NAND?
Do you see through-silicon vias helping with the stacking technology in your NAND products? GH: What’s exciting about working at Micron is that you have a lot of different technology units under one roof. It’s one of the benefits of technology sharing between very different memory technologies— DRAM and NAND—are very obvious at times. The through-silicon vias are being pioneered by our silicon group and for their products, they need both ultra-high speeds with super tiny form factors. In the NAND market, we certainly need those tiny form factors, but we don’t necessarily need those same speeds, so for now, there’s not a direct need for that. However, it’s one of those capabilities that is going into the toolbox that we will definitely pull out in the future. As we look into the future of non-volatile memory, we’re certainly looking at new memory hierarchy opportunities in computing solutions where through-silicon vias could offer a big advantage. I think this is something
8
Visit: eeweb.com
GH: There’s a lot of different ways that Micron can add fab capacity, or contribute more bits to the world, so to speak. One is just to our lithography scaling that we were talking about, but at some point you do have to add capacity. Over the last three years, we did ramp a new factory in Singapore and that was just a very smooth startup and has gone very well for us. It’s in full, high-volume production now. The big capacity play that Micron is making right now is not really bringing new capacity to the industry. Through our Elpida acquisition, we are adding a substantial amount of capacity to the Micron network. For the most part, the way we look at our fab capacity is that it’s really just memory capacity. A lot of the equipment is reusable between NAND and DRAM (although you certainly can’t flip a switch and swap the capacity overnight). For the most part, the equipment is reusable. I think right now, our big action that’s underway is to complete this Elpida acquisition. We are very conservative about adding capacity to the industry because the last couple of decades have been fraught with these wild swings in over and under supply. We think that with the consolidation in the industry over the last five years or so that hopefully things will settle down a little bit. Our customers would appreciate that as well. At this point, we’ll be very conservative about adding additional capacity, so we’ll just have to keep an eye on the market growth and see how things go.
What are the latest density sizes for DD3 that Micron is offering? BS: On the DRAM side of the house, the primary volume product today is DDR3, which is sold into everything from notebooks and desktop machines to Ultrabook branded devices, servers, and data center applications. We are in very high-volume production today with 4-gigabit designs and are using that to make everything from component to module sales going all the way up to 16-gigabyte modules and 32-gigabyte modules as well. In DDR3 today, we’re spending a lot of time with the server guys figuring out how to get
INTERVIEW more memory into their machines. That’s a great trend for a company like Micron. One theme to note here is that for Micron, the entire memory landscape is going through a Renaissance in terms of memory innovation. That’s partially driven by both the needs of the applications—the insatiable need for more bandwidth, more memory closer to the processor, and things like that, but it’s also being driven by technical trends that enable memory to be closer to the processor, such as better packaging technologies and through-silicon vias. You take those two trends and the net result is that things are pretty fun in memory—both in DRAM and NAND. There’s a lot of innovation happening across the entire portfolio, so even in DDR3, which has been in volume production for a number of years. There’s progress being made in figuring out how to get more memory into infrastructure applications such as servers and data centers. Micron is taking advantage of that; we have launched what we call load-reduced DIMMs, which allow us to put together modules at 32- and 64-gigabytes density. We’re pushing the speed of these components, and we’re selling a lot of DDR3 products into consumer gaming applications that need higher speeds (up to 2133 MHz), so there’s a lot of activity across a range of applications.
Does Micron currently have a DDR4 product that is available? BS: We do. We’re ready to go. We have the world’s first production ready, fully functional 4-gigabit DDR4 device. We have been working very actively with CPU enabler companies as well as server and data center applications to get this device into volume production. The initial interest in DDR4 really seems to be on the server side as these companies look for improvements in power as well as bandwidth, DDR4 offers something there that DDR3 isn’t able to deliver. The innovation pipeline continues.
Can you tell us about the Hybrid Memory Cube? BS: Hybrid Memory Cube is something we’re really excited about. The interest level has taken off like wildfire. It’s a technology that we have worked on for a number of years already and have demonstrated in conjunction with
Intel a few years ago. What Hybrid Memory Cube is all about is combining a stack of several layers of DRAM and putting it on top of a logic controller, then hooking that entire stack together with connections that go through the silicon layers using a technology called through-silicon vias (TSVs). If you think about it, TSVs are deep contacts which the memory guys have gotten pretty good at making. Those TSVs connect that entire stack of DRAM together with this logic control layer which allows us to do two things. First, we are able to manage this stack of memory—and when I say we manage it, I’m talking about ways to produce better reliability and better performance out of the memory stack than we would otherwise be able to do. Secondly, we provide an interface back to the particular processor that the memory cube connects to—be it a network processor or a high-performance computer processor. There’s also logic inside an HMC to link directly to other HMC stacks so we can chain them together and create a system with significant amounts of main memory. This system is connected directly to the processor with speeds and latencies that quite simply can’t be beat. We are the first production generation of Hybrid Memory Cube. Out of one cube, we’ll be able to get 160 gigabytes per second of bandwidth and for the second generation, right behind that, we’ll be able to get 320 gigabytes per second - out of one cube. As a reference point, from an entire DDR3 module, you typically get about 12 gigabytes per second. So as you can see, with the second generation of HMC, we’ll be getting over 20 times the bandwidth compared to a typical DRAM module. It’s really a game-changer.
Do you see the HMC replacing modules in a system? BS: It’s certainly higher value to the system—when you have this kind of bandwidth, you can do some pretty neat things. It’s fair to say that the technology today is not as cost-effective as DDR3 and not running in those kinds of volumes. Over time, the cost and the yields will improve; we really see this as something that already has a ton of activity and a ton of interest from nearly every systems company out there because of the bandwidths that it enables. To your point, it really does give a higher value to the system. All of a sudden, because of the bandwidth and the low power per bit, we’re able to do some things that no other kind of memory system would be able to do.
“3D NAND has the possibility of having a major breakthrough, so we’re pretty excited about the future of NAND despite all of the doom and gloom of Moore’s Law and running out of gas— we’re finding ways to work around those issues just like we have in the past.”
Visit: eeweb.com
9
PULSE
MICRON’S BOISE FACILITY
What do you see in the future for Micron? GH: There’s a funny story about this. Not too long ago, I was on a flight back east. A younger lady sat next to me and we had a conversation about what we each did for a living. When she asked me what I did and I told her “Flash memory,” and to my surprise, she said, “That’s cool!” That doesn’t happen every day, but over the last couple of years, it has happened. I think that Flash has become so ubiquitous for consumers. Of course, it’s not just the portable devices, but I think just recently things like tablets and solid-state drives have become top-of-mind for a lot of people. Those two applications in particular—and ultra thin computing in general—is a major opportunity for NAND and solid-state drives; not just for client devices, but also for the Cloud. The amount of Cloud storage that will be in solid-state drives will just be phenomenal over the next couple of years. That underlying explosive growth has just created a wonderful dynamic in the NAND industry and I think it’s going to continue to be strong for a while. People look at things like the attach rate of Flash SSDs to portable computers right now—it’s still relatively low. If you look at the
10
Visit: eeweb.com
penetration of solid-state drives in data centers in the Cloud, it’s still pretty low. With the rapid advances in the underlying NAND technology and the cost reductions, the attach rates and market penetration are going to accelerate, and that’s pretty exciting. I think over the next couple of years that’s going to be the big story. I do think, however, that there’s another chapter after this that’s starting to be written. If you look at NAND today, and you look at what’s happening, so far, a lot of those interesting devices have come about because people have put NAND behind traditional interfaces. For example, solid-state drives are really nothing more than NAND buried behind a legacy hard drive standard interface. Those interfaces really throttle the performance of that underlying technology. The same is true with embedded MMCs for mobile smart phones or tablets—these interfaces are being used because they exist, not because they are the best for NAND Flash or non-volatile memory. I think that what you are going to see in the next chapter is going to be where the ecosystem fundamentally changes to be able to unleash the true potential of non-volatile memory in the memory hierarchy. I think you are going to see different interfaces and different software that is really now being written for the NAND technology or the non-volatile memory itself. I think that’s going to be really exciting. ■
Online Circuit Simulator PartSim includes a full SPICE simulation engine, web-based schematic capture tool, and a graphical waveform viewer.
Some Features include: • Simulate in a standard Web Browser • AC/DC and Transient Simulations • Schematic Editor • WaveForm Viewer • Easily Share Simulations
Try-it Now!
www.partsim.com
PULSE 32-Bit Single-Chip MCUs The V850E/IA1 is a 32-bit single-chip microcontroller that uses the V850E1 CPU, a V850 microcontroller, and has on-chip ROM, RAM, bus interface, DMA controller, a variety of timers including a 3-phase sine wave PWM timer for a motor, various serial interfaces including FCAN, and peripheral facilities such as A/D converters. Selected features include an operating voltage between 4.5 and 5.5 volts with a maximum frequency of 50 MHz. It also has ROM capacities of 256 KB and RAM capacities of 10 KB. The device also has 256 MB of linear address space that is shared by both program and data...Read More
Fuel Gauge with Direct Battery Connection The Texas Instruments bq27530-G1 system-side Li-Ion Battery Management Unit is a microcontroller peripheral that provides Impedance Track™ fuel gauging and charging control for single-cell Li-Ion battery packs. The device requires little system microcontroller firmware development. Together with bq2416x Single Cell Switchmode Charger, the bq27530-G1 manages an embedded battery or a removable battery pack. The bq27530-G1 uses the patented Impedance Track™ algorithm for fuel gauging, and provides information such as remaining battery capacity, state-of-charge, run-time to empty, battery voltage, temperature and state of health...Read More
12/16-Channel AFEs Double Accuracy High-accuracy AFEs integrate simultaneous cell voltage sampling to improve stateof-charge prediction and maximize battery pack longevity. Maxim Integrated Products, Inc. announced it is sampling the MAX14920/MAX14921, high-accuracy, 12/16-channel cell-measurement analog front-ends (AFEs) that lower battery-management electronics costs by up to 35%. These devices double the accuracy of cell voltage readings through the use of high-accuracy common-mode level shifting and an integrated high-precision amplifier that simplifies ADC data conversion. The MAX14920/MAX14921 enable the industry’s highest accuracy cell voltage measurements...Read More
New MCU Educational Platform Zilog, a wholly-owned subsidiary of IXYS Corporation, is introducing its new Educational Platform which includes all of the necessary components and software to allow engineers, instructors and students to design with freedom. The Zilog Educational (“ZED”) Platform is an electronics development system for learning and teaching at the university level, yet can also serve the needs of students at the high school level. The core of the Zilog Educational Platform is Zilog’s Z16F2810 MCU, a16-bit Flash chip based on Zilog’s ZNEO CPU. The Platform can also be configured as a data acquisition and remote control system...Read More
Power Management IC Design Simulation The Fujitsu Easy DesignSim is a comprehensive online design simulation and support solution for design engineers working with Fujitsu’s extensive line of power management ICs (PMICs). The tool simplifies the process of designing circuit diagrams for power supplies. It automatically calculates key power parameters, such as input/output voltages, load currents, efficiency, and switching frequency, and then simulates the operation of the power supply online. After selecting a power management solution, “Easy DesignSim” can generate a BOM from the databases of electronics parts vendors based on the project requirements and circuit diagram...Read More
12
Visit: eeweb.com
FEATURED PRODUCTS Fully-Encapsulated Power Module Get the most power from the smallest space possible with Intersil’s ISL8225M, the industry’s easiest-to-design in, thermally enhanced, dual 15A/single 30A power module. Key Features: • Simple: One ISL8225M plus a few components and you are done. • Flexible: 5V-20V input; dual 15A or single 30A output • Scalable: Parallel up to 6 modules for 180A output • Reliable: Full power operation without fans or heat sinks The device also has a complete dual output DC-DC regulator all in one package. The patented current share archtiectures also reduces the layout sensity when two or more modules are paralleled with each other...Read More
Mobile App for Databook Power Integrations launched the latest version of the PI Databook as a mobile app. The PI Databook App is a comprehensive mobile implementation of the complete portfolio of Power Integrations technical documents and enables engineers to access technical data and video from any iOS or Android™ platform-enabled tablet or smart-phone. The PI Databook App allows users to browse product information, annotate, save and share documents. The App also includes a search function and newsfeed. The app makes it easier for them to find the information they need to solve their power-related design challenges without having to click through layers of a traditional website...Read More
Intelligent System Power Solution The IDT P95020 is the first of a new generation of standardized application-specific controllers that incorporates a General Purpose Microcontroller, a high fidelity Audio CODEC including Headphone Outputs and a 2.5W Class D Audio Amplifier, full Power Management functionality, a Touch Screen Controller and a Real Time Clock all of which are required by portable consumer devices such as cellular phone handsets, portable gaming devices, digital media players, portable navigational devices, etc. T he Device also features quick turn customization as well as a battery cahrger for LiIon/Li-Polymer batteries...Read More
Touch Sensing with 16-bit MCU The PICDEM Touch Sense 2 Demonstration Board provides a complete platform introducing Microchip’s mTouch Sensing Solutions employing the 16-bit PIC24F MCU. It features the Charge Time Measurement Unit (CTMU) for fast capacitive touch sensing without additional external components. Factory programmed firmware provides immediate access to all the board’s features through the use of the accompanying Windows™ based diagnostic tool. The diagnostic tool provides a platform to analyze application critical information in “real time” as it relates to touch sensor behavior...Read More Visit: eeweb.com
13
FEATURED PRODUCTS
PULSE
Monolithic CMOS Audio Device STMicroelectronics STA120 Digital Audio Interface Receiver is a monolithic CMOS device that receives and decodes audio data according to the AES/EBU, IEC 958, S/PDIF, & AGIO CP-340/1201 interface standards. It contains a RS-422 line receiver and Phase-Locked Loops (PLL) that recovers the clock and synchronization signals and de-multiplexes the audio and digital data. The STA120 de-multiplexes the channel status, user and validity information directly to serial output pins with dedicated pins for the most important channel status bits...Read More
Automotive 13.56MHz RFID Transceiver The MLX90132 is a 13.56MHz RFID/NFC reader IC. It has been designed to handle sub-carrier frequencies from 106 to 848 kHz and baud rates up to 848kbit/s. The robust and flexible receiver part of the MLX90132 enables designers of NFC/RFID reader devices to fit their system requirements and to address the main communication standards with the same device. The MLX90132 has 6 power modes for optimal power consumption. In ready state current consumption is 2.5mA. MLX90132 has both tag and field detection capabilities. The tag detection capability is based on the detection of a variation in the device HF field...Read More
Speed2Design TechTalks at NASA Littelfuse, Inc. has created a NASA Exploration & Discovery Experience for the engineering community as part of its 2013 Speed2Design™ promotion. Winning design engineers will get an opportunity to go behind the scenes to spend time with NASA engineers at two premiere NASA facilities. Littelfuse will host Speed2Design TechTalk events at NASA’s Ames Research Center in Moffett Field, Calif., on August 15 and Johnson Space Center in Houston, Texas, on October 24. Over the next few months, Littelfuse will randomly select 10 winners from those who enter the promotion to participate in each Speed2Design event...Read More
Family of Low-Skew LVCMOS The AK8180B has two pin-selectable LVCMOS inputs and nine LVCMOS output buffers. It operates with clock inputs up to 350MHz, and features an output-to-output skew of 150ps. The AK8180B is fully compatible with the IDT MPC9447. The AK8180C has two pin-selectable LVCMOS or LVPECL inputs and twelve LVCMOS output buffers. It operates with clock inputs up to 350MHz, and features an output-to-output skew of 150ps. The AK8180C is fully compatible with the IDT MPC9448. The AK8180 B and C feature a clock stop control that is synchronous to the falling edge of the input clock. It allows the start and stop of the output clock signal only in a logic low state, thus eliminating potential output runt pulses...Read More
14
Visit: eeweb.com
Technology You Can Trust
Avago Technologies Optocouplers
Safety Certified Protection... Worldwide! IEC 60747-5-5 Certified
Optocouplers are the only isolation devices that meet or exceed the IEC 60747-5-5 International Safety Standard for insulation and isolation. Stringent evaluation tests show Avago’s
optocouplers deliver outstanding performance on essential safety and deliver exceptional High Voltage protection for your equipment. Alternative isolation technologies such as ADI’s magnetic or TI’s capacitive isolators do not deliver anywhere near the high voltage insulation protection or noise isolation capabilities that optocouplers deliver. For more details on this subject, read our white paper at: www.avagoresponsecenter.com/672
PULSE
16
Visit: eeweb.com
FEATURED ARTICLE
Visit: eeweb.com
17
PULSE
ON Semiconductor’s new high performance IGBT for high performance power conversion.
IGBT LEGACY “ON Semiconductor’s first IGBT portfolio was in the late 1990s, while it was still part of Motorola,” said Mr. Jakwani. “Looking at our product portfolio, we started with AEC-Q qualified, automotive ignition IGBTs and expanded into other industrial applications. Although the company exited the industrial segment in early 2000s, throughout these years, we maintained our automotive ignition IGBTs, providing our customers with high quality and reliability products.
A NEW PORTFOLIO In 2010, ON Semiconductor took up a renewed focus in the IGBT market which was growing tremendously driven by need for high energy efficiency in server, UPS, industrial motor drives, and expansion in solar and wind power.
18
Visit: eeweb.com
Because of the strong innovation history in IGBTs from Motorola, ON Semiconductor had the intellectual property, design, and processes needed to provide highly reliable and quality devices to the growing market. In 2012, the first generation portfolio, with 28 orderable part numbers was introduced; targeted at induction heating, industrial motor control, uninterrupted power systems (UPS), solar inverters, and power factor correction applications. “ON Semiconductor is always looking for growing markets where our core competencies in device, process and packaging can provide customers with innovative and high performance solutions; we saw a need from our customer’s to develop high power and best in class IGBT portfolio to meet their requirements,” said Jakwani. “This competitive
FEATURED ARTICLE portfolio of devices that we were able to develop very quickly given our background, and receive extremely good reception when benchmarked against the market leaders in IGBTs” Need and competency continue to drive ON Semiconductor’s strategy, as the cloud computing, renewable solar energy systems, industrial motor drives and wireless markets continue to grow, so does the need for highly efficient VF drives, inverters and UPS infrastructure. The company’s second generation IGBT devices, to be released later this year, should put the company in a market-leading position based on system level performance. “This second generation portfolio of devices will be the company’s second proof point of our capabilities,” said Jakwani. “Our understanding of system level efficiency requirements gives us the ability to provide these next-generation IGBTs optimized for their end application.
APPLICATION SPECIFIC OPTIMIZATION According to Jakwani, an important aspect of ON Semiconductor’s IGBTs is optimization according to application.
“ON Semiconductor is always looking for growing markets where our core competencies in device, process and packaging can provide customers with innovative and high performance solutions; we saw a need from our customer’s to develop high power and best in class IGBT portfolio to meet their requirements.” In every generation of product the IGBT is optimized for a given application. That’s what we do -- we have different process tweaks that we can dial-in, so to speak -- and our design engineers do that by working closely with our systems applications designers.” Before sending parts to customers, ON Semiconductor does its own testing to decide how the part works best for a particular application and the trade-offs. All of these best practices can be found in the company’s IGBT Applications Handbook that outlines IGBTs, their customization, and application, which you can view by clicking here. ■
“We tell our customers that there is no standard IGBT,” Jakwani said. “Every IGBT is optimized with tradeoffs around three key variables, the VCE (sat), which is what the voltage drop is when it is on, the turn-off loss also known as the energy loss (EOFF), and the third variable is the short circuit robustness of the device.” ON Semiconductor optimizes IGBTs based on the end application, Jakwani continued, there is always a trade-off between the three variables. On a given technology node, ON develops an application specific portfolio optimized for end application resulting in high efficiency solutions for our customers. For example, he pointed out, “In high frequency UPS solar applications, which typically operate at 40khz, the more important parameters are EOFF and short circuit robustness, so we optimize the device around those two parameters. But if you look at a motor control application in a room air conditioner or an industrial motor which typically operate at 8 to 12 khz frequencies,” he continued, “the frequency is lower, which means you optimize your devices for lo Vce(sat).”
IGBT Triangle VCE(ON) DC losses (volts)
Optimize Ruggedness Short Circuit Aval. Energy
Switching
trise, tfall(nsec)
Eon, Eoff (mJ/amp) Fast, but soft (intangible)
Visit: eeweb.com
19
Get the Datasheet and Order Samples http://www.intersil.com
Power Factor Correction Controllers ISL6730A, ISL6730B, ISL6730C, ISL6730D The ISL6730A, ISL6730B, ISL6730C, ISL6730D are active Features power factor correction (PFC) controller ICs that use a boost topology. (ISL6730B, ISL6730C, ISL6730D are Coming Soon.) The controllers are suitable for AC/DC power systems, up to 2kW and over the universal line input.
The ISL6730A, ISL6730B, ISL6730C, ISL6730D are operated in continuous current mode. Accurate input current shaping is achieved with a current error amplifier. A patent pending breakthrough negative capacitance technology minimizes zero crossing distortion and reduces the magnetic components size. The small external components result in a low cost design without sacrificing performance. The internally clamped 12.5V gate driver delivers 1.5A peak current to the external power MOSFET. The ISL6730A, ISL6730B, ISL6730C, ISL6730D provide a highly reliable system that is fully protected. Protection features include cycle-by-cycle overcurrent, over power limit, over-temperature, input brownout, output overvoltage and undervoltage protection.
• Reduce component size requirements - Enables smaller, thinner AC/DC adapters - Choke and cap size can be reduced by 66% - Lower cost of materials • Excellent power factor over line and load regulation - Internal current compensation - CCM Mode with Patent pending IP for smaller EMI filter • Better light load efficiency - Automatic pulse skipping - Programmable or automatic shutdown • High reliable design - Cycle-by-cycle current limit - Input average power limit - OVP and OTP protection - Input brownout protection
The ISL6730A, ISL6730B provide excellent power efficiency and transitions into a power saving skip mode during light load conditions, thus improving efficiency automatically. The ISL6730A, ISL6730B, ISL6730C, ISL6730D can be shut down by pulling the FB pin below 0.5V or grounding the BO pin. The ISL6730C, ISL6730D have no skip mode.
• Small 10 Ld MSOP package
Two switching frequency options are provided. The ISL6730B, ISL6730D switch at 62kHz, and the ISL6730A, ISL6730C switch at 124kHz.
• TV AC/DC power supply
• Desktop computer AC/DC adaptor • Laptop computer AC/DC adaptor • AC/DC brick converters
100
VI
VLINE
Applications
+
VOUT
95
EFFICIENCY (%)
90
VCC ISEN
GATE
ICOMP
GND
ISL6730
VIN
FB
ISL6730A, SKIP
80 ISL6730C
75 70
COMP BO
85
65
VREG
60
0
20
FIGURE 1. TYPICAL APPLICATION
40 60 OUTPUT POWER (W)
80
100
FIGURE 2. PFC EFFICIENCY
TABLE 1. KEY DIFFERENCES IN FAMILY OF ISL6730
February 26, 2013 FN8258.0
VERSION
ISL6730A
ISL6730B
ISL6730C
ISL6730D
Switching Frequency
124kHz
62kHz
124kHz
62kHz
Skip Mode
Yes-Fixed
Yes-Fixed
No
No
Intersil (and design) is a registered trademark of Intersil Americas Inc. Copyright Intersil Americas Inc. 2013 All Rights Reserved. All other trademarks mentioned are the property of their respective owners.
PULSE
Re
Problem
Majo
22
Visit: eeweb.com
TECH ARTICLE
eactive Power Part 2:
ms with highly compensated power systems
or load areas with high levels of shunt capacitors can present special challenges to system planners and operators. Such highly compensated systems are difficult to operate and are subject to extremely rapid voltage collapse, which can occur with little warning, especially during contingencies. P. Jeffrey Palermo Executive Consultant DNV KEMA Energy & Sustainability
Visit: eeweb.com
23
PULSE Click the image below to read Part 1 of the series:
1.10
In the event of a widespread collapse, it could take many hours to restore customer service. Reconnecting generators, loads, and transmission to restore order in the system is a painstaking process. Generation and load must be balanced, and there must be enough transmission available at each step of the restoration process. Such a widespread outage could have an enormous impact. For example, it could cripple a major population and commercial center. As a result, highly compensated systems that serve major load centers are rare in the developed world. Reactive compensation, for this discussion, includes transmission-level (>100 kV) fixed and switched shunt capacitors, static var compensators (SVCs) and similar electronic devices, and synchronous condensers. These devices are typically added to the system to provide reactive support. Unused reactive capacity of generators in the affected area would be an additional source of var support.
Bus voltage (p.u.)
1.00 0.90 0.80 0.70 0.60
Safe voltage Saftery margin Critical voltage
More compens
0.50 0.40
Some compensation
5,000
7,000
9,000 Area imports (MW)
Figure 1: Example P-V curves with increasing var compensation
Highly compensated systems have four main problems:
1. Rapid voltage collapse 2. High normal operating voltages 3. Additional complexity 4. Limited operating flexibility All of these relate to risk. When the load at risk is not a major load center, the risk may be acceptable. But, when a major load center is at risk, other solutions should be considered.
Rapid Voltage Collapse
Static var compensators (SVCs) can provide stable reactive power to improve power system performance.
24
Visit: eeweb.com
The gradual voltage decline of normally compensated systems allows system operators to control the system and other conventional voltage control systems to be effective. Declining voltage acts as a warning signal for system operators. If voltages fall below the accepted voltage range, the operators can change the generation dispatch or take other actions to improve system voltage. The figure above shows three typical power-voltage (P-V) curves for a major load area with increasing var compensation using capacitors. The “some compensation� curve shows the common shape of a P-V curve—as the power transfer level increases the voltage gradually declines. As the power level gets close to the voltage limit, the voltage begins to fall more quickly until it reaches a point where no more power can flow without the voltage collapsing. This point is called the critical voltage as shown on the figure. It is common to allow for
1
TECH ARTICLE Normal voltage range
A lot of compensation
sation
11,000
Adding capacitors will generally improve the active power margin but will also increase the critical voltage. Therefore, the ability of the system to accommodate capacitors has an upper limit.
13,000
some uncertainty in these limits by using a safety margin (typically 5%), which establishes a safe voltage level. As the amount of compensation increases, three things happen to the curves: 1. The top portion of the curves becomes flatter and extends farther to the right, which means more power can be imported into the area. 2. The critical voltage rises, which means there is less difference between normal voltages and critical and safe voltages. 3. The slope of the P-V curve gets steeper near the point of collapse as the system becomes more highly compensated. This means that a voltage collapse can occur much more quickly with less time for the operators to take corrective action. Adding capacitors will generally improve the active power margin but will also increase the critical voltage. Therefore, the ability of the system to accommodate capacitors has an
upper limit. As the amount of shunt compensation increases, so does the critical voltage. Once the critical voltage reaches the normal operating range, the voltage becomes uncontrollable and small changes in power will result in large changes in voltage, which are unacceptable. In highly compensated systems, a gradual voltage decline cannot be expected. The voltage remains within the normal band until all reactive resources are exhausted; then, the voltage collapses almost instantaneously. Because the voltage stays in the normal range almost until the end, it can provide a false sense of security to system operators. A rapid voltage collapse occurs when all reactive sources are exhausted. SVC’s and other controllable reactive sources that act to maintain voltages within the normal range can mask an impending voltage collapse. Once these reactive resources are exhausted, there are no more reserves to support the voltage. One of the contributing factors to the August 2003 major blackout in the United States was fast voltage collapse. This can happen during periods of heavy load, especially when there is a dominance of motor load (e.g., air conditioning and industrial load). Recovery from the voltage sag during these conditions can be very slow.
High Normal Operating Voltages
The figure above also shows the normal operating range between approximately 1.05 p.u. and 0.95 p.u. of the nominal voltage. System operators use voltage as an indicator for potential problems. If the critical voltage is in the normal range due to very high compensation, then there will be no such warning. The system could be at risk even when the voltages are within the normal range. Visit: eeweb.com
25
PULSE While compensation devices allow a great deal of flexibility in controlling the system voltage, highly compensated systems provide little room for error.
Additional Complexity
Highly compensated systems can also introduce operating problems due to increased complexity. The multiplicity of devices can work against each other and make a serious situation worse. System operators must handle many combinations of outages, which add further complexity. The system operators monitor equipment loading, voltages, and other measures that indicate the health of the system. Examples of indicators include import limits (based on planning studies), generation reserve margins, and voltage margins. System operators also use online computer tools to help identify potential problems. Highly compensated systems can have many combinations of critical events—too many to study in advance. This can create a situation in which operators are unaware that the system is in danger of collapse. Operators and planners do not have a comprehensive computer tool that can handle the complexities and risks associated with highly compensated systems. Added complexity also comes from each reactive device having its own local control logic based on information at the connection point. These controls react to conditions and changes at their location without regard to overall system needs. The devices are set after careful study of likely system conditions; however, there are many more combinations of events that can be studied. Having too many independent voltage controllers can risk operation. If the system is critically dependent on these devices, there
26
Visit: eeweb.com
will be little room for error in the automatic controls. Should the wrong combination of events occur, there will be no warning or way to prevent a voltage collapse.
Limited Operating Flexibility
While a few switched capacitors and a limited amount of SVCs can provide operating stability, in an extreme situation, these devices actually reduce operating flexibility. The range of acceptable operating conditions becomes narrower. This can be seen in the figure above by comparing the critical voltages. The highly compensated curve has a very high critical voltage. If all the equipment is not operating as expected or if the anticipated reactive power margin is not available, there could be problems. This requires very tight control of generators, synchronous condensers, SVCs, and transformer tap set points. While compensation devices allow a great deal of flexibility in controlling the system voltage, highly compensated systems provide little room for error. An unexpected combination of events can cause voltages to fall below the safe range and result in a surprise voltage collapse.
About the Author
Mr. Palermo has more than 35 years of experience in the power sector, with specific expertise in system planning and sector restructuring. At DNV KEMA, Mr. Palermo is responsible for system planning and operating studies of generation and transmission systems within a range of multi-utility coordination schemes. He advises and assists utilities in developing and evaluating transmission plans -- including a wide range of system analyses using a variety of steady-state and dynamic system analysis tools and techniques. Mr. Palermo has appeared as an expert witness before FERC, Congressional committees, state legislatures, state and federal courts, arbitration panels, and state and provincial utility commissions. He has provided numerous lectures on restructuring, system planning and system operation. â–
Âť CLICK HERE to comment on the article.
Join Today
eeweb.com/register
PULSE
P
resident John F. Kennedy’s 1962 vision of placing a man on the moon within the span of a decade would require not just a powerful spacecraft, but a powerful and efficiently sized guidance computer and interface. The Apollo Guidance Computer, or AGC, was created to realize this vision and helped make manned space flight a reality. Designed for NASA at the MIT Instrumentation Laboratory under Charles Stark Draper, it was one of the world’s first computers to use integrated circuits. The simple but sophisticated interface for the AGC was known as the DSKY, standing for display and keyboard and pronounced dis-key. The DSKY numbers were displayed via green high-voltage electroluminescent seven-segment displays, and the calculator-style interface was the first of its kind, generally regarded as the basis for all digital control panel interfaces. To input commands to the AGC using the DSKY, commands were entered as two-digit numbers and classified as either a ‘verb’ or ‘noun’: Verb entries described the type of action to be performed and Noun entries specified which data were affected by the action specified by the Verb command.
28
Visit: eeweb.com
FEATURED ARTICLE Glenn Prow with the DSKY interface
Visit: eeweb.com
29
PULSE NASA employed many contractors to work in the space program. Glenn Prow retired in 1994 as a Principal Engineer for Lockheed after 35 years in the Space industry. In addition to work on the Space Shuttle and International Space Station programs, Glenn worked with the DSKY interface during the Apollo program. WHAT ARE SOME OF YOUR MEMORIES WORKING WITH THE DSKY DURING THE APOLLO PROGRAM? It has been almost 45 years since I worked with the DSKY. As compared to today’s technology and laptop computers, the Apollo computer and DSKY were very crude. Laptops and computers today as you know have a full keyboard of numbers, letters, and symbols, where as the DSKY had large pushbuttons and you could enter numbers only, and during
missions sometimes only with gloved fingers. The DSKY also displayed numbers only to which the user either memorized the codes displayed, or was able to read the results for his use, i.e.—attitude of the spacecraft in relation to the stable platform, which was the IMU (Inertial Measurement Unit). The IMU consisted of gyroscopes to detect change in velocity and direction. It might also be noted that the electronics in the computer and other associated equipment consisted of discrete components, i.e. transistors, resistors, capacitors, coils, etc. They were connected by welding their leads together and/or attached to a circuit board; the development of integrated circuits were in the development stage at this time.
For the crew of A was their only sour with the computer throughout the flig The Apollo computer ( AGC ) consisted of plug-in There was also upl potted modules. To troubleshoot and/or repair a module, Houston. one had to use a soda-blaster (similar to a sand-blaster) WHAT WERE to bare and reveal the component for testing and APOLLO PRO possible replacement. INVOLVING M INTERFACES? Both the Apollo computer and DSKY were housed in heavy machined casings. At the time they only In the early stages could estimate the environment that components that then), we inter would be exposed to: Vibration, Shock, minicomputers thro Temperature extremes, Radiation—to name site and later provi a few items. computers through There were actually two DSKYs on board on the Moon flights—one in the Apollo command module and one in the LEM.
30
Visit: eeweb.com
DID THE DSKY ALLOW THE CREW OF APOLLO 13 MORE FLEXIB IN CREATING WORK AROUN PROCEDURES SAFELY HOM
My last task before station design for p on the Space Stati from the USA and
TECH ARTICLE THE APOLLO GUIDANCE COMPUTER and DSKY were later utilized in pioneering fly-by-wire systems in military aircraft. Microcomputing replaced bulky hydraulic control systems with computer controlled precision, drastically reducing weight; these advancements later benefitted the commercial aviation industry. You can thank the Apollo program next time you enjoy a smooth, affordable airline journey. There is a DSKY on display at the Smithsonian Institute. In 2009, a DSKY was sold at auction for $50,788.
W
3 BILITY
ND S TO GET ME?
Apollo 13, the DSKY rce of communications and had to be used ght and during their return. link from Mission Control in
SOME OF THE POSTOJECTS YOU WORKED ON MICRO-COMPUTERS AND ?
of the “Internet” (it wasn’t called rfaced many terminals to numerous oughout the Johnson Space Center ided connection to other government h the world.
e retiring was to work on the Space placing various external experiments ion having to interface with designers d foreign countries.
» CLICK HERE to comment on the article.
Visit: eeweb.com
31
PULSE
Alex Toombs Electrical Engineering Student
32
Visit: eeweb.com
TECH ARTICLE
Visit: eeweb.com
33
PULSE Why are these augmented reality devices coming into public eye now?
O
culus Rift unexpectedly raised $2.5 million last year, ten times their asking price on Kickstarter. Google Glass sold out their developer editions at last year’s Google I/O developer conference. The price for even an untested prototype was an immense $1500. Whatever the reason, wearable computing and augmented reality spaces are becoming major markets.
Consumer Electronics in Today’s Modern Market Announced at the annual Google I/O conference in June 2012, Google’s Project Glass is a new take on wearable computing, and is heavily influenced by Steve Mann’s work (as pictured on Page 34). Google’s Android-themed Glass project was shown in a working demo when several people wearing the device, all participating in a live Google+ Hangout with one another and presenters in the auditorium below, skydove out of a plane and into the presentation hall—all live. Google Glass works much like the EyeTap, with an HUD displayed on the lens hanging over one eye off the aluminum frames. The computer and battery is contained in the frames of the device, with a microphone so
Perhaps driven by a desire to overcome laptop, tablet, and phone saturation, Apple and Google are working their respective R&D arms to come up with the next big thing. 34
Visit: eeweb.com
that voice commands can control it. Some speculate that a tight integration between Android devices and the proximity of the user will allow the lens to give information about what is around the user, including local restaurants and areas of interest. The device is scheduled to ship at the end of 2013, and developer versions are soon to be available. The impact of these devices could be huge as companies look for new markets to sell to, with the technology just now able to be compressed to a wearable size. Another battlefield several tech companies are reported to be moving into is that of smart watches. Possibly the most famous Kickstarter success story yet is the Pebble watch, which raised its $100,000 goal within two hours and was over five million dollars within the week. These watches display information received from cell phones over Bluetooth, a wireless technology allowing for texts, missed calls, and other notifications to be displayed on the screen. The E-paper screen allows for the battery of the device to last close to a week. The success of the Apple and Androidcompatible watch has led to credible rumors about Apple considering entering the market. Some believe the device unprofitable after Sony gambled and largely lost on Androidbased smartwatches, but many think Apple, driven by dropping demand in a saturated market, could shake things up. The Pebble watch is shown below as Figure 3.
Alternate Reality Devices Similar to the rise of Pebble, the Oculus Rift was another Kickstarter success story that has since took the gaming and technology world by storm. The team behind Oculus Rift was able to raise $2.4 million in order to begin manufacturing, though they had a working prototype when they posted the project online. While virtual reality has existed in various forms for the last few decades, none have successfully made it enticing enough to appeal to a wide segment of consumers.
TECH ARTICLE HISTORY OF WEARABLE COMPUTING Since 1997, the International Symposium on Wearable Computers has met annually to discuss the newest innovations and technologies around wearable computing. As personal computers became more prevalent toward the end of the 1990s, interest in portable applications inspired by science fiction grew enough to attract the attention of businesses and academic organizations. The first instance of a wearable “computer” dates back to the Qin Dynasty of China, circa mid 1600s, where an abacus-on-a-ring was first produced. The next real iteration was used by knowledgeable gamblers during the 1960s and 1970s, including mathematicians Edward O. Thorp and Claude Shannon, who used small computers inside of their shoes in order to predict roulette outcomes. By tapping every time the ball passed on the roulette wheel, the computer was able to predict when the ball would land in one of eight “octants,” or an eighth of the wheel. Even though the predictions were inaccurate, the resultant computer guesses were a little better than the average, meaning that using the device gave winnings that beat the house spread. Detailed images of the device can be seen on the EyeTap website. The 1980s and 1990s went on to develop further advances in wearable computing fields, including the first head-mounted cameras. Additionally, multiple function devices were realized with calculator watches during the 1980s. Among the first of these were the Casio Databank series of watches. Steve Mann: Evolution of wearable computing + augmented reality in everyday life Steve Mann: Evolution of wearable computing + augmented reality in everyday life
1980 1980
1995 passport 1995 passport
1999 1999
2004 with firstborn child 2004 with firstborn child
“Through the glass”, IEEE Technology and Society, Vol.31 Number 3, Fall 2012, pg.10-14 “Through the glass”, IEEE Technology and Society, Vol.31 Number 3, Fall 2012, pg.10-14
Mann’s 1999”EyeTap Digital Eye Glass” Mann’s 1999”EyeTap Digital Eye Glass”
2012, “Google Glass” 2012, “Google Glass”
Mann was recognized as “Father of AR” and the “Father of Wearable Computing” (IEEE ISSCC 2000) Mann was recognized as “Father of AR” and the “Father of Wearable Computing” (IEEE ISSCC 2000)
During the same time period, noted University of Toronto electrical engineering professor Steve Mann began developing his EyeTap computing system, a wearable computing system that consists of a lens over one eye with a computer-generated image projected in front of it. The resulting image is an example of augmented reality, where a display similar to a video game’s “heads-up display,” or HUD, is shown over real life. The technology behind the EyeTap is impressive, covering several fields. First, the image hits a beam splitter, which sends the image before the user to both their eye and a camera. The camera digitizes the image and sends the information to the onboard computer which processes the information and augments the reality around it. EyeTap devices have even been made in stereo configurations which have the HUD shown in both eyes, as opposed to just one.
The first instance of a wearable “computer” dates back to the Qin Dynasty of China, circa mid 1600s, where an abacuson-a-ring was first produced. Visit: eeweb.com
35
PULSE
Now developers, including those behind the popular Games of War series, have ported their games onto the system, with the lead game developer evangelizing at technology conferences like CES and SXSW. The Oculus Rift renders an effective 640×800 3D screen between both eyes, though higher resolution screens are being viewed for consumer models. To simulate human vision, each part of the screen includes some overlap onto the rest of the world, allowing the game to project a 360 degree immersive environment around the user. Many important industry veterans, including Gabe Newell and John Carmack, are raving about the hardware; others, meanwhile, look to enter the market themselves.
The Future of Augmented Reality While Steve Mann has long since earned the right to be called the ‘father of wearable computing,’ it is just now that many large scale companies are considering throwing their hats into the battlefield. Perhaps driven by a desire to overcome laptop, tablet, and phone saturation, Apple and Google are working their respective R&D arms to come up with the next big thing. Platforms like
36
Visit: eeweb.com
Kickstarter have allowed those with innovations to be able to fund manufacturing enough to bring their technology to a wide variety of customers, allowing for risky startups to blaze the trail inevitably ridden by the tech giants later. With Project Glass and Oculus Rift likely shipping at the end of this year, it will be interesting to see how the consumer electronic landscape changes in the next few years.
About the Author Alex Toombs is a senior electrical engineering at the University of Notre Dame, concentrating in semiconductor devices and nanotechnology. His academic, professional and research experiences have exposed him to a wide variety of fields; from financial analysis to semiconductor device design; from quantum mechanics to Android application development; and from low-cost biology tool design to audio technology. Following his graduation in May 2013, Alex will be joining the cloud startup Apcera as a Software Engineer. ■
» CLICK HERE to comment on the article.
EEWeb PULSE
To read all 100 issues of Pulse Magazine
Âť Click Here