Magazine autumn98 p29

Page 1

Inspection F

E

A

T

U

R

E

S

Automatic Defect Classification: A Productivity Improvement Tool by Tony Esposito, IBM Corporation; Mark Burns, Scott Morell, KLA-Tencor; Eric Wang, Stanford University

Why should a semiconductor fab invest the time to review and classify defects on a wafer after the wafer has been inspected? To add the additional step of classification in an already lengthy fabrication process is contradictory to manufacturing fundamentals unless it can be proven that the additional step can positively influence final yield. The extra information about the source of defects is an obvious benefit that defect review and classification provide. However, a method for quantifying the benefit of classification is required. Traditionally, classification is done manually by a human operator after the wafer is inspected. Manual review and classification of defects offers defect source information but carries with it several less-then-ideal side effects. From a manufacturing standpoint, the extra processing step increases the total time it takes for a lot to work through the process flow. Classification requires additional employees and review tools on which to do the review and classification. From an engineering standpoint, the information fed back by classification is only useful if it is accurate and consistent. In practice, a multitude of factors impact the accuracy of classification including operator experience, state of operator consciousness, consistencies from operator to operator, cost of operator labor, the cost of review stations, and the excessive queue time lots spend waiting for review after in-line inspection. The ideal solution is to place the task of classification with an automated system that reduces or eliminates the majority of these negative side effects.

cluster tool daily monitor wafers using 64 Mb pitch). For each level, a minimum of 10 lots were used to measure the performance of ADC against a pre-defined set of metrics. ADC performance

The overall ADC performance of the five process levels (figure 1) was measured and recorded. Each of the ADC classifiers performed at or above the expected levels for the beta evaluation. Process/Level Defect Standard Wafer ADI Excursion Monitor 4 Mb Metal 1 16 Mb Metal 1 64 Mb POLY

Accuracy

Purity

Redetection

97% 80% 72% 77% 80%

100% 89% 80% 87% 80%

100% 91%* 100% 99% 98%

Figure 1. Beta performance of IMPACT/Online ADC. *The lower-than-normal redetection for the ADI monitor is due to nuisance defects. Subsequent to follow-on beta testing, the inspection recipe was modified using Segmented Auto Threshold (SAT), which reduced the nuisance defects and improved rede-

Beta evaluation of IMPACT/Online™

tection to greater than 95 percent.

In a scientific approach to this task online, IBM installed a beta version of IMPACT/Online ADC on a KLA-Tencor 2132 defect inspector at IBM Burlington in order to collect data and analyze costs. The system was trained to classify five different production levels as part of the beta tool evaluation. The levels included: Trench Isolation, two metal levels, POLY on 64 Mb DRAM, and After-Develop Inspection or ADI (single layer

Case study: ADI excursion monitor

At the time of this study, the production classification strategy for the ADI Excursion Monitor was in transition from manual review and classification, to online automated defect classification using IMPACT. Therefore, the data collected for this study includes classification data from both the operators and ADC.

Autumn 1998

Yield Management Solutions

29


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.