Single-column e-beam systems are used in production for the detection of electrical defects, but are too slow to be used
for the detection of small physical defects, and can't meet future inspection requirements. This paper presents a multiplecolumn
e-beam technology for high throughput wafer inspection.
Multibeam has developed all-electrostatic columns for high-resolution imaging. The elimination of magnetic coils
enables the columns to be small; e-beam deflection is faster in the absence of magnetic hysteresis. Multiple miniaturecolumns
are assembled in an array. An array of 100 columns covers the entire surface of a 300mm wafer, affording
simultaneous cross-wafer sampling. Column performance simulations and system architecture are presented. Also
provided are examples of high throughput, more efficient, multiple-column wafer inspection.
Most semiconductor manufacturers expect 193nm immersion lithography to remain the dominant patterning
technology through the 32nm technology node. Conventional immersion lithography, however, is unlikely to
take the industry to 32nm half-pitch. Various double patterning techniques have been proposed to address
this limitation. These solutions will combine design for manufacturability (DFM) and advanced process
control (APC) strategies to achieve desired yield. Each strategy requires feeding forward design and process
context and feeding back process metrics. In this work, we discuss some interim solutions for control of
double patterning lithography (DPL), as well as some spacer-etch alternatives. We conclude with focus-exposure
data showing some potential challenges for pitch-splitting strategies implemented in the context of
immersion lithography.
Most semiconductor manufacturers expect 193nm immersion lithography to remain the dominant
patterning technology through the 32nm technology node. If this remains the case, the interaction
of more complex designs with shrinking process windows will severely limiting parametric yield.
The industry is responding with strategies based upon design for manufacturability (DFM) and
multi-variate advanced process control (APC). The primary goal of DFM is to enlarge the process
yield window, while the primary goal of APC is to keep the manufacturing process in that yield
window. In this work, we discuss new and innovative process metrics, including simulation-based
virtual metrology, that will be needed for yield at the 32nm technology node.
CD control is one of the main parameters for IC product performances and a major contributor to yield performance. Traditional SEM metrology can be a challenge on particular layers due to normal process variation and has not proven to provide sufficient focus monitoring ability. This in turn causes false positives resulting in unnecessary rework, but more importantly missed focus excursions resulting in yield loss.
Alexander Starikov, Intel Corporation, alludes to the fact that focus and exposure "knobs" account for greater than 80% of CD correctible variance1. Spansion F25 is evaluating an alternative technology using an optical method for the indirect monitoring of the CD on the implant layer. The optical method utilizes a dual tone line-end-shortening (LES) target which is measured on a standard optical overlay tool. The dual tone technology enables the ability to separate the contributions of the focus and exposure resulting in a more accurate characterization of the two parameters on standard production wafers. Ultimately by keeping focus and exposure within acceptable limits it can be assumed that the CD will be within acceptable limits as well without the unnecessary rework caused by process variation.
By using designed experiments this paper will provide characterization of the LES technique on the implant layer showing its ability to separate focus-exposure errors vs. the traditional SEM metrology. Actual high volume production data will be used to compare the robustness and sensitivity of the two technologies in a real life production environment. An overall outline of the production implementation will be documented as well.
KEYWORDS: Design for manufacturing, Critical dimension metrology, Process control, Metrology, Overlay metrology, Single crystal X-ray diffraction, Line width roughness, Data modeling, Immersion lithography, Lithography
Immersion lithography at 193nm has emerged as the leading contender for critical patterning through the 32nm technology node. Super-high NA, along with attendant polarization effects, will require re-optimization of virtually every resolution enhancement technology and the implementation of advanced process control at intra-wafer and intra-field levels. Furthermore, interactions of critical dimensions, profiles, roughness, and overlay between layers will impact design margins and become severe yield limiters. In this work, we show how design margins are reduced as a result of hidden process error and how this error can be parsed into unobservable, unsampled, unmodeled, and uncorrectable components. We apply four new process control technologies that use spectroscopic ellipsometry, grating-based overlay metrology, e-beam array imaging, and simulation to reduce hidden systematic error. Feedback of super-accurate process metrics will be critical to the application of conjoint DFM and APC strategies at the 65nm node and beyond. Manufacturing economics will force a trade-off between measurement cost and yield loss that favors greater expenditure on process control.
Due to the continuous shrinking of the design rules and, implicitly, of the lithographic process window, it becomes more and more important to implement a dynamic, on product, process monitoring and control based on both dose and focus parameters. The method we present targets lot-to-lot, inter-field and intra-field dose and focus effect monitoring and control. The advantage of simultaneous dose and focus control over the currently used CD correction by adjusting exposure dose only is visible in improvement of the CD distributions both at pre-etch and at post-etch phases. The 'On Product' monitoring and compensation is based on the optical measurement of a special compact line end shortening target which provides the unique ability to separate dose from focus on production wafers.
As design rules shrink and process windows become smaller, it is increasingly important to monitor exposure tool focus and exposure in order to maximize device yield. Economic considerations are forcing us to consider nearly all methods to improve yield across the wafer. For example, it is not uncommon in the industry that chips around the edge of the wafer have lower yield or device speed. These effects are typically due to process and exposure tool errors at the edge of the wafer. In order to improve yield and chip performance, we must characterize and correct for changes in the effective focus and exposure at the edge. Monitoring focus and exposure on product wafers is the most effective means for correction, since product wafers provide the most realistic view of exposure tool interactions with the process. In this work, on-product monitoring and correction is based on optical measurement using a compact line end shortening (LES) target that provides a unique separation of exposure and focus on product wafers. Our ultimate objective is indirect CD control, with maximum yield and little or no impact on productivity.
Parametric yield loss is an increasing fraction of total yield loss. Much of this originates in lithography in the form of pattern-limited yield. In particular, the ITRS has identified CD control at the 65nm technology node as a potential roadblock with no known solutions. At 65nm, shrinking design rules and narrowing process windows will become serious yield limiters. In high-volume production, corrections based on lot averages will have diminished correlation to device yield because APC systems will dramatically reduce error at the lot and wafer levels. As a result, cross-wafer and cross-field errors will dominate the systematic variation on 300mm wafers. Much of the yield loss will arise from hidden systematic variation, including intra-wafer dose and focus errors that occur during lithographic exposure. In addition, corollary systematic variation in the profiles of critical high-aspect-ratio structures will drive requirements for vertical process control. In this work, we model some of the potential yield losses and show how sensitive focus-exposure monitors and spectroscopic ellipsometry can be used to reduce the impact of hidden error on pattern limited yield, adding tens of millions of dollars in additional revenue per factory per year.
Process window control enables accelerated design-rule shrinks for both logic and memory manufacturers, but simple microeconomic models that directly link the effects of process window control to maximum profitability are rare. In this work, we derive these links using a simplified model for the maximum rate of profit generated by the semiconductor manufacturing process. We show that the ability of process window control to achieve these economic objectives may be limited by variability in the larger manufacturing context, including measurement delays and process variation at the lot, wafer, x-wafer, x-field, and x-chip levels. We conclude that x-wafer and x-field CD control strategies will be critical enablers of density, performance and optimum profitability at the 90 and 65nm technology nodes. These analyses correlate well with actual factory data and often identify millions of dollars in potential incremental revenue and cost savings. As an example, we show that a scatterometry-based CD Process Window Monitor is an economically justified, enabling technology for the 65nm node.
Simple microeconomic models that directly link yield learning to profitability in semiconductor manufacturing have been rare or non-existent. In this work, we review such a model and provide links to inspection capability and cost. Using a small number of input parameters, we explain current yield management practices in 200mm factories. The model is then used to extrapolate requirements for 300mm factories, including the impact of technology transitions to 130nm design rules and below. We show that the dramatic increase in value per wafer at the 300mm transition becomes a driver for increasing metrology and inspection capability and sampling. These analyses correlate well wtih actual factory data and often identify millions of dollars in potential cost savings. We demonstrate this using the example of grating-based overlay metrology for the 65nm node.
As fabs transition from 200 to 300mm wafers with shrinking design rules, the risk and cost associated with overlay excursions become more severe. This significantly impacts the overall litho-cell efficiency. Effective detection, identification, and reduction of overlay excursions are essential for realizing the productivity and cost benefits of the technology shifts. We have developed a comprehensive overlay excursion management method that encompasses baseline variation analysis, statistical separation and characterization of excursion signatures and their frequencies, as well as selection of sampling plans and control methods that minimize material at risk due to excursion. A novel baseline variance estimation method is developed that takes into account the spatial signature and temporal behavior of the litho-cell overlay correction mechanisms. Spatial and temporal excursion signatures are identified and incorporated in a cost model that estimates the material at risk in an excursion cycle. The material at risk associated with various sampling plans, control charts, and cycle times is assessed considering various lot disposition and routing decisions. These results are then used in determining an optimal sampling and control strategy for effective excursion management. In this paper, we describe and demonstrate the effectiveness of the methods using actual 300mm fab overlay data from several critical layers. With a thorough assessment of the actual baseline and excursion distributions, we quantify the amount of wafer-to-wafer and within-wafer sampling necessary for detecting excursions with minimal material at risk. We also evaluate the impact of shorter cycle time and faster response to excursion, which is made possible through automation and alternative metrology configurations.
Fundamentally, advanced process control enables accelerated design-rule reduction, but simple microeconomic models that directly link the effects of advanced process control to profitability are rare or non-existent. In this work, we derive these links using a simplified model for the rate of profit generated by the semiconductor manufacturing process. We use it to explain why and how microprocessor manufacturers strive to avoid commoditization by producing only the number of dies required to satisfy the time-varying demand in each performance segment. This strategy is realized using the tactic known as speed binning, the deliberate creation of an unnatural distribution of microprocessor performance that varies according to market demand. We show that the ability of APC to achieve these economic objectives may be limited by variability in the larger manufacturing context, including measurement delays and process window variation.
Simple microeconomic models that directly link metrology, yield, and profitability are rare or non-existent. In this work, we validate and apply such a model. Using a small number of input parameters, we explain current yield management practices in 200 mm factories. The model is then used to extrapolate requirements for 300 mm factories, including the impact of simultaneous technology transitions to 130nm lithography and integrated metrology. To support our conclusions, we use examples relevant to factory-wide photo module control.
As exposure wavelengths decrease from 248 nm to 193, 157, and even 13 nm (EUV), small process defects can cause collapse of the lithographic process window near the limits of resolution, particularly for the gate and contact structures in high- performance devices. Such sensitivity poses a challenge for lithography process module control. In this work, we show that yield loss can be caused by a combination of macro, micro, CD, and overlay defects. A defect is defined as any yield- affecting process variation. Each defect, regardless of cause, is assumed to have a specific 'kill potential.' The accuracy of the lithographic yield model can be improved by identifying those defects with the highest kill potential or, more importantly, those that pose the highest economic risk. Such economic considerations have led us to develop a simple heuristic model for understanding sampling strategies in defect metrology and for linking metrology capability to yield and profitability.
Defectivity in spin-coated, but unpatterned ultrathin resist (UTR) films (<EQ 1000 Angstrom) was studied in order to determine whether defectivity will present an issue in EUV (13.4-nm) and 157-nm lithographic technologies. These are the lithographic regimes where absorption issues mandate the use of ultrathin resists. Four resist samples formulated from the same Shipley UV6 polymer batch and having the same polymer molecular weight properties but different viscosities, were spin-coated at spin speeds ranging from 1000 to 5000 RPM on a production-grade track in a Class 1 pilot line facility. Defect inspection was carried out with KLA SP1/TBI tool, while defect review was carried out with JEOL 7515 SEM tool and KLA Ultrapointe Confocal Review Station (CRS) Microscope. The results obtained are related to the physical properties of the resist polymers, as well as to spin coating parameters. Also, the results of the defect inspection, review, characterization, and pareto are compared to those obtained on baseline thick resists (>= 3500 Angstrom) processed under similar condition as the ultra-thin resists. The results show that for a well-optimized coating process and within the thickness range explored (800 - 4200 Angstrom), there is no discernible dependence of defectivity on film thickness of the particular resists studied and on spin speed. Also assessed is the capability of the current metrology toolset for inspecting, reviewing, and classifying the various types of defects in UTR films.
The 0.13 micrometers semiconductor manufacturing generation, shipping as early as 2001, will have transistor gate structures as small as 100 nm, creating a demand for sub- 10nm gate linewidth control. Linewidth variation consists of cross-chip, cross-wafer, cross-lot, and run-to-run components, so we can expect the individual component requirements to be sub-5nm. For model-based, run-to-run control systems to achieve this level of performance, stabilization of lithographic focus will be critical. In this work we show promising results based upon a novel phase-shift focus monitor, optical overlay metrology, and robust analysis software. Extensions of this work explore spatial dependencies across the lithographic field due to reticle error and across the wafer due to wafer nanotopography. Both sources of variation can cause collapse of the focus-exposure process window near the limits of lithographic resolution, particularly for gate structures in high-performance microprocessors. Our work supports the contention that photolithography-induced defects may become the primary source of yield loss for the 0.13 micrometers generation and beyond.
The 0.13 micrometer semiconductor manufacturing generation will have transistor gate structures as small as 100 nm, creating a demand for 10 nm gate linewidth control and for measurement precision on the order of 2 nm. This process control requirement is inherently long-term. Therefore, measurement equipment should be able to run days or weeks without significant excursions. The requirement for long-term precision drives both the design and use of measurement equipment. We have found that long-term measurement precision on a single tool may be divided into orthogonal components corresponding to static repeatability, short-term dynamic reproducibility, and long-term stability of the tool. The static component is limited primarily by signal-to-noise ratio, the dynamic component is limited primarily by sample positioning and focusing, and the long-term component is limited primarily by instrument drift and, in the case of monitor wafers, aging of the sample. In this work, we show that each of these components can be reduced to less than 1 nm, 3-sigma, for CD SEM measurements of etched polysilicon gate structures.
KEYWORDS: Scanning electron microscopy, Metrology, Lithography, Semiconducting wafers, Critical dimension metrology, Error analysis, Signal processing, Digital signal processing, Transistors, Deep ultraviolet
Statistical metrology can be defined as a set of procedures to remove systematic and random gauge error from confounded measurement data for the purpose of reducing total uncertainty. We have applied these procedures to the determination of across-chip linewidth variation, a critical statistic in determining the speed binning and average selling price of advanced microprocessors, digital signal processors, and high-performance memory devices. The measurement data was obtained from tow sources: a high- throughput CD-SEM and an atomic force microscope. We found that the high-throughput of SEM permitted the additional measurements required for statistical metrology and heterogeneous gauge matching.
Fully automated, multi-mode CD-SEM metrology, utilizing both backscattered electron (BSE) and secondary electron (SE) detection, has been benchmarked to 180 nm critical dimensions using patterns generated by deep-UV lithography. Comparison of pure BSE with conventional SE SEM data used in a study of across-chip linewidth variation (ACLV) revealed that heterogeneous system matching depends on feature orientation as well as an offset between BSE and SE intensity profiles. The corresponding AFM data show that the BSE measurements are more accurate and less sensitive to feature orientation and sample charging. Using the multi-mode system, we found that SE profiles had a higher signal-to-noise ratio while the BSE profiles gave a better representation of the actual line shape. Static and dynamic measurement precision below 2 nm has been achieved with BSE on etched polysilicon. Move-acquire- measure (MAM) times at this precision were under 10 seconds per site. Models for orientation-independent measurement, generic wafer throughput, and overall equipment effectiveness were used to address the issues of system matching, tool productivity, and factory integration, respectively.
In this work, we address the 0.25 micrometers yield management applications of multimode electron beam imaging using both backscattered and secondary electrons. The prospects for achieving aggressive performance specifications for quarter-micrometer process control in imaging, metrology and productivity are analyzed, and supporting examples are provided.
The mathematical background underlying the science of critical dimension metrology is presented at a level suitable for the semiconductor process engineer with little direct experience in the field. Concomitant with this purpose, we make few assumptions and derive many of the concepts from first principles. Understanding the fundamentals provides a basis for further learning and a chart for navigating the sometimes murky waters of the literature. The process engineer can use this work profitably on its own or as a companion document to industrial1 and international standards2.
Low-loss electron (LLE) imaging has been shown to have significant advantages over both secondary electron and conventional backscattered electron imaging for the purposes of inspection and critical dimension metrology of integrated circuits. LLE images had high- resolution, good atomic contrast and fewer charging artifacts. Further, they were easily optimized using Monte Carlo simulations; and the optimized LLE images showed excellent precision, accuracy, and linearity in both process control and focus-exposure applications.
KEYWORDS: Silicon, Monte Carlo methods, Scanning electron microscopy, Metrology, Oxides, Signal detection, Sensors, Scattering, Image resolution, Confocal microscopy
In this work we present the results of a radically different approach to imaging of high-aspect ratio structures such as contact holes. Our approach utilizes two backscattered electron detection subsystems, one optimized for imaging at the top, like most SEM detectors, and another optimized for imaging at the base of submicrometer structures. These detection systems produce signals that can be combined in real-time to produce an image which resembles the 'extended focus' images obtained with confocal optical microscopes. Unlike confocal images, however, backscattered electron images have the inherent linearity and resolution characteristic of electron-beam technology. Backscattered electron imaging has been used to solve a number of vexing problems in monitoring semiconductor process. For example, contact hole measurement with secondary electrons has typically been done with a minimum or zero signal at the base of the structure, so that the measurement value obtained either has poor precision or is the result of extrapolation. In the case of backscattered electrons, the signal can have its maximum at the base of the structure, allowing high- precision measurement with no need for extrapolation. These results are supported by extensive Monte Carlo simulations.
KEYWORDS: Error analysis, Oxides, Metrology, Data modeling, Statistical analysis, Imaging systems, Scanning electron microscopy, Inspection, Process control, Electron beams
Site-to-site LVSEM measurement data on insulating samples are affected in a systematic way by the number of measurements per site. The problem stems from the fact that repeated imaging at the same site does not produce true statistical replicates since the electron dose is cumulative. Indeed, the measurement values tend to grow or shrink in direct proportion to the total dose applied. The data support a model for linewidth as a function of electron dose that includes a linear term for systematic error and a reciprocal square root term as a scaling parameter for random error. We show that charging samples such a resist on oxide, where measurements are dominated by site-to-site variation in the systematic error, should be measured at low electron dose. Conversely, conducting samples such as polysilicon on oxide, where the measurements are dominated by random error, should be measured at relatively high electron dose.
Different optical metrology systems such as broadband confocal microscopes, broadband coherence probe microscopes, and broadband brightfield microscopes show different linearity characteristics for different layers and linewidths. Linearity of response is dependent not only on the layer specifics but on the optical system parameters as well. These include the type of microscope, the bandwidth of illumination, the numerical aperture, the partial coherence, etc. Algorithm parameters such as focus- offset, threshold, and phase filter strength and placement (coherence probe) also make a difference. Since computer simulation is now able to predict (with good accuracy) the linewidths measured by these technologies and parameters, it is natural to begin a systematic study of the theoretical predictions. For example, the effects of wavelength on linearity for four types of microscopes are shown. All show improved linearity in the shorter wavelength region, extending the linear range of the instruments. The simulations suggest that variation of wavelength is a key to optimizing linearity around a given feature size. Linearity optimization analysis is performed for several microscope types and measurement algorithms for a simple layer geometry. The optimization program varies threshold and focus offset to achieve the best linearity. The combination of simulation with linearity optimization provides a testbench for metrology system design and evaluation. A new complex-difference algorithm is presented which clearly shows better linearity in at least one simulated case (when resist sidewall angles are changing) than the algorithms commonly used in optical metrology.
Decreasing dimensions in the processing and manufacture of integrated circuits (ICs) has stimulated interest in ultra-high resolution measuring technologies. The Atomic Force Microscope (AFM), which is now available commercially, offers three-dimensional surface measurement capability from angstroms to over 100 microns, the ability to image insulators directly without coating, and minimal sample preparation. These features indicate strong potential for applications in IC related inspection, process engineering, failure analysis and reliability, particularly as ICs move toward submicron geometries.
Linearity of response is one of the most important features of a measurement system. Linearity implies that accurate linewidths can be obtained from measured values knowing only the slope and offset of the data with respect to reference data taken with another, presumably more accurate, instrument. A first-order linear regression of the data yields the slope, offset, and estimate of the goodness of fit. Ideally, the slope is near unity, so that the magnification scales of the two instruments agree. The offset is considerably less important since, in IC process control, absolute changes in linewidth are often of more concern than the linewidths themselves. This paper demonstrates by simulation and experiment that the linearity (R-Squared) of an optical microscope depends not only upon the characteristics of the tool but also upon the characteristics of the object being measured. In virtually all optical microscopes, transparent structures support waveguide resonant eigenmodes which are strongly affected by geometry and contribute substantially to non- linearities in response. For isolated lines, nonlinearities are found to occur especially at certain widths where the eigenfunctions change rapidly with small change in width. The theory of these singular points is presented. The authors demonstrate that the coherence microscope, which uses both phase and amplitude information, has a potential advantage over brightfield and confocal microscopes in dealing with these problems. The introduction of a 'complex phase filter' in the measurement algorithm greatly reduces unwanted phase noise and its concomitant contribution to non-linearity. The ability to simulate the optical images and resulting measurement non-linearities offers promise in improving understanding and accuracy of optical metrology.
KEYWORDS: Electrons, Line scan image sensors, Metrology, Scanning electron microscopy, Photoresist materials, Silicon, Distortion, Integrated circuits, Inspection, Process control
A common cause of nonlinearity in lithographic metrology with SEMs is charge accumulation on photoresist structures surrounding the features to be measured. This phenomenon has been observed to produce strikingly different results on three low-voltage (1 kV) SEMs evaluated under different operating conditions. Features examined were isolated lines, lines in gratings, isolated spaces, and contact holes which ranged from 0.5-1.3 micrometers in 0.1 micrometers increments. Critical dimension measurements at the base of photoresist structures were obtained from image linescans using algorithms indigenous to the systems. The linescans exhibited various degrees of intensity blooming, scan asymmetry, and image inversion as a function of operating conditions.
Under low dose conditions, low-voltage SEMs can measure the bottom width of a resist profile accurately over a wide range of resist side wall angles, thus allowing a accurate, precise and fast measurement of focus-exposure matrices. It is also shown that the measured data fits a fixed parameter model well. By means of statistically leveraged experimental design techniques, a robust focus-exposure-response surface may be generated with as few as nine measurements. This approach greatly reduces measurement time, resulting in high-speed stepper setup. Results are shown for different experimental designs and different low- and-high voltage SEMs.
We show that the focal planecontrast of the confocal scanning thser microscope can be used effectively to measure
critical dimensions at the base ofsubmicrometer photoresist structures. The results, when compared with high-voltage
SEM measurements, are found to be highly feature dependent; separate threshold optimizations are required for each
case. A new criterion, incremental response, was introduced to aid in measurement system evaluation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.