Due to their small dimensions when used for visible light, the patterning of photonic crystals has only been possible
with costly electron beam lithography and low throughput R&D and pilot production grade imprint lithography. This
paper will focus on results from a high throughput imprint tool capable of processing over 20 wafers per hour on 50-
100mm sapphire, GaAs, SiC, Ge and metal substrates. An overview of the process used as well as the results of
patterning photonic crystal patterns on sapphire wafers and etching them into a SiO2 hard mask will be presented.
Finally an analysis of the cost of ownership which currently stands at ~ $20/wf (<$0.01/mm2 for 50mm wafers) will be
presented and opportunities for improvement discussed.
KEYWORDS: Single crystal X-ray diffraction, Critical dimension metrology, Semiconducting wafers, Scanning electron microscopy, Process control, Spectroscopy, Spectroscopes, Metrology, Precision measurement, Control systems
Smaller device dimensions and tighter process control windows have created a need for CD metrology tools having higher levels of precision and accuracy. Furthermore, the need to detect and measure changes in feature profiles is becoming critical to in-line process control and stepper evaluation for sub-0.18micrometers technology. Spectroscopic CD (SCDTM) is an optical metrology technique that can address these needs. This work describes the use of a spectroscopic CD metrology tool to measure and characterize the focus and exposure windows for the process. The results include comparison to the established in-line CD-SEM, as well as a cross-section SEM. Repeatability and long-term stability data form a gate level nominal process are also presented.
The continuing demand for higher frequency microprocessors and larger memory arrays has led to decreasing device dimensions and smaller process control windows. Decreasing process control windows have created a need for higher precision metrology to maintain an acceptable precision to tolerance ratio with a reasonable sampling rate. In order to determine and reduce across chip, across wafer, and across lot linewidth variations, higher sampling is required which, in turn, demands faster move acquire measure (MAM) times to maintain throughput. Finally, the need to detect and quantify sidewall angle changes in addition to CD measurements is becoming critical. Spectroscopic Scatterometry is a metrology technique which offers the potential to meet these requirements. This work explores some of the fundamental technology concerns for implementing scatterometry in a manufacturing environment. These concerns include mark requirements and characterization necessary for library generation. Comparison of scatterometry data to in-line CD SEM, x-section SEM, and AFM results will be presented.
It is well known that systematic within-chip dimension (CD) errors can strongly influence product yield and performance, especially in the case of microprocessors. It has been shown that this across chip linewidth variation (ACLV) dominates the CD error budge, and is comprised of multiple systematic and random effects, including substrate reflectivity, reticle CD errors, feature proximity, and lens aberrations. These effects have material, equipment, and process dependencies, with the results being that significant ACLV differences between nominally identical tools/processes can in some cases be observed. We present here a new analysis approach which allows for optimization of exposure/defocus conditions to minimize overall CD errors for a given process. Emphasis is on control of [(mean) + 3 sigma] of CD errors for a given exposure/defocus conditions. Input metrology data is obtained from electrical resistance probing, and data is presented for multiple 248 nm DUV processes and tools with CD ground rules ranging from 180 nm to 140 nm.
The traditional approach for CD and overlay control in lithography has been based upon statistical control of the critical inputs to the lithographic process. This SPC approach has the disadvantage that the process equipment must be taken out of manufacturing whenever a parameter goes out of control, so that the root cause may be diagnosed and addressed. In the case of leading-edge lithography, it is often not trivial to determine the cause of such disturbances, and productivity can be greatly increased if output data is used to dynamically tune the system inputs. We have successfully implemented a fully automated, closed-loop CD and overlay control system in manufacturing for both I-line and DUV lithography. This system features automatic metrology data upload, host control of stepper/track clusters, and utilizes tool-based lot data for manipulation of future lot inputs. CD control to within 1 nm of target and less than 20 nm 3(sigma) lot to lot variability has been demonstrated. Mean overlay errors of less than 50 nm have been realized as well. Process Cpk values were improved in some cases by more than 50% with implementation of the controller.
DUV scanning exposure systems have been steadily gaining market acceptance for the past five years, and soon, all major suppliers will offer 248-nm scanning tools. One of the major reasons for the emergence of this technology has been the purported improvement in critical dimension (CD) uniformity across the scanned field versus what can be realized in a full field stepper. Using high precision electrical resistance CD metrology, we have characterized the across field CD control capability of several DUV scanning tools and DUV steppers. Analysis is carried out through focus for multiple linetypes representing various orientations and nearest-neighbor proximities. Where possible, different NA/(sigma) combinations are examined as well. Surprisingly good full field sub-0.20 micrometers CD control is obtained even for 0.50 NA, and higher NA allows for non zero process latitude at 0.14 micrometers geometries. While it was initially anticipated that 193 nm ArF lithography would be required for 0.18 micrometers technology manufacturing, it has become apparent that 248 nm lithography will be employed for these groundrules, particularly for logic applications with predominantly semi-isolated features.
This paper describes a technique for quickly achieving photolithography and etch critical dimension (CD) targets using measurement data from an automated CD SEM and a statistical analysis package. The experimental dataset is created from the measured CD response as a function of process input parameters that have been varied in a controlled fashion across a wafer or on multiple wafers. The resulting model is displayed as an array of "prediction profiles" that allow interactive variation of the model components to simulate the response of CD to input changes. The outputs of the analysis are used to determine optimal processing conditions and to provide an estimate of process latitude.
Keywords: Automated CD SEM Metrology, Lithography Process Characterization
This paper presents the methodology used to perform an evaluation of automated CD metrology SEMs for sub-half micron process control. The paper describes the evaluation strategy, the procedure used to collect and analyze the evaluation data and concludes with recommendations on how the procedure can be improved. The evaluation process was designed to estimate metrology capability and review specific application requirements envisioned for a leading edge semiconductor development and manufacturing facility at Motorola. The evaluation process consisted of a quantitative evaluation of measurement performance specifically examining the reproducibility, linearity, automation success rate, and throughput of the instrument. In addition, capabilities such as user interface, computer integration, job transportability, and technology roadmap were assessed qualitatively. Although a particular evaluation of automated CD SEMs is considered here, the principles used to develop the evaluation procedure can be applied to metrology tools in general. A discussion of the application and desired functionality of CD metrology instrumentation including performance criteria is presented for completeness.
This paper describes a new method for monitoring the performance of metrology systems. The objective of the technique is to both identify deviant performance and estimate the likely cause. The method is driven by the assumption that all variation in a measurement system is systematic until proven random. This assumption in turn guides the choice of sampling plan and analysis to extract the maximum amount of information possible from a given number of measurements. Diagnostic capability is achieved by selecting a sampling plan which yields data for estimates of all known error modes of the instrument. Consequently, somewhat larger sampling plans are required and the technique is generally more appropriate for automated measurement systems. The concept is illustrated by example using automated scanning electron microscopes used for critical dimension measurement in the semiconductor manufacturing process. Experimental results illustrating the response of the monitor to programmed deviation are presented.
KEYWORDS: Edge detection, Detection and tracking algorithms, Scanning electron microscopy, Signal processing, Algorithm development, Electron microscopes, Signal detection, Photomicroscopy, Algorithms, Nonlinear optics
This paper discusses nonlinear behavior in SEM CD measurements stemming from the interaction of the edge detection algorithm and systematic changes in the appearance of the secondary electron signal. A first-order theory is developed which describes the effect of changes in apparent sidewall slope and baseline level on common edge detection algorithms. Specifically the theory predicts nonlinear behavior for linear approximation and threshold edge detection algorithms. Experimental verification of the effect is presented. Systematic increases in linewidth of 10% are frequently encountered in practice when making measurements in the sub 0.75 micrometers regime.
The considerations which drive an expert system for assisting in measurement system characterization are described. The expert system employs several novel techniques for evaluating the integrity of a characterization analysis by determining the degree to which critical assumptions are satisfied and flagging weak points in the data collection or analysis procedure. The properties of good characterization sampling plans are derived. Methods for formulating reliable characterization studies are described. The paper focuses on short term studies intended for equipment comparisons and calibrations; however, with minor alterations it can be expanded to include longer term stability studies.
KEYWORDS: Calibration, Error analysis, Statistical analysis, Metrology, Integrated circuits, Inspection, Process control, Data modeling, Information operations, Head
A new figure of merit, the critical dimension capability factor, of CDC, is described. The CDC incorporates measurements taken over a range of linewidths and over a range of process variations which simulate normal and extreme process operating conditions. Under these conditions CDC uniquely quantifies the capability of the measurement instrument on a given substrate and for a given set of parameter settings. CDC is calculated by performing a linear regression between measurements generated by the instrument under test (IUT) and a set of reference values (internally generated standard values). The mean square error (MSE) between the regression line and the observed values is then partitioned into components which estimate the contribution to the MSE from various sources based on a rigorous statistical analysis. The final CDC value is defined as the linewidth to uncertainty ratio and is a function of uncertainty introduced in the characterization procedure as well as the uncertainty introduced when the IUT makes a measurement in practice. Since the CDC is a function of the overall uncertainty in the measurements of the IUT relative to the reference values, it can legitimately be compared from one instrument to another and used to evaluate alternative measurement methods and technologies.
Methods for obtaining low noise, straight, uniform fringes with a grating interferometer are described. It is shown
that straight, uniform fringes can be obtained exactly, even when the interfering beams are not collimated, but are
diverging.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.