The laser damage performance of optical components is often limited by the presence of sparse defects rather than intrinsic material properties. In this regime, it is costly to perform destructive laser damage testing over areas large enough to make high-confidence statements of damage likelihood in non-tested parts or regions. Instead, one may record non-destructively the sizes and locations of all defects over a much larger area. It is also straightforward to do selective laser damage testing centered on defects (and defect-free sites) in a subregion. This latter measurement will yield a table that quantifies damage probability as a function of fluence and defect size. Combining the complete defect map and the damage probability table allows laser damage prediction at every location over the whole area of interest. In this paper large-area defect mapping of real-world coated optics is combined with previously established damage probability tables. The defect-driven contribution is shown to enhance the predictive power of the simulations as judged by standard damage testing.
Standard techniques for characterizing laser damage are ill-suited to the regime in which sparse defects form the dominant damage mechanism. Previous work on this problem using REO’s automated laser damage threshold test system has included linking damage events in HfO2/SiO2 high reflector coatings with visible pre-existing defects, and using a probability per defect based on size and local fluence to generate predictions of damage events in subsequent coating runs. However, in all this work the test sites were always in a predefined array, and the association of defects with damage events was done only after the fact. In an effort to make this process both more efficient and less susceptible to uncertainties, we have now developed an adaptive test strategy that puts defect identification and analysis into the loop. A map of defect locations and sizes on a test surface is compiled, and a set of test sites and corresponding fluences based on that map is then generated. With defects of interest now centered on the damaging beam, the problem of higher-order spatial variation in the beam profile is greatly reduced. Test sites in zones with no detectable defects are also included. This technique allows for the test regimen to be tailored to the specific surface under consideration. We report on characterization of a variety of coating materials and designs with this adaptive method.
Laser damage testing is widely utilized by laser and laser system builders to ensure the reliability of their products.
When damage is due primarily to sparse defects, the relatively limited data sets acquired under typical testing protocols
tend to imply that laser damage probabilities go to zero below some reported damage threshold. However, this is rarely
an accurate picture of the actual damage characteristics of the sample set. This study attempts to establish a correlation
between observed coating defects and laser damage (from a 1064 nm laser in the nanosecond regime), utilizing a large
sample size from a single coating run, together with the actual fluence levels present at the defect sites. This correlation
is then used to predict damage for optics coated under different circumstances. Results indicate that it might be possible
to develop an alternate methodology for determining damage characteristics, based on observed defects, which is both
more reliable and less time-consuming than traditional laser damage testing.
Laser induced damage of optical components is a concern in many applications in the commercial, scientific and military
market sectors. Numerous component manufacturers supply “high laser damage threshold” (HLDT) optics to meet the
needs of this market, and consumers pay a premium price for these products. While there’s no question that HLDT
optics are manufactured to more rigorous standards (and are therefore inherently more expensive) than conventional
products, it is not clear how this added expense translates directly into better performance. This is because the standard
methods for evaluating laser damage, and the underlying assumptions about the validity of traditional laser damage
testing, are flawed. In particular, the surface and coating defects that generally lead to laser damage (in many laserparameter
regimes of interest) are widely distributed over the component surface with large spaces in between them. As
a result, laser damage testing typically doesn’t include enough of these defects to achieve the sample sizes necessary to
make its results statistically meaningful. The result is a poor correlation between defect characteristics and damage
events. This paper establishes specifically why this is the case, and provides some indication of what might be done to
remedy the problem.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.