Charge-coupled devices have been the detector of choice for soft X-ray astronomy missions for many decades due to excellent energy resolution, noise performance, and longevity in space. Newer CCD-based missions require everincreasing performance which is made challenging by radiation damage inherent to the space environment. Missions such as ESA’s upcoming EUCLID observatory is aiming to measure tiny changes in the shape of distant galaxies, created by the presence of dark matter. Such high precision (not only specific to EUCLID) necessitates significant mitigation against radiation damage effects, one of which is utilising different detector operation modes such as multilevel clocking. Multi-level clocking uses three electrode voltage levels (compared to the standard two) to encourage traps within the damaged silicon to emit their charge such that they do not contribute to charge transfer losses, improving charge transfer efficiency and overall detector performance. However, multi-level clocking requires bespoke hardware to implement, followed by significant amount of testing to show that the benefit is significant.
A recent CCD optimisation technique, called the Active Trap Model, utilises knowledge of the radiation-induced defects within a CCD to optimise charge transfer performance across a wide range of variables including temperature, clocking speeds and device operation modes. This paper presents development of the Active Trap Model to predict the performance of multi-level clocking in CCDs. The performance of the model is compared to the experimental data available, namely from ESA’s PLATO1 mission, and shows good agreement between model and experimental data. The results show the versatility of the Active Trap Model and uses of the technique in potential future CCD-based space missions such as HabEx2 and LUVIOR3.The Coronagraph Instrument (CGI) will be required to operate with low signal flux for long integration times, demanding all noise sources are kept to a minimum. The Electron Multiplication (EM)-CCD has been baselined for both the imaging and spectrograph cameras due its ability to operate with sub-electron effective read noise values with appropriate multiplication gain setting. The presence of other noise sources, however, such as thermal dark signal and Clock Induced Charge (CIC), need to be characterized and mitigated. In addition, operation within a space environment will subject the device to radiation damage that will degrade the Charge Transfer Effciency (CTE) of the device throughout the mission lifetime. Irradiation at the nominal instrument operating temperature has the potential to provide the best estimate of performance degradation that will be experienced in-flight, since the final population of silicon defects has been shown to be dependent upon the temperature at which the sensor is irradiated.
Here we present initial findings from pre- and post- cryogenic irradiation testing of the e2v CCD201-20 BI EMCCD sensor, baselined for the WFIRST coronagraph instrument. The motivation for irradiation at cryogenic temperatures is discussed with reference to previous investigations of a similar nature. The results are presented in context with those from a previous room temperature irradiation investigation that was performed on a CCD201-20 operated under the same conditions. A key conclusion is that the measured performance degradation for a given proton fluence is seen to measurably differ for the cryogenic case compared to the room temperature equivalent for the conditions of this study.
Throughout the lifetime of a space-based mission the detector will be bombarded by high-energy particles and gamma rays. As time progresses, the radiation will damage the detectors, causing the Charge Transfer Efficiency (CTE) to decrease due to the creation of defects or “traps” in the silicon lattice of the detector. The defects create additional energy levels between the valence and conduction band in the silicon of the detector. Electrons or holes (for n-channel or p-channel devices respectively) that pass over the defect sites may be trapped. The trapped electrons or holes will later be emitted from the traps, subject to an emission-time constant related to the energy level of the associated defect. The capture and emission of charge from the signal leads to a characteristic trailing or “smearing” of images that must be corrected to enable the science goals of a mission to be met.
Over the past few years, great strides have been taken in the development of the pocket-pumping (or strictly-speaking “trap pumping”) technique. This technique not only allows individual defects (or traps) within the device to be located to the sub-pixel level, but it enables the investigation of the trap parameters such as the emission time constant to new levels of accuracy. Recent publications have shown the power of this technique in characterising a variety of different defects in both n- and p-channel devices and the potential for use in correction techniques, however, we are now exploring not only the trap locations and properties but the life cycle of these traps through time after irradiation. In orbit, most devices will be operating cold to suppress dark current and the devices are therefore cold whilst undergoing damage from the radiation environment. The mobility of defects varies as a function of temperature such that the mix of defects present following a cryogenic irradiation may vary significantly from that found following a room temperature irradiation or after annealing. It is therefore essential to study the trap formation and migration in orbit-like conditions and over longer timescales.
In this paper we present a selection of the latest methods and results in the trap pumping of n- and p-channel devices and demonstrate how this technique now allows us to map radiation-induced defects in CCDs through both space and time.
View contact details