Keywords: tdi
Summary
This demo shows how to configure a 2D array as a 1D pushbroom array with multiple stages of time-delayed integration (TDI).
Related Materials
The following demos, manuals and tutorials can provide additional information about the topics at the focus of this demo:
-
Related Demos
-
None.
-
-
Related Manuals
-
The BasicPlatform plugin manual.
-
-
Related Tutorials
-
None.
-
Details
Time-delayed integration (TDI) is a strategy to increase signal-to-noise ratio (SNR) by effectively increasing the integration time for each pixel. Rather than using a single pixel with a long integration time, TDI uses a set of pixels that will image the same location over a period of time. This is usually utilized in a pushbroom collection system, where a 1D array is scanned across a scene using platform motion to construct the 2nd dimension of the image. TDI is typically accomplished by using a 2D array in pushbroom mode and using the 2nd dimension of that 2D array as TDI "stages" that will be used to re-image the same location as the system as the array is scanned by the platform in the along-track dimension. The figure below illustrates this concept [1].

Important Files
The focus of this demo is to compare different sensor configurations
and the impacts of noise in those configurations. The noise is
introduced by the built-in detector model available in the DIRSIG5
BasicPlatform
plugin. Below is the baseline temporal integration and detector model
configuration in the .platform
file that is manipulated across the
various simulations.
<focalplane> <capturemethod> <imagefile areaunits="cm2" fluxunits="photonspersecond"> <basename>short_int</basename> <extension>img</extension> <schedule>simulation</schedule> <datatype>12</datatype> </imagefile> ... <temporalintegration tdi="false"> <time>2.0e-05</time> <samples>1</samples> </temporalintegration> <detectormodel> <quantumefficiency>1.0</quantumefficiency> <readnoise>20</readnoise> <darkcurrentdensity>8.0e-07</darkcurrentdensity> <minelectrons>0</minelectrons> <maxelectrons>10e+02</maxelectrons> <bitdepth>12</bitdepth> </detectormodel> ... </capturemethod> <detectorarray spatialunits="microns"> <clock type="independent" temporalunits="hertz"> <rate>50000</rate> <offset>0</offset> </clock> <xelementcount>250</xelementcount> <yelementcount>1</yelementcount> <xelementsize>2.00000</xelementsize> <yelementsize>2.00000</yelementsize> <xelementspacing>2.00000</xelementspacing> <yelementspacing>2.00000</yelementspacing> <xarrayoffset>0.000000</xarrayoffset> <yarrayoffset>0.000000</yarrayoffset> <xflipaxis>1</xflipaxis> <yflipaxis>0</yflipaxis> </detectorarray> </focalplane>
The array clock rate for all the sensors is 50 kHz (see the <clock>
element), which means the array is read out every 20 microseconds
(1 / 50000 = 20e-06 seconds).
Since the A/D convertor is setup to produce 12-bit data (see the
<bitdepth>
element), the output image <datatype>
has been set to
12
, which corresponds to 16-bit unsigned integer data.
Note that the tdi
attribute in the <temporalintegration>
is what
controls if the Y dimension of the array is used for TDI.
The noise sources (Shot noise, dark current noise and read noise) are constant across all the simulations. The key parameters that will be manipulated are related to the array dimensions, temporal integration and the enabling of TDI. The table below summarizes the differences between the four simulations included in this demo:
Name | Array Size | Integration Time | Max Electrons | TDI Flag |
---|---|---|---|---|
|
250 x 1 |
N/A |
N/A |
N/A |
|
250 x 1 |
2.0e-05 |
1000 |
false |
|
250 x 1 |
1.6e-04 |
8000 |
false |
|
250 x 8 |
2.0e-05 |
8000 |
true |
No Noise
The baseline in this demo is the "no noise" simulation, which uses a 1D array without temporal integration (radiance mode) and without the detector model configured. The resulting image features no blurring due to any temporal integration (motion related) effects and has no noise.
Short Integration Time
The detector array in this scenario only has a single row of pixels.
The short integration time setup uses a 20 microsecond integration
time and the A/D convertor has the max electrons set to 1000
.
This simulation will exhibit a baseline signal-to-noise ratio (SNR).
Long Integration Time
The detector array in this scenario also has a single row of pixels.
The long integration time setup uses a 160 microsecond integration
time, which is 8x longer than the short integration time. Since
it will integrate 8x more signal, the A/D convertor max electrons
is set to 8000
, which is 8x larger than the short integration
scenario. The goal of the longer integration time is to reduce the
baseline SNR. However, the longer integration time will impact the
image quality by blurring the image in the along track direction.
Time Delayed Integration
The detector array in this scenario has 8 rows of pixels that will
be used for the time-delayed integration. The integration time for
this setup is the same as the short integration setup, but to achieve
a similar SNR as the 8x longer integration time the sensor uses
those 8 rows of pixels as 8 stages of TDI. Since the effective
integration time (across all 8 stages) is 8x longer than the short
integration time, the A/D max electons is the same as the long
integration time value of 8000
.
Note
|
The number of TDI stages is a user-defined parameter supplied
directly via the Y dimension (the default along-track) of the
detector array. The 8 stages used in this example is arbitrary.
Modern commercial systems (e.g. WorldView-3) can feature as many
as 32 stages of TDI in some band arrays.
|
Setup
To run the four simulations, perform the following steps:
-
Run the DIRSIG
no_noise.jsim
file. -
Run the DIRSIG
short_int.jsim
file. -
Run the DIRSIG
long_int.jsim
file. -
Run the DIRSIG
tdi.jsim
file.
Note
|
Because these are advanced DIRSIG5 simulations using JSIM configurations, the simulations need to be run from the command line. |
Results
The output images from the simulations are shown below (scaled using a simple min/max scaling):
no_noise
, short_int
, long_int
and tdi
simulations (left to right).
The no_noise
image reflects the pristine (ideal) image that we would
expect. There are no temporal integration blurring effects (from
either sensor or scene motion), however the spinning rotor on the
helicopter are slightly distorted due to the time offset from line-to-line
in this pushbroom-style collection. In addition, there is no noise
present in the simulation. This is best observed in the uniform, white
areas of the target panels which feature no variation. Note that in
the "radiance mode" used by this simulation, even photon arrival (aka
Shot noise) is not included in the simulation.
The short_int
output includes the impacts of both temporal
integration and the detector model noise sources. The most obvious
impact is due to the detector model noise, which is introducing
significant variation across the scene. This is most easily observed
in the white areas of the target panels.
The long_int
output shows the expected improvement in the noise
(again, most easily observed in the white areas of the target
panels). However, the long integration time is so long that there
is a significant amount of blurring in the along-track (vertical)
axis of the image.
The tdi
output illustrates the advantage of this approach, by
featuring the reduced noise levels of the long integration time but
with the lack of blurring effects seen in the short integration
time scenario. However that TDI will still introduce blurring
effects in moving objects like the helicopter rotor blades.
The table below contains the mean and standard deviation for a region of the white part of the tr-bar target from each simulation.
Name | Mean (counts) | Stddev (counts) |
---|---|---|
Short (1x) Integration |
1.292422e+03 |
1.094757e+02 |
Long (8x) Integration |
1.281355e+03 |
7.143720e+01 |
TDI 8 stages |
1.295832e+03 |
2.770331e+01 |
The mean values for all the simulations are the same. Note that without the change in the A/D max electrons for the long integration and TDI simulations, those two simulations would have a mean that is 8x greater than the short integration since they have effectively an 8x longer integation time. The standard deviation for the short integration is the highest. The long integration reduces the noise by 1.5x and the TDI approaches reduces is by 3.9x. If the noise is normally distributed we would expect the noise to reduce by the square root of the increased integration time (sqrt(8) = 2.828).