[Previous] | [Session 8] | [Next]
S. Arribas (ESA/STScI), P. Jakobsen (ESA/ESTEC), R. Fosbury (ESA/ST-ECF), W. Freudling (ESO/ST-ECF)
We discuss how the pixel scale at the detector, the facet size of the MEMS array, and the spectrograph slit dimensions (width and length) affect the performance of the Near-IR Spectrometer for NGST (NIRSPEC). We consider spectroscopic performance both in terms of single object 'sensitivity' and 'speed' (i.e. number of galaxies observed per unit of observing time), and for R=100 and 1000.
We have developed a new NGST Exposure Time Calculator optimized for spectroscopy which incorporates the following features: i) accurate slit losses based on the simulated PSFs generated by Bely et al. (2001), ii) use of 'optimal extraction' techniques for solving the S/N-exposure time equation, and iii) quantitative assessment of the multiplexing gain of the NIRSPEC.
We find that the choice of parameters that optimize single-object sensitivity are somewhat different from those that optimize 'speed' in multi-object observations of high density fields. The optimal parameters are also dependent on the spectral resolution. We find that the pixel scale which optimizes sensitivity for R=1000 is in very good agreement with the previous result of Petro & Stockman (2000), after adjusting for the different assumed telescope size.
The author(s) of this abstract have provided an email address for comments about the abstract: arribas@stsci.edu