You will use 4 CCD simulations below to "observe" a star. Each simulation mimics 4 kinds of observing conditions. These conditions can represent bright, light polluted skies, lots of atmospheric turbulence, Hubble-like perfect conditions etc. You will first determine if your detector for each of the 4 cases can find the star based on certain requirements listed below, and then you will try to determine the conditions of the sky.
Procedure:
Determine the minimum exposure time required to believe that you can even "see" the star, using just your eye, with your virtual detector.
Determine the minimum exposure time so that the difference between the numbers in the green box labelled mean and red box labelled mean is equal or greater to the sky standard deviation (error) in the red box Remember, the star is centered in the green box while the detector background (e.g. the brightness of the virtual night sky in each case of the simulation) is registered in the red box.
What happens to the difference between the green and red mean values when you increase the exposure time for each detector? Is there a significant change?
From this exercise it should be
apparent to you that for any given exposure of the sky with any
given telescope plus detector there will be many stars that are
simply too faint to register on the detector and different detectors
will require different amounts of exposure time to produce
similar quality data.
More Questions