Chapter 9: Image Resolution (pages: 272-288)

 

  1. What are the four measures of resolution?
  2. What are mixed pixels, and how might they contribute to the problem that an increase in spatial resolution could lead to a decrease in classification accuracy?
  3. Simonett and Coiner studied the relationship between sensor resolution in the Landsat MSS and landscape detail. What did they conclude when comparing natural vs man made landscapes?
  4. How can an increase in spatial resolution decrease radiometric resolution when using traditional photographic emulsions?
  5. Match the following three terms with their definition: 1) Ground Resolved Distance (GRD), 2) Line pairs per millimeter (LPM), 3) Modulation Transfer Function (MTF).

a)      A systems response to a target array with elements of varying spatial frequency

b)      The dimensions of the smallest objects recorded on an image

c)      A measure of image resolution that uses a standardized target positioned on the ground to quantify the minimum distance that can be resolved.

 

 

1.      The text specifies four types of resolution applicable for interpreting images with a remote sensing system.  What are these four types?  How are they defined? 

2.      Image resolution, of any sort, is a function of interaction among three broad categories – Target Variables, System Variables and Operating Conditions (variables).  While these categories are not mutually exclusive (e.g. systems are often designed with certain operating conditions in mind) they provide a conceptual framework for understanding the resolution of a given image.  List as many factors as you can under each category.

 

Target Variables

System Variables

Operating Conditions

Figure/ground; contrast

Lenses/filters/etc

Altitude of sensor platform

Aspect-ratio; object shape

Film type

Speed of platform

Regularity of object shape

Angle of view/IFOV

Positional drift or perturbation of platform

Extent and number of regular objects

Detector design – radiometric or spectral

Solar angle and azimuth

Extent and uniformity of object background

Temporal coverage

Day vs night in target area

Slope/aspect relative to sensor &/or  illumination source

Look angle

Cloud cover/weather & atmospheric conditions in general

 

Gain

 

 

3.      What is the conceptual difference between Spatial Resolution and Ground Resolved Distance (GRD)?

4.      All forms of GRD-measures attempt to quantify which one image resolution category (while holding the other two constant)?    (c.f. the three categories of Q #2 above)

5.      Is it possible for an increase in spatial resolution (smaller pixel size) to give a more confused image (an image with a higher proportion of mixed pixels)?

 

 

1.      What are the four types of resolution as described by the text?

2.      The book explains what target variables are.  Pick one and briefly describe it.

3.      Name two operating conditions that effect remote sensing systems?

4.      What is GRD?

5.      Why are mixed pixels needed for mapping the Earth?

 

 

 

Chapter 6: Satellite Imaging Systems (pages: 157-203)

 

  1. What advantages do satellite sensors offer over aerial platforms of remote sensing?
  2. What is the difference between geostationary orbit and sun-synchronous orbit; why sun-synchronous orbit important to remote sensing?
  3. What are the basic support systems that are common to all earth observation satellites?
  4. Briefly explain what act was passed by congress in 1984 and 1992 that was important to remote sensing?
  5. The book offers a framework for understanding three different families or categories of satellite systems. What are they, how they different and what are some examples in each category?

 

 

  1. Define IFOV
  2. What type of orbit would be most beneficial for a meteorological or communications satellite, explain why?
  3. What is an innovative feature of the SPOT satellite and where was it designed?
  4. At what time in hours is optimal local sun time and why is it important to satellite orbit?
  5. What is the ground area represented by one MSS scene in kilometers from east to west and from north to south, and how many scan lines are present in that same MSS scene?

 

 

  1. Sun-synchronous orbits are designed to reduce what important source of variation in illumination?
  2. Pointing (orientation) errors for remote sensing satellites must be much smaller than _______?
  3. SeaWiFS primary mission is to observe _____ _____.
  4. Information displaying the outline of satellite image coverage plotted on a map is called the image _____.
  5. _____ provided the first satellite imagery of the earth's surface.