Difference between revisions of "Remote Sensing"

From Wonkpedia
Jump to navigation Jump to search
 
Line 70: Line 70:
 
[[Category:Remote Sensing]]
 
[[Category:Remote Sensing]]
 
[[Category:Satellite Imagery]]
 
[[Category:Satellite Imagery]]
  +
[[Category:OSINT]]
 
 
<references />
 
<references />

Latest revision as of 11:00, 16 November 2022

Remote Sensing is a generic term for types of satellite or high-altitude aviation imagery that contains bands outside of the visual spectrum, or uses a novel method of capturing images that the human eye would otherwise be unable to see.

"Where we're going, we won't need eyes to see" - Sam Neill, Event Horizon

Sources

As with most imagery, remote sensing data tends to fall into one of two categories: freely available and low resolution (to a technical user), or high quality and easy to use but absurdly expensive. If you have to ask how expensive, odds are you simply don't have the money.[1] This is all to assume that you can even open the data and use it- the software to do so is either proprietary, clunky, or both.[Citation NOT needed but provided anyways][2]

However, as time goes on, more and more is slowly becoming available.[3] (Todo: if anyone can figure out the prices for various providers, here would be a good place to put them)

Free

  • NASA (EarthExplorer, various)
  • ESA (Sentinel-1)

  • Capella (SAR)
  • Hawkeye 360
  • ICEYE

Types

SAR (Synthetic Aperture Radar)

A satellite in space shines radar light on the Earth and records the result, and using special processing fairly high resolution spatial images can be created. However, while the results look similar to images recorded by a normal camera, they are not actually formed by anything close to the same process as what normal cameras use, and special care must be taken when interpreting because the intuition you have from using your eyes all day does not always apply.

Theory of Synthetic Aperture Radar

The only real source of illumination is the satellite itself, so it can take images whenever it wants. Clouds are not an issue.

In principle these cover the range ~ 6mm–60mm / 5 GHz–60 GHz (I've seen mentions of C and L band radars being used on wikipedia). That said, in practice a satellite only observes at a narrow range of frequencies around the band it is transmitting on, and those bands tend to be narrow.

NIR (Near-Infrared)

The remote sensor passively collects light in this range and forms an image largely like how we are used to cameras working in the Optical.

Covers the range 0.75–1.4 μm / 214–400 THz.

The main source of illumination is the sun, so I think these images are only taken at day, clouds are an issue.

Many satellite imagery providers (e.g. Planet 0.5m) include NIR bands. NIR puts particular emphasis on plants and vegetation, and can be used with other bands to detect regions of ice, vegetation, fires, etc. A common method of analysis is to simply use the imagery as you would optical, but replace the "red" band (e.g. in Qgis) with the NIR data.

Optical

The remote sensor passively collects light in this range and forms an image exactly like how we are used to.

Covers the range 400 nm–700 nm / 430 THz – 750 THz.

The main source of illumination is the sun, so I images are only taken at day and clouds are an issue.

UV to X-ray

The atmosphere is not transparent at these wavelengths, so there are no Earth observing UV or x-ray satellites :(

Gamma-rays

The Vela satellites monitored the earth for gamma-ray flashes emitted by a nuclear weapon tests during the 1960's and 1970's.

However, the sun is a weak source of gamma rays, the atmosphere is fairly opaque, and gamma rays are not great at being reflected back towards where they came from, so after atmospheric testing of nuclear weapons stopped there has no more earth observing gamma-ray satellites. :(

GPS Mediated Ionosphere Monitoring

By processing recordings of the GPS signals from satellites, it is possible to identify changes in the electron content of the ionosphere, and by monitoring it detect the travelling ionospheric disturbances caused by rocket launches.


Social Media

If you're new, you wouldn't believe the stuff people posts on social media.

Truly this is more of a "near" sensing method since usually the most interesting material are images and video posted by people on the ground and nearby to the events. Nevertheless, this is highly valuable data can be collected by people very far from the events of interest.

Other

todo: talk about how you can detect ice, snow, fire, etc. by doing veeery specific band math in e.g. qgis. Maybe write a guide on it?

References