INTRODUCTION
In this note we will discuss imagery intelligence - the analysis of images of the Earth taken from space. This is now a multi-billion dollar industry both in the civilian and military arenas. It is facilitated by a large number of satellite constellations. These are in different types of low Earth orbit, geosynchronous orbit and highly elliptical orbits.
REMOTE SENSING
Earth observation from space is accomplished by means of remote sensing from satellites with electromagnetic sensors facing the Earth.
Astronomy is remote sensing of space from the Earth. Earth Observation from Space (EOS) is remote sensing of the Earth from space. Remote sensing is the only way we can gather information about objects we cannot directly touch and it is possible using the energy transmitted by electromagnetic radiation – light, infrared, UV, radio.
Most space-based remote sensing systems are passive. They use light from the Sun that is reflected from the Earth to sense conditions on the Earth’s surface. However, there is a growing interest in active systems where a satellite transmitter illuminates the Earth, usually with radio waves, that are then reflected back to the satellite. One subset of this is Synthetic Aperture Radar (SAR). The motion of the satellite is used to increase the aperture of the sensor in the direction of motion and thus achieve very high resolution of ground features.
PASSIVE EOS
The surface of the Earth receives both direct and indirect radiation from the Sun. It also radiates thermal radiation into space. The radiation from the Sun is broad spectral radiation with a maximum emission around 550 nm in the visible spectrum. The thermal radiation from the Earth is mostly in the infrared spectrum from 10 to 14 microns.

Sunlight is reflected according to the surface material it strikes. Each material modifies the light in different ways. The various properties of the light that are changed include intensity, colour & polarisation.

The image below shows a selection of the various EOS constellations.

A subset of these are the various global meteorological and climate satellites.

HISTORY
The first civilian Earth Observation Satellite was launched into orbit in 1960 by the USA. It was TIROS - the Television Infrared Observation Satellite.
This satellite was launched into low Earth orbit (LEO) with an orbital height of 520 km and an inclination of 48 degrees. It had both wide angle and narrow beam TV IR cameras.

SPATIAL RESOLUTION
Spatial resolution refers to the area on the ground represented by each pixel in the image. This limits the smallest features that can be seen in the image.
Spatial resolution is specified by the linear distance L between adjacent pixels on an image. It can range from around 10 km for full disc images of the Earth down to around 25 cm for the most detailed images available at present. L is related to the angular resolution A of the satellite camera and the distance d to the Earth’s surface by the formula L = A d .
Turbulence in the atmosphere near the ground imposes a theoretical resolution limit of around 10 cm. Thus it is not possible to read licence plates from low Earth orbit (LEO).
High resolution EOS satellites must be in LEO to achieve this resolution.
There is always a trade-off between spatial resolution and the area covered by a single image.
If an image is specified by 1000 x 1000 pixels then the area covered by the image is 1000 times the spatial resolution. So if the spatial resolution is one metre, the coverage will be only one km in each direction. Since the total surface area of the Earth is 500 million square km, it would take 500 million such images to completely cover the Earth’s surface. This explains why image analysts can often miss newly constructed features.
High resolution is not always desirable. Spatial resolution needs to be matched to the task in hand. Otherwise it is quite possible to ‘miss the wood for the trees’.
High resolution is not always the most desirable attribute. Low resolution meteorological images are often desirable to get a large coverage to give an overview of an extensive weather system.
The image below is also an example of a feature that might not be appreciated in a series of high resolution low coverage images. The feature here is the Darling river and its catchment system that extends over an extremely large area of the Australian state of New South Wales.

When water actually flows in this river system, it is from east to west (ie inland) and from north to south. It empties into the Murray river.
TEMPORAL RESOLUTION
Temporal (time) resolution refers to the interval between images of a specified area on the ground. For a single satellite this might be several days. If we need a specific sun angle this time could be even longer.
The smaller the area we are interested in the longer will be the temporal resolution.
Because of the limited temporal resolution of a single satellite , there is a move toward constellations of satellites. The best example of this is the DOVE constellation (FLOCK) by Planet Labs. This uses many low cost low orbit satellites to achieve high spatial resolution and high temporal resolution.
Temporal resolution is important when monitoring change.
SPECTRAL RESOLUTION
Passive EOS uses mainly the visible and infrared spectrum region with some atmospheric temperature profilers and active radar EOS using the microwave spectrum.
The various bands of the electromagnetic spectrum are shown below with the visible spectrum being expanded in the lower section.

Satellite sensors range in how they process and render the electromagnetic spectrum to produce the final image.
One of the simplest methods is to use a sensor that covers a wide spectral range but which renders the image as a gray scale - that is overall intensity as perceived by the sensor. This is referred to as a panchromatic sensor/image.

For a 'natural' colour rendition three bands (RGB) are commonly used. This is the traditional encoding used in domestic television. The RGB image shown below is of the famous beach at Bondi, a suburb of Sydney.

When more than 3 bands are used to render an image they are called multispectral or hyperspectral.
Multispectral imagery generally refers to 3 to 10 bands. Each band has a descriptive title. For example, the channels below include red, green, blue, near-infrared, and short-wave infrared.

Hyperspectral imagery consists of much narrower bands (10-20 nm). A hyperspectral image could have hundreds of bands. In general, they don’t have descriptive channel names.

The problem with both multispectral and particularly hyperspectral is how to display the images. RGB basically is capable of rendering all the colours that the human eye is capable of seeing. Why do we need 'extra' colours that we can't see anyway, particularly if they are outside the visible spectrum?
The need for 'extra' colours (or a finer colour resolution) is that we can manipulate them to show features that would not normally be apparent to us. And one way to render a particular attribute on an image is the use of 'false' colour. We represent the attribute in which we are interested with a colour that normally does not appear in the RGB representation of the image. An example is the image below. This is a image of the area around Perth, the capital city of Western Australia. A selection of spectral bands that is associated with high density construction is given the 'false' colour purple in the image.

IMAGE ANALYSIS - EXAMPLES
Not all Earth observations from space image surface features. Some satellites monitor atmospheric features such as the aurora. The following images are of phenomena or features that can be readily seen and measured by eye.
Aurorae can disrupt HF and satellite communications passing through an aurora.

Monitoring global cloud cover, climate patterns and change.

Tracking and predicting the movement of destructive storms and estimating wind speeds (cyclone category 1 to 5). Not all cyclones have a well defined eye and other methods must be used to estimate the storm centre and plot the storm path.

Locating active fire areas and estimating their movement, as well as the smoke. This image is an overlay of two layers. The red comes from a thermal IR image which shows the fires themselves. This is overlaid on a colour image showing the smoke clouds. Eastern Victoria and southeastern NSW.

At night the bush fires can be seen at night using an essentially infrared panchromatic image.

The following image shows the Brisbane river after peak flooding and the brown colour (top left) shows where massive flooding of urban areas remains.

The next image is an oblique wide angle view of Europe at night. This is particularly good at highlighting major cities and industrial areas. Note the cities of Madrid, Paris, London, Rome and Naples and the industrial areas of northern Italy, Benelux and western Germany, and north western England.

The next image shows the volcanic ash plume from a large eruption in Iceland. These images are particularly useful in order to redirect air traffic away from the plume. Ingestion of volcanic dust by a jet engine can often cause 'flameout' where the engine stops functioning and the aircraft may lose substantial altitude depending on how many engines are affected and whether the pilot can reignite the engine away from the plume.

Away from the volcano the ash clouds are not so easily recognised in an RGB image. However, a Perth research scientist developed a false colour algorithm that allowed such clouds to be tracked around the world as shown by this ash cloud from the eruption of a Chilean volcano. The false colour algorithm is expressed by (B2-B1)/(B2+B1) and shows up as the yellow-red colours. (B1 and B2 are the spectral bands used).

Some other applications of Earth observation from space are:
Techniques of imagery analysis may include:
Indicators to help with visual imagery analysis are:
The following images are of Australian sites unless otherwise noted. They are presented to test your imagery analysis skills. You should try and identify these features and/or locations. Test your hypotheses by searching the internet for matching labelled satellite images. The correct answers will be given in a file specified at the end of this document.
Image 1 Location?
Image 2 Feature & Location?
Image 3 Location?
Image 4 Location?
Image 5 Location (Europe)?
Image 6 Location?
Image 7 Location (International)?
Image 8 Location (international)?
Image 9 Feature & Location?
Image 10 Feature & Location?
MILITARY EARTH OBSERVATION FROM SPACE

The US National Reconnaissance Office (NRO) is the body concerned with launching EOS satellites for the USA.

US Military reconnaisance satellites have been declassified for the early years of military space history.

Specifically details of and images from the Gambit 3 system have been made publicly available.

Images taken by the camera on this satellite were released for reentry into the Earth's atmosphere where they were snatched out of the air by a hook on a C-130 Hercules aircraft.

Modern military imaging satellites use digital cameras and downlink imagery via a satellite relay system. They carry fuel to change their orbit and to position the satellite to take imagery more frequently over points of interest. They can also adjust their attitude to take non-vertical imagery if required.

MILITARY SATELLITE IMAGERY
The Cuban Missile Crisis in 1962. These are images that President Kennedy used to get the USSR to remove their missiles from Cuba and avert WWIII.

Aircraft carrier under construction in the Soviet Union 1964.

The movement of tanks in Syria in 2012.

In 2021 image analysts discover over one hundred previously unknown nuclear missile launch sites (ICBMs) in China.

Chinese military islands constructed in the South China Sea.

REFERENCES AND RESOURCES
Australian Space Academy