THE ASTRONOMICAL BRIGHTNESS SCALE

Astronomers use a "magnitude" scale to classify objects such as stars and planets according to their perceived brightness. The first such scale we know of was devised by the Greek astronomer Hipparchus around 150 BC.

The faintest stars we can see with our eyes on a dark night have an astronomical magnitude of +6, whereas Sirius, the brightest star in the sky has a magnitude of -1. The fainter an object is, the more positive is its magnitude, whereas very bright objects have increasingly negative magnitudes.

The magnitude scale is a logarithmic scale (because this is how the eye perceives light), and each successive magnitude is 2.5 times dimmer (or brighter) than the next magnitude. This value was adopted to keep the magnitude scale similar to that of Hipparchus where first magnitude stars are about 100 times brighter than sixth magnitude stars. The formula that relates magnitude to brightness or luminosity is:

The scale is drawn below with appropriate comments.

	| -27	brightness of the sun
	|
	|
	|
	|
	|
	|
	| -12	brightness of the full moon
	|
	|
	|
	| - 4	max brightness of Venus
	|
	|
	| - 1	brightest star "Sirius"
	|
	|
	| + 6	faintest star seen with naked eye
	|
	| +10	faintest star seen with 10x50 binoculars
	|
	|
	| +16	faintest star seen with 30cm telescope
	|
	|
	|
	|
	| +26	faintest star imaged with ground telescope
	|
	| +28	faintest object imaged with Hubble telescope