Magnitude of an Astronomical Object
- by NASA Marshall Space Flight Center and ScienceIQ.com
'Visual magnitude' is a scale used by astronomers to measure the brightness of a star. The term 'visual' means the brightness is being measured in the visible part of the spectrum, the part you can see with your eye (usually around 5500 angstroms). The first known catalogue of stars was made by the Greek Astronomer Hipparchus in about 120 B.C. and contained 1080 stars. It was later edited and increased to 1022 stars by Ptolemy in a famous catalogue known as the 'Almagest'. Hipparchus listed the stars that could be seen in each constellation, described their positions, and rated their brightness on a scale of 1 to 6, the brightest being 1. This method of describing the brightness of a star survives today. Of course, Hipparchus had no telescope, and so could only see stars as dim as 6th magnitude, but today we can see stars with ground-based telescopes down to about 22nd magnitude.
When astronomers began to accurately measure the brightness of stars using instruments, it was found that each magnitude is about 2.5 times brighter than the next greater magnitude. This means a difference in magnitudes of 5 units (from magnitude 1 to magnitude 6, for example) corresponds to a change in brightness of 100 times. With equipment to make more accurate measurements, astronomers were able to assign stars decimal values, like 2.75, rather than rounding off to magnitude 2 or 3. There are stars brighter than magnitude 1. The star Vega (alpha Lyrae) has a visual magnitude of 0. There are a few stars brighter than Vega. Their magnitudes will be negative.
Astronomers usually refer to 'apparent magnitudes', that is, how bright a star appears to us here at Earth. Apparent magnitudes are often written with a lower case 'm' (like 3.24m). The brightness of a star depends not only on how bright it actually is, but also on how far away it is. For example, a street light appears very bright directly underneath it, but not as bright if it's 1/2 a mile away down the road. Therefore, astronomers developed the 'absolute' brightness scale. Absolute magnitude is defined as how bright a star would appear if it were exactly 10 parsecs (about 33 light years) away from Earth. For example, the Sun has an apparent magnitude of -26.7 (because it's very, very close) and an absolute magnitude of +4.8. Absolute magnitudes are often written with a capital (upper case) 'M'.