Brightness, luminosity and the magnitude scale

In about 120 B.C., Hipparchus devised the system of quantifying the brightness of stars still in use today. He catalogued about 1000 stars visible to the unaided eye into six categories according to their apparent brightness. Each category was assigned a unit of apparent magnitude.

In 1854, the British astronomer N.R. Pogson showed that the human eye is able to perceive differences in brightness logarithmically. Each magnitude class is about 2.5 times brighter than the next brightest one.

The luminosity of a star is its intrinsic brightness, that is, the amount of energy that the star radiates per second in all directions. Also referred to as absolute brightness or absolute magnitude.

The apparent brightness of a star is how bright it actually appears to an observer. Technically, it is the amount of energy per second from a star that strikes a square centimeter of a detector (retina of the eye, photographic plate, charge-coupled device CCD) per second. Also referred to as apparent magnitude.


[back to the topics page] [back to astro 201 home page] [back to astro FAQ home page] [back to current A201 FAQ home page]