Magnitude |
The Greek astronomer Hipparchus (190-120 BC) included comparative brightness in a catalog of 850 stars. The brightest stars were magnitude 1, the next brightest 2, and so on to the faintest stars, just visible to the unaided eye, which were magnitude 6. The development of visual photometers by John Herschel and others in the 1800's allowed astronomers to measure stellar intensities. An international standard was soon required. It was found in the 1830's that the eye detects differences in intensity in a nonlinear logarithmic way. In 1856 Norman Pogson suggested that a star of magnitude 1 could be defined as appearing to be 100 times brighter to the average eye than a star of magnitude 6. With this definition a one point increase in visual magnitude is equal to a 2.512 fold increase in intensity. [2.512 is the fifth root of 100]. Absolute magnitude was then defined as the apparent magnitude if a star when placed at a distance of 10 parsecs (32.6 light years). |
Absolute magnitude is the magnitude of a star if placed at a distance of 10 parsecs (32.6 light years). |