AAS 199th meeting, Washington, DC, January 2002
Session 40. HAD III: Some Controversies in the History of Astronomy
Special Session Oral, Monday, January 7, 2002, 2:00-3:30pm, State

[Previous] | [Session 40] | [Next]


[40.02] Should Astronomy Abolish Magnitudes?

K. Brecher (Boston U.)

Astronomy is riddled with a number of anachronistic and counterintuitive practices. Among these are: plotting increasing stellar temperature from right to left in the H-R diagram; giving the distances to remote astronomical objects in parsecs; and reporting the brightness of astronomical objects in magnitudes. Historical accident and observational technique, respectively, are the bases for the first two practices, and they will undoubtedly persist in the future. However, the use of magnitudes is especially egregious when essentially linear optical detectors like CCDs are used for measuring brightness, which are then reported in a logarithmic (base 2.512…!) scale. The use of magnitudes has its origin in three historical artifacts: Ptolemy’s method of reporting the brightness of stars in the "Almagest"; the 19th century need for a photographic photometry scale; and the 19th century studies by psychophysicists E. H. Weber and G. T. Fechner on the response of the human eye to light. The latter work sought to uncover the relationship between the subjective response of the human eye and brain to the objective brightness of external optical stimuli. The resulting Fechner-Weber law states that this response is logarithmic: that is, that the eye essentially takes the logarithm of the incoming optical signal. However, after more than a century of perceptual studies, most intensively by S. S. Stevens, it is now well established that this relation is not logarithmic. For naked eye detection of stars from the first to sixth magnitudes, it can be reasonably well fit by a power law with index of about 0.3. Therefore, the modern experimental studies undermine the physiological basis for the use of magnitudes in astronomy. Should the historical origins of magnitudes alone be reason enough for their continued use? Probably not, since astronomical magnitudes are based on outdated studies of human perception; make little sense in an era of linear optical detection; and provide a barrier to student and public understanding of astronomy. Perhaps it is time to add astronomical magnitudes to the dustbin of scientific history, along with caloric, phlogiston and the ether.


[Previous] | [Session 40] | [Next]