I'm planning on writing a post soon about my recent work on variable stars near the Galactic center, but I've come to the realization that such a post will invariably detour into a healthy rant about the magnitude system astronomers use to describe how bright objects are. So, in the interest of keeping this forthcoming post more on topic, I'll rant about explain magnitudes now. (Yes, two science-y posts in rapid succession; please contain your excitement.)
When astronomers talk about how bright something is, we use "magnitudes" (which are just plain old numbers from, say, -30 to 30) instead of fluxes or luminosities, which are ungainly things with units, like 1400 watts/m2 or 1039 erg/s or 1032 watts. The main reason magnitudes are nice is because they are logarithmic; small numbers are easier to work with and think about than big numbers. Logarithmic just means "3" instead of "1000" or "17" instead of "100000000000000000." The brightness of astronomical objects varies a whole lot, so, logarithmic is nice. Except that the first weirdness of the magnitude system is that instead of saying, magnitude = log(flux), like sensible people would do, astronomers instead say, magnitude = -2.5log(flux) + constant. The constant, whatever, but -2.5?? Seriously? This is the kind of crap that keeps physicists at bay. First of all, the "2.5" means that "an order of magnitude" change in flux (i.e., a factor of 10) is actually a change of 2.5 in magnitudes—or, rather, a change of brightness by one magnitude does not mean that the brightness changed by an order of magnitude. I can almost get past that nomenclature mishap (mostly because there are much much worse problems elsewhere in astronomy) by reasoning that we'd have to use more digits if the 2.5 wasn't there; it's similar to how a "degree" in the Celsius and Farenheit systems don't refer to the same difference in temperature.
But the negative sign. Let's discuss the fact that there is a negative sign in that there equation. This means that a magnitude 10 star is much fainter than a magnitude 5 star. Fainter. As in, less light, because it's got a higher magnitude. ?!@#~!?!?$%#??!! Who the hell ever thought that was a good idea? Actually, it was the Greeks, and if this were a real lesson rather than a thinly-veiled rant, I'd probably explain it to you rather than just handing out a pointer to the wikipedia article on apparent magnitudes. Of course it was the Greeks, what we in the business refer to as a "historical artifact." Historical artifact my ass. When you get into astronomy, they laud it as this amazingly wonderfully rich subject in part because it's the oldest blahblahblah, but what they fail to mention is that what this really means is that we aren't just carrying around the baggage of decades of people naming and classifying things before they knew what the hell they were, but some of the stuff in our closet, nay, our very foundation, is the result of a bunch of guys who died thousands of years ago (and probably wore bedsheets when they were alive in the first place).