It's highly doubtful you could see any normal light source on the surface of the earth.
Using $$text{brightness} = frac{text{luminosity}}{4 pi times text{distance}^2}$$
(with brightness in watts, and luminosity in watts per square meter.
and distance to moon of $3.84 times 10^8$ meters.)
Try a hypothetical light source 100 megawatts output, all visible light, no heat.
$$text{brightness} = frac{100 times 10^6}{4 pi times 1.474 times 10^{17}}$$
$$text{brightness} = 5.4 times 10^{-11} text{ watts per square meter}$$ at the lunar surface.
That's pretty dim. By, contrast sunlight at earth's surface runs about 1300 watts per square meter.
In reality it'd take about a gigawatt to produce 100 megawatts of light in the visible range.
That's about what it takes to power a city of a million homes.
Cities also bounce most of the light they do produce off the ground, which'll have an albedo of somewhere around 0.3. So with ordinary city lights it'll take over 3 gigawatts to reach $5.4 times 10^{-11}$ watts per square meter on the lunar surface.
You might fare better with a big laser. The Apache Point Observatory Lunar Laser-ranging Operation picks up multi-photon signals from the Apollo retroreflectors using only a 1 gigawatt laser and a 3.5 meter telescope. As the article states, the laser beam only expands to 9.3 miles in diameter on the way to the moon, so you might see it wink at you.
At 10 parsecs, the sun has a magnitude of 4.83. It'd be visible on an average night. That magnitude corresponds to a brightness of 3X10e-10 watts per square meter, about 5.6 fold brighter than our hypothetical earth based light source. That puts our light at magnitude 6.5 to 7. Naked eye visibility runs to about 6.0
No comments:
Post a Comment