Suppose we have two stars of apparent magnitudes m_{1} and
m_{2} and absolute magnitudes M_{1} and M_{2}
that are located at distances d_{1} and d_{2} from us.
Remember the two equations that relate their brightnesses, luminosities and
magnitudes:

m_{2} - m_{1} = 2.5 log (b_{1}/b_{2})

M_{2} - M_{1} = 2.5 log (L_{1}/L_{2})

The most luminous stars have luminosities of 100,000 times that of the
sun. Therefore, we say that their luminosity is 10^{5}L_{sun}.
We can compute their absolute magnitude using:

M

_{sun}- M_{brightest}= 2.5 log (10^{5}L_{sun}/1 L_{sun}) = 2.5 log (10^{5}) = 2.5 x 5 = 12.5M

_{brightest}= +4.77 - 12.5 = -7.73

The faintest stars have a luminosity of .01 times that of the
sun. Therefore, we say that their luminosity is .01 L_{sun}.
We can compute their absolute magnitude using:

M

_{sun}- M_{faintest}= 2.5 log (.01 L_{sun}/1 L_{sun}) = 2.5 log (0.01) = 2.5 x -2 = -5M

_{brightest}= +4.77 + 5 = 9.77

How bright would the Sun be if it were at the distance of Alpha Centauri?

The Sun's apparent magnitude is -26.8. Its absolute magnitude is +4.77.

The distance to Alpha Centauri is 1.3 pc.

Therefore, the apparent magnitude that the Sun would have,
*if it were located at 1.3 pc* is:

m_{sun at 1.3 pc} = M_{sun} -5 + 5 log d = 4.77 - 5 + 5 log(1.3)

m_{sun at 1.3 pc} = 4.77 - 5 + 0.57 = +0.34

so that its apparent magnitude would be +0.34, instead of -26.8!

How far away could we detect a star like the Sun with the Hubble Space Telescope?

With HST, it is possible to see Cepheids in the galaxy M100 which have apparent magnitudes of +26. Let us adopt this as the limit of apparent brightness that HST can "see". Therefore, our problem reduces to: at what distance would the Sun have an apparent magnitude of +26?

M_{sun} = m_{HST limit} + 5 - 5 log d

5 log d = m_{HST limit} + 5 - M_{sun} =
+26 + 5 - 4.77 = 26.23

log d = 26.23/5 = 5.25

d = 10^{5.25} parsecs = 176,198 parsecs = 176 kiloparsecs (kpc).

Note that 176 kpc is about 3 time the distance to the Large Magellanic Cloud.