The human race has had a limited view of the sky, and the cosmos beyond. At first we had our own eyesight, then lenses, then telescopes, then non-visual spectrum sensors, and now a few satellite platforms for the same technologies.

As the move to habitation of the Moon and Mars moves forward, what does this mean for our cosmological viewing capabilities?

There is no atmosphere to obscure our telescopic vision from the Moon.

Mars' atmosphere is far less dense than Earth's, so that's also less-occluded view of the sky from Mars.

So, considering that some of our most interesting observations have been made by ganging up several terrestrial telescopes and using them in unison, what further capabilities might we gain from having permanent viewing platforms on Earth, the Moon, and Mars?

Are the orbital mechanics too difficult? While it is possible to calculate the correct viewing time/location on both Earth and Mars, would the window of opportunity simply be too small to view a single point in the sky from both vantage points simultaneously?

Do we leverage more asynchronous viewing, like they did recently with the black hole photos? (Have all three viewing platforms take images of Location X, then composite the images together later?)

It seems like this "tripod for cosmic imaging" idea would allow us "better" views of the galaxy... but I'm not smart enough to work out what "better" means, or what additional capabilities might be opened up when we have viewing platforms on multiple planetary bodies.

Anyone wanna try and think this through with me?

kleinbl00:

francopoli to back me up 'cuz that d00d has forgotten more about telescopes than I'll ever know and am_Unition because he does space observations for a living and stuff but

adding scopes up optically doesn't gain you that much. I mean it sort of does but at that point you're basically getting non-visual data in the optical wavelength, it's not like you get a better photograph out of it.

Deformable mirrors and laser guide stars have basically allowed an entire generation of telescopes to stay relevant. You can virtually eliminate atmospheric perturbation with one. I went deep down the rabbit hole on this one because I figured if I was able to cobble my own MEMS array I could spy on spy satellites and shit which would be SUPER COOOOL but then I got to talkin' with the guys at Palomar and when you're talking "guide star" and "atmosphere" you're no longer talking "eBay lasers" you're talking "notify the FAA when you turn it on lasers" and that was about the time I realized that optics is kind of a settled science and that the engineers are applying materials science and state-of-the-art processing and manufacturing techniques faster than you really hear about as an amateur.

I mean obviously big damn mirror in outer space is always going to win. People have been suggesting telescopes on the far side of the moon for 60 years or more. When you say "things are going to be super-great if we can ever put a 20-foot mirror out there at like, Lagrange 2 or some shit so that the thing is completely isolated" the astronomers will say "it already has a Twitter account." And when you say "asynchronous viewing" they say "you mean like when we look at something in December and then again in June so that our light-gathering is 2AU wide?"

The black hole "photo" wasn't a triumph of imaging, it was a triumph of DSP. The signal they were working with wasn't anything new, really, it was the gob-smacking amount of post-processing necessary to cull the noise. From an imaging standpoint, the JWST is gonna be sumpin' else.


posted 1812 days ago