I have seen the Milky Way with my naked eye under truly dark skies (middle of oceans and the like) that is 90-99% (in terms of data observed) of what I've seen in the very best images: https://capturetheat...lky-way-images/ - these images are stretched with longer exposures and taken under conditions where the naked eye may not see as much, but
I think they're really only capturing what we can mostly get if you keep staring at it for a while in a really really dark place. I've not seen any picture of the MW more impressive than what I've seen with my naked eye (more detailed perhaps, but more impressive, no). Like I illustrate with the macromolecule example below, both have value.
So that's all I'm saying, that as we get closer to these objects, we could see what we currently see of the Milky Way with the best conditions or perhaps with very lower power visual aids. I don't think the MW looks faint at all. Even in my rural backyard right now if I stare at it for an hour or so I can get about 50-70% of these views. I'm always struck by how similar what I see with my naked eye and the MW shots are. It's just a lot to take in at once and it requires sustained sleeping in the backyard for many seasons but you can see structure and colour.
I didn't say anything about a telescope bringing an object closer---that's you misunderstanding what I meant (I meant what we see in a telescope visually is okay/fair). I have lower power magnification scopes for my cell phone and cheap binocular gifts: these like 3x to 10x. When I take a picture from far away and go near and see it (without the low power telescope) then they do look the same. So I'm saying if we now do 500x telescope and we need a 50x telescope if we get closer, it is okay. I understand how a telescope works but at the same time, distance does affect what happens to the light that travels to your eye and if you're closer, you don't need these low power 3x to 10x visual aids - I can see with my naked eye what I can see from far away. I've verified this since I can see the city of Toronto from my bedroom window across the Lake Ontario which is ~50 miles wide. So I can see it with my naked eye on a clear day, I can see it better with the 3x and 10x scopes, and as I get closer to the city I see a very very similar view but much better detailed.
Here's an example of a normal cell phone picture without any zooming from the edge of our property: https://photos.app.g...uT2p6BUYDTL5Nk9 (look carefully you can see Toronto in the far distance - this is what I see with my naked eye on a clear day with my glasses). This is what it looks like using the cell phone camera zoom: https://photos.app.g...mcS7PFLVbwSrNcA - and as you drive (or boat) to Toronto and get closer, this zoomed view is what you see but even better.
But anyways, my point was that using visual aids like telescopes isn't stretching the data to a point of distortion. I don't think it's that different from wearing glasses or wearing solar glasses to view a solar eclipse, etc. In the end, the same photons are going into your eye. But anyways, this does get to be a slippery slope. Is EAA the same as seeing what we'd see visually? But I'm more flexible about what is acceptable when it comes to purity of "visual only" argument. I'm saying telescopes used visually are fine but then people could ask why stop there. But in EAA, we're seeing processed data, whereas in the telescope we're observing the same photons.
Finally this visual requirement is a red herring when it comes to the point about science. We don't "see" nanoscopic objects but we routinely image them and represent them in a computer with pretty pictures. I got into protein structure modelling purely because I consider proteins to be beautiful molecules but as I said before, what we do with these representations is really a lot like what we do with AP data: graphical viewers of macromolecules have tons of options where you can produce pretty pictures in so many ways and none of these are what they look like visually - they're just balls of atoms - but these visual representations do have their utility in addition to being pretty pictures. Go here to see what I mean: https://www.rcsb.org/
But nonetheless, these visual representations are of what we consider to be the "true" underlying data and both these visual representations and the underlying data have use in science. Mostly the data is what is processed further but the visual stuff lets the human do science using their own neural networks (i.e., their brain).
How's the data we collect on an AP object different from the data collected from an x-ray diffraction experiment of a protein structure? How's the processing of it to produce pretty pictures different? The problem is that the data we collect in AP is wasted in terms of science: we don't have a use for it whereas with protein structures, it is hard to get and there is a use for it (to model it, to discover drugs using it, etc.). If we could find a use for the AP data collected by the average person like we do with proteins, then we could have the equivalent of these citizen science experiments like folding@home, rice@home (our project), or even the foldit video game.
The seti@home was one such example but I'm talking about finding a usef for AP data collected using AP cameras of different qualities, etc. Perhaps we could have a game like "spot the supernova" or "spot the asteroid".
Finally I understand other people make a distinction, but in my own career and my research group, I don't make a distinction between art, science, and philosophy. I have had this statement on my web page (ram.org) for decades (and I've had this argument likewise for decades).
as an aside there are many arguments why this is probably not true. one analogy is a wall illuminated by a lamp, or your computer monitor. does it get brighter and brighter as you get closer to it? you can try it right now - it doesn't. as the monitor gets bigger in your field of vision, the portion of the monitor that your eye can see drops in proportion, so the brightness is constant to your eye. by the "i want to represent what we'd see from a spaceship" logic, the monitor should be blindingly bright when viewed from 1mm away and it's just not. similarly if you pull up to some nebula whose width is measured in light years, when you get to it you'll see that there's nothing to see. there's just not enough stuff making photons in your field of view anymore.
we are inside the milky way and it is only faintly visible to us. when M31 is right on top of us it's going to look like the milky way does now - only visible under dark skies, and only vaguely - just kind of a ribbon of light with not a lot of structure. the brightness of the milky way from a dark site is nothing like any stretched image of it.
the telescope comparison does not make sense as the telescope gathers a whole lot more light than our eyes ever could. if our pupils were somehow as big as the telescope's aperture we'd see M31 as brightly as in the telescope all the time, right now. to think that the telescope "brings an object closer" or represents what it would look like if you were close enough that it subtends the same angle in your vision as it does thru the telescope is to miss the point of what a telescope actually does.