Oh no, that doesn't work. As an example Google images of M27. You won't get any images of M27, you'll get thousands of examples of what other imagers processed their images of M27 to look like, some with the most imaginative color schemes.
When you take your own images it is a wonderful and enlightening experience to see the source data before it is processed beyond all recognition. For me imaging proceeds in 3 distinct steps. First, I acquire, stack, and calibrate my source images. This unprocessed source image actually looks a _lot_ like what I'll see through the telescope, maybe a bit brighter but will all of the light pollution and low contrast that I get at the eyepiece. If I'm imaging from a dark sight I get exactly the same experience, just minus the light pollution scrum. Second, I'll go easy on the processing to remove the light pollution giving an image that looks a lot like I would see from a dark site. Both of these images make the best darned finder charts ever! They show me exactly where to look at what to look for. Finally, I'll push the edge of processing to pull as much out of the data as I can. This lets me put all of the available data in context, although in reality these objects _never_ look this was. Biologically, our vision just doesn't work that way.
Interestingly, one thing that imaging has taught me is that no image, even those taken with the Hubble Space Telescope, can capture the subtle beauty of the real thing. These images can be truly beautiful in their own way and I love processing my images to pull up faint details and colors that I will never see, but I have learned to appreciate and dearly love that the beauty of the real thing is in its subtlety, not brash eye-burning colors and impossible contrast. However, beauty is in the eyes of the beholder.
To each their own path. There is no One True Way.