By no means do I consider myself an expert on imaging supernovae, but hopefully have gained enough experience to produce reliable results since beginning imaging supernovae in 2007 as well as being part of a supernova search from 2010 to 2015.
Measuring the brightness of a supernova can be tricky. Many are embedded in the host galaxy and require subtraction to get an accurate result. I've seen many reports that are far too bright due to this issue. What filter, if any, and the choice comparison stars can result in a wide range of results. With CCDs being red sensitive that also can create issues. Stay away from very red stars or very blue stars. For the unfiltered, luminance, and clear images I prefer to use comparison stars of Johnson V. Some people prefer to use R. My choice of catalog is APASS. I try to find stars that have B-V between 0.4 and 0.8. I've found that when compared with actual Johnson V exposures it is usually not more than 0.2 magnitude off.
For SN 2019ein in NGC 5353 it is somewhat embedded. However, right now as it is near maximum I don't expect the host contribution would be significant.
einarin was your magnitude determined from Johnson V comparison stars? Is the image unfiltered or taken with a red filter? Just eyeballing your image, the supernova does appear to be between 13.3 and 13.8 based on the stars in the nice little triangle between NGC 5353 and NGC 5371 to upper left. It definitely appears brighter than the 14.3 star which is the southern of the two stars between the NGC 5353/4 pair and NGC 5355.
As Redbetter pointed out, some supernovae do appear brighter visually than say, the Johnson V results would suggest. In my case some of it is my right eye is without the lens and as a result is a bit more blue sensitive. However, some supernova around maximum are nearly equally bright from blue to red. I suspect that being so nearly equally bright over the visual range whereas most stars are not may be the reason supernovae appear brighter visually. Any one else have any ideas on this?