I notice that when I compare flux calibrated stellar spectra with reference spectra there usually is no issue with the absorption/emission features being present and at the correct wavelengths but there is usually some difference in slope between the continuum in my spectra vs the reference spectra. I'm doing pre-calibration and background subtraction and my instrument response curve looks normal.
What are the likely causes of this difference in continuum slopes and does it have any significance? It seems to occur whether I use a largely automated process like Demetra or a more manual process like Rspec.
I couldn't find a reference library spectrum for an a1m iv star and so used an a0 iv star but I assumed that they should be similar at low resolution. I was interested in differences in the continuum using different processing software tools vs a reference spectrum.
Rick