Here is a worked example of the noise behavior when doing a raw stack of RGB vs. RGB combined with L. This is about the same as comparing OSC to RGBL - assuming OSC gathers the same data in 3 1m exposures as 3 1m exposures of r, g, b. That isn't exactly right because the g pixels in OSC are doubled - but it's close.
A key point in studying the noise behavior in different scenarios is to have well defined signal and a well defined noise model. For RGB vs. LRGB the main driver is total time - and the main driver for that is to reduce the impact of Poisson or shot noise. So in this study I assume the only noise is shot noise and it is well behaved. I also assume the colors are well calibrated and that the L filter has the net response of the sum of R, G, B. Even if these things aren't exactly true they will modify the results only slightly - through a different noise model.
The first thing I plot is a single pixel and its values after 1m exposures in R, G, B. And I calculate the L value as the sum of the pixels.
In such a short exposure you can't be sure what the color or luminance is. It may appear to be red but with more exposure time you realize it is blue.
Also note that the signal here ultimately corresponds to an electron arrival rate in electrons per minute.
The error bars in each bar correspond to the shot noise in that channel - and as you can see the noise is relatively large and the SNR in each channel is not great. Given the spread in R G B values it looks strong in red and weak in blue - but you can't be sure.
This is with a total of 3 minutes exposure time - 1m in each channel.