The lower QE of film is not a fundamental of the basic technology, but a question of ease of manufacture and keep-ability.
One could easily imagine film made with some way to get more uniformly and evenly distributed crystals. And one or more of various kinds of doping and hypering.
Trouble is the film has to be on the shelves for months or even years.
Imagine how much thermal and cosmic noise a sensor would pick up from just a week of continuous readout.
There are various things we can do to approximate such an ideal film as previously mentioned.
Combining H2 hypering, flashing, multiple exposures, compensating development and impeccable scanning to pick up the faintest shadow details in the film, will bring you several stops up in sensitivity.
The QE of common camera sensors is on the other hand over-advertised.
Often you’ll hear numbers from huge specialty scientific monochrome, cryo sensors, as universal truths for all sensors.
Sensors and infrastructure that is well out of reach of even a very rich private person.
The real QE of most camera sensors is a trade secret.
But is obviously empirically observable to be far below the often quoted high numbers ranging from 90% to 40%.
Also, it is worth remembering that film scanning unless you have caught onto the recent trend of macro camera scanning, has been absolutely terrible for decades, held up to what we know (but most have selectively forgotten) that film is capable of WRT resolution and dynamics.
Well lit 135 film will in most regards absolutely kill *any* sensor of the same size.
Scanning and erosion of printing knowledge is the problem.
Not film as medium.
Edited by Helge F., 03 December 2020 - 06:24 AM.