"Bjørn, With your very significant training in science and it's methods this discussion must be trying."
Not at all. Entertaining; no. There is genuine disagreement present, for example about theory applicability or the assumptions invoked, which is nothing unheard of in scientific circles, but perhaps less easy to deal with on a 'net forum.
I have many decades of practical (and scientific) photography experience without ever once thinking in the direction of the suggested theory. I have used all kinds of photographic tools and formats, from the tiniest imaginable to 8x10" and never needing such concepts. One simply does not take the "equivalent" picture with the different gear, it's that simple. A knowledgeable photographer knows the limitations and constraints of her or his tools, and works accordingly; sometimes the perceived limits are transgressed but then mainly for artistic reasons, which also is an integrated part of photography although less capable of being put into a theoretical framework.
You are basically saying that because you have so much experience, the concept is redundant.
But what if one does not have that background? Does a photographer today have to start with 8x10" and film to get proficient?
Is there a way to compress all that information that you gathered in decades about what to expect from which format, into something that can be learned in a few hours?
Is it so complicated that you need a lifetime of learning, or is it fairly straightforward such that you can quickly learn it, and then focus on other (more important) stuff?
Are there things that were relevant in the analog era, but are no longer relevant today, and therefore don't have to be dragged along?
It is precisely because most photographers from the digital age do not have the background you have, that new concepts are needed.
Forgetting all previous baggage can sometimes enable a new look at the situation and newer, simpler descriptions.
Also, the fact that today most images are viewed on digital screens has led to new problems.
One of them is that people look at images at 100% and wonder about noise.
Another is that some people think that all 10MP images (to give an example) are the same, irrespective of the size of the sensor (you sometimes see this among laypeople).
A digital image has lost its physical dimension, it's just a bunch of pixels, which can lead to the situation that secondary magnification is forgotten about.
Why would you then
not expect the same image quality from a 10MP smartphone sensor vs. a 10MP DSLR on a big print?
This is very different from holding a piece of negative and putting it on the enlarger, thinking about which enlarger lens to use, in which case you are acutely aware of the process that leads to the final result.
New processes require new theoretical concepts, because the old way of doing things will soon be completely forgotten.
For me, the concepts from the analog era are not very natural because I did not grow up using them, even though I can understand them.
I understand that people had to spend hours in the darkroom for what can be achieved today by a few clicks.
At the same time, I do not believe that I have to always think in terms of the old process. I can invent new concept that more compactly describe what I need to know in order to know what I have to do to reach a certain goal.
I tend to think in terms of the digital signal processing and try to simplify the imaging chain to a point where things that do not matter, do not show up in the analysis. This is what is natural to me and what I'm spending my whole working day with anyway. So an assumption that introduces invariance (or reduces degrees of freedom, as you put it) is actually something I embrace, because it makes my life simpler and allows me to focus on the other variables.
The fact that the entire development of digital systems was built upon ideas from information theory suggests that those same ideas should also facilitate a simpler understanding of digital systems, even though the whole thing could probably also be analysed using the concepts from the analog era.
If I know how to get roughly the same image from two formats (an equivalent one), I know by extension how to take a different photograph, or I can predict how the photographs will differ before taking them.