Author Topic: Pixel-Shifting Vs. Larger Sensors  (Read 29057 times)

Les Olson

  • NG Member
  • *
  • Posts: 502
  • You ARE NikonGear
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #30 on: October 30, 2017, 10:08:55 »
The pixel-shifted colors, however you spell it, are IMO (which is all that I have) quite a bit better than the standard Bayer interpolation scheme we are used to. My original question had to do with the value of going after ever-larger sensors as opposed to getting better color/resolution of the existing sized sensors as in the A7R3. And Pixel-Shifting, to my eyes, seems to offer this.

The fact that colour is all in your head does not mean it is not real, to paraphrase Professor Dumbledore.  It just means that when you say that the colours you get from a camera with pixel shift are "better", you are responding to something about the colours.  The colours you get from a camera are no more a "true" representation of the real world than the colours you see.  The light reflected from a flower is not made up of RGB, so the colour the camera "sees" is all in its head - its image processing engine - in exactly the same way as the colour you see is all in your head. It is just not true that the colours a pixel shift camera sees are purer in the sense of being less a result of the choices made by the people who designed the image processing engine.

The colours in the camera (or a monitor or a printer) are made up of saturation, luminance and hue.  The hue is the RGB values.  That is all there is.  So when you say colours are "better" it can only be because one (or more) of those things - saturation, luminance or hue, is very slightly different.  The saturation and the luminance and the RGB values can be anything you like, so any camera can reproduce, perfectly, the output of any other camera.  So when you say your choice is between larger sensors and pixel-shift you are missing a third option: adjust the output of your current camera to match what it is you like about the colours you get from pixel shift cameras.

Of course, it may be less work just to buy the camera whose output you prefer, but what happens if you don't like the same colours for every type of scene, or if a client's taste differs from yours?  You still need to know which of the colour parameters is driving the preference so you can adjust the output to match it.  You need to work out whether you are David Attenborough or Robert Mapplethorpe, artistically speaking, because colour tastes are driven by that choice.

The same applies to resolution. You may want more resolution for the same reason as astronomers and microscopists want more resolution, but if you want more resolution to give the impression of greater sharpness you are going down the same rabbit hole as with colour accuracy. "Sharpness" is like "better colour" - it is all in your head.  However, just as the perception of better colour has a basis in particular values of luminance and saturation and so on, perceived sharpness has a basis in measureable image parameters: resolution, but also acutance, contrast and colour (red areas are perceived as less sharp, eg).  To get the result you want you have to know what is driving perceived sharpness in that image, and if you do you can get the result you want with any camera.

Michael Erlewine

  • Close-Up Photographer
  • NG Supporter
  • **
  • Posts: 2067
  • Close-Up with APO
    • Spirit Grooves
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #31 on: October 30, 2017, 10:29:49 »
The fact that colour is all in your head does not mean it is not real, to paraphrase Professor Dumbledore.  It just means that when you say that the colours you get from a camera with pixel shift are "better", you are responding to something about the colours.  The colours you get from a camera are no more a "true" representation of the real world than the colours you see.  The light reflected from a flower is not made up of RGB, so the colour the camera "sees" is all in its head - its image processing engine - in exactly the same way as the colour you see is all in your head. It is just not true that the colours a pixel shift camera sees are purer in the sense of being less a result of the choices made by the people who designed the image processing engine.

The colours in the camera (or a monitor or a printer) are made up of saturation, luminance and hue.  The hue is the RGB values.  That is all there is.  So when you say colours are "better" it can only be because one (or more) of those things - saturation, luminance or hue, is very slightly different.  The saturation and the luminance and the RGB values can be anything you like, so any camera can reproduce, perfectly, the output of any other camera.  So when you say your choice is between larger sensors and pixel-shift you are missing a third option: adjust the output of your current camera to match what it is you like about the colours you get from pixel shift cameras.

Of course, it may be less work just to buy the camera whose output you prefer, but what happens if you don't like the same colours for every type of scene, or if a client's taste differs from yours?  You still need to know which of the colour parameters is driving the preference so you can adjust the output to match it.  You need to work out whether you are David Attenborough or Robert Mapplethorpe, artistically speaking, because colour tastes are driven by that choice.

The same applies to resolution. You may want more resolution for the same reason as astronomers and microscopists want more resolution, but if you want more resolution to give the impression of greater sharpness you are going down the same rabbit hole as with colour accuracy. "Sharpness" is like "better colour" - it is all in your head.  However, just as the perception of better colour has a basis in particular values of luminance and saturation and so on, perceived sharpness has a basis in measureable image parameters: resolution, but also acutance, contrast and colour (red areas are perceived as less sharp, eg).  To get the result you want you have to know what is driving perceived sharpness in that image, and if you do you can get the result you want with any camera.

I'm not going to repeat myself. Thanks for the explanation, but I get nothing from it but a lot of words. Sorry. You are telling me stuff I already know, but still not saying anything I feel is useful. That's just me. I have been singing the song that sharpness is dependent on color for a decade at least, etc. Perhaps there are others out there who can dialog with what I wrote to better communicate with me. You want me to hear you, but I feel you never heard what i said, but just used my question as a way to go on and on.

I am having the same discussion on LuLa, but with reasonable answers.
MichaelErlewine.smugmug.com, Daily Blog at https://www.facebook.com/MichaelErlewine. main site: SpiritGrooves.net, https://www.youtube.com/user/merlewine, Founder: MacroStop.com, All-Music Guide, All-Movie Guide, Classic Posters.com, Matrix Software, DharmaGrooves.com

BW

  • NG Supporter
  • **
  • Posts: 864
  • You ARE NikonGear
    • Børge Wahl-Photography
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #32 on: October 30, 2017, 11:19:18 »
Have you tried shouting your questions into the woods? Seem like thats your only remaining option?

Ilkka Nissilä

  • NG Member
  • *
  • Posts: 1714
  • You ARE NikonGear
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #33 on: October 30, 2017, 13:30:28 »
If there are fine details which vary in colour at high spatial frequencies in the subject then the Bayer method in itself may not reproduce the color variations at the pixel level as accurately as the pixel-shift technique does.

Also simply white text on black background shows how a Bayer sensor with no AA filter creates a lot of artifacts which pixel shift seems to be able to eliminate (assuming conditions are static enough). Here is a section of a screen grab from dpreview's studio comparison.

Michael Erlewine

  • Close-Up Photographer
  • NG Supporter
  • **
  • Posts: 2067
  • Close-Up with APO
    • Spirit Grooves
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #34 on: October 30, 2017, 13:41:58 »
If there are fine details which vary in colour at high spatial frequencies in the subject then the Bayer method in itself may not reproduce the color variations at the pixel level as accurately as the pixel-shift technique does.

Also simply white text on black background shows how a Bayer sensor with no AA filter creates a lot of artifacts which pixel shift seems to be able to eliminate (assuming conditions are static enough). Here is a section of a screen grab from dpreview's studio comparison.

Thank you! This, then, is what I thought we would be talking about by now. My own eyes proved to me that the pixel-shifting in the Pentax K3 and K1 were a step finer than I could achieve with my Nikon D810. I could see the difference, but the implementation of other Pentax features (like the convenience of non-native lenses) were just not there and many of the Pentax native lenses were, well, not so good IMO. I even went to great extent to find lenses that I could use on the K1, including the Cosina Voigtlander 90mm APO Macro and ever a rare copy of the Voigtlander 125mm APO-Lanthar in Pentax mount, and others. It was easy to see that there was a future in pixel-shifting. I just needed to wait for a better implementation of it. The Sony A7R3 may be it; if not I will return it and sell my Voigtlander 65mm Macro in E-mount.

I also have a special rig for the Cambo Actus that will let me use the A7R3 with some of my cherished lenses like the APO El Nikkor 105mm, and many others.
MichaelErlewine.smugmug.com, Daily Blog at https://www.facebook.com/MichaelErlewine. main site: SpiritGrooves.net, https://www.youtube.com/user/merlewine, Founder: MacroStop.com, All-Music Guide, All-Movie Guide, Classic Posters.com, Matrix Software, DharmaGrooves.com

simsurace

  • NG Member
  • *
  • Posts: 835
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #35 on: October 30, 2017, 14:59:06 »
If there are fine details which vary in colour at high spatial frequencies in the subject then the Bayer method in itself may not reproduce the color variations at the pixel level as accurately as the pixel-shift technique does.

Also simply white text on black background shows how a Bayer sensor with no AA filter creates a lot of artifacts which pixel shift seems to be able to eliminate (assuming conditions are static enough). Here is a section of a screen grab from dpreview's studio comparison.

Thanks Ilkka, this is a pretty impressive demonstration!
Simone Carlo Surace
suracephoto.com

Les Olson

  • NG Member
  • *
  • Posts: 502
  • You ARE NikonGear
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #36 on: October 30, 2017, 20:47:37 »
If there are fine details which vary in colour at high spatial frequencies in the subject then the Bayer method in itself may not reproduce the color variations at the pixel level as accurately as the pixel-shift technique does.

I am done trying to explain this, so I will just ask a question: we measure resolution of high spatial frequencies by MTF, so why can't you show us the MTFs for pixel shift vs no pixel shift for a coloured target - instead of black and white lines, red/green, red/blue, green/blue?

Michael Erlewine

  • Close-Up Photographer
  • NG Supporter
  • **
  • Posts: 2067
  • Close-Up with APO
    • Spirit Grooves
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #37 on: October 30, 2017, 21:42:18 »
I am done trying to explain this, so I will just ask a question: we measure resolution of high spatial frequencies by MTF, so why can't you show us the MTFs for pixel shift vs no pixel shift for a coloured target - instead of black and white lines, red/green, red/blue, green/blue?

I'm done responding to you too and your telling folks what they have to do. My suggestion is that you actually try pixel-shift and compare it to other DSLR quality images and thus see for yourself. That's what this thread was supposed to be about. Then, I would like to hear what you have found.
MichaelErlewine.smugmug.com, Daily Blog at https://www.facebook.com/MichaelErlewine. main site: SpiritGrooves.net, https://www.youtube.com/user/merlewine, Founder: MacroStop.com, All-Music Guide, All-Movie Guide, Classic Posters.com, Matrix Software, DharmaGrooves.com

simsurace

  • NG Member
  • *
  • Posts: 835
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #38 on: October 31, 2017, 13:20:34 »
I am done trying to explain this, so I will just ask a question: we measure resolution of high spatial frequencies by MTF, so why can't you show us the MTFs for pixel shift vs no pixel shift for a coloured target - instead of black and white lines, red/green, red/blue, green/blue?
Why are you asking for this?
What is your prediction?
Does it change the interpretation of the obvious occurrence of massive artifacts in the Bayer shot that are absent from the pixel-shifted shot?
Are you familiar with Nyquist's theorem and its relationship with Bayer sensors?
Simone Carlo Surace
suracephoto.com

Les Olson

  • NG Member
  • *
  • Posts: 502
  • You ARE NikonGear
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #39 on: October 31, 2017, 15:39:57 »
Why are you asking for this?
What is your prediction?
Does it change the interpretation of the obvious occurrence of massive artifacts in the Bayer shot that are absent from the pixel-shifted shot?
Are you familiar with Nyquist's theorem and its relationship with Bayer sensors?

The claim is that pixel shift will give better resolution of fine coloured detail. Isn't an MTF curve or MTF50 for a coloured target the obvious way to show that?   

Ilkka Nissilä

  • NG Member
  • *
  • Posts: 1714
  • You ARE NikonGear
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #40 on: October 31, 2017, 16:54:18 »
Measuring line pairs at different distances is not a good way to characterize the 2D imaging performance of a system when the real-world objects to be delineated are not lines but 2D random patterns of small variations in colour.

I think the best way to judge the accuracy of fine detail is to drop high contrast dots at random locations of the test target, photograph it and take a sum-of-squared-errors between RGB values of the target and the image and then use that as a measure of the quality of the imaging. Distortion would be heavily punished in this metric though so perhaps that's not what people want to use (if the spatial accuracy of the image is not a priority).
However, distortion can be measured and corrected prior to testing.

Perhaps a metric of false color can be devised that is less sensitive to distortion than MSE.

Line pairs of homogeneous colour allow interpolation to patch gaps in the matrix. There should be some contrast loss though (which reduces MTF values). I think coloured resolution targets are not commonly used because it is harder to make high-contrast high-resolution targets in colour. However, in my opinion a different metric should be used for the accuracy of colour of pixel-level detail. I'll think about it.

Ilkka Nissilä

  • NG Member
  • *
  • Posts: 1714
  • You ARE NikonGear
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #41 on: October 31, 2017, 17:25:05 »
False color is a kind of cross-talk between the colors. A system with false color isn’t properly characterized by images taken of targets of a specific color  since the result is garbling of the channel not being meausured as well. I guess the whole k-space analysis fails because the RGB MTF changes if the target is shifted by one pixel. It’s just not the right tool for characterizing this kind of imaging artefacts.

simsurace

  • NG Member
  • *
  • Posts: 835
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #42 on: October 31, 2017, 18:01:57 »
The claim is that pixel shift will give better resolution of fine coloured detail. Isn't an MTF curve or MTF50 for a coloured target the obvious way to show that?
We are talking about detail at frequencies very close to the Nyquist limit. MTF is probably less than 50% at that frequency.
Anyway, with differences so obvious to the naked eye, what more do you expect to learn from an MTF curve?
The Bayer color artifacts are not nice, but currently there is a steep price to pay for avoiding them (either light loss for stacked sensors or steadyness for the pixel shift feature).
If there is a lot of light and the pixel shift is fast enough, maybe you can capture the four images so quickly that the problem of movement is minimized...
Simone Carlo Surace
suracephoto.com

Bruno Schroder

  • NG Supporter
  • **
  • Posts: 1665
  • Future is the only way forward
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #43 on: October 31, 2017, 19:22:29 »
The claim is that pixel shift will give better resolution of fine coloured detail. Isn't an MTF curve or MTF50 for a coloured target the obvious way to show that?
Does this means that you do not agree that a direct measure is better than interpolation?
Bruno Schröder

Reality is frequently inaccurate. (Douglas Adams)

Michael Erlewine

  • Close-Up Photographer
  • NG Supporter
  • **
  • Posts: 2067
  • Close-Up with APO
    • Spirit Grooves
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #44 on: October 31, 2017, 19:30:05 »
Does this means that you do not agree that a direct measure is better than interpolation?

My point exactly, but Les wants to put them on the same level. I don't get it.
MichaelErlewine.smugmug.com, Daily Blog at https://www.facebook.com/MichaelErlewine. main site: SpiritGrooves.net, https://www.youtube.com/user/merlewine, Founder: MacroStop.com, All-Music Guide, All-Movie Guide, Classic Posters.com, Matrix Software, DharmaGrooves.com