Author Topic: Pixel-Shifting Vs. Larger Sensors  (Read 27392 times)

Bruno Schroder

  • NG Supporter
  • **
  • Posts: 1665
  • Future is the only way forward
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #45 on: October 31, 2017, 19:49:04 »
Not exactly about the same topic but the theory is applicable here as well: https://www.lensrentals.com/blog/2017/10/the-8k-conundrum-when-bad-lenses-mount-good-sensors/

From the conclusion: For an exceptionally good lens stopped down a bit, then the system MTF is almost entirely dependent on the camera. Raise the resolution of the camera and the image improves dramatically.

Michael uses only the finest lenses available. He not only works with sharp lenses, but also with the best APO corrected. In his case, and his style of photography, any increase in sensor resolution will have a positive impact on the system MTF (lens + sensor).

If you have not looked at his work, he photographs static subjects in studio under static lightning. Movement and duration of the process have no impact here.

Bruno Schröder

Reality is frequently inaccurate. (Douglas Adams)

Michael Erlewine

  • Close-Up Photographer
  • NG Supporter
  • **
  • Posts: 2067
  • Close-Up with APO
    • Spirit Grooves
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #46 on: October 31, 2017, 20:01:29 »
Not exactly about the same topic but the theory is applicable here as well: https://www.lensrentals.com/blog/2017/10/the-8k-conundrum-when-bad-lenses-mount-good-sensors/

From the conclusion: For an exceptionally good lens stopped down a bit, then the system MTF is almost entirely dependent on the camera. Raise the resolution of the camera and the image improves dramatically.

Michael uses only the finest lenses available. He not only works with sharp lenses, but also with the best APO corrected. In his case, and his style of photography, any increase in sensor resolution will have a positive impact on the system MTF (lens + sensor).

If you have not looked at his work, he photographs static subjects in studio under static lightning. Movement and duration of the process have no impact here.

Here is a an example of what I do,  many taken outside in summer, but inside in winter, because we are in northern Michigan.

Taken with the Nikon D850 and the APO El Nikkor 105mm lens.

MichaelErlewine.smugmug.com, Daily Blog at https://www.facebook.com/MichaelErlewine. main site: SpiritGrooves.net, https://www.youtube.com/user/merlewine, Founder: MacroStop.com, All-Music Guide, All-Movie Guide, Classic Posters.com, Matrix Software, DharmaGrooves.com

Les Olson

  • NG Member
  • *
  • Posts: 502
  • You ARE NikonGear
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #47 on: November 01, 2017, 08:57:36 »
Does this means that you do not agree that a direct measure is better than interpolation?

But pixel shift is not a direct measure: it does not measure RGB values for each pixel. It is, just like the Bayer array, assigning each pixel the values for RGB it did not measure based on RGB values in its neighbours.

It is not true that a direct measurement is always better than an interpolation: an interpolation between accurate measurements can be much more accurate than an inaccurate direct measurement.  In the case of pixel shift the question question is how accurate the shift is.  There must be both systematic and random error in how far the sensor moves.  How big is it? There is no variation in the distance between pixels in the Bayer array, so it is perfectly possible that pixel shift will be worse than Bayer interpolation.  (Plus, of course, since we are talking about microns, it really is not good enough to say "Oh, Michael's in the studio so he doesn't have to worry about subject movement").

Michael Erlewine

  • Close-Up Photographer
  • NG Supporter
  • **
  • Posts: 2067
  • Close-Up with APO
    • Spirit Grooves
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #48 on: November 01, 2017, 09:24:35 »
But pixel shift is not a direct measure: it does not measure RGB values for each pixel. It is, just like the Bayer array, assigning each pixel the values for RGB it did not measure based on RGB values in its neighbours.

It is not true that a direct measurement is always better than an interpolation: an interpolation between accurate measurements can be much more accurate than an inaccurate direct measurement.  In the case of pixel shift the question question is how accurate the shift is.  There must be both systematic and random error in how far the sensor moves.  How big is it? There is no variation in the distance between pixels in the Bayer array, so it is perfectly possible that pixel shift will be worse than Bayer interpolation.  (Plus, of course, since we are talking about microns, it really is not good enough to say "Oh, Michael's in the studio so he doesn't have to worry about subject movement").

Les Olson: Have you ever actually used pixel-shift? If so, which camera and on what?
MichaelErlewine.smugmug.com, Daily Blog at https://www.facebook.com/MichaelErlewine. main site: SpiritGrooves.net, https://www.youtube.com/user/merlewine, Founder: MacroStop.com, All-Music Guide, All-Movie Guide, Classic Posters.com, Matrix Software, DharmaGrooves.com

simsurace

  • NG Member
  • *
  • Posts: 835
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #49 on: November 01, 2017, 09:38:27 »
But pixel shift is not a direct measure: it does not measure RGB values for each pixel. It is, just like the Bayer array, assigning each pixel the values for RGB it did not measure based on RGB values in its neighbours.
My understanding is that it assigns each pixel RGGB values. One of them comes from the unshifted position, the other three from shifts in a U-shaped pattern. It is the same as if you would leave the photosites stationary and move the color filter array on top of that. You would make four measurements, one in each channel.
I'm assuming that stationarity of the camera and subject is given and that the sensor movement error is significantly smaller than a pixel.

It is not true that a direct measurement is always better than an interpolation: an interpolation between accurate measurements can be much more accurate than an inaccurate direct measurement.  In the case of pixel shift the question question is how accurate the shift is.  There must be both systematic and random error in how far the sensor moves.  How big is it? There is no variation in the distance between pixels in the Bayer array, so it is perfectly possible that pixel shift will be worse than Bayer interpolation.  (Plus, of course, since we are talking about microns, it really is not good enough to say "Oh, Michael's in the studio so he doesn't have to worry about subject movement").

Your argument would certainly have merit a priori (i.e. before having seen any images from pixel shifted sensors), and these are certainly among the challenges that the engineers implementing a pixel shift feature are/were facing.
I would assume that in Michael's studio it is possible to achieve the same degree of stability as in dpreview's studio.
I do not know whether this fully accounts for what Michael is seeing, as there might be other differences between the cameras and their processing pipeline that contribute to the perception that color rendition has improved.
Nevertheless, my guess is that artifacts like those seen in Ilkka's example are especially problematic for stacking, because they make it more difficult for the software to match features to a pixel-level precision. So while the artifacts themselves may be somewhat smeared out and hidden by the stacking process, the resulting contrast may also be lower than in the case where very few artifacts are present and very precise matching is possible. This could have a perceptible effect on sharpness, but to be really sure one would need to do quite elaborate testing.
Simone Carlo Surace
suracephoto.com

Les Olson

  • NG Member
  • *
  • Posts: 502
  • You ARE NikonGear
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #50 on: November 01, 2017, 09:48:03 »
Raise the resolution of the camera and the image improves dramatically.

Michael uses only the finest lenses available. He not only works with sharp lenses, but also with the best APO corrected. In his case, and his style of photography, any increase in sensor resolution will have a positive impact on the system MTF (lens + sensor).

Actually, the conclusion was "system MTF changes significantly with different sensors, but at higher resolutions a diminishing return is seen." [Emphasis added]

The perceived sharpness of a print is heavily influenced by detail about 10 lp/mm.  For an FX sensor, ignoring output resolution, and to make it easy to read the graphs, say 75 lp/mm at the sensor.  So, if you want to know how your prints will look, MTF at 75 lp/mm is a better metric than extinction MTF.  On those graphs, 6K - roughly 24MP - gives MTF at 75 lp/mm = 0.65 and 8K - roughly, 40MP - gives just under 0.7, and the lens only value, which you cannot ever quite reach, is 0.75. 

If you prefer, you can look at extinction resolution, conventionally defined as MTF50.  In that case, going from 24MP to 40MP takes you from 100 lp/mm to 110 lp/mm.  On a back-of-the envelope calculation, you need well over 100MP to get close to the lens only MTF50.

Of course, those are contrast MTF curves, and you might want to think about what MTF50 means in terms of colour resolution.

Les Olson

  • NG Member
  • *
  • Posts: 502
  • You ARE NikonGear
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #51 on: November 01, 2017, 11:32:49 »
My understanding is that it assigns each pixel RGGB values. One of them comes from the unshifted position, the other three from shifts in a U-shaped pattern. It is the same as if you would leave the photosites stationary and move the color filter array on top of that. You would make four measurements, one in each channel.


No, it's not the same.  When you move the micro-lenses - Sony's Active Pixel Colour Sensor - the pixel sees the same bit of the scene each time, but through a different coloured micro-lens.  In pixel shift the pixel sees a different bit of the scene each time through the same coloured micro-lens.  Then in software you "shift" the pixels back so they all line up.  In the APCS small inaccuracies don't matter, but in pixel shift, they do. Plus, it may not be the case that the software manipulation of pixel "position" is free, informationally.


Les Olson

  • NG Member
  • *
  • Posts: 502
  • You ARE NikonGear
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #52 on: November 01, 2017, 13:22:19 »

So while the artifacts themselves may be somewhat smeared out and hidden by the stacking process, the resulting contrast may also be lower than in the case where very few artifacts are present and very precise matching is possible. This could have a perceptible effect on sharpness, but to be really sure one would need to do quite elaborate testing.

Or one could just move the little square around the DPR test shot, since why anyone would think white on black text was the way to test for resolution of fine colour detail is beyond me.  Just to the left of the B&W text is a tiny embroidery of the Beatles.  There is a lot of very fine detail with abrupt colour transitions. There may be a tiny but more detail in the K1 images with pixel shift on compared to off, but the D850 is better than both (look especially at the trousers).  If you go up to the reels of thread immediately above, where there is detail within larger areas of the same colour, you see the same thing: pixel shift may improve the K1 images a tiny bit, but the D850 is clearly better. 

As for the artifacts, comparing the white on black text with the black on white and the lower contrast versions is illuminating.


Frank Fremerey

  • engineering art
  • NG Supporter
  • **
  • Posts: 12620
  • Bonn, Germany
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #53 on: November 01, 2017, 13:30:37 »
But pixel shift is not a direct measure: it does not measure RGB values for each pixel. It is, just like the Bayer array, assigning each pixel the values for RGB it did not measure based on RGB values in its neighbours.

It is not true that a direct measurement is always better than an interpolation: an interpolation between accurate measurements can be much more accurate than an inaccurate direct measurement.  In the case of pixel shift the question question is how accurate the shift is.  There must be both systematic and random error in how far the sensor moves.  How big is it? There is no variation in the distance between pixels in the Bayer array, so it is perfectly possible that pixel shift will be worse than Bayer interpolation.  (Plus, of course, since we are talking about microns, it really is not good enough to say "Oh, Michael's in the studio so he doesn't have to worry about subject movement").

For color accuracy massive oversampling on static subjects means to deepen the FWC on virtual pixels that get much more direct measurement of the actual color in the position of the shift.

Color accuracy is what Michael is longing for and a 16-shift would mean that the final picture is made from 16 times the amount of photon events compared to a single shot.

In my imagination that means we have a 16 fold Full Well Capacity per final pixel and much more direct measurement quotient, meaning a real 16 bit measurement

You are out there. You and your camera. You can shoot or not shoot as you please. Discover the world, Your world. Show it to us. Or we might never see it.

Me: https://youpic.com/photographer/frankfremerey/

Michael Erlewine

  • Close-Up Photographer
  • NG Supporter
  • **
  • Posts: 2067
  • Close-Up with APO
    • Spirit Grooves
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #54 on: November 01, 2017, 13:42:46 »
The Nikon D850 does not offer pixel-shift, so I fail to understand why it is being referenced. The new Sony A7R3 has pixel-shift and early reports seem to agree that the new A7R3 has greater DR than the older A7R2. If so, then pixel-shift on the A7R3 may well be very useful for more accurate color (to my eyes) than with Bayer interpolation. I'm not a betting person, but I would bet that the A7R3 may make a greater splash in this regard than we expect, given the right lenses.

I sold my A7R2 (and a host of batteries) the day the A7R3 was announced having pixel-shift. I hope Sony does not mess it up; I have used Sony video cameras for decades and trust them a lot.
MichaelErlewine.smugmug.com, Daily Blog at https://www.facebook.com/MichaelErlewine. main site: SpiritGrooves.net, https://www.youtube.com/user/merlewine, Founder: MacroStop.com, All-Music Guide, All-Movie Guide, Classic Posters.com, Matrix Software, DharmaGrooves.com

Frank Fremerey

  • engineering art
  • NG Supporter
  • **
  • Posts: 12620
  • Bonn, Germany
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #55 on: November 01, 2017, 15:22:30 »
The Nikon D850 does not offer pixel-shift, so I fail to understand why it is being referenced. The new Sony A7R3 has pixel-shift and early reports seem to agree that the new A7R3 has greater DR than the older A7R2. If so, then pixel-shift on the A7R3 may well be very useful for more accurate color (to my eyes) than with Bayer interpolation. I'm not a betting person, but I would bet that the A7R3 may make a greater splash in this regard than we expect, given the right lenses.

I sold my A7R2 (and a host of batteries) the day the A7R3 was announced having pixel-shift. I hope Sony does not mess it up; I have used Sony video cameras for decades and trust them a lot.

The A7R3 has a 4-way-pixel-shift, so the final image is made from four times the photon events than a single shot with the same sonsor.

Is t he shifting is done with the aim to measure all three basic colors (green twice) in the accurate position for non static subjects (1-pixel-shift)?
Or is it done mainly to increase spatial resolution (1/2 pixel shift)?
You are out there. You and your camera. You can shoot or not shoot as you please. Discover the world, Your world. Show it to us. Or we might never see it.

Me: https://youpic.com/photographer/frankfremerey/

Les Olson

  • NG Member
  • *
  • Posts: 502
  • You ARE NikonGear
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #56 on: November 01, 2017, 15:37:42 »
For color accuracy massive oversampling on static subjects means to deepen the FWC on virtual pixels that get much more direct measurement of the actual color in the position of the shift.

Color accuracy is what Michael is longing for and a 16-shift would mean that the final picture is made from 16 times the amount of photon events compared to a single shot.

Anyone who cared could just go back to the DPR test shot and download the RAW file for the K1 with and without pixel shift, process identically, zoom in on the color-checker card and get the RGB values for each colour area with and without pixel shift. 

"Colour accuracy" is still meaningless, but the RGB values must be different - otherwise the pixel shift is doing nothing to colour at all. 

Michael Erlewine

  • Close-Up Photographer
  • NG Supporter
  • **
  • Posts: 2067
  • Close-Up with APO
    • Spirit Grooves
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #57 on: November 01, 2017, 16:14:02 »
Having done many pixel-shift images with the Pentax K3 and K1, not to mention endless thousands of retouching images with the various modes of focus-stacking, I don’t care how you describe it, there is a significant difference between shifted and unshifted images, IMO (and experience), in the favor of shifted images.

Questions about “APO,” which has no standard definition, and terms like acutance, resolution, and other terms that we use to describe the interplay of color with “sharpness” (another vague term) is not at issue here. That conversation will go on perhaps forever.

All the tests I have seen, performed, and read about point out that there is a difference between pixel-shift and traditional Bayer images. It is that DIFFERENCE I have tried to refer to, not to stir up all the armchair philosophers or theoreticians out there, but to contact those who actual have used pixel-shift. They are the ones I would like to talk with, folks who actually have experimented with pixel-shifting, where the rubber meets the road, so to speak. Perhaps there are none on this forum!

And my inquiry was as to how far up the road of ever-larger sensors do we go until we have “enough” of whatever we are looking for in “resolution,” etc., to be satisfied. Perhaps never. And I am not interested in printing out images, either.

However, my hunch is there is a point of diminishing returns between what I need to get in the way of color that pleases me, along with sharpness, correction, etc. and... sensor size. Pixel-shift has, IMO, helped to limit (for me) the size of sensor I need. In other words, I believe I can get more with less... as to the sensor size that I used to imagine. 
MichaelErlewine.smugmug.com, Daily Blog at https://www.facebook.com/MichaelErlewine. main site: SpiritGrooves.net, https://www.youtube.com/user/merlewine, Founder: MacroStop.com, All-Music Guide, All-Movie Guide, Classic Posters.com, Matrix Software, DharmaGrooves.com

David H. Hartman

  • NG Member
  • *
  • Posts: 2790
  • I Doctor Photographs... :)
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #58 on: November 02, 2017, 02:35:04 »
I may be sticking my foot deeply to my mouth but please indulge me...

If one is down sampling to 2000 pixels on the long side isn't the process used to down sample more important than the pixel or near pixel level performance of the camera, its sensor and its image processor?

Dave the ignorant
Beatniks are out to make it rich
Oh no, must be the season of the witch!

Ethan

  • NG Supporter
  • **
  • Posts: 208
  • You ARE NikonGear
Re: Pixel-Shifting Vs. Larger Sensors
« Reply #59 on: November 02, 2017, 06:08:41 »
I may be sticking my foot deeply to my mouth but please indulge me...

If one is down sampling to 2000 pixels on the long side isn't the process used to down sample more important than the pixel or near pixel level performance of the camera, its sensor and its image processor?

Dave the ignorant

As another "ignorant", this is what I tried to underline to Michael.

He is starting with a super duper scanned file, process it in Profoto colour space and drops it down to an sRGB Jpeg compressed colour space at a max of 2000 px at any length!!!!!!

What would be the point in having the best resolution if you are dropping to a Jpeg compressed file??????

But hey what do we know, we are all ignorant.