Author Topic: Nikon 180-600mm First Look Review: A Wildlife Photographer's Field Report  (Read 5890 times)

Bent Hjarbo

  • NG Supporter
  • **
  • Posts: 2289
  • Hvidovre, Denmark
    • Hjarbos hjemmeside
Yet a boring handheld image this time at 180/5.6, ISO 200, 1/1600.
Also good at 180mm so think lens is ok even that I can't test it on a FX body (don't have any).
......file was too large at 1900 pixels on long edge......2.77 MB  .....but I can ensure you image looks very good :-)

Reduced it to 1800 pixels......still too large.......
Then 1700 pixels.......
I have an FX body, if time allows we could try, I have the old 200-500, and would like the get the 180-600 instead.

MEPER

  • NG Supporter
  • **
  • Posts: 1179
  • You ARE NikonGear
Yes, we could try that when I have got a little bit more experience using the lens.
I like that lens has internal focus and zoom. It has a lot of function buttons at the barrel. I yet have to read what I can do with those.....

Yes, reduce JPEG quality (just a little bit) to reduce file size......
I use NX Studio Export function to convert image to JPEG.

It is funny......in the eighties I was in the development team which invented the JPEG standard......or the mathematical algorithm we analyzed ended up to be the selected for the standard.
I was the guy to implement the algoritme in software (at that time assembly code). Some other very clever guys did the mathematical stuff.....(cosine transformation / Hoffman coding).
Today more advanced algorithms exists that can compress better but JPEG still seems to be the one used......   

Erik Lund

  • Global Moderator
  • **
  • Posts: 6529
  • Copenhagen
    • ErikLund.com
It is funny......in the eighties I was in the development team which invented the JPEG standard......or the mathematical algorithm we analyzed ended up to be the selected for the standard.
I was the guy to implement the algoritme in software (at that time assembly code). Some other very clever guys did the mathematical stuff.....(cosine transformation / Hoffman coding).
Today more advanced algorithms exists that can compress better but JPEG still seems to be the one used......   
That is quite a substantial contribution to the imaging world! Thanks for that ;)
Please feel free to elaborate on the early thoughts that was behind the successful idears behind JPEG in the early days  :)
PS Huge congratulations on the new lens! I can only assist with my Infrared Z6 for testing, although I fear it will have hotspot,,,
Let me know if we should meetup in Copenhagen with Bent
 
Erik Lund

MEPER

  • NG Supporter
  • **
  • Posts: 1179
  • You ARE NikonGear
The real "brains" behind the standard in the company I was in at that time (KTAS) which was a telephone company.....was Birger Niss and Jørgen Vaaben:
https://jpeg.org/items/20170813_press.html
It seems JPEG was recognized as a standard first in 1992 but it was developed some years before that in a "ESPRIT" project.
Telecommunication lines were slow at that time (ISDN = 64kbit/s). The goal was to be able to search in picture databases fetching pictures at that speed and my goal was to write a JPEG decoder that could do it in realtime on a IBM XT machine or something similar. Therefor the JPEG standard has an option called progressive mode where you first get a very rough image and then it gradually gets finer and finer so you fast can decide if it is the wanted image or you want to look at the next.
The JPEG standard works on 8x8 pixel blocks (run-length coding). So it is one long bit-stream like if you receive a morse coding message. In progressive mode first scan has equal 8x8 pixels and then it gets finer and finer. A picture is coded as luminance + two chrominance components. The luminance component has nearly all the information. I was surprised that the chrominance components that so little information was needed.......when I remember back :-)
So it was an advantage to go from RGB to Luminance + Chrominance. The image could be better compressed.

Think this explains it quite good (I had to go deep in my memory):
https://yasoob.me/posts/understanding-and-writing-jpeg-decoder-in-python/

Yes, we could meet in Copenhagen one day and do some tests.......

MEPER

  • NG Supporter
  • **
  • Posts: 1179
  • You ARE NikonGear
When coding a JPEG image it is an advantage to do statistics on the actual image an optimize a Hoffman table to the specific image and send the Hoffman table with the image instead of using a standard Hofmann table which is an option in the standard. It would be similar to make a specific morse table to a specific message. Then it can be coded more efficient (morse table coding is loss less of course). Then the two letters that have the highest count will get (. and -). Now using the standard morse table it is e = . and t = - as far as I remember. It is based on English language and which letters that have the highest probability.

MEPER

  • NG Supporter
  • **
  • Posts: 1179
  • You ARE NikonGear
You can say that the "core" in the JPEG algorithm is that via the discrete cosine transformation you go from pixel domain to frequency domain and higher compression rates will filter out more and more of the high frequencies in the image.
A 100% JPEG image is almost lossless. I think to the eye it is lossless. But it has to be an image and not graphics. 


Erik Lund

  • Global Moderator
  • **
  • Posts: 6529
  • Copenhagen
    • ErikLund.com
Thank You!
I recall investigating this long time ago when I learnt that it was possible to recover the edge of the image from the first Nikon D1 files.
The firmware would crop the image to leave out any artifacts with potential faults,,,  :o
Erik Lund

golunvolo

  • NG Supporter
  • **
  • Posts: 7150
  • You ARE NikonGear
This is very interesting reading. Thank you!

Ilkka Nissilä

  • NG Member
  • *
  • Posts: 1712
  • You ARE NikonGear
Cameralabs' test shows pretty amazing sharpness for a lens of this type.

https://www.cameralabs.com/nikon-z-180-600mm-f5-6-6-3-vr-review/

I hope it's not only a really good sample but representative of the typical performance of the lens.

ChengZhou Hong

  • NG Member
  • *
  • Posts: 11
  • You ARE NikonGear
You can say that the "core" in the JPEG algorithm is that via the discrete cosine transformation you go from pixel domain to frequency domain and higher compression rates will filter out more and more of the high frequencies in the image.
A 100% JPEG image is almost lossless. I think to the eye it is lossless. But it has to be an image and not graphics.

This seems to be relatively recent information.. Very interesting, thank you for sharing!
Personally I had wanted to know more about digital imaging technology, but as I am not computer-inclined, I had given up.



Cameralabs' test shows pretty amazing sharpness for a lens of this type.

https://www.cameralabs.com/nikon-z-180-600mm-f5-6-6-3-vr-review/

I hope it's not only a really good sample but representative of the typical performance of the lens.

Ah glad to see a review of this lens popping up already! Some of the recent supertele-zooms like this one (150-600mm lenses from Sigma and Tamron) almost make me regret purchasing a AI-S 300mm f/2.8 with TC-301... For the time being I have no wishes to "upgrade" to mirrorless, but this lens would be one of the temptations to do so. I suppose I'd enjoy the old tanker for the time being..

Eddie Draaisma

  • NG Member
  • *
  • Posts: 419
A follow-up to the first report from Steve Perry with some comparisons to other Nikkors and the Sony 200-600:


https://www.youtube.com/watch?v=7loeXXUP4Ic