Author Topic: Z9 Release Thread  (Read 54759 times)

chambeshi

  • Guest
Re: Z9 Release Thread
« Reply #120 on: November 01, 2021, 17:19:17 »
Nikon applied for an interesting patent mid 2021, commented on in asobinet.com. Thom Hogan has been comment a couple of times over the past year or more on a noticeable increase of Nikon patents in sensor technology

https://asobinet.com/info-patent-nikon-global-shutter-for-af/

This is intriguing, considering the timing of submission (March 2021): https://www.j-platpat.inpit.go.jp/s0100
JP,2021-100287,A

https://www.zsystemuser.com/nikon-z-system-news-and/did-nikon-just-provide-a-z9.html

https://www.fredmiranda.com/forum/topic/1707731/0#15639102

Akira

  • Homo jezoensis
  • NG Supporter
  • **
  • Posts: 12550
  • Tokyo, Japan
Re: Z9 Release Thread
« Reply #121 on: November 01, 2021, 19:25:35 »
Thank you all for kind comments.  Glad if my quick and dirty translation was of any help.

Akira! What should it mean: Unbalanced performance caused by the unbalanced developing level of technology? For each function? Does it mean - underdeveloping of some parts, boards, or functions? Poor assembling of the final product? The camera is made of raw, poorly engineered parts? Would you be so kind to explain, what on Earth he means by saying that? Or what is underdeveloping - my English only? THX in advance!  LZ

  It may mean different levels of development in sensor read-out, viewfinder, data streaming, autofocus ai, and specially card-writing specs. He can tell much better.

   Akira, thanks a lot for taking the time to share it!

Paco, your interpretation of my rough translation.  You understand me correctly.

LZ, for example, the frame rate of the EVF, according to the initial review of DP review TV, seems to be kept lower than that of the higher-end models by the competitors.  I guess it is also a kind of compromise for the limited processing power of Exspeed 7, and that might be criticized by the sports shooters.  The engineers would have wanted it to be much faster like 120fps, if the processing power would have allowed.

Also, some may detect the remaining slight jello effect inevitably caused by the rolling shutter.
"The eye is blind if the mind is absent." - Confucius

"Limitation is inspiration." - Akira

Akira

  • Homo jezoensis
  • NG Supporter
  • **
  • Posts: 12550
  • Tokyo, Japan
Re: Z9 Release Thread
« Reply #122 on: November 01, 2021, 19:32:38 »
Nikon applied for an interesting patent mid 2021, commented on in asobinet.com. Thom Hogan has been comment a couple of times over the past year or more on a noticeable increase of Nikon patents in sensor technology

https://asobinet.com/info-patent-nikon-global-shutter-for-af/

This is intriguing, considering the timing of submission (March 2021): https://www.j-platpat.inpit.go.jp/s0100
JP,2021-100287,A

https://www.zsystemuser.com/nikon-z-system-news-and/did-nikon-just-provide-a-z9.html

https://www.fredmiranda.com/forum/topic/1707731/0#15639102

Woody, thank you for sharing this interesting patent.  I haven't been able to find the reason why the global shutter hasn't been employed by the still cameras despite the essential advantage of the absence of the jello effect.  These are the first explanation I have ever come up with.

To realize the patented technology, even more advanced image processor aided by Ai and deep learning technology would be required.
"The eye is blind if the mind is absent." - Confucius

"Limitation is inspiration." - Akira

Jack Dahlgren

  • NG Member
  • *
  • Posts: 1528
  • You ARE NikonGear
Re: Z9 Release Thread
« Reply #123 on: November 01, 2021, 20:06:44 »
Woody, thank you for sharing this interesting patent.  I haven't been able to find the reason why the global shutter hasn't been employed by the still cameras despite the essential advantage of the absence of the jello effect.  These are the first explanation I have ever come up with.

To realize the patented technology, even more advanced image processor aided by Ai and deep learning technology would be required.

It seems like very large, high speed global shutters may never actually happen because rolling shutters,as have been the case for about 100 years, will continue to improve until they are “good enough”.

Akira

  • Homo jezoensis
  • NG Supporter
  • **
  • Posts: 12550
  • Tokyo, Japan
Re: Z9 Release Thread
« Reply #124 on: November 01, 2021, 20:24:07 »
It seems like very large, high speed global shutters may never actually happen because rolling shutters,as have been the case for about 100 years, will continue to improve until they are “good enough”.

Possibly, just like the Bayer filter sensor.  It was initially criticized for the false coloring and moire artifacts, but now it seems to be considered to be good enough.

The patent Woody linked may rather be a strategic one to keep the rivals from employing the technology that potentially beat their own cameras hard.
"The eye is blind if the mind is absent." - Confucius

"Limitation is inspiration." - Akira

chambeshi

  • Guest
Re: Z9 Release Thread
« Reply #125 on: November 01, 2021, 20:39:59 »
Very interesting discussion about the Z9 sensor / shutter, and more. The sensor patent [JP,2021-100287,A] emphasizes an independent AF function. As I understand this example of the flying raptor in the technical explanations of the Z9 sensor scan rates, the Nikon representative says the very fast AF scan rate can still track a challenging subject shooting at 120 fps.
Nikon Canada: https://www.youtube.com/watch?v=cmfnT62zaVM

This is one aspect in this detailed explanation by Chris Ogonek, Nikon Canada tech expert.

Autofocus scanning - 37:00 culminating in a series of 400 plus images of a flying raptor in C120 fps mode. Then the live audience chooses a frame by random number. All these images are in sharp focus!
47:00 >  Dual dedicated data channel from CPU with 1. Sensor and 2. EVF. This is a new industry standard. No other MILCs do it this way, apparently.

[Interesting information overall, including Michelle Valberg reports the 500 PF works well for her wildlife subjects with TC2 on the Z6 II and now Z9. In the live comments Brad Hill confirms the same for the Z6 II]

golunvolo

  • NG Supporter
  • **
  • Posts: 6776
  • You ARE NikonGear
Re: Z9 Release Thread
« Reply #126 on: November 01, 2021, 21:37:36 »
Very interesting discussion about the Z9 sensor / shutter, and more. The sensor patent [JP,2021-100287,A] emphasizes an independent AF function. As I understand this example of the flying raptor in the technical explanations of the Z9 sensor scan rates, the Nikon representative says the very fast AF scan rate can still track a challenging subject shooting at 120 fps.
Nikon Canada: https://www.youtube.com/watch?v=cmfnT62zaVM

This is one aspect in this detailed explanation by Chris Ogonek, Nikon Canada tech expert.

Autofocus scanning - 37:00 culminating in a series of 400 plus images of a flying raptor in C120 fps mode. Then the live audience chooses a frame by random number. All these images are in sharp focus!
47:00 >  Dual dedicated data channel from CPU with 1. Sensor and 2. EVF. This is a new industry standard. No other MILCs do it this way, apparently.

[Interesting information overall, including Michelle Valberg reports the 500 PF works well for her wildlife subjects with TC2 on the Z6 II and now Z9. In the live comments Brad Hill confirms the same for the Z6 II]

49´30" you can have little blackout, lines around the viewfinder shutter sound or sound only on the headphones! Even thou I´ll probably try the lines around, the headphones option sounds like fun  :)

chambeshi

  • Guest
Re: Z9 Release Thread
« Reply #127 on: November 03, 2021, 10:58:27 »
I was up at the tender hour of 0130 this morning - GMT+1 - to listen to a zoom Discussion about the Z9 by Thom Hogan and Mark Comon (Paul's Camera, Torrence CA).

Much information was covered, some useful insights new to me, so well worth the nocturnal event. A particular new feature of the Nikon Z Autofocus system stands out, and we are likely to learn much more about its importance in the wild. TH reiterated why the Z9 is a D1 Moment at several points in the Discussion. He also underscored an aspect of the Z9 autofocus, which no one else had mentioned IME (in screening a lot of material over the past few days).

The AI subject recognition of objects is encoded in the 3D Tracking in the hierarchical structure to culminate in drilling down to the smallest Object - namely the Eye. It appears from the preliminary evidence of images and EVF footage that if 3D tracking cannot find Eyes it reverts to Heads; then Bodies/Torsos.... It is Always tracking shapes when this mode is activated by the photographer. As TH concluded, the deeper details as to how Nikon has actually implemented this hierarchical shape recognition is not at all clear.

It is very clear this AI empowered 3D OR works across a diverse range of active subjects, humans, animals (including fishes), birds and vehicles!

Vehicle Tracking  similarly works with recognizing the Hierarchy of Vehicles > Cockpits > Front-of-a-Vehicle, down to smallest high contrast objects, headlights.

One point of interest flagged in this Discussion is among the first feedback of the Z9 from Michelle Valberg; she says the 100-400 f4.5/5.6S gives excellent quality with the ZTC2 at f11, using 3D Tracking + Eye-Detect on the Z9. Some of these examples also include 500 f5.6E PF+T2 III, which is no less encouraging (reiterated in her recent video https://www.youtube.com/watch?v=StM5lcT52jI). She also raves about the scope, speed and reliability of the 3D Tracking Mode+Eye Detect on a range of mammals and birds, also amphibians.

She will probably have much more to say about this aspects this weekend. Part of the Discussion day is Online (and free to registrants):
https://creativephotoacademy.com/event/revealing-the-soul-within-with-michelle-valberg-1/2021-11-07/

fyi a section of this can watched free, and she may present information additional to her video

Jack Dahlgren

  • NG Member
  • *
  • Posts: 1528
  • You ARE NikonGear
Re: Z9 Release Thread
« Reply #128 on: November 03, 2021, 15:59:43 »
The AI subject recognition of objects is encoded in the 3D Tracking in the hierarchical structure to culminate in drilling down to the smallest Object - namely the Eye. It appears from the preliminary evidence of images and EVF footage that if 3D tracking cannot find Eyes it reverts to Heads; then Bodies/Torsos.... It is Always tracking shapes when this mode is activated by the photographer. As TH concluded, the deeper details as to how Nikon has actually implemented this hierarchical shape recognition is not at all clear.

It is very clear this AI empowered 3D OR works across a diverse range of active subjects, humans, animals (including fishes), birds and vehicles!

Vehicle Tracking  similarly works with recognizing the Hierarchy of Vehicles > Cockpits > Front-of-a-Vehicle, down to smallest high contrast objects, headlights.

The object focus is probably run in a two step process. First the entire image is likely scaled down (likely this is done on the image that is presented to the viewfinder as it is downsampled already) then it will run an image segmentation routine to separate/identify the objects in the scene.
Here is an overview of what segmentation is and how it works.
https://catalog.ngc.nvidia.com/orgs/nvidia/collections/imagesegmentation
Once the objects are detected, I'd guess a second pass is taken within those areas of interest to see if an eye can be detected within that region of the image. If so, the sub-region pixels are used to adjust focus further.
They may be doing it differently. There are one shot methods available, but from the videos they have shown with bounding boxes it seems like this is probably the way it is being done as they have a continuous stream of information which means they can operate on differences or changes adding stability to the recognition. One shot detection may jump around if the scene changes.

If you have image segmentation you can start to understand object movement - for example you can tell if it is moving laterally frame to frame or getting bigger because it is coming closer.
If you know what you are tracking, it is easier to figure out how to focus on it. I expect that this capability will be used to improve metering in the future, if it is not being used already.

This is the real reason that this is a D1 moment. Once you have a camera which can really recognize what you are shooting, it can begin to act as an assistant. In this case the smart focus is your own personal focus puller. It will be a long time before it replaces craft services though...

Next steps I can image are using things like pose estimation to enable maintaining focus on other parts of the body (sometimes we want to focus on a hand) or other objects.
An example of how object pose estimation works is here:
https://docs.nvidia.com/isaac/isaac/packages/object_pose_estimation/doc/pose_cnn_decoder.html
Here is how it plays out with humans
https://youtu.be/BZId1M-uehA






chambeshi

  • Guest
Re: Z9 Release Thread
« Reply #129 on: November 03, 2021, 19:08:16 »
Excellent explanation, with the authoritative links [added emboldened emphasis below mine]

We continue to live in the most interesting of times....

thank you very much  :) :)

The object focus is probably run in a two step process. First the entire image is likely scaled down (likely this is done on the image that is presented to the viewfinder as it is downsampled already) then it will run an image segmentation routine to separate/identify the objects in the scene.
Here is an overview of what segmentation is and how it works.
https://catalog.ngc.nvidia.com/orgs/nvidia/collections/imagesegmentation
Once the objects are detected, I'd guess a second pass is taken within those areas of interest to see if an eye can be detected within that region of the image. If so, the sub-region pixels are used to adjust focus further.
They may be doing it differently. There are one shot methods available, but from the videos they have shown with bounding boxes it seems like this is probably the way it is being done as they have a continuous stream of information which means they can operate on differences or changes adding stability to the recognition. One shot detection may jump around if the scene changes.

If you have image segmentation you can start to understand object movement - for example you can tell if it is moving laterally frame to frame or getting bigger because it is coming closer.
If you know what you are tracking, it is easier to figure out how to focus on it. I expect that this capability will be used to improve metering in the future, if it is not being used already.

This is the real reason that this is a D1 moment. Once you have a camera which can really recognize what you are shooting, it can begin to act as an assistant. In this case the smart focus is your own personal focus puller. It will be a long time before it replaces craft services though...

Next steps I can image are using things like pose estimation to enable maintaining focus on other parts of the body (sometimes we want to focus on a hand) or other objects.
An example of how object pose estimation works is here:
https://docs.nvidia.com/isaac/isaac/packages/object_pose_estimation/doc/pose_cnn_decoder.html
Here is how it plays out with humans
https://youtu.be/BZId1M-uehA

chambeshi

  • Guest
Re: Z9 Release Thread
« Reply #130 on: November 03, 2021, 19:09:36 »
Video of yesterday's discussion is now publicly available - thanks to free from Paul's Photo CA

https://www.youtube.com/watch?v=R5CO7S2XiTU

I was up at the tender hour of 0130 this morning - GMT+1 - to listen to a zoom Discussion about the Z9 by Thom Hogan and Mark Comon (Paul's Camera, Torrence CA).

Much information was covered, some useful insights new to me, so well worth the nocturnal event. A particular new feature of the Nikon Z Autofocus system stands out, and we are likely to learn much more about its importance in the wild. TH reiterated why the Z9 is a D1 Moment at several points in the Discussion. He also underscored an aspect of the Z9 autofocus, which no one else had mentioned IME (in screening a lot of material over the past few days).

The AI subject recognition of objects is encoded in the 3D Tracking in the hierarchical structure to culminate in drilling down to the smallest Object - namely the Eye. It appears from the preliminary evidence of images and EVF footage that if 3D tracking cannot find Eyes it reverts to Heads; then Bodies/Torsos.... It is Always tracking shapes when this mode is activated by the photographer. As TH concluded, the deeper details as to how Nikon has actually implemented this hierarchical shape recognition is not at all clear.

It is very clear this AI empowered 3D OR works across a diverse range of active subjects, humans, animals (including fishes), birds and vehicles!

Vehicle Tracking  similarly works with recognizing the Hierarchy of Vehicles > Cockpits > Front-of-a-Vehicle, down to smallest high contrast objects, headlights.

One point of interest flagged in this Discussion is among the first feedback of the Z9 from Michelle Valberg; she says the 100-400 f4.5/5.6S gives excellent quality with the ZTC2 at f11, using 3D Tracking + Eye-Detect on the Z9. Some of these examples also include 500 f5.6E PF+T2 III, which is no less encouraging (reiterated in her recent video https://www.youtube.com/watch?v=StM5lcT52jI). She also raves about the scope, speed and reliability of the 3D Tracking Mode+Eye Detect on a range of mammals and birds, also amphibians.

She will probably have much more to say about this aspects this weekend. Part of the Discussion day is Online (and free to registrants):
https://creativephotoacademy.com/event/revealing-the-soul-within-with-michelle-valberg-1/2021-11-07/

fyi a section of this can watched free, and she may present information additional to her video

Jack Dahlgren

  • NG Member
  • *
  • Posts: 1528
  • You ARE NikonGear
Re: Z9 Release Thread
« Reply #131 on: November 03, 2021, 20:10:52 »
Excellent explanation, with the authoritative links [added emboldened emphasis below mine]

We continue to live in the most interesting of times....

thank you very much  :) :)

Thanks. I’m still a manual focuser for the most part, but am excited to see these advancements as my own capability to perform fine focus heads in the wrong direction. It is truly interesting times.

fish_shooter

  • NG Member
  • *
  • Posts: 95
  • You ARE NikonGear
    • Salmonography.com
Re: Z9 Release Thread
« Reply #132 on: November 04, 2021, 00:05:55 »
Hogan mentioned a number in the hundreds that could indeed refer to these picture elements and that this number was greater than the current cameras. One thing not pointed out is that all these animals are vertebrates and thus have a common body plan. It will be interesting when it comes to invertebrates - how well will the AF work with them? Many have bilateral symmetry with an anterior head and a pair of eyes so these may be OK with animal AF. Another thing will be fake eyes such as a large spot (one on each side) found on some fishes.........

Jack Dahlgren

  • NG Member
  • *
  • Posts: 1528
  • You ARE NikonGear
Re: Z9 Release Thread
« Reply #133 on: November 04, 2021, 02:14:06 »
Hogan mentioned a number in the hundreds that could indeed refer to these picture elements and that this number was greater than the current cameras. One thing not pointed out is that all these animals are vertebrates and thus have a common body plan. It will be interesting when it comes to invertebrates - how well will the AF work with them? Many have bilateral symmetry with an anterior head and a pair of eyes so these may be OK with animal AF. Another thing will be fake eyes such as a large spot (one on each side) found on some fishes.........

Neural networks are weird. I don’t think they have a concept of body plan, but instead develop a set of “features” which are tested to be useful in distinguishing between things it sees. Sometimes these are recognizable by humans, but some are just strange shapes or gradients. The beauty of this approach is that you don’t need to know what those features are, the machine figures it out. Training the model can take a lot of time and data, but once you have the trained model you can apply it to multiple situations, or modify it to handle additional classes etc. One of the links I gave earlier was to our NGC page where pre-trained models are available as a starting point for people to build applications on top of. I’m constantly amazed by how fast things are evolving and people find better and better techniques to make machine learning faster and more accurate.

Øivind Tøien

  • NG Supporter
  • **
  • Posts: 1736
  • Fairbanks, Alaska
Re: Z9 Release Thread
« Reply #134 on: November 04, 2021, 04:11:05 »

I did not see anyone referring to this one, Morten Hilmer BIRD PHOTOGRAPHY | NIKON Z9 on Svalbard
https://www.youtube.com/watch?v=yjRBEw-NLlI&ab_channel=MortenHilmer
Apparently there will be more coming in the same series.
Øivind Tøien