It is interesting to note that for the first 100 years or so of cinema production, there was no way to preview the image being recorded. They didn't even have reflex finders on the big silent-running studio production cameras until the end of the 1960's! The cinematographer had to trust that the focus marks and T-stop scales were dead accurate, and that, to the greatest extent possible, the lens would deliver an image unspoiled by lens defects.
When digital cinema hit big in the 2000's, there was a mad rush by the lens companies to create entire new lines of lenses, because the digital sensors could "see" many of the aberrations on the existing lenses which were much more ignorable when film was the recording medium. Mainly chromatic aberrations.
Then, toward the 2010's, the old, aberration-rich old lenses were rediscovered, seemingly as a reaction against the super clean "perfect engineering" look that lens companies seem always to be pushing.
'New and Improved' gave way to 'Old and Not Improved'.
I think that if the audience is watching the bokeh or chromatic aberration, the movie must not have a very good story!