There is some arbitrariness to the ISO ratings for digital cameras. Still, a lower sensitivity can only be achieved by either losing data or increasing the full-well capacity. That's why going below base ISO reduces the dynamic range.
Imagine that you have set your exposure such that at base ISO, the brightest pixels are fully saturated. Now you would like to double your exposure time (say, to blur something). You are now sending double the light to the sensor, and the brightest pixels receive double the number of photons that would normally saturate them. By choosing one half of the base ISO, you are scaling down the signal, so the wells that are fully saturated are now output as a raw level which is one stop below saturation. Doing the same RAW conversion as you would be doing with the first exposure at base ISO, you get an image which has the same overall brightness. But since you clipped the top stop of light (all the saturated pixels are mapped to one value irrespective of the actual amount of light that hit them), you lost one stop of information.
So going below the base ISO is not really recommended unless you either don't care about the clipped highlights or you are well under saturation at base ISO. But in the latter case you could just try to expose to the right at base ISO and lower the brightness during RAW conversion, which should give equal or better results.
A lower base ISO is correlated with a larger dynamic range, all else equal. E.g. from the D800 with base ISO 100 to the D810 with base ISO 64, we measure a 0.43 stop increase in DR at their respective base ISOs, which is slightly less than the 0.64 stops difference between ISO 100 and ISO 64.
Since there are ways to increase DR without decreasing base ISO, not all cameras with, say, a base ISO of 100 have the same DR. This is obvious if we contemplate the multiple-stop increases in DR over the last decade while base ISO has at most decreased from ISO 200 to ISO 100 or 64. The lower noise floor of newer sensors is responsible for that feat.
Hopefully, some day in the future we will get sensors which do not have a saturation limit. You could theoretically count the number of times saturation has occurred for each pixel, the limit to this being the speed at which you can read out data from individual pixels. I believe I once read about a patent for such a technology. Given such a sensor, the biggest challenge would still consist in making this dynamic range visible to the eye in some way.