There is so much innovation these days that it has transcended the separation-by-OS that used to handily signal where and what kind of changes you could expect. As an example, if you're looking for an experimental
graphical terminal emulator it turns out you can use it in Windows and OS X, but not in Linux. But the point is, it's not available on one OS in particular and it's even a virtue now to be cross-platform.
There's so much new tech out there and it all happens on a huge variety of platforms. So trying out new tech is just a matter of focusing (for example: system software, graphics software, hardware support, kernel-level new stuff, software in embedded systems, hardware sensors, etc.) and then deciding what the required resources are to dive in on that specific level. What OS or OSes would be best, what packages should you install, and so on.
Going back to your examples, 3D/VR desktop work has been going on since the 80s at least, and AI before that, and "drastically better performance" has always been on peoples' minds. The GUI mashups even ring a bell, though everything is so scriptable these days that anyone who's doing a GUI mashup would probably be less frustrated just typing it into a reusable script. These aren't new topics, they change over time incrementally, and the only advice I can give is to make sure you are _really_ looking at the high-end tech that you think you are. If you are frustrated with a slow system, did it cost less than $10K? Because that's commodity-level pricing. If you are frustrated with the 3D effects you just enabled on your desktop, did you really research the state of the art? And so on.
Also, just to nitpick--you say Ubuntu is dumbed-down in "default configuration" but Windows and OS X are dumbed down by default too, aren't they? That's why you have package managers, Ninite, the App Store, etc. Restore your purchases or download a set of things and you're out of the dumbzone.