Yes, research computing has continued to move more and more into datacenters and/or cloud providers. That includes access to GPUs, including those that support CUDA, for using that hardware to perform computations. The desktop or laptop is largely a dumb terminal There is no computer Apple or any other manufacturer can provide that would reduce that trend. You’re right that scientific visualization is a particular category where there’s still some value in having some “oomph” in the computer in front of you.
Given all that, the threat to Macs is Windows in this sector. Windows Subsystem for Linux (WSL) keeps getting better and is probably a better choice than macOS for people who want tools like they find on their servers; macOS tends to have dated and/or different versions (BSD vs. GNU) of open source software while WSL lets one use software from standard Linux package management systems. Obviously, lots of people still prefer macOS to Windows for everything outside the terminal window.
I know Nvidia blames Apple for the poor (and since Mojave, non-existent) support for their cards on Macs but Nvidia is responsible for making the drivers for their hardware. I think they decided there was enough value in putting in the work to write drivers for Apple’s APIs, particular the “Metal” API. I saw a mention of on an Nvidia forum that maybe the upcoming DriverKit could make a difference but Apple says the framework is “to create drivers for USB, Serial, NIC, and HID devices,” video cards aren’t on that list.