Just wondering how progress is on eGPU docks? I want to upgrade my GPU at some point and thought it might be cool to put it in a dock and connect it to a laptop. The laptop has a couple thunderbolt ports.
I have extensively used an eGPU (Razer Core X) with an Nvidia RTX 3050 for gaming under Wayland. Using X11 gave me nothing but problems, but Wayland allows for full hotplug capabilities (as long as no monitors are ever connected to the GPU).
Of course, performance is fairly bad with the official Nvidia drivers + Wayland, but it's good enough to play The Outer Worlds and a few other single player games, which is good enough for me! I have been entirely unable to get external monitors to work with the Nvidia driver (any help would be much appreciated), although they did work (coldplug) with the Nouveau driver.
When I was using Windows, I was able to hotplug/unplug the eGPU with monitors attached, effectively turning the GPU into an external docking station--I am closely following driver improvements, as this would be great to have on Linux to get around the 2-monitor limitation of the Intel iGPU.
I'm using the Surface Laptop Studio with EndeavourOS (basically arch, so I have all the latest packages)--the performance issues stem from Nvidia's drivers, so AMD should not suffer from the same problems, although I don't have any AMD cards to test if hotplug with monitors is functional
I used one with Fedora for a while. The problem I had is whenever it would randomly disconnect, Fedora could not handle it gracefully. It would lock up the system and require a hard reboot. Windows has been a bit more graceful about things. I'm hoping the next generation or maybe oculink will be better.
Can confirm, I'm using a dock (from Razor) daily without problems. Hot switching doesn't work though, you need to restart X/your display manager to connect or disconnect the eGPU. I'd recommend the gswitch utility to configure the graphics card to be used (on X11). Haven't tested much on Wayland, but I know that at least Gnome (Wayland only) has trouble mixing eGPU and the internal display if that is important.
I don't think hot switching is an issue. It would be setup and not disconnected unless I'm traveling. Does it use the egpu for everything when its connected? Or can you set it up like hybrid where it'll only use it for games etc?
If you use X and need to restart it, you can probably preemptively use XPRA to proxy your Xclients and move them to the new Xserver, except maybe for those that need low latency or DRM (e.g. games)
It worked... ok. The lack of a USB dock really hurt the "desktop and laptop in one" concept that I was shooting for. I had to plug / unplug 3 things to get into "desktop mode" which was a hassle for how much I switched between modes. It ran things like Valheim really well but utterly failed at FPS games like Apex (<15fps, horrible stuttering, totally unplayable).
If you already have a laptop, a GPU, a desk, a decent monitor, and you typically play low-requirement games and just want to play on high settings -- then by all means it'll be great for that! Another way it may make sense for you is if you play around with CUDA and need a compatible GPU on a budget.
That being said, don't convince yourself that you'll get full use out of something like a 4070. If that's what you want then, as of now, a desktop is almost certainly your best option.
For sure. It's something I've considered for a while simply because I don't need that extra heat/noise created by the GPU when I'm only doing my day job.
I went down this path, but mini itx nucs with a GPU slot seemed to be better as long as you're not using the egpu on multiple devices; if you are, then it might be worth considering just making a PC a host and running sunshine/moonlight. While I haven't tried connecting to my host on the steamdeck, I have on my laptop and felt like it could be used for gaming
I have an AKiTiO Node Titan eGPU enclosure with a GTX 1070 hooked up to an Ubuntu 22.04 laptop and it's working pretty well. I'm doing PCI passthrough to an Arch Linux VM, since my company mandated that all Linux users must use Ubuntu. To stave off comments about this, I'll say that it's not just that I dislike Ubuntu. They're requiring me to lock down so much stuff that I can't do my job. Plus, the endpoint security sensor on the host plays absolute hell with anything that uses heavy multiprocessing. The GPU (with external monitors), second NVMe drive, mouse, keyboard, audio interface, microphone, webcam, 30 gigs of RAM, and 11 CPU cores are passed to the VM, and the host OS gets the laptop GPU + monitor and my continuing disdain.
I've been using this setup for a month. My experience thus far has been positive. I start the computer up with or without the GPU connected, connect the GPU if I haven't yet, launch my VM via libvirt, and things just work. I really thought I'd have more problems with the GPU, but the USB passthrough stuff has been the truly problematic part (I can't just pass the whole PCI USB controller for IOMMU reasons). It's important to note that the GPU displays directly to external monitors. I think it's possible to like, send the data back to your laptop screen? But I really didn't want that.
(As an aside, the security people at my company have no problems with VMs lol. They know what I've done and they don't seem to care).