I don't really disagree, but I think that was the original intent of the meme; to show Crowder as a complete chode by having him assert really stupid, deeply unpopular ideas.
The meme's use has become too soft on Crowder lately, though, I think.
I notice lately that many memes origins are worse than I thought from the context they are used in. Racist, homophobic, and lying people are not something I usually accept as entertainment, but they sneak their way unnoticed into my (non-news) feed through memes. I guess most people won't know the origins of the meme and use it according to the meaning they formed on their own. Other memes like the distracted boyfriend meme are meaningless stock photos, so I understand why many people use memes without thinking about the origins.
Anyway, thanks for pointing out who the person in the picture actually is.
I'm tired of people taking sides like companies give a shit about us. I wouldn't be surprised to see five comments saying something like "you shouldn't buy Nvidia AMD is open source" or "you should sell your card and get an amd card."
I'd say whatever you have is fine, it's better for the environment if you keep it for longer anyway. There are soo many people who parrot things without giving much though to an individuals situation or the complexity of a company's behavior. Every companies job is to maximize profit while minimizing loss.
Basically if everyone blindly chose AMD over Nvidia the roles would flip and AMD would start doing the things Nvidia is doing to maintain dominance, increase profit, reduce cost and Nvidia would start trying to gain more market share from AMD by opening up, becoming more consumer friendly, competitively priced
For individuals, selling your old card and buying a new AMD card for the same price will net you with a slower card in general or if you go used there is a good chance it doesn't work properly and the buyer ghosts you. I should know, I tried to get a used AMD card and it died every time I ran a GPU intensive game.
I also went the other way upgrading my mother's Nvidia card with a new AMD card that was three times as expensive as her Nvidia card ($50) would be on eBay and it runs a bit slower than her Nvidia card did. She was happy about the upgrade though because I used that Nvidia card in her movie server resulting in better live video transcoding than a cheap AMD card would.
Not a criticism of you but a little fun fact about him for others, he has a bunch of friends who "aren't" Nazis but calling themselves or have friends who like to call themselves stuff like "race realist".
What distro are you using? Been looking for an excuse to strain my 6900XT.
I started looking at getting it running on Void and it seemed like (at the time) there were a lot of specific version dependencies that made it awkward.
I suspect the right answer is to spin up a container, but I resent Docker's licensing BS too much for that. Surely by now there'd be a purpose built live image- write it to a flash drive, reboot, and boom, anime vampire princes hot girls
I set mine up on arch. There's an aur package, but it didn't work for me.
After some failed attempts, I ended up having success following this guide.
Some parts are out of date though, so if it fails to install something you'll need to have it target a newer available package. Main example of this is inside the webui-user.sh file, it tells you to replace an existing line with export TORCH_COMMAND="pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/rocm5.1.1". This will fail because that version of pytorch is no longer available. So instead you need to replace the download URL with an up to date one from the pytorch website. They've also slightly changed the layout of the file. Right now the correct edit should be to find the # install command for torch line and change the command under it to:
pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm5.7
You may need to swap pip to pip3 too, if you get a pip error. Overall it takes some troubleshooting, look at any errors you get and see if it's calling for a package you don't have or anything like that.
I can confirm that it works just fine for me. In my case I'm on Arch Linux btw and a 7900XTX, but it needed a few tweaks:
- Having xformers installed at all would sometimes break startup of stable-diffusion depending on the fork
- I had an internal and an external GPU, I want to set HIP_VISIBLE_DEVICE so that it only sees the correct one
- I had to update torch/torchvision and set HSA_OVERRIDE_GFX_VERSION
Earlier in my career, I compiled tensorflow with CUDA/cuDNN (NVIDIA) in one container and then in another machine and container compiled with ROCm (AMD) for cancerous tissue detection in computer vision tasks. GPU acceleration in training the model was significantly more performant with NVIDIA libraries.
It's not like you can't train deep neural networks without NVIDIA, but their deep learning libraries combined with tensor cores in Turing-era GPUs and later make things much faster.
Not gonna lie, raytracing is cooler on older games than it is newer ones. Newer games use a lot of smoke and mirrors to simulate raytracing, which means raytracing isn't as obvious of an upgrade, or can even be a downgrade depending on the scene. Older games, however, don't have as much smoke and mirrors so raytracing can offer more of an improvement.
Also, stylized games with raytracing are 10/10. Idk why, but applying rtx to highly stylized games always looks way cooler than on games with realistic graphics.
Playing old games with Ray tracing is just as amazing as playing new games with Ray tracing. I know quake rt gets too dark to play half way through, they should have added light sources in those areas.
Then again, I played through cyberpunk 2077 at 27fps before the 2.0 update. Control was pretty good at 50fps, and I couldn't recommend portal enough at about 40fps on my 2070 super. I don't know if teardown leveraged rt cores but digital foundry said it ran better on Nvidia and I played through that game at 70fps.
I love playing with new technologies. I wish graphics card prices stayed down because rt is too heavy nowadays for my first gen RT card. I play newer games with rt off and most setting turned down because of it.
I love playing with new technologies. I wish graphics card prices stayed down because rt is too heavy nowadays for my first gen RT card. I play newer games with rt off and most setting turned down because of it.
I wish they stayed down because VR has the potential to bring back crossfire/SLI. Nvidia's gameworks already has support for using two GPUs to render different eyes and supposedly, when properly implemented, it results in a nearly 2x increase in fps. However, GPUs are way too expensive right now for people to buy two of them, so afaik there aren't any VR games that support splitting rendering between two GPUs.
VR games could be a hell of a lot cooler if having 2 GPUs was widely affordable and developers developed for them, but instead it's being held back by single-gpu performance.
I'm holding out building a new gaming rig until AMD sorts out better ray-tracing and cuda support. I'm playing on a Deck now so I have plenty of time to work through my old backlog.
My experience with the Deck outside of CS2 has been nothing short of mind-boggling. I don't even REALLY have a problem with CS2 but I cannot play online for VAC reasons I can't sort out. I have a ticket open with Steam Support. 🤷
It's also not my problem either. I don't give a shit what nvidia or AMD does, I just want to be able to run AI stuff on my rig in as open-source a manner as is possible.
last I heard AMD is working on CUDA working on their GPUs and I saw a post saying it was pretty complete by now (although I myself don't keep up with that sort of stuff)
NVIDIA finally being the whole bitch it seems, not unexpected when it comes to tech monopolies.
In the words of our lord and savior Linus Torvalds "NVIDIA, fuck you! 🖕", amen.
In all reality, a lot of individuals aren't gonna care when it comes to EULA B's unless they absolutely depend on it and this whole move has me want an AMD gpu even more.
That sucks if that's true. And would also ironically not be the first time AMD is getting denied a thing after they already have an implementation ready for it lol.
Man I just built a new rig last November and went with nvidia specifically to run some niche scientific computing software that only targets CUDA. It took a bit of effort to get it to play nice, but it at least runs pretty well. Unfortunately, now I'm trying to update to KDE6 and play games and boy howdy are there graphics glitches. I really wish HPC academics would ditch CUDA for GPU acceleration, and maybe ifort + mkl while they're at it.
Yes. Haha. Amusing. But really....Blender, Davinci Resolve, and a host of others. It's not a hobby, it's quite literally a (albeit small) portion of my income.
And it has nothing to do with anime nudes.
Anime/waifu is literally for pedos wanting a loophole. Sorry, not sorry.