Is Orca that resource intensive? I'm running it in a container with KasmVNC and have never really checked out the resource usage. Admittedly it's on one of my local servers in another room. I guess it's how large your projects are too.
That's how I got a free netbook. The netbook had 32GB flash with windows and office occupying 27+GB. Then windows wanted to do an update - with an 8+GB file. Spot the problem. And windows can get quite annoying with updates. As the netbook could not be expanded, and attempts to redirect the update to a USB stick did not work, a newer netbook was bought, and I got the old one. Linux plus libreoffice plus a bunch of extras happily sat in 4GB...
Gives a lot of Space for running Virtual machines.
Also browsers can chew that up fast if you have a lot of tabs, Firefox has managed to do it a few times. At least until I started limiting its RAM to 8GB (best decision ever)
(To use it with other apps like Chrome or Electron apps just replace the command at the end, and startup class with the ones from the program you'd like to run. Icon and Name changes are optional but might be desirable so you remember what app it is for).
Something I didn't consider when answering earlier is that even if Firefox did have good RAM usage limiting built-in I probably still wouldn't use it or recommend it, because one of Firefox's biggest problems is that it leaks. And memory leaks will not be negated by Firefox's built-in RAM limiter but they will be by systemd's (or anything else you might be using instead) Firefox would still crash in the event of a leak but it's still better than it taking gnome or other apps with it, or freezing your system entirely.
That's good to know, I don't know how well it would work though I feel like I enabled something about closing background tabs to reduce memory load (it might have been what you said, it might have been something else I don't really remember) and it helped a little bit but it still ended up chewing up a lot of memory.
Setting the limit though did help immediately. And stop the overconsumption problems, occasionally a couple of tabs crash here and there but it doesn't freeze or worse cause other apps to slow down and freeze. Which did happen before.
It might be harder for them but there are similar tools that they could use to limit it. One I've seen people use is firejail, a tool designed for sandboxing processes and applications.
I've personally never tried it myself though so I can't attest to how well it works, either for this purpose or sandboxing in general.
Does it kill Firefox if it tries to go over the limit? I think I tried this once and if there is a memory leak it just closes itself (which is batter than hogging the whole system, bit still)
No, it just limits the amount of RAM that Firefox (or whatever other application you launch with these parameters) will see.
A few Firefox tabs may crash occasionally as a side effect. And obviously if Firefox eats up all of the 8GB it's allocated it may crash itself though usually it doesn't and tabs will crash before the browser crashes.
Tiny Core Linux is a minimal Linux kernel based operating system focusing on providing a base system using BusyBox and FLTK. It was developed by Robert Shingledecker, who was previously the lead developer of Damn Small Linux.
Hm? Do you mean a link to builds that are this small? My midrange Intel i5-12600K (I'm a working man, doc...) L3 cache is 20,971,520 bytes. My Linux Mint (basically Ubuntu kernel) vmlinuz right now is only 14,952,840 bytes. Sure, that's a compressed kernel image not uncompressed, but consider this is a generic kernel built to run most desktops applications very comfortably and with wide hardware support. It's not too hard to imagine fitting an uncompressed kernel into the same amount of space. Does that help to show they're roughly on the same order of magnitude?
My ARM board from 2010 has 256MB of memory. It runs an old 3.1 kernel (not attached to internet) , new kernels won't fit/load. But on that I have OpenMediaVault running SAMBA shares and mindlna to serve music. It isn't even using 50% of the 256MB
i mean, some games (cough cough factorio cough cough) manage to use up about 25GB of ram on my system, so it's nice to have a buffer. now, my 64GB may be considered a bit overkill but i call it future proofing
Can't relate, just upgraded my laptop from 32GB to 64GB since VScode would keep closing due to OOM. What? Oh, no, it's not vscode's fault.....I keep like 5 Firefox windows with 30+ tabs open, like a fucking maniac..... Close them? What do you mean "close" them?
I had around 1500 open tabs in Firefox. It was fine. I figured enough was enough and closed them all. Now I close all tabs at the end of the day before shutting down.
When I started hitting OOMs I just downloaded free ram.
(Modifying my zram-generator config to use 1.5x my ram size instead of the measly 4GB – uncompressed – default. Seriously it's worth looking into, though default depends on your distro)
No need to convince me. I will always believe people complaining about garbage electron apps.
That being said, I use vscodium myself and actually like it. Does not mean I won't complain tho
Firefox puts inactive tabs to sleep, effectively turning them into bookmarks that reload when you switch back to them. I regularly right-click close-tabs-to-the-right over 200 tabs.
I think there's still something wrong with your setup....
You should be able to have as many Firefox windows and tabs as you'd like without using too much RAM, since they should de "suspended".
I regularly have hundreds of tabs running fine, on 32GB of RAM.
Most likely it's a vscode extension that's leaking memory, and this problem will still happen after your upgrade, just take longer.
Your use case is obviously different, but I've gone years between system upgrades. I mostly do OSS coding, or work stuff; not gaming. The only case I can imagine needing to upgrade my little Ryzen with 16 cores - a laptop CPU - is if it becomes absolutely imperative that I run AI models on my desktop. Or if Rust really does become pervasive; compiling Rust programs is almost as bad as compiling Haskell, and will take over my computer for minutes at a time.
When I got this little micro, the first thing I did was upgrade it to 64GB of RAM, because that's the one thing I think you can never have too much of; especially with the modern web and all the shit that brings with it; Electron apps, and so on, absolutely chew up memory. The one good thing about the Rust trend is better memory use, so the crappy compile times are somewhat forgiveable.
Somewhere around 2017 I bought an old dell precision from 2011 for $25, put a radeon rx 570 in it a few years later and used it as my main computer until last year when I finally got around to building a replacement
My case was purely, that I had upgraded the gpu in my classic Mac Pro, and thought that a SFX pc build could be done with the old gpu and a power supply and mobo.
It started out with a cheap mobo, to hold only an old i7 from an imac that was parted out, and 8gb of ram (2x4 sticks I had spare) and the vega56.
I found it so capable a system, that the only issue was ram when I forgot about the dozen tabs open on a browser, and the game just launched would hang the system. Before I would ‘waste’ spending money on the max 16gb that this board could hold, I started collecting the parts for it’s current setup; a520i, ryzen5 5600x , 64gb, nvme ssd and the gpus I’ve now swapped between the cMP so now it’s a rx5700xt.
Use is purely a spare, don’t want a windows machine, I’ve got the mac for a server/media machine, so it’s all purpose and games on the Linux box.
Although I have got dual boot capability set up on both just because I could, maybe something really offside would need w10 - one example; VCDS car diagnostic software that doesn’t support anything but win.
At work I regularly kiss 32gb with everything open and a VM. When I got my latest machine I made sure to get 64 so I think I'll be good for a while. 32 gigs lasted me from 2017 to 2024. And if I need more this machine takes 2 SODIMMS so I can install at least 96 gigs.
Fixing a SSAO bug where indices overflowed the 32bit int on the gpu I had to use 64GB. Since then I have never needed more than 32GB and at home 24 is way more than I need.
Well, I just remembered, actually I did need more once for a fftv bug (same story, 32bit overflow) but I borrowed a 192GB pc for that.
Am i the only one who still has no problems with 8GB? Not that I wouldn't be happy with more but i can't remember the last time I've even thought about ram usage
At my last WFH job my daily setup was firefox, sublime text, slack (electron app), github desktop (also electron), and 3 terminals, one running a local dev server. It all ran fine.
this; every time the ublock origin absolutists insist that everyone must use Firefox or die I just wonder if they never open more than one or two tabs anyway. hell, a sufficiently complex web app running in a single tab can make FF choke
It was also supposed to be an all-in-one recording/streaming computer for university events, and they had to use the budget for something. It ended up being used as a proxmox host for a while, then it was handed off to me. Now the most resource-intensive thing it runs is a Windows 11 VM that I torture mercilessly use for experiments. It rarely gets to 10% memory utilization.
True story. I remember back in the bad old days when Firefox had notorious memory leaks, so when building my latest PC, I put in 32GB. The monitor app on my desktop has only ever topped out at showing 30% of memory allocated.
And hitting high memory pressure is really not fun on Linux (on Fedora at least), it simply locks up and slows down to a crawl and does nothing for minutes until the oom killer finally kills the bad program. I've kind of solvd this by installing a better oom killer on my laptop, but my desktop was easy: buy 32GB of additional ram for like 90$: problem solved
I like to have a 50GB+ swap file. Though Fedora is a bit weird with swap files as by default it's stored in RAM (Yes, extra space for RAM is stored in RAM. I... admit I don't understand the detail).
Hmm, it's been a few years since I've run Fedora, but that's an experience also still stuck in my head from that time.
I always figured, Linux had just gotten better at that, because I switched to a more up-to-date distro afterwards, but in retrospect, it's not like Fedora is terribly out of date, so maybe that is just a weird configuration on Fedora...
Multiple Firefox windows, at least one JetBrains IDE, and some other apps and I fill 20-30GBs easily. Sometimes on the lower end, sometimes on the higher end.
Y'all need to point me towards one of those tiny Linux systems. I have an old no-longer-bricked Toshiba Satellite that somebody gave me and I got it to boot again, so I slapped Mint on it to see how I liked it since I've never messed with that distro before. The only problem is this sucker is a dog, it's only got 2 gigs of RAM and a pokey 5400 RPM platter drive in it. The thing sits there and thrashes swap constantly even when it's doing nothing, and when Mint is creating one of its automated system image rollback things it's completely unusable. I'm surprised the laptop platters don't escape their casing and bore into the Earth like a drill bit.
I found that it will... eventually... load and run the latest FreeCAD build and once it's going it's actually not bad (awful screen resolution and single touch only trackpad notwithstanding). But getting there when taken altogether takes about 20 minutes...
Be careful with disabling swap if you don't have a very large amount of RAM, as many apps rely on memory overcommitment and a large virtual address space, which can behave erratically without swap.
You'd be better off keeping swap enabled and instead setting vm.swappiness = 0 in sysctl.conf.
Swappiness is a value between 0 and 100, where 0 means to never swap unless absolutely necessary (only if you completely run out of RAM), and 100 means all programs and data will be swapped nearly instantly. Think of it like a target for the percentage of RAM to keep available. The default is usually 40 which is fine for a low-RAM system, but swaps way too often for a system with more RAM.
I use bunsenlabs helium on my old vaio a series laptop. I use a 32 bit non pae build bc it's a pentium M that might not support pae. It uses a window manager over a desktop environment.
I'd recommend using a 32 bit distro as they tend to take up a little less ram.
Also I'm on a 4200 rpm PATA HDD. It has 2 gb of ddr ram. It's slightly too old to get ddr2 which is unfortunate.
You absolutely need swap on a low RAM system. It's the only way the system will actually be usable. You'll hit OOMs (out of memory errors) that take down the whole GUI if you turn off swap on a system with only 2GB RAM. You can only really turn off swap if you have a very large amount of RAM, and even then, it's safer to keep it enabled and set swappiness to 0 instead.
Long time ago I had a 3.5" HD floppy disk Linux with graphical user interface and ethernet and some programs on it. But 64 bit and the increased kernel size probably make this difficult nowadays.