Do you want to supplement your usenet sources? If so, what type of content/language are you looking for? Do you have any proof like screenshots of the old times left?
Things that should be answered before anyone would consider giving away an invite, because that is also a liability.
I guess your OPNSense rule from Edit3 is not working because the source is not your mailu instance, because connections are initiated from the outside and mailu only answers (TCP ACK). So you have asynchornous routing.
You may get this working if you set the "reply-to" option to the wg gateway on the firewall rule that allows VPS -> wg -> mailu traffic.
However there is a much cleaner solution using the PROXY protocol, which mailu seems to support: https://mailu.io/master/reverse.html
They are using traefik, but nginx also supports the PROXY protocol.
I ran into the same problem some months ago when my cloud backups stopped being financially viable and I decided to recycle my old drives. For offline backups mergerfs will not work as far as I understand. Creating tar archives of 130TB+ also doesnt sound like a good option. Some of the tape backup solutions looked to be possible options, but are often complex and use special archive formats...
I ended up writing my own solution in python using json state files. It's complete enough to run the backup, but otherwise very work-in-progress with no restore at all. So I do not want to publish it.
If you find a suitable solution I am also very interested 😅
If this fits your budget (you still need the actuals disks..) it's not a bad choice. Speed should be sufficient for HDDs, as it's USB 3.
As the other poster suggested, don't use its hardware raid. Use it as a JBOD and configure the raid in Linux with ZFS or similar.
And never forget: RAID is not a backup! You still need to do regular backups, at least for important data.
Honestly, unless you can spend more $, one or two USB disks for the mini pc is probably your only choice.
From your description I would gues that the affected trackers have some rate or connection limits, and your qbittorrent announces are exceeding them. try setting a higher announce interval, like 1+ hours
I'm surprised no one mentioned ansible yet. It's meant for this (and more).
By ssh keys I assume you're talking about authorized_keys, not private keys. I agree with other posters that private keys should not be synced, just generate new ones and add them to the relevant servers authorized_keys with ansible.
Insurance is privatized with many providers, however the 'base insurance' covers the same at all providers and is mandatory to have for all residents. However base insurance prices still vary between providers every year, so people are encouraged to switch the provider regularily, which is a business on its own with brokers / call centers getting bonuses for every 'sign up', which means a lot of wasted money:
https://www.emolument.com/salary-reports/jobs/insurance-broker/45892
no, its personal preference
I'm using organizr, it embeds your apps in tabs/iframes and allows you to configure them in the UI.
same here, lol
I don't think you can force anything in this regard. There is the official forum, reddit and here. People will just naturally ask wherever they are comfortable.
Disclaimer: I did not yet have access to an AMT setup, but answering based on common sense:
-
Like most KVM options, it is most likely LAN only. Unless you do some port forwarding on your router, it should not be possible to access it outside of your LAN.
-
Yes, always factory reset sensitive settings on a second hand machine first.
It would not be for me, but they just sent me this chat message which is concerning:
We are currently seeing unexpected growth across Dropbox Advanced, and as a result are currently only able to grant 1 TB per month per team. We understand this may be frustrating and are working to resolve this for our customers.
Dropbox 8TB weekly upload limit
Info for anyone switching from gdrive to dropbox like me, know that there may be a weekly upload limit of 8TB.
750gb upload, 5tb download per day. However this seems to be another limit, maybe file based max sharing or something.
guys are not downloading enough 😅
User statistics
All-time upload: 143.678 TiB
All-time download: 112.403 TiB
All-time share ratio: 1.27
you need to improve your search skills:
It matters only if "the docker hosts external IP" your dns resolves is a public IP. In that case packets travel to the router which needs to map/send them back to the docker hosts LAN IP (NAT-Reflection). With cgnat this would need to be enabled on the carrier side, where you set up the port forwarding. If that's not possible, split-DNS may be an alternative.
If "the docker hosts external IP" is actually your docker hosts LAN IP, all of that is irrelevant. Split-DNS would accomplish that.
Are you hosting behind NAT / at home? If so, you may need to enable NAT reflection on your router.