Skip Navigation

Search

Pictrs Migration- Between NOW and 1700 UTC Image Upload Disabled

Hi all, We will be attempting to migrate Pict-rs tomorrow starting at 1700 UTC. Between now and 1700, image uploads will be disabled to try and avoid any data loss. You will still be able to like, comment and post until then. During the upgrade window, Pict-rs will be offline, so viewing images might not work. In theory CloudFlares CDN cache should cover us during the window, but YMMV. Hopefully everything should go smooth, but we do our best to make sure we have backups!

You can monitor our site health at https://dash.lemmy.world and https://status.lemmy.world

We will post updates to https://mastodon.world/@LemmyWorld as well.

Cheers, LW Infra Team 💗

Update This change is done. Images are now hosted on S3 storage and uploads are re-enabled.

58

Brief outage for Pictrs upgrade today Oct 2nd 18:30 CEST is finished

Edit : OK so the outage was less brief than expected. The upgrade of the 3GB pictrs database took over 1 hour, and the version 0.4 database is now 14.5 GB... But anyway, it seems to be working alright now!

We will upgrade pictrs today, from 0.3.1 to 0.4.4. This will enable us to switch to S3 storage later on, and is needed for more anti-CSAM tools. Outage should only involve picture uploads, and should be brief.

26

Lemmy world was upgraded to 0.18.3 today (2023-07-30)

Update The upgrade was done, DB migrations took around 5 minutes. We'll keep an eye out for (new) issues but for now it seems to be OK.

Original message We will upgrade lemmy.world to 0.18.3 today at 20:00 UTC+2 (Check what this isn in your timezone). Expect the site to be down for a few minutes. ""Edit"" I was warned it could be more than a few minutes. The database update might even take 30 minutes or longer.

Release notes for 0.18.3 can be found here: https://github.com/LemmyNet/lemmy/blob/main/RELEASES.md

(This is unrelated to the downtimes we experienced lately, those are caused by attacks that we're still looking into mitigating. Sorry for those)

142

New try at upgrading to 0.18.1 July 1st 20:00 CET

We'll give the upgrade new try tomorrow. I've had some good input from admins of other instances, which are also gonna help troubleshoot during/after the upgrade.

Also there are newer RC versions with fixed issues.

Be aware that might we need to rollback again, posts posted between the upgrade and the rollback will be lost.

We see a huge rise in new user signups (duh.. it's July 1st) which also stresses the server. Let's hope the improvements in 0.18.1 will also help with that.

406

Server will be migrated (More power!)

So after we've extended the virtual cloud server twice, we're at the max for the current configuration. And with this crazy growth (almost 12k users!!) even now the server is more and more reaching capacity.

Therefore I decided to order a dedicated server. Same one as used for mastodon.world.

So the bad news... we will need some downtime. Hopefully, not too much. I will prepare the new server, copy (rsync) stuff over, stop Lemmy, do last rsync and change the DNS. If all goes well it would take maybe 10 minutes downtime, 30 at most. (With mastodon.world it took 20 minutes, mainly because of a typo :-) )

For those who would like to donate, to cover server costs, you can do so at our OpenCollective or Patreon

Thanks!

Update The server was migrated. It took around 4 minutes downtime. For those who asked, it now uses a dedicated server with a AMD EPYC 7502P 32 Cores "Rome" CPU and 128GB RAM. Should be enough for now.

I will be tuning the database a bit, so that should give some extra seconds of downtime, but just refresh and it's back. After that I'll investigate further to the cause of the slow posting. Thanks @[email protected] for assisting with that.

354

Restarting Lemmy.world at 17:00 CET for server upgrade

Apparently in some parts of the world it's already Monday June 12th. I see registrations going up.

This results in the server CPU staying around 70% even spiking over 90%. So I will upgrade the VPS from 4cpu/16GB to 8cpu/32GB.

Expect a few minutes downtime at 17:00 CET

53

I will briefly restart the server to scale up

I'm not happy with the performance, especially when clicking 'Post' it takes too long.

First I will upgrade this VPS from 2cpu/4GB to 4cpu/16GB.

Then I will tune the database (more shared_buffers to start with)

After that I'll see what more I can tune, together with the Lemmy admin channel.

11