52
submitted 7 months ago* (last edited 7 months ago) by Dark_Arc@lemmy.world to c/selfhosted@lemmy.world

My main account is dark_arc@social.packetloss.gg. However, as of roughly ~~24-hours ago~~ (it seems this has been going on since March 10th and gotten worse since) it seems like the server has stopped properly retrieving content from lemmy.world.

It's been running smoothly for well over 9 months, and (I think) working fine for content coming in from other instances. So I'm curious if anyone else experienced anything strange with lemmy.world federation recently?

Setup Description

The server flow in my case is as follows:

[Public Internet] <-> [Digital Ocean Droplet] <-> [ZeroTier] <-> [Physical Machine in my Basement (HW Info)]

The Digital Ocean droplet is a virtual host machine that forwards requests via nginx to the physical machine where a second nginx server (running the standard lemmy nginx config) then forwards the request to the lemmy server software itself.

Current Status

Lemmy Internal Error

I've found this is my lemmy logs:

2024-03-24T00:42:10.062274Z  WARN lemmy_utils: error in spawn: Unknown: Request limit was reached during fetch
   0: lemmy_apub::objects::community::from_json
             at crates/apub/src/objects/community.rs:126
   1: lemmy_apub::fetcher::user_or_community::from_json
             at crates/apub/src/fetcher/user_or_community.rs:87
   2: lemmy_server::root_span_builder::HTTP request
           with http.method=POST http.scheme="http" http.host=social.packetloss.gg http.target=/inbox otel.kind="server" request_id=688ad030-f892-4925-9ce9-fc4f3070a967
             at src/root_span_builder.rs:16

I'm thinking this could be the cause ... though I'm not sure how to raise the limit (it seems to be hard coded). I opened an issue with the Lemmy devs but I've since closed it while gathering more information/making sure this is truly an issue with the Lemmy server software.

Nginx 408 and 499s

I'm seeing the digital ocean nginx server reporting 499 on various "/inbox" route requests and I'm seeing the nginx running on the physical machine that talks directly to lemmy reporting 408 on various "/inbox" route requests.

There are some examples in this comment: https://lemmy.world/comment/8728858

you are viewing a single comment's thread
view the rest of the comments
[-] seang96@spgrn.com 1 points 7 months ago* (last edited 7 months ago)

How much RAM is postgres using? I'm running on i7 11th Gen and 16gb of RAM so your hardware is fine. This is the resource usage of my setup. First 6 are Lemmy itself broken up to scale and for redundancy. I have a node down right now normally using 3 http ones. Http ones receive federation from others and the lemmy-instance2s are the databases setup to be highly available too.

NAME CPU(cores) MEMORY(bytes)

lemmy-6b9fbd75f6-4q4n8 1m 5Mi

lemmy-fed-0 2m 18Mi

lemmy-fed-1 1m 18Mi

lemmy-fed-2 6m 17Mi

lemmy-http-57886f784f-5x9vb 4m 73Mi

lemmy-http-57886f784f-tpbtb 13m 121Mi

lemmy-instance2-czc6-0 47m 3362Mi

lemmy-instance2-jpw7-0 13m 1017Mi

[-] Dark_Arc@lemmy.world 1 points 7 months ago* (last edited 7 months ago)

Not sure what you're using to generate that list/formatting is a bit difficult.

I don't have a cluster since it's effectively single user + @Auto_Post_Bot@social.packetloss.gg (in theory a few other people have access, but they're not active), single machine, it's just more or less the out of the box docker stuff on a bare metal machine in my basement + a digital ocean droplet.

The droplet is what I'm using to have a static IP to prevent dynamic DNS nonsense + it provides some level of protection against a naive DDoS attack on random fediverse servers (since I can in the worst case, get on my phone and severe the ZeroTier connection that's using to connect the droplet to my basement server).

I'm pretty confident whatever is going on is payload related at this point.

    PID USER      PR  NI    VIRT    RES    SHR S  %CPU  %MEM     TIME+ COMMAND
  50622 70        20   0  330264 240200 201512 S   0.0   0.7   0:25.21 postgres
  50636 70        20   0  327804 239520 201296 S   0.0   0.7   0:26.55 postgres
  50627 70        20   0  327204 239152 201592 S   0.0   0.7   0:24.75 postgres
  50454 70        20   0  328932 238720 200872 S   0.0   0.7   0:26.61 postgres
  50639 70        20   0  313528 217800 193792 S   0.0   0.7   0:03.13 postgres
  50641 70        20   0  313284 217336 194204 S   0.0   0.7   0:03.15 postgres
  50626 70        20   0  313592 216604 193636 S   0.0   0.7   0:05.07 postgres
  50632 70        20   0  313236 216460 193968 S   0.0   0.7   0:04.52 postgres
  50638 70        20   0  310368 216084 193856 S   0.0   0.7   0:04.20 postgres
  50614 70        20   0  310520 216072 193840 S   0.0   0.7   0:02.88 postgres
  50642 70        20   0  312200 215920 194068 S   0.0   0.7   0:04.46 postgres
  50640 70        20   0  312584 215724 193676 S   0.0   0.7   0:03.32 postgres
  50635 70        20   0  309744 215404 193764 S   0.0   0.7   0:02.72 postgres
  50630 70        20   0  312168 215224 193488 S   0.0   0.7   0:02.67 postgres
  50621 70        20   0  309560 215096 193772 S   0.0   0.7   0:02.97 postgres
  50646 70        20   0  309492 215008 193560 S   0.0   0.7   0:04.66 postgres
  50625 70        20   0  309760 215004 193368 S   0.0   0.7   0:03.08 postgres
  50637 70        20   0  309296 214992 193848 S   0.0   0.7   0:02.87 postgres
  50616 70        20   0  310596 214984 192700 S   0.0   0.7   0:04.17 postgres
  50643 70        20   0  310392 214940 194008 S   0.0   0.7   0:04.14 postgres
  50624 70        20   0  310128 214880 192928 S   0.0   0.7   0:04.15 postgres
  50631 70        20   0  310220 214596 192576 S   0.0   0.7   0:02.71 postgres
  50613 70        20   0  309364 213880 192520 S   0.0   0.7   0:04.06 postgres
  50628 70        20   0  309852 213236 191504 S   0.0   0.7   0:03.04 postgres
  50634 70        20   0  187772 163388 149428 S   0.0   0.5   0:02.87 postgres
  50644 70        20   0  189684 162892 148508 S   0.0   0.5   0:04.11 postgres
  50633 70        20   0  186096 162544 149324 S   0.0   0.5   0:03.20 postgres
  50629 70        20   0  185644 162112 149296 S   0.0   0.5   0:04.62 postgres
  50618 70        20   0  186264 160576 147928 S   0.0   0.5   0:04.10 postgres
  50582 70        20   0  185708 160236 147592 S   0.0   0.5   0:04.10 postgres
   3108 70        20   0  172072 144092 142256 S   0.0   0.4   0:04.46 postgres
   3109 70        20   0  172024 142404 140632 S   0.0   0.4   0:02.24 postgres
   2408 70        20   0  171856  23660  22020 S   0.0   0.1   0:00.76 postgres
   3113 70        20   0  173536   9472   7436 S   0.0   0.0   0:00.15 postgres
   3112 70        20   0  171936   8732   7020 S   0.0   0.0   0:01.54 postgres
   3114 70        20   0  173472   5624   3684 S   0.0   0.0   0:00.00 postgres

I've got quite a bit of experience with postgres; I don't see any indication it's the problem.

this post was submitted on 23 Mar 2024
52 points (94.8% liked)

Selfhosted

40198 readers
722 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS