this post was submitted on 13 Sep 2025
76 points (88.8% liked)

Selfhosted

51408 readers
899 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Links are almost always base64 encoded now and the online url decoders always produce garbage. I was wondering if there is a project out there that would allow me to self-host this type of tool?

I'd probably network this container through gluetun because, yanno, privacy.

Edit to add: Doesn't have to be specifically base64 focused. Any link decoder that I can use in a privacy respecting way, would be welcome.

Edit 2: See if your solution will decode this link (the one in the image): https://link.sfchronicle.com/external/41488169.38548/aHR0cHM6Ly93d3cuaG90ZG9nYmlsbHMuY29tL2hhbWJ1cmdlci1tb2xkcy9idXJnZXItZG9nLW1vbGQ_c2lkPTY4MTNkMTljYzM0ZWJjZTE4NDA1ZGVjYSZzcz1QJnN0X3JpZD1udWxsJnV0bV9zb3VyY2U9bmV3c2xldHRlciZ1dG1fbWVkaXVtPWVtYWlsJnV0bV90ZXJtPWJyaWVmaW5nJnV0bV9jYW1wYWlnbj1zZmNfYml0ZWN1cmlvdXM/6813d19cc34ebce18405decaB7ef84e41 (it should decode to this page: https://www.hotdogbills.com/hamburger-molds)

you are viewing a single comment's thread
view the rest of the comments
[–] hendrik@palaver.p3x.de 47 points 22 hours ago (1 children)

There's base64 -d on the command line.

[–] ReedReads@lemmy.zip 4 points 22 hours ago (4 children)

base64 -d

Right but the / in the url trips it up and I'd like to just copy/paste the full url and have it spit out the proper, decoded link.

[–] ExFed@programming.dev 29 points 21 hours ago* (last edited 21 hours ago) (1 children)

The / character isn't a part of the base64 encoding. In fact, only one part of the URL looks like base64. No plain base64 tool (whether via CLI, self-hosted, or otherwise) will be able to decode an entire URL like that. You'll first need to parse the URL to isolate the base64 part. This is literally solved with a single line of bash:

echo "https://link.sfchronicle.com/external/41488169.38548/aHR0cHM6Ly93d3cuaG90ZG9nYmlsbHMuY29tL2hhbWJ1cmdlci1tb2xkcy9idXJnZXItZG9nLW1vbGQ_c2lkPTY4MTNkMTljYzM0ZWJjZTE4NDA1ZGVjYSZzcz1QJnN0X3JpZD1udWxsJnV0bV9zb3VyY2U9bmV3c2xldHRlciZ1dG1fbWVkaXVtPWVtYWlsJnV0bV90ZXJtPWJyaWVmaW5nJnV0bV9jYW1wYWlnbj1zZmNfYml0ZWN1cmlvdXM/6813d19cc34ebce18405decaB7ef84e41" | cut -d/ -f6 | base64 -d

See TIO for example.

edit: add TIO link

[–] ReedReads@lemmy.zip 11 points 21 hours ago (3 children)
  1. Thank you for this
  2. You know more than I do re: bash. Where can I learn what | cut -d/ -f6 | means? I assume the cut is the parsing? But maybe that is wrong? Would love to learn how to learn this.
[–] krnl386@lemmy.ca 8 points 14 hours ago* (last edited 14 hours ago) (1 children)

Try explainshell.com - you can paste in any oneliner and the site will parse it and explain each part.

Here’s the link

[–] Enoril@jlai.lu 3 points 6 hours ago

Really nice! Thanks for sharing this

[–] 30p87@feddit.org 10 points 18 hours ago

cut --help and man cut can teach you more than anyone here.

But: "|" takes the output of the former command, and uses it as input for the latter. So it's like copying the output of "echo [...]", executing "cut -d '/' -f 6", and pasting it into that. Then copy the output of "cut", execute "base64 -d" and paste it there. Except the pipe ("|") automates that on one line.

And yes, cut takes a string (so a list of characters, for example the url), split's it at what -d specifies (eg. cut -d '/' splits at "/"), so it now internally has a list of strings, "https:", "", "link.sfchronicle.com", "external", 41488169.38548", "aHR0cHM6Ly93d3cuaG90ZG9nYmlsbHMuY29tL2hhbWJ1cmdlci1tb2xkcy9idXJnZXItZG9nLW1vbGQ_c2lkPTY4MTNkMTljYzM0ZWJjZTE4NDA1ZGVjYSZzcz1QJnN0X3JpZD1udWxsJnV0bV9zb3VyY2U9bmV3c2xldHRlciZ1dG1fbWVkaXVtPWVtYWlsJnV0bV90ZXJtPWJyaWVmaW5nJnV0bV9jYW1wYWlnbj1zZmNfYml0ZWN1cmlvdXM" and "6813d19cc34ebce18405decaB7ef84e41", and from that list outputs whatever is specified by -f (so eg. -f 6 means the 6th of those strings. And -f 2-3 means the 2nd to 3rd string. And -5 means everything up to and including the fifth, and 3- means everything after and including the third).

But all of that is explained better in the manpage (man cut). And the best way to learn is to just fuck around. So echo "t es t str i n g, 1" | cut ... and try various arguments.

[–] ccryx@discuss.tchncs.de 1 points 16 hours ago* (last edited 16 hours ago)

You can use man <command> (in this case man cut) to read a program's manual page. Appending --help (without any other arguments will often produce at least a short description of the program and list the available options.

[–] hendrik@palaver.p3x.de 5 points 21 hours ago* (last edited 21 hours ago) (1 children)

~~Well, the URL is a bit weird.~~

echo "aHR0cHM6Ly93d3cuaG90ZG9nYmlsbHMuY29tL2hhbWJ1cmdlci1tb2xkcy9idXJnZXItZG9nLW1vbGQ" | base64 -d

gives me "https://www.hotdogbills.com/hamburger-molds/burger-dog-mold". (Without the 's'.) And then there are about 176 characters left. I suppose the underscore is some delimiter. The rest is:

echo "c2lkPTY4MTNkMTljYzM0ZWJjZTE4NDA1ZGVjYSZzcz1QJnN0X3JpZD1udWxsJnV0bV9zb3VyY2U9bmV3c2xldHRlciZ1dG1fbWVkaXVtPWVtYWlsJnV0bV90ZXJtPWJyaWVmaW5nJnV0bV9jYW1wYWlnbj1zZmNfYml0ZWN1cmlvdXM" | base64 -d

"sid=6813d19cc34ebce18405deca&ss=P&st_rid=null&utm_source=newsletter&utm_medium=email&utm_term=briefing&utm_campaign=sfc_bitecurious"

And I suppose the stuff after the last slash is there for some other reason, tracking or some hash or whatever. But the things before that are the URL and the parameters.

But the question remains whether we have some kind of tool to do this automatically and make it a bit easier...

[–] ReedReads@lemmy.zip 3 points 21 hours ago (1 children)

I really appreciate all of the time and effort you spent on this url. You're right, the url is weird, which is why I thought it was a good example.

But the question remains whether we have some kind of tool to do this automatically and make it a bit easier…

But you nailed it with this last sentence. Especially when one is on mobile.

Thanks for replying again.

[–] hendrik@palaver.p3x.de 2 points 20 hours ago* (last edited 20 hours ago)

I know. Guess I mainly wanted to say your given solution isn't the entire story and the potential tool should decode the parameters as well, they might or might not be important. I'm often at the computer and I regularly do one-off tasks this way... But I'm aware it might not be an one-off task to you and you might not have a Linux terminal open 24/7 either 😉 Hope some of the other people have what you need. And btw... since I clicked on a few of the suggestions: I think the thing called URL encoding is a something different, that's with all the percent signs and not base64 like here.

[–] carl_dungeon@lemmy.world 2 points 22 hours ago

Just put it in quotes?