48

This webpage provides instructions for using the acme-dns DNS challenge method with various ACME clients to obtain HTTPS certificates for private networks. Caddy, Traefik, cert-manager, acme.sh, LEGO and Certify The Web are listed as ACME clients that support acme-dns. For each client, configuration examples are provided that show how to set API credentials and other settings to use the acme-dns service at https://api.getlocalcert.net/api/v1/acme-dns-compat to obtain certificates. Interesting that so many ACME clients support the acme-dns service, providing an easy way to obtain HTTPS certificates for private networks.

HN https://news.ycombinator.com/item?id=36674224

seiferteric: Proposes an idea for automatically creating trusted certificates for new devices on a private network.

hartmel: Mentions SCEP which allows automatic certificate enrollment for network devices.

mananaysiempre: Thinks using EJBCA for this, as hartmel suggested, adds unnecessary complexity.

8organicbits: Describes a solution using getlocalcert which issues certificates for anonymous domain names.

austin-cheney: Has a solution using TypeScript that checks for existing certificates and creates them if needed, installing them in the OS and browser.

bruce511: Says automating the process is possible.

lolinder: Mentions Caddy will automatically create and manage certificates for local domains.

frfl: Uses Lego to get a Let's Encrypt certificate for a local network website using the DNS challenge.

donselaar: Recommends DANE which works well for private networks without a public CA, but lacks browser support.

top 24 comments
sorted by: hot top controversial new old
[-] thedaly@reseed.it 7 points 1 year ago

Big fan of letsencrypt’s certbot with the nginx and cloudflare (or other dns providers) plugins.

Is there any reason to use caddy or traefik over nginx?

[-] lchapman@programming.dev 6 points 1 year ago

Caddy takes almost all of the nginx boilerplate and handles it for you.

If you’re doing something simple in nginx, it’s far simpler with Caddy.

[-] robotrash@lemmy.robotra.sh 3 points 1 year ago

What if I'm using NGINX Proxy Manager which gives me a GUI for my dumbness?

[-] lchapman@programming.dev 2 points 1 year ago

Stick with it, sounds like you’ve got a system that works for you

[-] AES@lemmy.ronsmans.eu 1 points 1 year ago
[-] LedgeDrop@lemm.ee 4 points 1 year ago

I found traefik to be a more feature rich, load balancer when used in kubernetes environments. Other than use in kubernetes, I'd say if you're happy with nginx, keep using nginx :)

[-] steltek@lemm.ee 1 points 1 year ago

I haven't tried it yet but I vaguely recall traefik had a better proxy-auth setup while nginx locked it away behind their freemium plan.

[-] dan@upvote.au 12 points 1 year ago

Because you might want to use HTTPS on a server that's not accessible externally. Some browser features only work over HTTPS.

Sounds like a bad browser.

[-] jarfil@beehaw.org 11 points 1 year ago* (last edited 1 year ago)

Good browsers don't let random unauthenticated content to do whatever it wants on neither the local machine or the network.

HTTPS is also the only way to use client-side certificates for strong two-way authentication and zero-trust setups.

Good browsers don't let random unauthenticated content to do whatever it wants on neither the local machine or the network.

So, lynx?

zero-trust setups. private networks

[-] jarfil@beehaw.org 2 points 1 year ago

lynx, no-script... it's all fine until some web needs JavaScript yes or yes, which nowadays seem to be most of them, then it's a game of whom to trust.

Private networks are usually an oxymoron, they're only as private as far as the WiFi router or whoever clicks the wrong malicious link go. Zero-trust mitigates that, instead of blindly relying on perimeter defenses and trusting anyone who manages to bypass them.

[-] dan@upvote.au 1 points 1 year ago

Every browser implements these limitations, as they're part of the web platform. Some examples are service workers, web crypto, HTTP/2, webcam, microphone, geolocation, and more. There's a list here: https://developer.mozilla.org/en-US/docs/Web/Security/Secure_Contexts/features_restricted_to_secure_contexts

Sounds like a bad browser.

[-] dan@upvote.au 2 points 1 year ago

Every browser does this. It's intentional to push people towards using encrypted connections, especially for PII like geolocation.

Sounds dystopian. I still won't feel bad for normies.

[-] xthexder@l.sw0.com 4 points 1 year ago

Personally I use dnsrobocert with my own domains. I've got a few subdomains that point to a Wireguard subnet IP for private network apps (so it resolves to nothing if you're not on VPN). Having a real valid SSL cert is really nice vs self signing, and it keeps my browser with HTTPS-Everywhere happy.

[-] pe1uca@lemmy.pe1uca.dev 4 points 1 year ago

Yep, caddy was as easy as to use xcaddy with the module of my DNS, configure the key and run caddy, that's it xD.

For what lolinder mentioned in the news link you need to have port 80 open.
If you don't want that you could configure local authority, but that'll give the warning of a selfsigned certificate.

[-] GameGod@beehaw.org 1 points 1 year ago

IMHO all these approaches are convoluted and introduce way too many components (SPOFs) to solve the problem. They're "free" but they come at the cost of maintaining all this extra infrastructure and don't forget that certificate transparency logs mean all your internal DNS records that you request a LetsEncrypt certificate for will be published publicly. (!)

An alternative approach is to set up your own internal certificate authority (CA), which you can do in a couple minutes with step-ca. You then just deploy your CA root cert to all the machines on your network and can get certs whenever you need. If you want to go the extra mile and set up automatic renewal, you can do that too, but it's overkill for internal use IMHO.

Using your own CA introduces only a single new software component and it doesn't require high availability to be useful....

[-] abhibeckert@beehaw.org 0 points 1 year ago

Unfortunately these days internal CAs aren't always trusted. We have one where I work, and hundreds of times a day people have to click through "I understand the risks, proceed anyway" alert prompts.

Which makes me really uncomfortable - I fear one day someone will blindly click past a warning about an actual malicious certificate.

[-] TemporalSoup@beehaw.org 1 points 1 year ago

It kills me that companies seem to willingly train their users to ignore warnings and signs that something is amiss.

"Yeah, all our emails from that vendor come with the external email warning, just ignore it"

this post was submitted on 13 Jul 2023
48 points (100.0% liked)

Technology

37603 readers
561 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS