That's the name we use to designate software like GitHub, GitLab and similar, which provide repositories hosting and tooling like issue trackers. It's supposed to be named like that because of SourceForge, the oldest of such tools, although I didn't hear the term "forge" before the last 5 years or so, long after SourceForge demise, so I imagine there is a bit of nostalgia in this name (not sure who is nostalgic of SourceForge, though 😂). The wikipedia page : https://en.wikipedia.org/wiki/Forge_(software)
There hasn’t been a new Git repo launch in almost a decade
Am I the only person annoyed they seem to mistake repositories for forges? It's already annoying when casual users say "git" for "GitHub", but those guys actually want to build a forge, explaining they're going to do better than anyone else. Maybe start by properly using the terms?
So basically, they are abusing customers by making them pay more for shorter distance just because they know they can, and when customers try to work around those abuses, the airliners sue them. If this is not a clear cut case for regulation, I don't know what this is (then again, I'm a european :P ).
GitHub is a great platform, which has championed open-source for decades, now. I don't think anybody has anything to blame them for (except people not liking the idea that AI is trained on their code, like sibling mentioned), it's more about fears it may go bad. Because basically, it's where most of the code of the world is hosted, it's a single point of failure. People also have questioned the pertinence of having all open-source code hosted on a proprietary platform. And the acquisition by Microsoft also had a chilling effect on those of us who remember Internet Explorer 6's Microsoft more than VSCode's Microsoft.
For those reasons, it is desirable for those who love the idea of decentralization to look up for alternatives. But even there, it's perfectly fine to stay on GitHub, "decentralizing" doesn't require everybody to leave. :) Plus, even when using an other forge, it's still good to keep publishing mirrors on GitHub for visibility and discoverability, currently.
Counter opinion : this is a bad article written on a great videogame and a great tabletop game.
So, the main point of the author is that they don't like DnD. Well, maybe don't play a DnD game? 😂 #firstWorldProblems
"That was my favorite spear, 300 years ago!"
as in my experience, most regular users do not have a Matrix client installed
I understand your point, but by that logic, we should use Reddit rather than Lemmy, as most users are there. It's not only about ease of use, it's about being sure users won't be abused. Discord is still in its acquisition phase, but you can be sure enshitification will come next.
Obligatory check : are you sure you really need a forge? (that's the name we use to designate tools like Github/Gitlab/Gitea/etc). You can do a lot with git alone : you can host repositories on your server, clone them through ssh (or even http with git http-backend
, although it requires a bit of setup), push, pull, create branches, create notes, etc. And the best of it : you can even have CI/CD scripts as post-receive
hooks that will run your tests, deploy your app, or reject the changes if something is not right.
The only thing you have to do is to create the repos on your server with the --bare
flag, as in git init --bare
, this will create a repos that is basically only what you usually have in the .git
directory, and will avoid having errors because you pushed to a branch that is not the currently one checked. It will also keep the repos clean, without artifacts (provided you run your build tasks elsewhere, obviously), so it will make all your sources really easy to backup.
And to discuss issues and changes, there is always email. :) There is also this, a code review tool that just pop up on HN.
And it works with Github! :) Just add a git remote to Github, and you can push to it or fetch from it. You can even setup hooks to sync with it. I publish my FOSS projects both on Github and Gitlab, and the only thing I do to propagate changes is to push to my local bare repos that I use for easy backups, they each have a post-update hook which propagates the change everywhere it needs to be (on Github, Gitlab, various machines in my local network, which then have their own post-update hooks to deploy the app/lib). The final touch to that : having this ~/git/
directory that contains all my bare repos (which are only a few hundred MB so fit perfectly in my backups) allowed me to create a git_grep_all
script to do code search in all my repos at once (who needs elasticsearch anyway :D ) :
#!/usr/bin/env bash
# grep recursively bare repos
INITIAL_DIR=$(pwd)
for dir in $(find . -name HEAD -exec dirname '{}' \;); do
pushd $dir > /dev/null
git grep "$*" HEAD > /dev/null
if [[ "$?" = "0" ]]; then
pwd
git grep "$*" HEAD
echo
fi
popd > /dev/null
done
(note that it uses pushd
and popd
, which are bash builtins, other shells should use other ways to change directories)
The reason why you may still want a forge is if you have non tech people who should be able to work on issues/epics/documentation/etc.
I don't think it's a Mastodon problem. It's a generalist social network problem. Facebook, Twitter, Mastodon, why are we using those? For some, it's "to keep in touch" with friends and family, and they're happy seeing any activity, preferably things that makes them smile (that's more Facebook). For others, it's a mean to build street cred in their industry by publicly saying on topic things that sound smart (that was Twitter). But if you look for interesting discussions on things you like, in order to learn something, they're terrible at that. It's where specialized communities, discussing only one topic, shine. It used to be forums, then reddit, now lemmy. RSS is also a very good way to get that kick.
Not to sound too pessimistic, but we live in a time where we see Twitter collapsing, despite being one of those "too big to fail" websites. My bet is that none will stand the test of time, the web is ephemeral (and archive.org is an underappreciated wonder of the world). I would rather say that what you really need is a backup routine.
Solving it the unix way:
ls -1 | sort -R | sxiv -f -s f -S 5 -
So it's ls -1
to list the content of current directory (presumably where your pictures are), with one file per line, so we can then pipe it to sort
, with the -R
option to sort randomly, then piping the result to sxiv
, a lightweight image viewer available on most distro (I just checked, it's available on Debian). For its options : -f
means it's fullscreen, -s f
makes it scale to fit the image on screen as well as possible, -S 5
tells it to start in slideshow mode and change picture every 5 seconds, and -
is to tell it to take the files list from stdin (thus from the ls
and sort
commands).
This won't work for videos, though, only pictures.
My favorite cost cutting tip is to avoid big webapps running on docker, and instead do with small UNIX utilities (cron instead of a calendar, text files instead of note taking app, rsync instead of a filehosting dropbox-like app, simple static webserver for file sharing, etc). This allows me to run my server on a simple Raspberry Pi, with less than 500mb of used RAM in average, and mininal energy consumption. So, total cost of the setup:
With that, I run all services I need on a single machine, and I have a backup plan for recovery of both hardware and software.
Getting used to a UNIX shell and to UNIX philosophy can take some time, but it's very rewarding in making everything more simple (thus more efficient).