123
Linux during the mid to late 90s (Windows 95 and 98 era)
(lemmy.world)
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
Slackware and Red Hat were the two distros in use in the mid 90s.
My local city used proper UNIX, and my university had ~~IRIXworkstations~~ SPARCstations and SunOS servers. We used Linux at my ISP to handle modem pools and web/mail/news servers. In the early 2000s we had Linux labs, and Linux clusters to work on.
Linux on the desktop was a bit painful. There were no modules. Kernels had to fit into main memory. So you'd roll your own kernel with just the drivers you needed. XFree86 was tricky to configure with timings for your CRT monitors. If done wrong, you could break your monitor.
I used FVWM2 and Enlightenment for many years. I miss Enlightenment.
Me too! Has E17 come out yet? 😆
Enlightenment is on version 26
Guess you missed the joke that it was 13 years between E16 and E17 🙂
E16 was better
that was the last time i contributed; i created a LCARS port and now there are hundreds of them everywhere.
LCARS interface.... that is something I haven't seen in a loooooooong time
I used Enlightenment on Arch Linux for a year, in 2020-21. The PC had 4G ram and an HDD, Enlightenment was blazing fast. I could type enlightenment_start to a tty and reach a Wayland desktop under a second with 250M ram used total. E is still alive and kicking.
SGI workstations had the best GUI. That shit looked straight out of Hollywood
How wrong did you have to be to break your monitor? Because I'm positive I got it very wrong a whole lot of times and never managed that.
By the late 90's most monitors were smart enough to detect when sync speed was too far off and not try to display an image.
It was the old monitors that only supported a single or fixed set of scan rates that you had to worry about damaging. Some could be very picky and others were more tolerant.
Thank goodness I had a newer monitor then, because I would definitely have toasted several.
I managed to make mine do some very worrying noises, but none of my monitors broke either, even though the bandwidth I based my calculations on was often kinda made up.
You mean your graphic drivers, right? not your actual hardware?
(edit: oh no)
No. The wrong timing parameters could definitively break your hardware.
@andrewth09 I bricked a monitor when I tried to fiddle with the graphics settings in Linux back in the late 90s (tried to get it to run on 1280*1024 - which was considered "hi resolution" back then). I had to buy a new monitor. Then installed Windows and only returned to Linux a long time after that.
Oh yeah. I remember all the warnings plastered all over the X11 config file about how dangerous the settings were if you got them wrong.
The lessons we learn...