45

Terminal Trove showcases the best of the terminal, Discover a collection of CLI, TUI, and more developer tools at Terminal Trove.

112

On Monday, Mistral AI announced a new AI language model called Mixtral 8x7B, a "mixture of experts" (MoE) model with open weights that reportedly truly matches OpenAI's GPT-3.5 in performance—an achievement that has been claimed by others in the past but is being taken seriously by AI heavyweights such as OpenAI's Andrej Karpathy and Jim Fan. That means we're closer to having a ChatGPT-3.5-level AI assistant that can run freely and locally on our devices, given the right implementation.

Mistral, based in Paris and founded by Arthur Mensch, Guillaume Lample, and Timothée Lacroix, has seen a rapid rise in the AI space recently. It has been quickly raising venture capital to become a sort of French anti-OpenAI, championing smaller models with eye-catching performance. Most notably, Mistral's models run locally with open weights that can be downloaded and used with fewer restrictions than closed AI models from OpenAI, Anthropic, or Google. (In this context "weights" are the computer files that represent a trained neural network.)

Mixtral 8x7B can process a 32K token context window and works in French, German, Spanish, Italian, and English. It works much like ChatGPT in that it can assist with compositional tasks, analyze data, troubleshoot software, and write programs. Mistral claims that it outperforms Meta's much larger LLaMA 2 70B (70 billion parameter) large language model and that it matches or exceeds OpenAI's GPT-3.5 on certain benchmarks, as seen in the chart below.
A chart of Mixtral 8x7B performance vs. LLaMA 2 70B and GPT-3.5, provided by Mistral.

The speed at which open-weights AI models have caught up with OpenAI's top offering a year ago has taken many by surprise. Pietro Schirano, the founder of EverArt, wrote on X, "Just incredible. I am running Mistral 8x7B instruct at 27 tokens per second, completely locally thanks to @LMStudioAI. A model that scores better than GPT-3.5, locally. Imagine where we will be 1 year from now."

LexicaArt founder Sharif Shameem tweeted, "The Mixtral MoE model genuinely feels like an inflection point — a true GPT-3.5 level model that can run at 30 tokens/sec on an M1. Imagine all the products now possible when inference is 100% free and your data stays on your device." To which Andrej Karpathy replied, "Agree. It feels like the capability / reasoning power has made major strides, lagging behind is more the UI/UX of the whole thing, maybe some tool use finetuning, maybe some RAG databases, etc."

Mixture of experts

So what does mixture of experts mean? As this excellent Hugging Face guide explains, it refers to a machine-learning model architecture where a gate network routes input data to different specialized neural network components, known as "experts," for processing. The advantage of this is that it enables more efficient and scalable model training and inference, as only a subset of experts are activated for each input, reducing the computational load compared to monolithic models with equivalent parameter counts.

In layperson's terms, a MoE is like having a team of specialized workers (the "experts") in a factory, where a smart system (the "gate network") decides which worker is best suited to handle each specific task. This setup makes the whole process more efficient and faster, as each task is done by an expert in that area, and not every worker needs to be involved in every task, unlike in a traditional factory where every worker might have to do a bit of everything.

OpenAI has been rumored to use a MoE system with GPT-4, accounting for some of its performance. In the case of Mixtral 8x7B, the name implies that the model is a mixture of eight 7 billion-parameter neural networks, but as Karpathy pointed out in a tweet, the name is slightly misleading because, "it is not all 7B params that are being 8x'd, only the FeedForward blocks in the Transformer are 8x'd, everything else stays the same. Hence also why total number of params is not 56B but only 46.7B."

Mixtral is not the first "open" mixture of experts model, but it is notable for its relatively small size in parameter count and performance. It's out now, available on Hugging Face and BitTorrent under the Apache 2.0 license. People have been running it locally using an app called LM Studio. Also, Mistral began offering beta access to an API for three levels of Mistral models on Monday.

[-] daredevil@kbin.social 17 points 11 months ago

I'd imagine this will also be very problematic for non-celebrities from all sorts of backgrounds as well. The harassment potential is very concerning.

[-] daredevil@kbin.social 12 points 11 months ago* (last edited 11 months ago)

kbin has this -- the feature is called collections. https://kbin.social/magazines/collections

you can make public ones that others can follow, or private ones to make curated feeds for yourself.

[-] daredevil@kbin.social 32 points 11 months ago

Pandora's Box is already opened, unfortunately. The Streisand Effect is only going to make this worse.

[-] daredevil@kbin.social 50 points 11 months ago

What scares me more than the fact that this guy exists are how many people chose to and continue to support him

[-] daredevil@kbin.social 11 points 1 year ago

This is awesome, OP. I've crossposted this to @linguistics to give you a little more visibility. Cheers.

[-] daredevil@kbin.social 26 points 1 year ago* (last edited 1 year ago)

One of my favorite things about /kbin is that it utilizes threads and microblogs. In my experience thus far, users here seem rather shy. I don't hold it against anyone though, because I totally understand.

Federating content from the likes of Mastodon is very helpful for having discussions trickle in from the fediverse. I think it's also really helpful for establishing an ongoing daily discussion space so the thread feed isn't as cluttered. IMO, there's more potential beyond that, too (Think: drawing everyday for a month, photography-based posting/challenges while using tags for content organization, language-learning exercises, the list goes on...).The combination of threads with microblogs has shown me the power that lies behind content federation. As a result, /kbin is by far my favorite of the fediverse platforms so far.

I still have some minor issues with how it currently works. Currently, I believe the name of a magazine causes hashtags with the exact same string to federate content to that magazine. The magazine that matches the desired hashtag also takes priority, even when the hashtag isn't assigned in the magazine's settings. An issue with this is that if any subsequent magazines try to federate content using that hashtag, it won't be able to do so.

It seems as though microblogs can only federate content to either the magazine that matches the hashtag in question, or the magazine that uses the hashtag first. There's also an issue where a microblog that uses multiple hashtags will only federate content to the magazine with the first available tag. E.g. if someone writes an unused tag for the first, followed by #kbinmeta, then #fediverse third, the post would only go to the kbinmeta microblog section. It would be lovely for microblogs to be federated, or even mirrored across magazines (as in child comments/replies) that implement the same tag. Hopefully, this could also be done without adding excessive overhead to Ernest/the server. Perhaps even offer the ability to have a magazine choose to refuse federating tags that match the magazine's name.

There are also some minor issues with moderation federation, but I don't exactly want to specify here, because I'm worried it could be used maliciously.

That being said, I can't wait to see how /kbin will mature.

[-] daredevil@kbin.social 21 points 1 year ago* (last edited 1 year ago)

I've taken care of it. 🙂

14

First one that comes to my mind is having to travel with an NPC and our walk/run speeds don't match.

67

@Ernest has pushed an update which allows users to request ownership/moderation of abandoned magazines. Ghost/abandoned magazines were fairly prevalent after the initial wave of hype due to users either squatting magazine names or becoming inactive for other reasons. Now is your chance to get involved, if you were waiting to do so.

To request ownership/moderator privileges, scroll down to where it says "MODERATORS" in the sidebar. There will be an icon of a hand pointing upwards that you can click on, then make the request. Cheers, and thank you for your hard work Ernest, as well as future mods.

[-] daredevil@kbin.social 9 points 1 year ago

I hope things get better; looking forward to the infrastructure and devlog updates as well!

1
submitted 1 year ago* (last edited 1 year ago) by daredevil@kbin.social to c/genshin_impact@lemmy.world
1
submitted 1 year ago* (last edited 1 year ago) by daredevil@kbin.social to c/genshin_impact@lemmy.world
1
submitted 1 year ago* (last edited 1 year ago) by daredevil@kbin.social to c/genshin_impact@lemmy.world

"...Euphrasie, three days ago, one of your journalists secretly followed a suspect all the way from the Court of Fontaine to Romaritime Harbor, and almost ended up being tied up and thrown into the sea by a gang of criminals. Whether or not there's any truth in the notion that 'nearer to the action is closer to the truth,' surely Miss Charlotte doesn't value her reports more than she does her own life?"
— Yet another exasperated exchange between Captain Chevreuse of the Special Security and Surveillance Patrol and Euphrasie, Editor-in-Chief of The Steambird

◆ Name: Charlotte
◆ Title: Lens of Verity
◆ Reporter of The Steambird
◆ Vision: Cryo
◆ Constellation: Hualina Veritas

Fontaine's famous newspaper The Steambird has a veritable legion of reporters it can call upon, each with their own area of expertise. Some specialize in celebrity gossip, others follow the word on the street, while others still focus on political affairs...

But among them all, there is one that stands head and shoulders above the rest thanks to her seemingly boundless reserve of energy and perseverance — the inimitable Charlotte.

Unswervingly committed to the principle that "nearer to the action is closer to the truth," Charlotte has a habit of popping up literally anywhere and everywhere in Fontaine — from its widest avenues to its narrowest back alleys, its highest vantage points to its lowest subterranean vaults, even its tallest mountains to its deepest undersea caverns. She captures the "truth" with her Kamera, records it in her articles, and finally unveils it for all to see.

And when the "truth" comes out, she's met with a variety of different reactions ranging from applause, to embarrassment, to outright fury. There are even some who would resort to any means necessary to make a particular article connected to themselves disappear. Or alternatively, just make Charlotte disappear.

For this reason, the newspaper's Editor-in-Chief Euphrasie has on numerous occasions felt the need to distance Charlotte from the Court of Fontaine by sending her off on faraway "field reporting" jobs, only recalling her once the Maison Gardiennage or Special Security and Surveillance Patrol had finally managed to clear things up.

But despite all this, neither the toil of the job itself nor the pressure of external denunciations and threats has ever phased Charlotte in the slightest.

With her trusty companion Monsieur Verite by her side, she invariably carries out her journalistic duties with unfaltering fervor, rushing about in pursuit of all the "truths" out there just waiting to be discovered.

1
submitted 1 year ago* (last edited 1 year ago) by daredevil@kbin.social to c/genshin_impact@lemmy.world

One lie always follows another, and so "justice" awaits inescapably at the end. The ignorant see this as some kind of farce. But if they trace back to the source, they inevitably realize that they began by deceiving themselves.
— A disordered fable left in someone's dream by Mage "N"

◆ Name: Furina
◆ Title: Endless Solo of Solitude
◆ Regina of All Waters, Kindreds, Peoples and Laws
◆ Gnosis: Hydro
◆ Constellation: Animula Choragi

Undoubtedly, Furina has been much loved by the people of Fontaine from the moment she became the Hydro Archon.
Her charismatic parlance, lively wit, and elegant bearing — all bear witness to her godly charms.

But perhaps the thing that she is most revered for is her unrivaled sense of drama.
As the protagonist of a famous play at the Opera Epiclese once put it,
"Life is like the theater — you never can tell when the twist will come."

Furina is as inscrutable as the most cunning of stage characters, her course of action defying all prediction.
In fact it's precisely for this reason that the god of Justice and Judgment, unapproachable in her divine majesty, has such a bewitching influence.

But when the curtain falls, a hollow feeling invariably starts to creep in.
There are those who wonder whether there are moments in the dead of night when even a god like Furina feels the sharp pangs of loneliness.

No, surely not. People couldn't possibly imagine, let alone believe, that such a scene might play out.

And that's indeed the way it should be.

That is, were it not for the fact that Furina's tears had already been silently washed away by the Fountain of Lucine.

5

It could be something from today, the past week, or whatever. All things big or small are welcome too. I'm sitting outside today as a a part of my daily routine--it's nice and sunny out, and there's a gentle breeze which feels very relaxing. Doing so is pretty nice between the time I spend at the computer.

18
submitted 1 year ago* (last edited 1 year ago) by daredevil@kbin.social to c/AskKbin@kbin.social

I like that kbin is smaller compared to some lemmy instances. I also prefer the UI. Bigger communities tend to feel a bit overwhelming for me. I also appreciate how transparent Ernest has been regarding kbin's development. That said, it's been a bit challenging to figure out how to utilize some of the federation features that kbin has to offer--microblogging in particular. From what I've seen, people don't generally seem too interested in this feature, but I think it's nice to have.

5
submitted 1 year ago* (last edited 1 year ago) by daredevil@kbin.social to c/linux@kbin.social

Hi, sorry if this isn't the right place for this question. I've been using Linux Mint Cinnamon for about 9 months now and have also been experimenting with an Ubuntu GNOME Wayland session for the past month or so. I don't really like distro-hopping, but using X11 isn't cutting it for me. After giving GNOME an honest shot, I don't think it's for me. However, Wayland has been stellar. I would prefer to keep using LM Cinnamon, but I have a dual monitor setup that use different refresh rates which has been causing issues.

I'm interested in Arch, but I'm slightly concerned about the frequent comments regarding things breaking during updates. Also, is maintaining an Arch install heavy on time consumption? I'm not opposed to reading the wiki and spending time here and there to keep things working. However, I'm a bit hesitant if I were to run into an issue that may be more complicated than I may be prepared for. That said, generally I do like the higher skill ceiling options, if that makes sense in this context.

Tumbleweed seems more beginner friendly from what I've read so far. While I do generally enjoy challenges, having a smoother day-to-day experience does certainly have it's own appeal.

I would primarily be doing some gaming (this would be a mix of more recent AAA titles along with less demanding ones) and programming, along with the usual stuff you'd expect on a desktop setup. I have a Ryzen 5 3600 processor, an AMD 6650 XT GPU, and 16 gb RAM if that information helps. Thanks in advance; if this isn't the right place, I'll delete the post.

Update: I have installed EndeavourOS and things have been smooth so far. The installer was very straightforward, and setup was extremely quick. I have started reinstalling various programs which were part of my original workflow with very minimal issues. The issues primarily came from adjusting to pacman syntax. I also have a series of notes regarding what I have installed and how. Cheers, and thanks for your input, everyone. I will be sticking with Gnome for the time being.

[-] daredevil@kbin.social 14 points 1 year ago* (last edited 1 year ago)

A lot of social engagement through social media is driven by impressions such as up votes, favorites, likes, etc. Unfortunately, an easy way to promote engagement and such lies in rage bait. This is likely due to the visceral emotional response generated by rage baiting. I would also extend this issue to how ubiquitous instant gratification is to the internet and social media. People tend to acquire clout through reacting to something quickly, which isn't always well-thought out. Add in the notion of mob mentality, and you have a recipe for the rapid exponential propagation of negative words, thoughts, and emotions. People also tend to not have productive ways of channeling their frustrations and issues, so they often see other entities on the Internet as just a name, sometimes less than that.

There's also a heavy amount of tribalism across a variety of domains which allows one to take refuge from this rage baiting by finding other like-minded individuals to identify with. In some cases, the stress of everyday life or what have you removes a sense of agency or power in one's life and sometimes people cope with this by developing a sense of superiority through whichever group or ideal that they identify with. This cycle repeats itself until there is a constant battle between any given groups where people attempt to elevate their self-worth by putting those that they dont agree with down, while emphasizing the superiority of their own ideal, IMO. I could be totally wrong ofc. I'm hardly perfect.

It's been a pretty exhausting experience. I'm tired of it as well; my fondness for engaging with people has diminished greatly.

[-] daredevil@kbin.social 12 points 1 year ago

AMD has served me well since I've started actively using Linux.

[-] daredevil@kbin.social 26 points 1 year ago

Even though this is a nice development, I'm pretty disappointed in the resources directed at Linux support. I'm considering dropping Proton soon.

[-] daredevil@kbin.social 23 points 1 year ago

Are you shore?

view more: next ›

daredevil

joined 1 year ago