this post was submitted on 18 Mar 2025
884 points (97.6% liked)
Buy European
3949 readers
2527 users here now
Overview:
The community to discuss buying European goods and services.
Rules:
-
Be kind to each other, and argue in good faith. No direct insults nor disrespectful and condescending comments.
-
Do not use this community to promote Nationalism/Euronationalism. This community is for discussing European products/services and news related to that. For other topics the following might be of interest:
-
Include a disclaimer at the bottom of the post if you're affiliated with the recommendation.
Feddit.uk's instance rules apply:
- No racism, sexism, homophobia, transphobia or xenophobia
- No incitement of violence or promotion of violent ideologies
- No harassment, dogpiling or doxxing of other users
- Do not share intentionally false or misleading information
- Do not spam or abuse network features.
- Alt accounts are permitted, but all accounts must list each other in their bios.
Benefits of Buying Local:
local investment, job creation, innovation, increased competition, more redundancy.
Related Communities:
Buying and Selling:
!flohmarkt@lemmy.caBoycott:
!boycottus@lemmy.caStop Publisher Kill Switch in Games Practice:
!stopkillinggames@lemm.eeBanner credits: BYTEAlliance
founded 1 month ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Pretty much all of the ai tools available now have been shown to hallucinate, even if it started out with an internet search.
I've had ai tools spit out real looking URLs that led to 404 pages, because it had hallucinated those links. It's a place to start your research, to maybe refine your questions, but I wouldn't trust it much with the actual research.
An LLM, a large language model, that an ai tool like Mistral is, doesn't really use knowledge, it predicts what the next logical text is going to be based on information it has been trained on. It doesn't think, it doesn't reason, it just predicts what the next words are likely going to be.
It doesn't even understand text, that's why all of them claimed that there were just 2 Rs in strawberry. It doesn't treat text as text.
You can use it to rewrite a text for you, perhaps even summarize (though there's still the possibility of hallucinations there), but I wouldn't ask it to do research for you.
This is really really helpful, thank you. I appreciate the explanation.