141
submitted 3 months ago by yogthos@lemmy.ml to c/technology@lemmy.ml
top 50 comments
sorted by: hot top controversial new old
[-] YurkshireLad@lemmy.ca 66 points 3 months ago

350,000 servers? Jesus, what a waste of resources.

[-] yogthos@lemmy.ml 53 points 3 months ago

just capitalist markets allocating resources efficiently where they're need

load more comments (5 replies)
[-] AlexWIWA@lemmy.ml 19 points 3 months ago

Sounds like we're going to get some killer deals on used hardware in a year or so

[-] queermunist@lemmy.ml 56 points 3 months ago

Totally not a bubble though.

[-] MajorHavoc@programming.dev 23 points 3 months ago* (last edited 3 months ago)

Yeah. It's a legitimate business, where the funders at the top of the pyramid are paid by those that join at the bottom!

[-] riskable@programming.dev 33 points 3 months ago

Now's the time to start saving for a discount GPU in approximately 12 months.

[-] FaceDeer@fedia.io 17 points 3 months ago

They don't use GPUs, they use more specialized devices like the H100.

[-] tyler@programming.dev 8 points 3 months ago

Everyone that doesn’t have access to those is using gpus though.

[-] FaceDeer@fedia.io 8 points 3 months ago

We are talking specifically about OpenAI, though.

[-] porous_grey_matter@lemmy.ml 7 points 3 months ago

People who previously were at the high end of GPU can now afford used H100s -> they sell their GPUs -> we can maybe afford them

load more comments (1 replies)
load more comments (2 replies)
[-] Aabbcc@lemm.ee 3 points 3 months ago

Can I use a H100 to run hell divers 2?

[-] Ephera@lemmy.ml 20 points 3 months ago

I do expect them to receive more funding, but I also expect that to be tied to pricing increases. And I feel like that could break their neck.

In my team, we're doing lots of GenAI use-cases and far too often, it's a matter of slapping a chatbot interface onto a normal SQL database query, just so we can tell our customers and their bosses that we did something with GenAI, because that's what they're receiving funding for. Apart from these user interfaces, we're hardly solving problems with GenAI.

If the operation costs go up and management starts asking what the pricing for a non-GenAI solution would be like, I expect the answer to be rather devastating for most use-cases.

Like, there's maybe still a decent niche in that developing a chatbot interface is likely cheaper than a traditional interface, so maybe new projects might start out with a chatbot interface and later get a regular GUI to reduce operation costs. And of course, there is the niche of actual language processing, for which LLMs are genuinely a good tool. But yeah, going to be interesting how many real-world use-cases remain once the hype dies down.

[-] yogthos@lemmy.ml 5 points 3 months ago

It's also worth noting that smaller model work fine for these types of use cases, so it might just make sense to run a local model at that point.

[-] chemicalwonka@discuss.tchncs.de 17 points 3 months ago
[-] Travelator@thelemmy.club 16 points 3 months ago

Good. It's fake crap tech that no one needs.

[-] curiousaur@reddthat.com 9 points 3 months ago

It's actually really awesome and truly helps with my work.

load more comments (5 replies)
[-] flambonkscious@sh.itjust.works 13 points 3 months ago

The start(-up?)[sic] generates up to $2 billion annually from ChatGPT and an additional $ 1 billion from LLM access fees, translating to an approximate total revenue of between $3.5 billion and $4.5 billion annually.

I hope their reporting is better then their math...

[-] Hector_McG@programming.dev 10 points 3 months ago

Probably used ChatGPT….

[-] twei@discuss.tchncs.de 8 points 3 months ago

Maybe they also added 500M for stuff like Dall-E?

[-] flambonkscious@sh.itjust.works 3 points 3 months ago

Good point - it guess it could have easily fallen out while being edited, too

load more comments (1 replies)
[-] PanArab@lemmy.ml 13 points 3 months ago

I hope so! I am so sick and tired of AI this and AI that at work.

[-] PeepinGoodArgs@reddthat.com 13 points 3 months ago

I will be in a perfect position to snatch a discount H100 in 12 months

[-] delirious_owl@discuss.online 12 points 3 months ago

Bubble. Meet pop.

[-] kjaeselrek@lemmy.ml 9 points 3 months ago
[-] NigelFrobisher@aussie.zone 8 points 3 months ago
[-] ryan213@lemmy.ca 12 points 3 months ago
[-] geneva_convenience@lemmy.ml 7 points 3 months ago

Ai stands for artificial income.

[-] Tangentism@lemmy.ml 7 points 3 months ago
[-] strawberry@kbin.run 7 points 3 months ago
[-] Aurenkin@sh.itjust.works 6 points 3 months ago

Last time a batch of these popped up it was saying they'd be bankrupt in 2024 so I guess they've made it to 2025 now. I wonder if we'll see similar articles again next year.

[-] coffee_with_cream@sh.itjust.works 6 points 3 months ago

For anyone doing a serious project, it's much more cost effective to rent a node and run your own models on it. You can spin them up and down as needed, cache often-used queries, etc.

[-] yogthos@lemmy.ml 6 points 3 months ago

For sure, and in a lot of use cases you don't even need a really big model. There are a few niche scenarios where you require a large context that's not practical to run on your own infrastructure, but in most cases I agree.

[-] arran4@aussie.zone 3 points 3 months ago

This sounds like FUD to me. If it were it would be acquired pretty quickly.

[-] jackyalcine@lemmy.ml 6 points 3 months ago

They're wholly owned by Microsoft so it'd probably be mothballed at worst.

[-] arran4@aussie.zone 4 points 3 months ago

For another conversation I need some evidence of that, where did you find it?

load more comments (2 replies)
load more comments
view more: next ›
this post was submitted on 28 Jul 2024
141 points (97.3% liked)

Technology

34795 readers
396 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS