49
submitted 8 months ago by ylai@lemmy.ml to c/fosai@lemmy.world
all 32 comments
sorted by: hot top controversial new old
[-] MaxPow3r11@lemmy.world 54 points 8 months ago

He should build a sub to go to the bottom of the ocean.

[-] finthechat@kbin.social 20 points 8 months ago

And take Elon Musk with him

[-] lurch@sh.itjust.works 13 points 8 months ago

they'll never do that, but maybe a spaceship shaped like a giant penis could do the trick

[-] ArtVandelay@lemmy.world 5 points 8 months ago

Give it exactly 420.69 lb of rocket fuel and fire it into the sun

[-] lurch@sh.itjust.works 3 points 8 months ago

If they hesitate, we just tell them there are neither poor people, nor taxes in space 😜

[-] chaogomu@kbin.social 5 points 8 months ago

And skimp on materials, build it out of carbon fiber instead of titanium or steel.

[-] kool_newt@lemm.ee 1 points 8 months ago

He should test out those new nitrogen chambers.

[-] RagingSnarkasm@lemmy.world 50 points 8 months ago

Building a metaverse that people want to actually engage with was too hard, so he's decided to scale back his ambitions and tackle something less difficult: AGI.

[-] fhein@lemmy.world 5 points 8 months ago

He just want some virtual friends to hang out with in the metaverse, since humans weren't interested.

[-] wooki@lemmynsfw.com 1 points 8 months ago

That whole laughable expedition stinks of financial fraud

[-] Buffalox@lemmy.world 15 points 8 months ago* (last edited 8 months ago)

Is Zuckerberg an idiot? Or does he have an actual plan with this?
Seems to me it's completely useless like Metaverse.
If the LLM is so stupid it can't figure out the sides of an equal sign can be reversed as simple as in 2+2=4 <=> 4=2+2. He will never achieve general intelligence by just throwing more compute power at it.
As powerful as LLM is, it's still astoundingly stupid when it hits its limitations.

[-] lauha@lemmy.one 12 points 8 months ago

Humans are astoundigly stupid when they hit their limitations.

[-] art101@lemmy.world 4 points 8 months ago

Spoken like a true AI!!!! ;-)

[-] Sylver@lemmy.world 1 points 8 months ago

The difference is that we can go beyond that limitation. Even self-coding AI will either solve a problem, or compound its own inefficiencies before asking an operator to help out.

[-] Neil@lemmy.ml 6 points 8 months ago

Your post sounds almost as dense as:

"everything that can be invented has been invented." - Duell 1899.

[-] Sylver@lemmy.world 2 points 8 months ago

The difference here is that Zuck is not planning on inventing or revolutionizing anything. He’s just throwing more computation power at an already inefficient method of modeling AI.

[-] lauha@lemmy.one 3 points 8 months ago

or compound its own inefficiencies before asking an operator to help out.

Some people do. Some people refuse to ask for help.

[-] deafboy@lemmy.world 6 points 8 months ago

I don't know much, but from what I know, we still haven't reach a point of diminishing returns, so more power = more better.

[-] fidodo@lemmy.world 2 points 8 months ago

Not necessarily since you also need better techniques. A competitor could easily surpass you with less by being smarter about how the AI is trained.

[-] fidodo@lemmy.world 5 points 8 months ago* (last edited 8 months ago)

Trying to achieve AGI by throwing more compute at LLMs is like trying to reach the moon by building a more powerful hot air balloon.

[-] Buffalox@lemmy.world 2 points 8 months ago

Assuming that "not compute" should be "more compute" I totally agree. That's a very apt analogy.

[-] fidodo@lemmy.world 2 points 8 months ago* (last edited 8 months ago)

Yes, thanks, swipe typing picked up "not" instead of "more". Maybe someone can throw some more compute at the swipe typing algorithm to better pick up on the context of the sentence when picking words.

[-] Boozilla@lemmy.world 10 points 8 months ago

Zuckerberg: Why are my pupils vertical slits? Why am I always cold? Why do people find me so repellent?

AI: Sir, I can answer all three with one response, but you won't like it.

[-] iAvicenna@lemmy.world 8 points 8 months ago

I wonder if he really thinks that AGI is just AI with more parameters and gpus thrown into the mix.

[-] jcolag@lemmy.sdf.org 8 points 8 months ago

Sure, we could point to thousands of years of really smart people trying and utterly failing to build mathematical models for innovation and thought, but it also does make a certain amount of sense that, if you pile up enough transistors and wish really hard, that your investment will Frosty the Snowman itself into being your friend, right...?

[-] Dkarma@lemmy.world 5 points 8 months ago

Iirc the h100s are $30k per gpu at this time.

[-] Squeak@lemmy.world 7 points 8 months ago

I’m sure he’d get a hefty discount on 350k of them

[-] thallamabond@lemmy.world 6 points 8 months ago

Retail price of $10,500,000,000. That's nuts.

[-] tburkhol@lemmy.world 5 points 8 months ago

Still less than Musk paid for Twitter. It's totally reasonable from a biggest-billionaire-toy entry.

[-] Fermion@feddit.nl 2 points 8 months ago

This is why they can't stop harvesting your data.

this post was submitted on 20 Jan 2024
49 points (83.6% liked)

Free Open-Source Artificial Intelligence

2799 readers
1 users here now

Welcome to Free Open-Source Artificial Intelligence!

We are a community dedicated to forwarding the availability and access to:

Free Open Source Artificial Intelligence (F.O.S.A.I.)

More AI Communities

LLM Leaderboards

Developer Resources

GitHub Projects

FOSAI Time Capsule

founded 1 year ago
MODERATORS