188
submitted 3 months ago by obbeel@lemmy.eco.br to c/technology@lemmy.world
all 21 comments
sorted by: hot top controversial new old
[-] Lettuceeatlettuce@lemmy.ml 89 points 3 months ago

Capitalists: So you're telling me I can build 1000x more AI data center infrastructure now?

[-] werefreeatlast@lemmy.world 5 points 3 months ago

Today, I started it like any other day... A good big boob gangbang video. Then to the serious stuff, the stuff AI is really here for!.... Hello Mr Chat GPT! Could you please write me a python program to predict the next lotto ticket numbers and winning locations?

Oh and I do like that new EV. I see you would like me to purchase it since every page has it. But how could I tell AI that in the state of Washington we cannot have e-bikes?! Some idiot put a regulation that limits the power to 750w and the speed to 20mph. That makes going to the nearest store almost practical if I want to spend my day riding.

AI could you please cause accidents that create the right environment for new politicians to increase that power limit to 3kw or something more practical? ....eh okay Mr Chat, my cat just died on the window, could you please re-write the script so that my cat doesn't die on the window and politicians decide to increase the power to 3kw?.... Ok Mr Chat, my cat now died in our microwave and the microwave is now 3kw. That's very close to what I need but could you please not kill my cat? .... Introducing the new KitchenAid 3kw horse discombobulation machine! Invented by the smartest persons in the entire planet! Bike? What bike?

[-] xthexder@l.sw0.com 4 points 3 months ago* (last edited 3 months ago)

I don't really see the problem with restricting e-bike power. You can still go faster than 20mph if you pedal. I think what you really want is a motorcycle. They make those in electric form too.

[-] BlackLaZoR@kbin.run 53 points 3 months ago

This constant shuttling of information back and forth is responsible for consuming as much as 200 times the energy used in the computation, according to this research.

Press x to doubt. I know moving data costs more energy than computation itself, but that sounds like a pure BS.

[-] oakey66@lemmy.world 37 points 3 months ago

Everyone is trying to cash in before it all collapses because big tech has just turned into what I dub as hype hopping since they have no good ideas.

[-] Sanctus@lemmy.world 23 points 3 months ago

Where the fuck are the automatic dishwashers that put the dishes away? Dryers that fold your clothes for you? We've got a long fucken way to go and we haven't even half pillaged the Jetsons. Let's get on with the progress already! Fuck chasing profit!

[-] db2@lemmy.world 23 points 3 months ago

It was probably written by AI.

[-] A_A@lemmy.world 39 points 3 months ago* (last edited 3 months ago)

Experimental demonstration of magnetic tunnel junction-based computational random-access memory
"In this work, a CRAM array based on magnetic tunnel junctions (MTJs) is experimentally demonstrated. First, basic memory operations, as well as 2-, 3-, and 5-input logic operations, are studied. Then, a 1-bit full adder with two different designs is demonstrated."
https://www.nature.com/articles/s44335-024-00003-3
So, this is experimentally demonstrated, yet, only at small scale.

[-] SkybreakerEngineer@lemmy.world 2 points 3 months ago

Everyone knows that Big Data just means lots of 1-bit adders

[-] FaceDeer@fedia.io 19 points 3 months ago

It probably doesn't matter from a popular perception standpoint. The talking point that AI burns massive amounts of coal for each deepfake generated is now deeply ingrained, it'll be brought up regularly for years after it's no longer true.

[-] interdimensionalmeme@lemmy.ml 2 points 3 months ago

Stochastic parrot ! Stochastic parrot ! Stochastic parrot !

[-] GBU_28@lemm.ee 1 points 3 months ago* (last edited 3 months ago)

Casual consumers don't care one bit about that. Companies would, because this would save money

[-] recklessengagement@lemmy.world 8 points 3 months ago

Possible solution to the Von Neumann bottleneck? Or does this address a different issue

[-] palordrolap@kbin.run 12 points 3 months ago* (last edited 3 months ago)

To stick with the analogy, this is like putting a small CPU inside the bottle, so the main CPU<->RAM bottleneck isn't used as often. That said, any CPU, within RAM silicon or not, is still going to have to shift data around, so there will still be choke points, they'll just be quicker. Theoretically.

Thinking about it, this is kind of the counterpart to CPUs having an on-chip cache of memory.

Edit: counterpoint to counterpart

[-] HubertManne@moist.catsweat.com 2 points 3 months ago

I hope this is true. ai has its uses but it can't be way more inneficient. It would be great if it answering used no more energy than a standard web query

[-] DumbAceDragon@sh.itjust.works 1 points 3 months ago

"Hey Tim, could you maybe add a couple hundred buzzwords to that title for me? Thanks!"

[-] Ilovethebomb@lemm.ee 0 points 3 months ago

Arm’s CEO recently suggested that by 2030, AI may consume a quarter of all energy produced in the U.S.

No way does AI produce enough value that they could afford this.

this post was submitted on 29 Jul 2024
188 points (91.2% liked)

Technology

59374 readers
3467 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS