this post was submitted on 17 Mar 2024
462 points (95.5% liked)

Technology

75238 readers
4623 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] 9488fcea02a9@sh.itjust.works 82 points 2 years ago (3 children)

I'm not a developer, but I use AI tools at work (mostly LLMs).

You need to treat AI like a junior intern.... You give it a task, but you still need to check the output and use critical thinking. You cant just take some work from an intern, blindly incorporate it into your presentation, and then blame the intern if the work is shoddy....

AI should be a time saver for certain tasks. It cannot (currently) replace a good worker.

[–] Lmaydev@programming.dev 34 points 2 years ago* (last edited 2 years ago) (1 children)

As a developer I use it mainly for learning.

What used to be a Google followed by skimming a few articles or docs pages is now a question.

It pulls the specific info I need, sources it and allows follow up questions.

I've noticed the new juniors can get up to speed on new tech very quickly nowadays.

As for code I don't trust it beyond snippets I can use as a base.

[–] FiniteBanjo@lemmy.today 0 points 2 years ago* (last edited 2 years ago) (1 children)

JFC they've certainly got the unethical shills out in full force today. Language Models do not and will never amount to proper human work. It's almost always a net negative everywhere it is used, final products considered.

[–] Lmaydev@programming.dev 1 points 2 years ago (1 children)
[–] FiniteBanjo@lemmy.today 1 points 2 years ago (1 children)

Its intended use is to replace human work in exchange for lower accuracy. There is no ethical use case scenario.

[–] Lmaydev@programming.dev 1 points 2 years ago (1 children)

It's intended to show case its ability to generate text. How people use it is up to them.

As I said it's great for learning as it's very accurate when summarising articles / docs. It even sources it so you can read up more if needed.

[–] FiniteBanjo@lemmy.today 0 points 2 years ago (1 children)

It's been known to claim commands and documentation exist when they don't. It very commonly gets simple addition wrong.

[–] Lmaydev@programming.dev 1 points 2 years ago (1 children)

That's because it's a language processor not a calculator. As I said you're using it wrong.

[–] FiniteBanjo@lemmy.today 1 points 2 years ago (1 children)

So the correct usage is to have documents incorrectly explained to you? I fail to see how that does any good.

[–] Lmaydev@programming.dev 1 points 2 years ago

I know you do buddy.

[–] Gradually_Adjusting@lemmy.ca 15 points 2 years ago (1 children)

It's clutch for boring emails with several tedious document summaries. Sometimes I get a day's work done in 4 hours.

Automation can be great, when it comes from the bottom-up.

[–] isles@lemmy.world 2 points 2 years ago

Honestly, that's been my favorite - bringing in automation tech to help me in low-tech industries (almost all corporate-type office jobs). When I started my current role, I was working consistently 50 hours a week. I slowly automated almost all the processes and now usually work about 2-3 hours a day with the same outputs. The trick is to not increase outputs or that becomes the new baseline expectation.

[–] fidodo@lemmy.world 8 points 2 years ago

I am a developer and that's exactly how I see it too. I think AI will be able to write PRs for simple stories but it will need a human to review those stories to give approval or feedback for it to fix it, or manually intervene to tweak the output.