this post was submitted on 12 Aug 2025
59 points (94.0% liked)
Technology
40289 readers
449 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I actually asked my locally running LLM(s) to rework my resume and specifically to add in any common skills or tools for the roles that I didn't have listed (8 years as a generalist you touch a LOT of stuff, and I hadn't remembered quite a few of them), and removed any that weren't applicable.
I've been getting a decent number of interviews (3 this week, 2 last).
Honestly this isn't just an AI issue, this is also a recruiter issue. The hiring manager gives a role description and a list of skills or other keywords for the posting, but the recruiter doesn't know what half of them are. An actual human may not know that "Cisco" + "network engineer" = configured routers. Hell, I've had people ask me if Cisco (who I actually did work for, but not as a network engineer) is the food company, thinking of Sysco.
Sorry if this is a stupid question, but is there a good place to figure out how to run LLM's locally? Seems safer than entering personal data onto a server somewhere
As a start, you could take a look at Ollama, which seems to be available in many package managers if you use one. I've done some experimenting with mistral-nemo, but you should pick a model size appropriate to your hardware and use case. I believe there are GUIs and extensions for Ollama, but as someone with a low interest in LLMs, I've only used the bare bones features through my terminal, and I haven't used it for any projects or tasks.
You definitely shouldn't trust it to teach you anything (I've seen some highly concerning errors in my tests), but it might be useful to you if you can verify the outputs.
Also check out the PrivacyGuides page on LLMs.
Thank you for the information! Yeah, I don't really trust them. They feel flimsy and unreliable for most things. Sometimes, they have their moments where they seem actually helpful.
I hate their usage overall, I just figure if I need it to help me land a job at some point, I should probably just have some extra options ready.
Cisco? Oh, the vegetable oil?
That's shortening!
No, it's that guy from Star Wars.