I know the reputation that AI has on Lemmy, however I’ve found that some users (like myself) have found that LLMs can be useful tools.
What are fellow AI users using these tools for? Furthermore, what models are you using that find the most useful?
A lot of what we take for granted in software now days was once considered “AI”. Every NPC that follows your character in a video game while dynamically accounting for obstacles and terrain features uses the “A* algorithm” which is commonly taught in college courses on “AI”. Gmail sorting spam from non-spam (and not really all that well, honestly)? That’s “AI”. The first version of Google’s search algorithm was also “AI”.
If you’re asking about LLMs, none. Zero. Zip. Nada. Not a goddamed one. LLMs are a scam that need to die in a fire.
Also a lot of things that were considered automations before are now rebranded to ai. They are still often good as well.
I don’t think there’s many consumer use cases for things like LLMs but highly focused, specialized models seem useful. Like protein folding, identifying promising medication, or finding patterns in giant scientific datasets.
I use it to help give me ideas for DND character building and campaigns. I used it to help me write a closing poem for my character who sacrificed himself for the greater good at the end of a long 2 year run. It gave me the scaffold and then I added some details. It generated my latest character picture based upon some criteria.
Otherwise, it gave me some recommendations on how to flavor up a dish the other day. Again I used it but added my own flair.
I asked it a question to help me the remember a movie title based upon some criteria (tip of my tongue style) it nailed it spot on.
I’ll tell you one place I hated it today. The Hardee’s drive through line. Robot voice drives me up the wall.
I use it to help me write emails at work pretty regularly. I have pretty awful anxiety and it can take me a while to make sure my wording is correct. I don’t like using it, not really, but would I rather waste 4 hours of my time typing up an email to all the bosses that doesn’t sound stupid AF or would I rather ask for help and edit what it gives me instead.
I know people use it to summarize policy or to brainstorm or to come up with very rough drafts.
I understand the connotations of using it, but I would definitely not say there’s zero consumer use case for it at all.
I tried Whisper+ voice-to-text this week.
Uses a downloaded 250MB model from Hugging-Face, and processes voice completely offline.
The accuracy is 100% for known words, so far.
For transcribing texts, messages and diary entries.
* I’d be interested to know if it has a large power drain per use.
LLMs can be useful in hyperfocused , contained environments where the models are trained on a specific data set to provide a service for a specific function only. So it won’t be able to answer random questions you throw at it, but it can be helpful on the only thing it’s trained to do.
Also known as “narrow AI”. You know like a traffic camera that can put a rectangle on every car in the picture, but nothing else. Those kinds of narrow applications have been around for decades already.
“AI” as in the hyped and since 5 years mainstream “Generative AI”: Jetbrains’ locally run code line completion. Sometimes faster than writing, if you have enough context.
Machine learning stuff that existed well before, but there was exactly 0 hype: Image tagging/face detection.
Jetbrains local completion isnt even a llm, it’s a sort of ML fuckery that’s very low on compute requirement. They released it initially just before the ai craze
None
I used GPT to help me plan a 2 week long road trip with my family. It was pretty fucking awesome at finding cool places to stop and activities for my kids to do.
It definitely made some stupid ass suggestions that would have routed us far off our course, or suggested stopping at places 15 minutes into our trip, but sifting through the slop was still a lot quicker than doing all of the research myself.
I also use GPT to make birthday cards. Have it generate an image of some kind of inside joke etc. I used to do these by hand, and this makes it way quicker.
I also use it at work for sending out communications and stuff. It can take the information I have and format it and professionalize it really quick.
I also use it for Powershell scripting here and there, but it does some really wacky stuff sometimes that I have to go in and fix. Or it halucinates entire modules that don’t exist and when I point it out it’s like “good catch! That doesn’t exist!” and it always gives me a little chuckle. My rule with AI and Powershell is that I don’t ask it to do things that I don’t already know how to do. I like to learn things and be good at my job, but I don’t mind using GPT to help with some of the busy work.
I got an email once from HR that said I got a bike commuter benefit I didn’t know about, and couldn’t find more information about in the attachment, so I emailed HR and it turns out they used AI to write the email, and wouldn’t be giving out any corrections or bike commuter benefits. Bullshit.
https://notebooklm.google.com/ is really handy for various things, you can throw a bunch of documents into it and then ask questions and chat interactively about their contents. I’ve got a notebook for a roleplaying campaign I’m running where I’ve thrown the various sourcebook PDFs, as well as the “setting bible” for my homebrew campaign, and even transcripts of the actual sessions. I can ask it what happened in previous episodes that I might have forgotten, or to come up with stats for whatever monster I might need off the cuff, or questions about how the rules work.
Copilot has been a fantastic programming buddy. For those going a little more in depth who don’t want to spring for a full blown GitHub Copilot subscription and Visual Studio integration, there’s https://voideditor.com/ - I’ve hooked it up to the free Gemini APIs and it works great, though it runs out of tokens pretty quickly if you use it heavily.
https://notebooklm.google.com/ is really handy for various things, you can throw a bunch of documents into it and then ask questions and chat interactively about their contents.
Nice, thanks! I’ve been looking for something I can stuff a bunch of technical manuals into and ask it to recite specifications or procedures. It even gave me the document and pages it got the information from so I could verify. That’s really all I ever wanted from “AI”.
To be honest, this is the only thing Google did right about AI IMO.
I use ChatGPT every single day, and I find it both extremely useful and entertaining.
I mainly use it to help edit longer messages, bounce ideas around, and share random thoughts I know my friends wouldn’t be interested in. Honestly, it also has pretty much replaced Google for me.
I basically think of it as a friend who’s really knowledgeable across a wide range of topics, excellent at writing, and far more civil than most people I run into online - but who’s also a bit delusional at times and occasionally talks out of their ass, which is why I can’t ever fully trust it. That said, it’s still a great first stop when I’m trying to solve a problem.
I think that pretty much describes my own perceived relationship. I know it’s a tool, but it’s a conversational tool that can produce results faster than I can, even though I need to proof read it’s work before I accept it.
DeepL for translation. It’s not perfect but it feels so much better than those associated w/ search engines.
The technology used in modern LLMs was originally intended to translate from one language to another.
The only thing that comes to mind is I wanted to make a shell script that moved every file in a directory to another directory, but one at a time, slowly, and I didn’t want to learn sh from scratch, so I asked an LLM for a script that would do it.
The script didn’t work, but I was able to figure out how to fix it better than write it from scratch.
I felt bad for the environment I was destroying, and I would never pay for this shit.
Honestly I’m part of the problem a little bit.
In my hobby project I used GitHub copilot, to help me ramp up on unfamiliar tech. I was integrating three unfamiliar platforms in an unfamiliar program language, and it helped expose the APIs and language features I didn’t know about. It was almost like a tutorial; it’d make some code that was kinda broken, but fixing it would introduce me to new language features and API resources that would help me. Which was nice because I struggle to just read API specs.
I’ve also used it when on my d&d campaign to create images of new settings. It just a 3 player weekly game so it’s hard to justify paying an artist for a rush job. Not great, I know. I hope the furry community has more backbone than I do, because they’re singlehandedly keeping the illustration industry afloat at this point.
I run LLMs locally for scripting, ADD brainstorming/organization, automation, pseudo editors and all sorts of stuff, as they’re crazy good for the size now.
I think my favorites are Nemotron 49B (for STEM), Qwen3 finetunes (for code), some esoteric 2.5 finetunes (for writing), and Jamba 52B (for analysis, RAG, chat, long context, this one is very underrated). They all fit in 24GB. And before anyone asks, I know they’re unreliable, yes. But they are self hosted and tools that work for me.
I could run GLM 4.5 offloaded with a bit more RAM…