176
Ruletanic (lemmy.blahaj.zone)
you are viewing a single comment's thread
view the rest of the comments
[-] princessnorah@lemmy.blahaj.zone 2 points 11 months ago

Hope you like 40 second response times unless you use a GPU model.

[-] JDubbleu@programming.dev 10 points 11 months ago

I've hosted one on a raspberry pi and it took at most a second to process and act on commands. Basic speech to text doesn't require massive models and has become much less compute intensive in the past decade.

[-] princessnorah@lemmy.blahaj.zone 2 points 11 months ago

Okay well I was running faster-whisper through Home Assistant.

this post was submitted on 22 Nov 2023
176 points (100.0% liked)

196

16452 readers
1785 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS