You are viewing a single thread.
View all comments View context
14 points

In fact, I’d assume a bot would be less likely to make a phonetic mistake than a person/

permalink
report
parent
reply
5 points

I started to say this in my previous comment, but on things like Youtube shorts, I’ve noticed the baked in subtitles they always have tend to be hilariously inaccurate, even if the video is using a text-to-speech program to read aloud something written on Tumblr or Reddit, so they had the text in the first place… It does speech-to-text, then they run text-to-speech on that.

LLMs are trained on written text, and I don’t think they would correctly innovate on misspelling. Someone else mentioned the “should of” mistake, which I can see an LLM doing, because it’s a common mistake humans have made. “cost” instead of “caused” isn’t commonly made by humans, so I don’t think an LLM would just come up with it. STT software has been pulling that shit for 30 years now though.

permalink
report
parent
reply
2 points

Absolutely. STT is still hit and miss on YouTube.

permalink
report
parent
reply
3 points

Likely. I was thinking that too, but still sort of the same outcome. Journalism is dying a very public death.

permalink
report
parent
reply

Good News Everyone

!goodnewseveryone@sh.itjust.works

Create post

A place to post good news and prevent doom scrolling!

Rules for now:

  1. posts must link from a reliable news source
  2. no reposts
  3. paywalled articles must be made available
  4. avoid politics

Community stats

  • 711

    Monthly active users

  • 82

    Posts

  • 646

    Comments