if AI ever gets good enough to do this accurately and in real time, we’d be looking at an actual babelfish
i would finally, at long last, have to hand it to them
Sentence structure means that it kind of can’t happen in real-time as such, because you would need to wait until potentially the end of the sentence to get words that appear early in the sentence in an accurate and natural-ish translation. If “20 seconds later” is real time, barring run-on sentences, which are much more common in speech than in writing, then I guess.
I think most people are okay with a reasonable delay if the live interpretation is accurate.
Excited to hear Hell of Presidents in the original Mandarin, as it was intended.
the series has been so good and it makes me miss Matt Christman even more :(
wild. any other mandarin speakers get a bizarre almost synaesthetic sensation from this?
Here’s the funny part: their American accent totally made it believable.
It’s very clear that even with the AI generated voice, they are not native Mandarin speakers. They sound like your typical foreigners who learned Chinese for a number of years lol. I don’t know if it’s the dataset they’re trained on or just how the algorithm works, but it’s very interesting.
Makes me think about what it would be like if Chinese ever becomes an international language, in the way English has and Latin did before it. It makes me giggle to think about Mandarin with a backwoods Tennessee drawl.
The best comparison for me is Montreal french. Deadass sounds like your uncle from up north getting a little parlez vous on.
Even with the phonemes of any two given language varieties that are considered to be “the same sound”, there are going to be differences in what the average pronunciation is, so I assume that’s a lot of what’s going on here. The other thing is that English and Chinese have a lot of phonemes that barely or don’t at all overlap in possible pronunciations, so the algorithm is picking the closest match.
finally a good use for AI