Glorious Alpha Two Testers!
Phase I of Alpha Two testing will occur on weekends. Each weekend is scheduled to start on Fridays at 10 AM PT and end on Sundays at 10 PM PT. Find out more here.
Check out Alpha Two Announcements here to see the latest Alpha Two news and update notes.
Our quickest Alpha Two updates are in Discord. Testers with Alpha Two access can chat in Alpha Two channels by connecting your Discord and Intrepid accounts here.
Phase I of Alpha Two testing will occur on weekends. Each weekend is scheduled to start on Fridays at 10 AM PT and end on Sundays at 10 PM PT. Find out more here.
Check out Alpha Two Announcements here to see the latest Alpha Two news and update notes.
Our quickest Alpha Two updates are in Discord. Testers with Alpha Two access can chat in Alpha Two channels by connecting your Discord and Intrepid accounts here.
The AI voicework feels off
Halae
Member, Alpha Two
I can't be the only one that feels like the few lines of machine-VA we've gotten feels weird and wrong. Like, I don't have any inherent problems with language models and such like that as long as everything is sourced ethically, but all the characters that have voices right now sound... drunk. Or maybe high. Their speech is slow, plodding, has the wrong cadence, mispronounces colloquialisms, mispronounces actual words rarely, and just in general feels wrong?
I'd actually prefer no VA to what we have now, since every time I hear it I try to blast through it as fast as possible so that I don't have to listen to it, because it falls into a kind of auditory uncanny valley. I don't know if there's a way to fix that using the language models that are currently market-available.
I'd actually prefer no VA to what we have now, since every time I hear it I try to blast through it as fast as possible so that I don't have to listen to it, because it falls into a kind of auditory uncanny valley. I don't know if there's a way to fix that using the language models that are currently market-available.
1
Comments
'ere turning into 'air' half the time, the awkwardly long pauses, emphasis in the wrong places consistently
I'd rather the team actually just read the lines themselves. It'd turn out better than this
Surely generating new voice lines every time isn't good for performance.
It's not doing it fresh, judging by the various mismatches in text to audio
Intrepid needs to keep developing the tech and it will be great. On a side note, let them develop more instead of 3 days into alpha 2 demanding features be removed because they aren't perfect.
But Tradesmanager Hernet needs work. I re-read/relistened to every conversation with him atleast 4 times
due to how horribly brainfudging his AI voice was.
I'd prefer him not losing the flavour of the text, but rather have the AI voice being able to better
deliver the lines.
Example clip: https://www.twitch.tv/nratv/clip/AcceptablePlumpRabbitAllenHuhu-wSSiy8LP0zUdFhNl
They’re using their own form of AI voices, which they produce in-house within the company.
And in the Alpha streams, it was already mentioned that they’ll still be fine-tuning the voices, so what we’ve heard so far is only the 'raw' version.
Going to be some growing pains for now but it will improve drastically by the time the game releases.
Does anyone know how much machine learning/"AI" is used in this game? I don't typically buy or support games that use generative content or art.