قالب وردپرس درنا توس
Home / Technology / Alexa and Google Home are no threat to regional accents – here's why

Alexa and Google Home are no threat to regional accents – here's why



Credit: Shutterstock

"Hello, Google. Can you understand my accent?"


"Of course. You have an incredibly comfortable dialect."

Then my Saturday morning talked with my Google Home Mini. And while I was happy to get a compliment about my speech, it was not my motivation to ask the question. Our human-to-robot chat was triggered by recent news reports claiming that speech assistants are likely to "stamp out", "kill off" and make us "lose" our regional accents, described by some as "under threat". [1

9659005] Another apocalyptic story about language and another urge to lingvister to do something mytebusting.

These shattering headings relate to the findings of a recent study by the Life Science Center in Newcastle-upon-Tyne, claiming that 79% of visitors to their robots – then and now the exhibition reported to change the way they speak to be understood by speech assistants, such as Alexa (Amazon), Google Assistant, Siri (Apple) and Cortana (Microsoft). Most importantly, these visitors have also reported that they have regional British accents, and their perceived need for accent shift highlights two issues.

One problem is practical. People with non-standard accents are more likely to experience difficulties communicating when interacting with their smart speakers. The second problem is social. Speech technology seems to privilege standard accents, such as Southern Standard British English (also known as received pronunciation). But what is important to note is that our virtual assistants are not inherently biased.

"Hi, Google. Do you like how I speak?"

"Sorry, I can not judge. But I can play some music on you."

Language-based discrimination is around us and when we ask our devices to play our favorite song and end up with something else or with a " Sorry, I do not understand, "it may cause us to wonder why our voices are not heard and why these machines seem to work against those of us with regional accents.

The Internet is in abundance with clips and complaints. Scottish accents are particularly difficult for speech recognition technology, as shown by research and popular with comedy kiss, Burnistoun. But we must bear in mind that recognizing the wide range of English accents that exist is not a small achievement (consider the different accents spoken in your region and then look over the amount of accents spoken in English speaking countries around the world and by other English speakers).

Research suggests that we generally are good at recognizing where people are based, based on their accents, but may sometimes struggle, usually due to lack of prior exposure. So it makes sense that our virtual assistants also need to get acquainted with regional accents through language recording to recognize – or even use – accents when they interact with us.

«Hello Google. Are smart speakers stamping out regional accents? "

" Sorry, I do not understand. "

Okay, I probably pulled my Google assistant with this question, but I was fascinated to see if it would anticipate a kind of linguistic prophetic judgment day. And I was prepared to counteract end-time pessimism with the soothing message that speech assistants will not cause regional accidents. Here are just a few reasons why.

Calm down

Interactions between human and robot are much less frequent than human to human interactions. The majority of Our daily talk is designed for other people. Changing our speech sometimes a day to interact with smart speakers will not affect our accents more generally.

Our "smart speaker accents" look like accents we already use regularly. We do not tend to change our accents drastically when talking to smart speakers. We can increase our volume, talk slowly and say more. We can pronounce things differently unhealthy, but only a little. And we adapt our speech in similar ways in many other contexts: calling, making job interviews, talking with people from other cities and countries.

Accents and the language we use more generally are fundamental to who we are. We create and imply our identities through language and other social practices. Through our speech we can emphasize our professionalism in some settings, and our street cred in others. Our accents are often a source of regional pride and linked to a sense of incorporation of loyalty and belonging. We are not robots using a recording model of language. We speak in certain ways because it is meaningful to do it.

So, do not be afraid, the end of rich and varied British accents is not close. If anything, voice-activated artificial intelligence makes a regional speech an increasingly important part of everyday interactions. Instead of seeing voice assistants as a threat to regional accents, we may be better at embracing the challenge of learning them our lingo.

«Hello, Google. Do you like to talk to a particular accent? "

" I'm afraid you have to put up with this for now. "

If we look at their ability to produce languages, smart speakers have undoubtedly come a long way. My Google Assistant, for example, now speaks German, French, Japanese, Italian and Spanish, as well as some language variations within each category. It can answer me with an Australian, Canadian, British, American, Indian, Singaporean or Irish accent of English (albeit mainly standard variants at this stage, and often with glitches). In this sense, they are more representative and inclusive than ever before, but of course there is still a way to go.


Explore further:
Amazon patent speaks places accent the translation center scene

Delivered by:
The conversation


Source link