in

Siri and Alexa are NOT making adults ruder because we don’t need to say please or thank you to them

Loading...

Why you should always THANK Alexa: Study warns children who grow up barking orders at smart assistants could end up with bad manners

  • Experts surveyed 274 adults and studied how nice they were to virtual assistants
  • They found that ordering around Siri or Alexa has no effect on your manners
  • This is because adults are set in their ways and don’t see the assistants as people
  • Children may be more influenced as they are more likely to personify them

Barking off orders to Alexa and Siri without so much as a please or thank you likely isn’t going to become a habit you carry over into the rest of your life.

This is because adults have already formed their behaviours for interacting with others — and, in their current form, we don’t see smart assistants as people. 

Researchers came to this conclusion after talking with over 200 people and seeing how they interacted with digital assistants like Alexa, Google Assistant and Siri.

However, children may be more susceptible to forming impolite habits from talking to smart assistants — partly because they are more likely to personify them.  

Adults may begin to be more influenced by their interactions with smart machines as their designs more more human-like or relatable, however, the researchers added. 

In addition, the researchers say that they expect the impacts of talking to smart assistants and artificial intelligences to increase — and affect even adults — as these become more anthropomorphic in their form. 

Scroll down for video

Barking off orders to Alexa and Siri without so much as a please or thank you likely isn't going to become a habit you carry over into the rest of your life

Barking off orders to Alexa and Siri without so much as a please or thank you likely isn’t going to become a habit you carry over into the rest of your life

Information systems researchers James Gaskin and Nathan Burton of Brigham Young University surveyed 274 people and observed them interacting with both other people and their digital assistants to assess how polite they were in each case.

‘Worried parents and news outlets alike have fretted about how the personification of digital assistants affects our politeness,’ said Professor Gaskin.

‘Yet we have found little reason to worry about adults becoming ruder as a result of ordering around Siri or Alexa.’

‘In other words, there is no need for adults to say “please” and “thank you” when using a digital assistant.’

The duo had actually been expecting to find the opposite — that the way that people treat Siri and Alexa might carry over to impact the way they conduct themselves with other people. 

The reason that this is not the case, they suggest, is that — in their current form — these virtual assistants helpers are not personified enough by the young adults who the researchers involved in their experiment.

However, the same may not be true for children, who are both less set in their behavioural habits and may be more inclined to regard digital assistants as if they were real people.

Information systems researchers James Gaskin and Nathan Burton of Brigham Young University surveyed 274 people and observed them interacting with both other people and their digital assistants to assess how polite they were in each case

Information systems researchers James Gaskin and Nathan Burton of Brigham Young University surveyed 274 people and observed them interacting with both other people and their digital assistants to assess how polite they were in each case

In fact, Amazon and Google have already reacted to similar concerns by parents, having programmed their smart assistants with the capacity to thank and compliment children who make polite requests to them.

In addition, the researchers say that they expect the impacts of talking to smart assistants and artificial intelligences to increase — and affect even adults — as these become more anthropomorphic in their form.

For example, devices such as Anki’s new Vector Robot — which has expressive eyes as wall as a head and arm-like-attachment that both move — are more likely to be seen as having and being able to understand emotions.

In addition, the researchers say that they expect the impacts of talking to smart assistants and artificial intelligences to increase ¿ and affect even adults ¿ as these become more anthropomorphic in their form. Pictured, Anki's expressive robot, Vector

In addition, the researchers say that they expect the impacts of talking to smart assistants and artificial intelligences to increase — and affect even adults — as these become more anthropomorphic in their form. Pictured, Anki’s expressive robot, Vector

Loading...

‘The Vector Robot appears to do a good job of embodying a digital assistant in a way that is easily personifiable,’ Mr Burton said.

‘If we did the same type of study using a Vector Robot, I believe we would have found a much stronger effect on human interactions.’

The full findings of the study will be presented on August 17 during the Americas Conference on Information Systems held in Cancún, Mexico.

WHAT DO EXPERTS SAY ON GIVING ROBOTS STATUS AS PERSONS UNDER THE LAW?

The question of whether robots are people has European lawmakers and other experts at loggerheads. 

The issue first arose in January 2017, thanks to a paragraph of text buried deep in a European Parliament report, that advised creating a ‘legal status for robots’.

A group of 156 AI specialists from 14 nations has written an open letter to the European Commission in Brussels denouncing the move. 

Writing in the statement, they said: ‘We, artificial intelligence and robotics experts, industry leaders, law, medical and ethics experts, confirm that establishing EU-wide rules for robotics and is pertinent to guarantee a high level of safety and security to the European Union citizens while fostering innovation.

‘As human-robot interactions become common place, the European Union needs to offer the appropriate framework to reinforce Democracy and European Union values.

‘In fact, the artificial intelligence and robotics framework must be explored not only through economic and legal aspects, but also through its societal, psychological and ethical impacts.

‘In this context, we are concerned by the European Parliament resolution on civil law rules of robotics, and its recommendation to the European Commission.’ 

They say that the creation of a legal status of an ‘electronic person’ for self-learning robots is a bad idea, for a whole host of reasons. 

This includes the fact that companies manufacturing the machines may be absolved of any legal liability for damage inflicted by their creations.

They added: ‘Legal status for a robot can’t derive from the Natural Person model, since the robot would then hold human rights, such as the right to dignity, the right to remuneration or the right to citizenship. 

‘The legal status for a robot can’t derive from the Legal Entity model,’ as afforded to businesses, ‘since it implies the existence of human persons behind the legal person to represent and direct it. This is not the case for a robot.’ 

‘Consequently, we affirm that the European Union must prompt the development of the AI and bobotics industry insofar as to limit health and safety risks to human beings. 

‘The protection of robots’ users and third parties must be at the heart of all EU legal provisions.’ 

 

Source link

Loading...

Leave a Reply

Postboxes in Essex town are sealed off as ‘precaution’ after local postman loses his keys

Malaysian search team expert claims it is ‘impossible’ Nora Quoirin reached the waterfall herself