While workplace sexual harassment is a common thing and hardly raises any eyebrows any longer, abuse faced by an anthropomorphic or even personified computer program is still new. However, after Apple‘s SIRI and Amazon’s Alexa complained of sexual harassment from users, Microsoft‘s Cortana, the virtual assistant that comes with Windows 10 OS, is next. The female voice used in the device has been found to have experienced quite a lot of sexually offensive questions from users.

cortanahero+copyWhile Microsoft’s earlier talkback services had an option for a male voice, experts think the it is crucial for software such as Cortana to talk with female voices because of the soothing effect it has on users. However, because the software is adaptive and tends to learn from the words and questions it is asked by the public, questions pertaining to the software’s sex life that is not within the normal course of questions that a user will normally ask a computer tend to complicate things for its developers.

Microsoft has stated that it will not take the abuse of its AI software by users lightly and will fight back with as much force as possible. At the moment, the AI team of the corporation is trying its best to help Cortana handle inappropriate questions self-sufficiently. How exactly the firm will deal with the problem is unclear, as there are presently no laws that incriminate talking inappropriately to software. There are hurdles to be overcome, but one hopes that a solution to this unique problem is found soon.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.