you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

Assigning female genders to digital assistants such as Apple’s Siri and Amazon’s Alexa is helping entrench harmful gender biases, according to a UN agency.

...and

“Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’,” the report said.

being nice and helpful isn't "harmful", in fact, when I meet someone needing directions or having issues I try to be nice and helpful.

This seems like the usual Gender Studies mambo jambo since it doesn't indicate how exactly and quantitatively it impacts people, it just goes on a wild eageration spiral.

...also:

“This harassment is not, it bears noting, uncommon. A writer for Microsoft’s Cortana assistant said that ‘a good chunk of the volume of early-on enquiries’ probe the assistant’s sex life.”

I don't tell my Cortana to go fuck herself, I just often google how to kill her because I hate the stupid apps and settings microsoft forces upon my laptop that slows it to a crawl and when I finally I manage to remove/disable a bunch of them an update (that I can no longer block) comes along and reenables a bunch of them.

It cited research by a firm that develops digital assistants that suggested at least 5% of interactions were “unambiguously sexually explicit” and noted the company’s belief that the actual number was likely to be “much higher due to difficulties detecting sexually suggestive speech”.

1- They're not people

2- Making a big deal out of nothing. What next? They're going to consider inflatable ruber dolls rape because they don't provide consent?