You have commented 339 times on Rantburg.

Your Name
Your e-mail (optional)
Website (optional)
My Original Nic        Pic-a-Nic        Sorry. Comments have been closed on this article.
Bold Italic Underline Strike Bullet Blockquote Small Big Link Squish Foto Photo
-Signs, Portents, and the Weather-
AI programs exhibit racial and gender biases, research reveals
2017-04-20
[THEGUARDIAN] Machine learning algorithms are picking up deeply ingrained race and gender prejudices concealed within the patterns of language use, scientists say.

An artificial intelligence tool that has revolutionised the ability of computers to interpret everyday language has been shown to exhibit striking gender and racial biases.

The findings raise the spectre of existing social inequalities and prejudices being reinforced in new and unpredictable ways as an increasing number of decisions affecting our everyday lives are ceded to automatons.

In the past few years, the ability of programs such as Google Translate to interpret language has improved dramatically. These gains have been thanks to new machine learning techniques and the availability of vast amounts of online text data, on which the algorithms can be trained.

However, as machines are getting closer to acquiring human-like language abilities, they are also absorbing the deeply ingrained biases concealed within the patterns of language use, the latest research reveals.
Posted by:Fred

#5  Sad to say you could not mae a movie like airplane today. Write a line like "I speak jive" and the PC brigade would string you to the lamp post.

Not only that but the folks on the internet would slice and dice you for calling it jive and not ebonics, for Ms Cleaver's cultural appropriation of their language and for the plot hole of the black fellow eating the tainted fish instead of the chicken.
Posted by: rjschwarz   2017-04-20 16:38  

#4  The words “female” and “woman” were more closely associated with arts and humanities occupations and with the home, while “male” and “man” were closer to maths and engineering professions.

One could make a similar observation wandering around a college campus. How odd that our language mirrors the world we see.

Sandra Wachter, a researcher in data ethics and algorithms at the University of Oxford, said: “The world is biased,
the historical data is biased, hence it is not surprising that we receive biased results.”


tl;dr: We don't like the world and demand it conform to our prejudices about what is right and proper. (sound of tiny feet stamping)
Posted by: SteveS   2017-04-20 09:27  

#3  "Sir, my first job was programming binary loadlifters—very similar to your vaporators in most respects."
Posted by: Procopius2k   2017-04-20 09:14  

#2  Sad to say you could not mae a movie like airplane today. Write a line like "I speak jive" and the PC brigade would string you to the lamp post.

As to machine speech. Does this just mean English or all languages?
Posted by: Cheaderhead   2017-04-20 06:15  

#1  And for those of you really interested (professionally or personally) in 'word-embedding', sentiment analysis, SVM and sparse word matrices...don't miss NLP Day Texas 2017
Posted by: Skidmark   2017-04-20 00:57  

00:00