THE POLITICAL CORRUPTION EVIDENCE | FACEBOOK META CORRUPTION -- (Click HERE, on this title, to open the Gallery) | Facebook 'Gaydar' Shows How Creepy Algorithms Can Get

Facebook 'Gaydar' Shows How Creepy Algorithms Can Get

Facebook 'Gaydar' Shows How Creepy Algorithms Can Get

Imagine what an oppressive government could do with it. ‘GayDar” uses same facial measureing as Hitler used to decide who to kill in the ovens!

The rampant extremist gay guys who run Facebook and finance the DNC want their computers to spot their next hook-up or find them young gay boys to hire.

by
Cathy O'Neil

31

Watch out.

Photographer: Jin Lee/Bloomberg

Artificial intelligence keeps getting creepier. In one controversial study, researchers at Stanford University have demonstrated that facial recognition technology can identify gay people with surprising precision, although many caveats apply. Imagine how that could be used in the many countries where homosexuality is a criminal offense.


The lead author of the “gaydar” study, Michal Kosinski, argues that he’s merely showing people what’s possible, so they can take appropriate action to prevent abuse. I’m not convinced.

When people hear about algorithms recognizing people through masks, finding terrorists and identifying criminals, they tend to think of dystopian movies like “Minority Report,” in which Tom Cruise prevented murders with the help of “precogs” -- human beings with supernatural, albeit fatally flawed, foresight caused by a childhood neurological disease.

Reality is much worse. We don’t have precognition. We have algorithms that, although better than random guessing and sometimes more accurate than human judgment, are very far from perfect. Yet they’re being represented and marketed as if they’re scientific tools with mathematical precision, often by people who should know better.

This is an abuse of the public’s trust in science and in mathematics. Data scientists have an ethical duty to alert the public to the mistakes these algorithms inevitably make -- and the tragedies they can entail.

That’s the point I made in a recent conversation with Kosinski, who is also known for creating the “magic sauce” psycho-profiling algorithm that Cambridge Analytica later adapted to campaign for both Brexit and Donald Trump.

His response was that we’re both trying to warn the world about the potential dangers of big data, but with different methods. He’s showing the world the “toy versions” of algorithms that can and surely are being built with bigger and better data elsewhere -- and he doesn’t derive any income from the commercial applications. Academic prototyping, if you will.

I don’t buy that. It’s like complaining about the dangers of war while building bombs. Even “toy versions” can be very destructive when people put too much faith in them. And they do, which is why companies like Cambridge Analytica can make money peddling their secret sauce.

Consider the gaydar algorithm. A government could use it to target civilians, declaring certain people “gender atypical” and “criminally gay” because the black box says so -- with no appeals process, because it’s “just math.” We’ve already seen this very scenario play out in other contexts, for example with algorithmic assessments of public school teachers. The difference is that instead of losing their jobs, people could lose their freedom -- or worse.

People who work with big data must guard against this. Of course, oppressive regimes don’t need algorithms to be oppressive. But we shouldn’t allow them to appeal to the authority of math and science in doing so. We should make them do their nasty things in full view. We should expose their political nature, because political fights at least might look winnable in the long run.

When I asked Kosinski about this, he seemed more worried that the algorithm would work really well than about the false pretense of scientific authority. Maybe he thinks everyone understands the flaws. Maybe he believes it’s just a matter of time before the algorithms finding criminals and terrorists become much more accurate -- although he did acknowledge that there’s little reason to think his gaydar results would translate to other countries.

I don’t think I convinced him to stop building creepy models for the sake of demonstrating how creepy things might get. So watch out.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

To contact the author of this story:
Cathy O'Neil at cathy.oneil@gmail.com

To contact the editor responsible for this story:
Mark Whitehouse at mwhitehouse1@bloomberg.net





Gallery RSS RSS Feed | Archive View | Powered by Zenphoto | rotections: Public Domain. Non-Commercial. Fair Use. Freedom of The Press. No Tracking Of Public Allowed. First Amendment Protections, SLAPP, UN Protected. GDPR Compliant. Section 203 protected. Privacy Tools At: http://privacytools.io, ACLU, ICIJ- supported. If you sue us to try to hide and censor the news, you are allowing us to bypass the demurrer process, and we will counter-sue you for RICO, Anti-trust, Political Bribery, Sex Trafficking, Interference, First Amendment and your other crimes, which we have FBI-grade evidence for! Bring it on corrupt Tesla, Google, Facebook, Youtube, Netflix! We might even get DOJ and/or FTC to partner with us (again) to take your filthy corrupt companies down!....REAL NEWS does not have ads in it. Any news source, with ads in it, is fake news manipulated by the advertisers!....DOWNLOAD AND COPY THIS NEWS SITE. USE ANY FREE SERVER SPACE YOU FIND ON THE WEB. MAKE YOUR OWN DIGITAL NEWSPAPER. JOIN THE HUNDREDS OF THOUSANDS OF FREE NEWS SITES, LIKE THIS, AROUND THE WORLD AND DELIVER THE NEWS AND DEMOCRACY. PLEASE FOLLOW THE WIKIPEDIA RULES FOR POSTING. BE THE NEWS!