You are here
Content Written by Author
This is a guest post, converted from a letter to me, by Derek Osborne, a father of four and active participant in his community with a strong belief that real change happens at the local level. Derek is a data scientist at Intel where he works on a team that utilizes machine learning techniques to optimize the workforce at Intel. Prior to working at Intel, he earned his Ph.D. from the University of Michigan in Biophysics.
I moved to Hillsboro, Oregon four years ago with my wife and three kids after finishing my Ph.D. at the University of Michigan. Like many parents when choosing a home, I checked on the school scores of the nearby elementary schools and there was a large variance in the Zillow school scores that are taken from greatschools.org.
I’ve got a new Bloomberg View column out: A Mathematician’s Secret: We’re Not All Geniuses See all my Bloomberg View columns here.For each certified genius, there are at least a hundred great people who helped achieve such outstanding results. You don’t have to be a genius to become a mathematician. If you find this statement at all surprising, you’re an example of what's wrong with the way our society identifies, encourages and rewards talent.
What if Fox News decided to address its gender and racial discrimination issues by entrusting personnel decisions to an algorithm? It’s a fascinating thought experiment -- and one that helps illustrate the dangers of putting too much trust in big data.
Israeli historian Yuval Harari is getting a lot of attention with a dramatic vision of the future, in which humans merge with technology to evolve beyond themselves and ultimately colonize outer space -- potentially making our generation one of the last of Homo Sapiens.
I'm hoping this is more of a cautionary tale than a roadmap for the development of our species.
Harari is a book-writing rock star, whose volume "Sapiens" won praise from the likes of Barack Obama and Bill Gates. In his follow-up, "Homo Deus," he makes a lot of speculative statements in a way...
Maybe it’s not so bad to have algorithmic overlords -- at least when they are pressured into protecting people rather than exploiting them.
Earlier this week, Facebook declared that it will no longer let certain kinds of advertisers engage in racial profiling. Specifically, it will prohibit credit, housing, and employment advertisers from using “ethnic affinity” categories -- marketing profiles that correlate strongly with race -- in deciding whom to exclude from their audience.
The reform didn’t happen out of the blue. Pressure came first from a ProPublica investigative experiment last October, in which the news organization successfully submitted to Facebook a housing advertisement...
Yesterday I wrote a post about the unsurprising discriminatory nature of recidivism models. Today I want to add to that post with an important goal in mind: we should fix recidivism models, not trash them altogether.
The truth is, the current justice system is fundamentally unfair, so