Coins have two sides, both heads and tails — and as Bret Michaels from Poison reminds us, “Every rose has it’s thorn.”
Such is the case with both social media and big data, which like any new technologies, discoveries, and applications may be used for good and less good.
Summary and commentary by Tim McElligott in FierceBigData; original article by Prof. Nancy Rothbard in Knowledge@Wharton.
Emphasis in red added by me.
Brian Wood, VP Marketing
———-
Big data helping social media become anti-social
I have the courage of my great-grandparents to thank for my living in a country where I am free to express my thoughts without fear of serious reprisal. I have no clue who they were or what circumstances led them to emigrate, but no matter the circumstances, it takes courage to uproot oneself and travel someplace from where one is unlikely to return. So thanks great-grandmas and great-grandpas, whoever you were.
The freedom to speak one’s mind has never been without consequence. The reprisals for doing so in the United States are nothing compared to those in other, less free countries, but thanks in part to big data and our own misguided assumptions about privacy–particularly with regard to social media–they are getting more widespread and prejudicial. However, it is neither the government nor the police from whom we have to fear reprisal today; it is from perspective employers, insurers, lenders and others.
I don’t claim to have been the victim of such reprisals, yet, but I adhere to some schools of thought that are quite unpopular in many circles, even among my family and friends, and they undeservedly would raise a red flag in the minds of those picking through my social networks. Being in the minority on these views, I have tended to bring them up socially only when provoked. That’s why the early days of social media were so welcomed by me and others who find themselves isolated by their ideas. Social media helped people of like minds find each other—for better or worse–as it helped dangerous whack jobs find each other, too. It helped people explore their ideas openly and privately, and sometimes even helped them find their public voice.
I say the “early days” of social media because it seems like no matter what social format arises, private citizens leverage it for social purposes, drive its growth and then the business community comes along and perverts it as a sales channel or as an investigative tool. This isn’t all bad, as it ultimately pays the bill, allowing for the free use of these social media platforms.
But as an article this week from the Wharton School at the University of Pennsylvania reminds us, we should not be so free with our social presence. Social media may give us a platform for expressing our personal ideas and engaging with groups of like-minded, yet far-flung friends, but it also creates an environment where the consequences of doing so may be more insidious than anyone involved in the creation of the First Amendment could have imagined.
The Wharton article talks mostly in a positive way about the new era of “algorithmic hiring,” a practice recruiters use to find qualified candidates for open positions. It shows how companies use big data techniques to find the best candidates based on more than just their resumes or academic pedigrees, like the whiz-bang, self-taught programmer or the genius entrepreneurial dropout. Algorithmic hiring looks at the web sites a person frequents, the language a person uses to describe technology, the skills reported on LinkedIn and what projects they have worked on through Meetup groups and other ancillary information not found on a resume.
Big data in this case is employed to spot “wasted talent” and eliminate human bias in filtering candidates. But algorithms also can be a great way to hide intended biases and allow people to get away with purposeful discrimination. Whatever criteria can be selected for bringing on talent, also can be used to exclude. And detecting purposeful exclusion would be harder in a big data, algorithmic hiring environment. All that recruiters, lenders or anyone else has to say when the civil rights lawyers show up is, “We’re not discriminating. We let the computer decide.”
Lawyers are smart, but they are not likely to start picking apart algorithms to uncover hidden, but intended, biases. The lack of privacy on social media sites is well-known at this point. But they say a digital footprint never dies and many a footprint was laid down when people still thought they had a modicum of privacy on these sites.
Every incremental benefit businesses find to engage in social media seems to chip away at the incentive for people to use these platforms and deprives people of the primary social benefit of the Internet, which is to make the world your neighborhood, and you can exchange ideas without the fear of reprisal.
http://www.fiercebigdata.com/story/big-data-helping-social-media-become-anti-social/2013-05-09
———
Mind Your ‘Social’ Presence: Big-data Recruiting Has Arrived
Peter Steiner’s famed 1993 New Yorker cartoon of two dogs at a computer — with its caption, “On the Internet, nobody knows you’re a dog” — alluded to the difficulty of determining people’s online identities, including when it comes to recruiting workers. Now, a recent New York Times story on “algorithmic hiring” — which uses big-data analytics in place of traditional “talent markers” such as academic degrees — points to a new way of finding employees in a sea of online information.
According to the Times, big-data analytics promises to find hidden jewels like 26-year-old Jade Dominguez, an “average” high school student who did not attend college but taught himself computer programming. Dominguez was discovered by Gild, a San Francisco-based startup that is among a growing number of companies harnessing multiple data sources, including social networks, to automate parts of the hiring process and spot the right candidates. Gild claims it has developed a “technology that finds the people out there [and] tells you who’s good (and at what) and how to engage them.”
Gild crunches thousands of bits of information around 300 larger variables, including the web sites a person frequents, the language he or she uses to describe technology, the skills reported on LinkedIn and projects worked on, among other criteria, according to the Times article. Other firms that have developed technology to spot talent through social networks include TalentBin and Entelo of San Francisco, and RemarkableHire of McLean, Va.
Algorithmic hiring may be an innovative way to help companies hire faster and better for certain positions, but is it more than a passing technological fad? “This kind of hiring may be on the rise in jobs where the skills needed are clearly tied to performance,” says Wharton management professor Nancy Rothbard. “This means that the performance metrics also need to be clear and tangible. However, [algorithmic hiring] may be difficult to get right in the many jobs where there is less clarity” regarding metrics.
With such hiring techniques, companies could end up shifting the emphasis disproportionately towards social traits and away from the tried-and-tested attributes of good resumes — strong academic careers, demonstrated skill sets, etc. Rothbard suggests ways to get around those obstacles. “At the end of the day, getting the algorithm to reflect the desired attributes of the company is going to matter a lot with this technique,” she says. “Having checks and balances in the system is important. Moreover, with this type of an approach, experimentation, testing and follow up measurement are likely to matter a lot.”
In the Times article, Vivienne Ming, chief scientist at Gild, noted that talented people are ignored, misjudged or fall through the cracks all the time. The “traditional markers” people use for hiring “can be wrong, profoundly wrong,” and big-data technology can spot “wasted talent” and eliminate human bias, she said. Ming should know: Born male, she noticed how people began treating her differently after her gender change — in both good and negative ways. According to Ming, using algorithmic hiring is a way to “let the data speak for itself.”