This is a Hitskin.com skin preview
Install the skin • Return to the skin page
Algorithms are like invisible judges that decide our fate
Page 1 of 1
Algorithms are like invisible judges that decide our fate
http://www.thehindu.com/sci-tech/science/algorithms-are-like-invisible-judges-that-decide-our-fates/article7147576.ece
Imagine that you’re a contestant in an audition round of The Voice, where you belt out your best I Will Always Love You. A minute passes. No reaction from the celebrity judges. You keep singing. Another minute, still no encouraging smile or nod.
You strain to hit your highest note, pleading with your performance: “Please, please accept me! I am doing my best!” The song ends. No one wants you. Your family bow their heads in shame. Your mom cries. You stand on the stage, alone in the spotlight, heartbroken. A trap door opens beneath your feet and you slide screaming into Adam Levine’s basement torture maze.
Dehumanising
Think that’s bad? In the real world, science has come up with something worse. A company called Jobaline offers “voice profiling” to predict job success based on how candidates sound; its algorithm identifies and analyses over one thousand vocal characteristics by which it categorises job applicants on suitability.
It’s horrible and dehumanising, like all our other profiling (the racial kind is always a big hit!) and reliant on born-in, luck-of-the-genetic-draw factors that we can neither avoid nor control.
This is not the only creepy algorithm system HR departments have been employing to help the company bottom line. Companies like Wal-Mart and Credit Suisse have been crunching data to predict which employees are “flight risks” who are likely to quit (easily remedied with a simple anklet attaching the worker to his or her cash register or cubicle) versus those deemed “sticky,” meaning in-it-for-the-long-haul. The information lets bosses either improve morale or get a head-start on a search for a replacement.
The inventors of such programs often enjoy the impeachable, amoral cloak of scientific legitimacy. When it comes to voice profiling, computers are not judging the speakers themselves; only the reactions the speaker’s voice provokes in other (presumably human) listeners.
‘Mechanical judge’
“The algorithm functions as a mechanical judge in a voice-based beauty contest”, wrote Chamorro-Premuzic and Adler in The Harvard Business Review. “Desirable voices are invited to the next round, where they are judged by humans, while undesirable voices are eliminated from the contest.”
The makers of voice profiling programs tout this as a moral achievement. Human beings bring loads of biases into any evaluation; computers are blissfully unaware of differences in race, gender, sexual preference or age. “That’s the beauty of math!” Jobaline CEO Luis Salazar told NPR. “It’s blind.”
The problem is, when applied in a capitalist system already plagued by unfairness and inhumanity, this blindness sounds really, really dangerous. An impersonal computer program gets first say as to who gets to earn money to buy food and who doesn’t, based on an application of a binary code too subtle and complex for us to understand. — © Guardian Newspapers Limited, 2015
Companies now use ‘voice analysis’ software to determine whether to hire us. And, once we’re employed, to predict if we’ll stay
Imagine that you’re a contestant in an audition round of The Voice, where you belt out your best I Will Always Love You. A minute passes. No reaction from the celebrity judges. You keep singing. Another minute, still no encouraging smile or nod.
You strain to hit your highest note, pleading with your performance: “Please, please accept me! I am doing my best!” The song ends. No one wants you. Your family bow their heads in shame. Your mom cries. You stand on the stage, alone in the spotlight, heartbroken. A trap door opens beneath your feet and you slide screaming into Adam Levine’s basement torture maze.
Dehumanising
Think that’s bad? In the real world, science has come up with something worse. A company called Jobaline offers “voice profiling” to predict job success based on how candidates sound; its algorithm identifies and analyses over one thousand vocal characteristics by which it categorises job applicants on suitability.
It’s horrible and dehumanising, like all our other profiling (the racial kind is always a big hit!) and reliant on born-in, luck-of-the-genetic-draw factors that we can neither avoid nor control.
This is not the only creepy algorithm system HR departments have been employing to help the company bottom line. Companies like Wal-Mart and Credit Suisse have been crunching data to predict which employees are “flight risks” who are likely to quit (easily remedied with a simple anklet attaching the worker to his or her cash register or cubicle) versus those deemed “sticky,” meaning in-it-for-the-long-haul. The information lets bosses either improve morale or get a head-start on a search for a replacement.
The inventors of such programs often enjoy the impeachable, amoral cloak of scientific legitimacy. When it comes to voice profiling, computers are not judging the speakers themselves; only the reactions the speaker’s voice provokes in other (presumably human) listeners.
‘Mechanical judge’
“The algorithm functions as a mechanical judge in a voice-based beauty contest”, wrote Chamorro-Premuzic and Adler in The Harvard Business Review. “Desirable voices are invited to the next round, where they are judged by humans, while undesirable voices are eliminated from the contest.”
The makers of voice profiling programs tout this as a moral achievement. Human beings bring loads of biases into any evaluation; computers are blissfully unaware of differences in race, gender, sexual preference or age. “That’s the beauty of math!” Jobaline CEO Luis Salazar told NPR. “It’s blind.”
The problem is, when applied in a capitalist system already plagued by unfairness and inhumanity, this blindness sounds really, really dangerous. An impersonal computer program gets first say as to who gets to earn money to buy food and who doesn’t, based on an application of a binary code too subtle and complex for us to understand. — © Guardian Newspapers Limited, 2015
FluteHolder- Posts : 2355
Join date : 2011-06-03
Similar topics
» being invisible
» Kevin Slavin: How algorithms shape our world
» The Invisible Kannadiga
» Are Indian Women Invisible?
» India’s invisible workers
» Kevin Slavin: How algorithms shape our world
» The Invisible Kannadiga
» Are Indian Women Invisible?
» India’s invisible workers
Page 1 of 1
Permissions in this forum:
You cannot reply to topics in this forum