View source for Welcome to Algorithmic Prison
From Critiques Of Libertarianism
Jump to:
navigation
,
search
<!-- you can have any number of categories here --> [[Category:Bill Davidow]] [[Category:Algorithmic Prison|100]] <!-- 1 URL must be followed by >= 0 Other URL and Old URL and 1 End URL.--> {{URL | url = http://www.theatlantic.com/technology/archive/2014/02/welcome-to-algorithmic-prison/283985/}} <!-- {{Other URL | url = }} --> <!-- {{Old URL | url = }} --> {{End URL}} {{DES | des = The algorithmic prison idea is that big data allows business and government to deny us loans, jobs, right to travel, etc. without our knowing why or being able to contest and change the data. This also makes us very vulnerable to dirty tricks. | show=}} <!-- insert wiki page text here --> <!-- DPL has problems with categories that have a single quote in them. Use these explicit workarounds. --> <!-- otherwise, we would use {{Links}} and {{Quotes}} --> {{List|title=Welcome to Algorithmic Prison|links=true}} {{Quotations|title=Welcome to Algorithmic Prison|quotes=true}} {{Text | Corporations and governments are using information about us in a new—and newly insidious—way. Employing massive data files, much of the information taken from the Internet, they profile us, predict our good or bad character, credit worthiness, behavior, tastes, and spending habits—and take actions accordingly. As a result, millions of Americans are now virtually incarcerated in algorithmic prisons. Some can no longer get loans or cash checks. Others are being offered only usurious credit-card interest rates. Many have trouble finding employment because of their Internet profiles. Others may have trouble purchasing property, life, and automobile insurance because of algorithmic predictions. Algorithms may select some people for government audits, while leaving others to find themselves undergoing gratuitous and degrading airport screening. An estimated 500 Americans have their names on no-fly lists. Thousands more are targeted for enhanced screening by the Automated Targeting System algorithm used by the Transportation Security Administration. By using data including "tax identification number, past travel itineraries, property records, physical characteristics, and law enforcement or intelligence information" the algorithm is expected to predict how likely a passenger is to be dangerous. Algorithms also constrain our lives in virtual space. They determine what products we will be exposed to. They analyze our interests and play an active role in selecting the things we see when we go to a particular website.. Eli Pariser, argues in The Filter Bubble, "You click on a link, which signals your interest in something, which means you are more likely to see articles about that topic" and then "you become trapped in a loop." The danger being that you emerge with a very distorted view of the world. If you’re having trouble finding a job as a software engineer, it may be because you got a low score from the Gild, a company that predicts the skill of programmers by evaluating the open source code they have written, the language they use on LinkedIn, and how they answer questions on software social forums Algorithmic prisons are not new. Even before the Internet, credit reporting and rating agencies were a power in our economy. Fitch’s, Moody’s, and Standard & Poor’s have been rating business credit for decades. Equifax, the oldest credit rating agency, was founded in 1899. When algorithms get it right (and in general they do a pretty good job), they provide extremely valuable services to the economy. They make our lives safer. They make it easier to find the products and services we want. Amazon constantly alerts me to books it correctly predicts I will want to read. They increase the efficiency of businesses. But when algorithms get it wrong, real suffering follows. Most of us would not be concerned if 10 or 100 times too many people ended up on the TSA’s enhanced airport screening list as long as an airplane hijacking was avoided. In times when jobs are scarce and applicants many, most employers would opt for tighter algorithmic screening. There are lots of candidates to hire and more harm may be done by hiring a bad apple than by missing a potentially good new employee. And avoiding bad loans is key to the success of banks. Missing out on a few good ones in return for avoiding a big loss is a decent trade off. But we’ve reached the point where, in many cases, private companies and public institutions stand to gain more than they will lose if a lot of innocent people end up in algorithmic prison. A related concern is this: Surveillance has become automated through the use of Internet tools, capturing data from cellular phones, low cost cameras, and the ability to economically analyze big databases. As a result, it has become much easier—and a lot less costly—to construct algorithmic prisons. Not only can we expect to see a great increase in the number of algorithmic prisons, but thanks to cheaper and more efficient tools the value derived from establishing them will increase. A number of services already facilitate the creation of algorithmic prisons. Axciom, for instance, a marketing services company, monitors 50 trillion transactions annually and maintains about 1,500 data points on 500 million consumers worldwide. That same database can serve as a key component in the construction of an algorithmic prison. There are other features of algorithmic prisons that a latter-day antagonist in a tale by Kafka might have dreamed up. A consumer or job seeker might know only that he has trouble getting credit or a job interview. What he may not know is that the bars of an invisible prison are keeping him from reaching his goal. The federal Consumer Financial Protection Bureau lists more than 40 consumer-reporting companies. These are services that provide reports for banks, check cashers, payday lenders, auto and property insurers, utilities, gambling establishments, rental companies, medical insurers, and companies wanting to check out employment history. The good news is that the Fair Credit Reporting Act requires those companies to give consumers annual access to their reports and allows a consumer to complain to the Consumer Financial Protection Bureau if he is being treated unfairly. Good luck with that. Even if an algorithmic prisoner knows he is in a prison, he may not know who his jailer is. Is he unable to get a loan because of a corrupted file at Experian or Equifax? Or could it be TransUnion? His bank could even have its own algorithms to determine a consumer’s creditworthiness. Just think of the needle-in-a-haystack effort consumers must undertake if they are forced to investigate dozens of consumer-reporting companies, looking for the one that threw them behind algorithmic bars. Now imagine a future that contains hundreds of such companies. A prisoner might not have any idea as to what type of behavior got him sentenced to a jail term. Is he on an enhanced screening list at an airport because of a trip he made to an unstable country, a post on his Facebook page, or a phone call to a friend who has a suspected terrorist friend? Finally, how does one get his name off an enhanced screening list or correct a credit report? Each case is different. The appeal and pardon process may be very difficult—if there is one. It is impossible to fathom all the implications of algorithmic prisons. Yet a few things are certain: Even if they do have great economic value for businesses, and even if they do make our country a safer place, as they continue to proliferate, many of us will be injured, seriously inconvenienced, or experience great frustration as a result. Even if we all believed algorithmic prisons present a serious threat to individual freedom, it would be difficult to come up with a reasonable solution to the problems they create. Personally speaking, I'd favor requiring all companies to destroy within, say, 48 hours, all data collected about me unless I have given explicit permission otherwise. I would also prohibit the sale of my personal information or its use for advertising. Well, that is a nice idea but it is fraught with problems. Under those rules, accurate credit reports would be impossible. And I would want law enforcement agencies to have access to all that information subject to the right restrictions and oversight. If the data is destroyed, that would be impossible. What is clear is that the consumer protections in place at the moment do not suffice. An additional a set of carefully constructed restrictions is required. Being held in any number of algorithmic prisons is a scenario I for one do not want to be caught up in. And I doubt I am alone. }}
Template:DES
(
view source
)
Template:End URL
(
view source
)
Template:Extension DPL
(
view source
)
Template:List
(
view source
)
Template:Quotations
(
view source
)
Template:Red
(
view source
)
Template:Text
(
view source
)
Template:URL
(
view source
)
Return to
Welcome to Algorithmic Prison
.
Navigation menu
Views
Page
Discussion
View source
History
Personal tools
Log in
Search
Search For Page Title
in Wikipedia
with Google
Translate This Page
Google Translate
Navigation
Main Page (fast)
Main Page (long)
Blog
Original Critiques site
What's new
Current events
Recent changes
Bibliography
List of all indexes
All indexed pages
All unindexed pages
All external links
Random page
Under Construction
To Be Added
Site Information
About This Site
About The Author
How You Can Help
Support us at Patreon!
Site Features
Site Status
Credits
Notes
Help
Toolbox
What links here
Related changes
Special pages
Page information
Guidelines To Create
Indexable Page/Quote
Indexable Book/Quote
Indexable Quote
Unindexed
Templates
Edit Sidebar
Purge cache this page