By now we are well aware that the government has been compiling a massive database of our communications for quite some time. The evidence provided by Edward Snowden presents the NSA more as a surveillance agency, rather than a security agency.
Most people would be surprised to learn that there are a significant number of people in the ruling class that do not believe the current level of surveillance and intrusive behavior by the State goes far enough! They want to collect even more private data from individuals. This might be hard for most people to believe, but there are software developers working feverishly to develop and test software that could actually identify you as a criminal before you break a law!
Jim Alder, the former chief privacy officer at a company called Intelius, is responsible for developing one type of predictive policing software. The software uses arbitrary characteristic or traits such as hair color, eye color, and number of tattoos in order to determine if someone committed a felony. Bloomberg reports on the ground breaking software.
While working as the chief privacy officer at Intelius, an online provider of background checks, Jim Adler created software that demonstrates how just a few details about a person could be used to estimate the chances of someone committing a felony. Accurately, he says.
If that sounds like the stuff of science fiction, similar to the Tom Cruise movie where people are arrested before the crimes happen, early forms of this type of predictive policing can be attempted today because of the enormous amounts of digital data on individuals being collected and analyzed.
To test his computer program, Adler began with tens of thousands of criminal records owned by Intelius and focused only on a few details about each person, including gender, eye and skin color, the number of traffic tickets and minor offenses, and whether the individual has tattoos. Based on that data, and excluding any information about a felony conviction, he said his algorithm determined with reasonable accuracy whether a person had committed a serious crime.
While Adler acknowledges there was “sample bias” in the data and that his program is “not ready for prime time,” he said a bigger sample with more historical information about individuals could be used to create a felon predictor — software that gives the statistical likelihood of someone committing a serious crime in the future. Scores could even be assigned to individuals.
Adler, who has testified before Congress and the Federal Trade Commission on big data and privacy issues, created the program to show both the potential benefits of using big data to stop trouble before it happens, as well as the possible dangers of going too far with using predictive technologies.
“It’s important that geeks and suits and wonks get together and talk about these things,” said Adler, who recently left Intelius on good terms and is now a vice president at Metanautix, a data analytics startup. “Because geeks like me can do stuff like this, we can make stuff work – it’s not our job to figure out if it’s right or not. We often don’t know.”
The applicability and implementation of this software should not be left up to “geeks and suits and wonks.” In the private sector companies could use this type of software in order to improve their processes or to gain a competitive edge. The market would be left to regulate and penalize bad actors.
For example, companies could use the software to improve the process used for selecting prospective employees. Using the software to identify individuals that portend to have success based on matching traits or characteristics of previous good performers could be a beneficial hiring tool. Those companies that use the tool in a discriminatory fashion would only be hurting themselves by eliminating candidates based on superficial attributes.
In contrast, software placed in the hands of an entity that enjoys a monopoly in their industry could be extremely dangerous. Without market signals to regulate an abusive force, such as an agency of the State, the State would be free to discriminate without repercussions from the market. There are many people in government today that would love to have this power.
If you don’t think there is a risk that the US government would start using predictive policing software against their own people, then you obviously have not been paying attention to the revelations uncovered by NSA leaker Edward Snowden. All they have to do is find a way to sell it to the gullible masses as increased security.
Software like the one described in the above article is only dangerous if placed in the unregulated hands of an agency of the State. The only force that has the ability to regulate such a potentially dangerous, but also useful technology is the good ole market.
Receive access to ALL of our EXCLUSIVE bonus audio content – including “Conspiracy Corner”, “Degenerate Gamblers” and the “League of Liberty Podcast” by joining the Lions of Liberty Pride and supporting us on Patreon!