Martial Law of the Mind: Predictive Technologies to Bring in Wicked Times

0
695
tect

[fvplayer src=”https://player.vimeo.com/external/158440044.hd.mp4?s=d6dea21eb3ebed5490f091a25bf5423a5386b124&profile_id=113″ splash=”https://christianjournal.net/wp-content/uploads/2016/03/predictive-tech.jpg”]

Predictive technologies are the future police force. The biotech industry is ushering in a time when thoughts are actually crimes, freedom does not exist, and humans are wirelessly connected to the internet. This plan concocted by the NWO, will soon be a reality. There is no compromise or conspiracy, DARPA agencies are readily working on BCI (brain-computer interfaces) that will radically change the battle field. Many misunderstood the dangers of Jade Helm 15 because the purpose of the martial law drill was covert and hidden. The true purpose behind Jade Helm 15 was militarized predictive technology. The testing phases are coming to an end and predictive technology is being released on battlefields across the globe.

These ‘futuristic’ scripts are already in use today by private sector corporations across the globe. Their purpose is to predict the enemies moves prior to the enemy making them. Predictive technology is a truly dangerous weapon, and a radical ideology.

Ways this is already publicly used are how the weather forecast is predicted, and how google can predict what you want to click on before you click it, also how hulu shows relevant tv shows to the unsuspecting viewer. This technology made its entrance in these ways for a certain reason, and that is to sway the populace to accept their presence because it appears to be useful to the computer user. However, there is an agenda with this technology, and it is being played out today. Through the use of high end computers these dangerously accurate ‘scripts’ can predict thoughts before they are made. How, one might ask? Remember the NSA’s data collection, and Facebook’s ‘profile’? That is your online presence, and through emotional captures, i.e your camera; the technicians can then judge responses and your moves which you are most likely to make. Data mining was not just for shoppers but for government’s, private sectors, and other criminal corporations to harvest and use in experiments. After all you didn’t read the fine print, stating that your data might go to 3rd party sources.

Just as the scripts are being used in a “useful” way to the user, they will be used in a “good” way.

Now a team of scientists has demonstrated that a computer can outperform human judges in predicting who will commit a violent crime. In a paper published last month, they described how they built a system that started with people already arrested for domestic violence, then figured out which of them would be most likely to commit the same crime again.

The technology could potentially spare victims from being injured, or even killed. It could also keep the least dangerous offenders from going to jail unnecessarily. And yet, there’s something unnerving about using machines to decide what should happen to people. If targeted advertising misfires, nobody’s liberty is at stake.

For two decades, police departments have used computers to identify times and places where crimes are more likely to occur, guiding the deployment of officers and detectives. Now they’re going another step: using vast data sets to identify individuals who are criminally inclined. They’re doing this with varying levels of transparency and scientific testing. A system called Beware, for example, is capable of rating citizens of Fresno, California, as posing a high, medium or low level of threat. Press accounts say the system amasses data not only on past crimes but on web searches, property records and social networking posts.

Critics are warning that the new technology had been rushed into use without enough public discussion. One question is precisely how the software works — it’s the manufacturer’s trade secret. Another is whether there’s scientific evidence that such technology works as advertised.

By contrast, the recent paper on the system that forecasts domestic violence lays out what it can do and how well it can do it.

One of the creators of that system, University of Pennsylvania statistician Richard Berk, said he only works with publicly available data on people who have already been arrested. The system isn’t scooping up and crunching data on ordinary citizens, he said, but is making the same forecasts that judges or police officers previously had to make when it came time to decide whether to detain or release a suspect.

He started working on crime forecasting more than a decade ago, and by 2008 had created a computerized system that beat the experts in picking which parolees were most likely to reoffend. He used a machine learning system – feeding a computer lots of different kinds of data until it discovered patterns that it could use to make predictions, which then can be tested against known data.

Machine learning doesn’t necessarily yield an algorithm that people can understand. Users know which parameters get considered but not how the machine uses them to get its answers.

In the domestic violence paper, published in February in the Journal of Empirical Legal Studies, Berk and Penn psychologist Susan Sorenson looked at data from about 100,000 cases, all occurring between 2009 and 2013. Here, too, they used a machine learning system, feeding a computer data on age, sex, zip code, age at first arrest, and a long list of possible previous charges for such things as drunk driving, animal mistreatment, and firearms crimes. They did not use race, though Berk said the system isn’t completely race blind because some inferences about race can be drawn from a person’s zip code.

The researchers used about two-thirds of the data to “train” the system, giving the machine access to the input data as well as the outcome – whether or not these people were arrested a second time for domestic violence. The other third of the data they used to test the system, giving the computer only the information that a judge could know at arraignment, and seeing how well the system predicted who would be arrested for domestic violence again.

 It is vital that what is stated above is not the whole truth. Data mining is being used by corporations for far more than marketing, it is experimenting for futuristic thought crimes.

Algorithms to kill freedom, scripts to destroy lives, and Martial Law of the mind. Wicked times lie ahead.

Works Cited

Faye Flam. “The Crime You Have Not Yet Committed.” Bloomberg View. . (2016): . . http://www.bloombergview.com/articles/2016-03-08/the-crime-you-have-not-yet-committed

Duke Health. “Monkeys drive wheelchairs using only their thoughts.” Science Daily. . (2016): . . http://www.sciencedaily.com/releases/2016/03/160303094320.htm