What if we could have predicted the ISIS insurgency in Iraq? Promising examples of innovations in engineering and data science technologies exist today, revealing to us the potential to predict such conflicts in the future. The Satellite Sentinel Project effectively predicted that the Sudanese Armed Forces would invade Abyei in 2011. And through Big Data analysis of hundreds of news reports, Georgetown University fellow Kalev Leetaru has been able to retroactively pinpoint the location of Osama bin Laden within a 124-mile radius of Abbottabad, Pakistan. Imagine how much shorter the war in Afghanistan could have been if we had just used the right algorithm.
How do we get it wrong? Predictions about the future can be wrong in one of two ways. Errors can range in type from false positives that erroneously assert that some event will happen (e.g., the existence of weapons of mass destruction in Iraq) and false negatives that do not predict cases that will actually happen (e.g., the failure of many policymakers to foresee the relapse of violence in South Sudan). While the fallibility of policymakers depends on political context, all such failures are indicative of how difficult and complicated it is to foretell tomorrow. A lot could happen, and in the end, there is a good amount of guesswork.
According to The Naked Future: What Happens in a World That Anticipates Your Every Move by Patrick Tucker, two technological trends may help us become better guessers: increasing telemetry and connectivity. The author makes a bold—though not airtight—claim that we are leaving the Big Data age and entering the telemetric age.
Quoting the Oxford English Dictionary, Tucker defines telemetry as “the process or practice of obtaining measurements in one place and relaying them for recording or display to a point at a distance.” The unprecedented ability to measure billions upon billions of bits of data in real time will be facilitated with technological advancements in sensors, cameras, and microphones, together with improvements in today’s Big Data techniques of mining, analysis, and visualization. While it would be fair to assume that Tucker is considering the potential of what may be popularly referred to as intelligent machines or an “Internet of Things,” it is not the objects themselves that will define a singular change in humanity. Rather, it is the connectivity of the digital, the physical, and the human by telemetry that will reveal a naked future, one that, through streams of data—however known, minute, or non-intuitive—will expose to us a much more predictable world.
While it is uncertain whether these new technologies will come to full fruition, achieving the level of determinism, perfection, and integrated completeness so often described throughout the book, people and policymakers should take heed, knowing that individuals, industries, and governments are working toward knowing more about a future that might affect them and you.
Tucker, a technology editor of Defense One and deputy editor of The Futurist, amplifies his claims of a new era through his many interviews with a range of technologists and futurists at Google, Stanford, MIT, Facebook, and Twitter, as well as his conversations with hackers, entrepreneurs, scientists, police officers, and US government officials. The author surveys a broad range of topics, including personal improvement through a quantified self, biometric healthcare, far-off forecasts of the weather, algorithmic movie recommendations and storytelling, personalized advertising, individualized online learning, online dating, and predictive policing. In the book’s trade off of depth-for-breadth, the author attempts to thread a recurring theme: The future will become more predictable, whether we like it or not.
Yet the future will not be the only thing that will be revealed. The loss of personal privacy is the chief sacrifice that we will all have to grow more comfortable in relinquishing. Even if we close our Facebook and Twitter accounts, delete our browser cookies, and cancel our cellphones, our friends, family, and neighbors will continue to reveal things about us, and sensors, cameras, and microphones will be so ubiquitous and omniscient, that there will be plenty of data about us to target us for ads, medicines, and movies.
While such sacrifices of privacy may be innocuous for most of us in terms of say, which television shows we might want others to know we watch, or who we want to fall in love with, it may be in the realm of crime and predictive policing that might give us the most pause. Punishment and persecution are far greater consequences than humiliation. And it may be on this subject of policing where increased connectivity of the digital, the physical, and the human may belie an uncertain balance toward automation, machines, and software divorced from the freedoms we today take for granted in everyday life, such as the freedom to be anonymous, unwatched, and undisturbed on the train, or the freedom to fully express oneself when text messaging a friend.
In the future, predictive policing techniques will be increasingly applied across the world, from Baltimore to Beirut to Beijing. Advanced statistical techniques, machine learning algorithms, and more responsive analysis will grant police officers more insights into the where, when, and who of crime prediction. By 2020, police officers will be about to capture potential and actual criminals with video feeds from satellites that will scan from space, drones that will crisscross the skies, and closed-circuit television cameras embedded in buildings and streetlights. Mobile computing in the form of tablets, laptops, and phones will offer cops on the beat mobile dashboards of a diffuse and more responsive network of command and control centers. Knowing where to place and not place police officers and better knowing from data which suspects are actually criminals will more efficiently lead to a more peaceful neighborhoods—local, regional, and global.
According to Tucker, “We grow more accustomed to surveillance in general, especially when submitting to extra surveillance has a voluntary component, one that makes submission convenient and resistance incredibly inconvenient.” Submission to such surveillance may have some clear profits in terms of conflict prevention, but resistance, despite inconvenience, may have clear merit, as with the examples of Abyei and bin Laden. On the other hand, what would the former Egyptian, Tunisian, and Libyan regimes have done with Leetaru’s methodology to predict the Arab Spring? The use and abuse of telemetry, Big Data, and an Internet of Things will likely vary across political context. Hence, governance will still matter greatly in the future, despite innovation.
Concerns over privacy should not lead us to live as hermits—far from it. The Luddites continue to lose, mostly because there is so much potential to be gained for greater peace, liberty, and justice from machine learning, intelligent things, and new technologies. But the challenge is in balancing device- and data-driven improvements in these values at the cost of how much of our privacy we will actually need to surrender.
Thong Nguyen is Data Lab Program Administrator at the International Peace Institute and Carnegie Council non-resident fellow of the Future Worlds Project.