Bits of Books - Books by Title


The Naked Future

What Happens in a World That Anticipates Your Every Move?

By Patrick Tucker



n 2009 a woman named Carol Kasyjanski became the second human to receive a Bluetooth-enabled pacemaker that allows her heart to dialogue directly with her doctor. The first was former VP Dick Cheney, who got one in 2007, but never activated the broadcasting capability for fear of hackers.

Quantified Self movement: devices such as Fitbit which publicly broadcast the sort of measurements which previously were only available in a hospital.

Bird flu H1N1 strain mutates into deadly H5N1 strain when one of its proteins contains lysine at position 627. This enables virus to reproduce at 33 degrees C (ie in mammal lungs as opposed to the 43 degrees in bird lungs).

Using tweets to predict spread of illnesses in real time. Had a database of 2 billion tweets to work from. Bur first had to teach the program how to recognize health-related vs non-health ("this exam work is giving me a headache"). To do that, hired humans via Mechanical Turk to look at a training set of 11 million tweets - each tweet looked at by 3 people, so if at least 2 of them said it was illness-related, then a good example.

Combining the tweets data with the Fitbit data on a group of 6000 people in NY allowed researchers to work out who had a cold, who they got it from, and the likelihood that they would pass it to you.

People reading books on Kindle providing a huge new data set for authors and publishers. Can see which passages get high-lighted, can tell where a reader stops reading a book. Readers today may be uncomfortable that their habits are so closely monitored, but history tells us that they quickly get used to it. Amazon's recommendation system was seen as creepy at first, but now accepted.

Harrahs Casino spends $100 million a year administering its Total Rewards system, but gets its money's worth. If a member is losing heavily on floor, a 'Luck ambassador" will bring him a free drink or show tickets or whatever his records show he values most. Or, if you go on your birthday, or on vacation, or even just first weekend of every month, Harrahs will reward that to encourage you to keep doing it.

Data arms race assymetrical at present - the big companies have all the information, and the analytics to interpret. But consumer apps are appearing, giving you a handle on the consequences of purchasing decisions. QS records help you understand what factors are influencing your decisions.

Problem with all dating sites - the most eligible prospects get an overwhelming number of requests, and so quickly tire of the effort of sorting through them. Conversely the poor prospects send out lots of requests, but get no replies, so they lose interest as well. It's the 'prettiest girl in the room' problem. In a bar you can see the girl with a dozen hopeful men around her, but on dating site get no indication of who's getting all the attention.

All amateur poker players go wooden when they get a good hand. So busy trying to suppress their excitement that they can't act naturally.

The present system of screening all airport travellers is clumsy and ineffective, and, in particular, it doesn't make us any safer. The best system would separate out the few high risk people and leave the rest. But needs Big Data to do that, and we would have to give up more of our private information.

(NY Times)

In On What We Can Not Do, a short and pungent essay published a few years ago, the Italian philosopher Giorgio Agamben outlined two ways in which power operates today. There's the conventional type that seeks to limit our potential for self-­development by restricting material resources and banning certain behaviors. But there's also a subtler, more insidious type, which limits not what we can do but what we can not do. What’s at stake here is not so much our ability to do things but our capacity not to make use of that very ability.

While each of us can still choose not to be on Facebook, have a credit history or build a presence online, can we really afford not to do any of those things today? It was acceptable not to have a cellphone when most people didn't have them; today, when almost everybody does and when our phone habits can even be used to assess whether we qualify for a loan, such acts of refusal border on the impossible.

For Agamben, it's this double power 'to be and to not be, to do and to not do' that makes us human. This active necessity to choose (and err) contributes to the development of individual faculties that shape our subjectivity. The tragedy of modern man, then, is that 'he has become blind not to his capacities but to his incapacities, not to what he can do but to what he cannot, or can not, do.'

This blindness to the question of incapacities mars most popular books on recent advances in our ability to store, analyze and profit from vast amounts of data generated by our gadgets. (Our wherewithal not to call this phenomenon by the ugly, jargony name of Big Data seems itself to be under threat.) The two books under review, alas, are no exception.

In The Naked Future, Patrick Tucker, an editor at large for The Futurist magazine, surveys how this influx of readily available data will transform every domain of our existence, from improving our ability to predict earthquakes (thanks to the proliferation of sensors) to producing highly customized education courses that would tailor their content and teaching style, in real time, to the needs of individual students. His verdict: It's all for the better.

Since most of us lead rather structured, regular lives - work, home, weekend - even a handful of data points (our location, how often we call our friends) proves useful in predicting what we may be doing a day or a year from now. 'A flat tire on a Monday at 10 a.m. isn't actually random. . . . We just don't yet know how to model it,' Tucker writes.

Seeking to integrate data streams from multiple sources - our inboxes, our phones, our cars and, with its recent acquisition of a company that makes thermostats and smoke detectors, our bedrooms - a company like Google is well positioned not just to predict our future but also to detect just how much risk we take on every day, be it fire, a flat tire or a default on a loan. (Banks and insurance companies beware: You will be disrupted next!)

With so much predictive power, we may soon know the exact price of 'preferring not to,' as a modern-day Bartleby might put it. Would you skip the gym tonight if your smartphone told you this would (a) increase your risk of heart attack by 5 percent or (b) result in higher health insurance payments? Tucker doesn't appear too concerned. To his own prediction that there will come a day when his gadgets will send him a similar note, he can only complain that for this to work, we need data from other people, not just him - and that 'our outmoded ideas of privacy begin to get in the way of progress and better health.'

The predictive models Tucker celebrates are good at telling us what could happen, but they cannot tell us why. As Tucker himself acknowledges, we can learn that some people are more prone to having flat tires and, by analyzing heaps of data, we can even identify who they are — which might be enough to prevent an accident — but the exact reasons defy us. Such aversion to understanding causality has a political cost. To apply such logic to more consequential problems — health, education, crime - could bias us into thinking that our problems stem from our own poor choices. This is not very surprising, given that the self-tracking gadget in our hands can only nudge us to change our behavior, not reform society at large. But surely many of the problems that plague our health and educational systems stem from the failures of institutions, not just individuals.

In his new book, Social Physics, Alex Pentland, a prominent data scientist at M.I.T., shows as much uncritical enthusiasm for prediction as Tucker, while making a case that we need a new science - social physics - that can make sense of all the digital bread crumbs, from call records to credit card transactions, that we leave as we navigate our daily life. (That the idea of social physics was once promoted by the positivist Auguste Comte, one scholar who would have warmed to the idea of Big Data, goes unmentioned.)

What is social physics good for? It would allow us to detect and improve 'idea flow' - the way ideas and behaviors travel through social networks. For example, Pentland wants to arm employers with sophisticated gadgets that would allow them to monitor the communicative activities of their employees and coax them toward more productive behaviors so their cognitive activity isn’t wasted on trifles. That this might lead to a new form of intellectual Taylorism, with managers optimizing the efficiency of the brainstorming session (rather than the time spent at the conveyor belt), seems of little concern to Pentland, who dryly remarks, 'What isn't measured can't be managed.' Employers would certainly love this, but why should employees acquiesce to ubiquitous surveillance? Pentland rarely pauses to discuss the political implications of his agenda, arguing that we must make our social systems more dynamic, automated and data-dependent, as if data, by itself, can settle all political conflicts once and for all.

Both books reveal - mostly through their flaws - that the Big Data debate needs grounding in philosophy. When Big Data allows us to automate decision-­making, or at least contextualize every decision with a trove of data about its likely consequences, we need to grapple with the question of just how much we want to leave to chance and to those simple, low-tech, unautomated options of democratic contestation and deliberation.

As we gain the capacity to predict and even pre-empt crises, we risk eliminating the very kinds of experimental behaviors that have been conducive to social innovation. Occasionally, someone needs to break the law, engage in an act of civil disobedience or simply refuse to do something the rest of us find useful. The temptation of Big Data lies precisely in allowing us to identify and make such loopholes unavailable to deviants, who might actually be dissidents in disguise.

It may be that the first kind of power identified by Agamben is actually less pernicious, for, in barring us from doing certain things, it at least preserves, even nurtures, our capacity to resist. But as we lose our ability not to do — here Agamben is absolutely right - our capacity to resist goes away with it. Perhaps it's easier to resist the power that bars us from using our smartphones than the one that bars us from not using them. Big Data does not a free society make, at least not without basic political judgment.

More books on Computers











Books by Title

Books by Author

Books by Topic

Bits of Books To Impress