Horizons of Uncertainty by Scott Smith

Sign at CERN, Cessy, France.

Sign at CERN, Cessy, France.

Cross-published from Medium

Last week, the Pew Research Internet Project published the top-line findings of a survey on Americans’ attitudes about technology and the future. Titled “U.S. Views of Technology and the Future: Science in the next 50 years,” the study took a fairly broadbrush approach at gauging how hot or cold my compatriots feel about the “the future” as it might be shaped by both real and imagined technologies. While the individual findings are what most media coverage focused on (and broad disparity in interpretation of the data for headlines), two overarching themes jumped out at me that I think say a lot about Where We Are Now.

How you feel about the future as represented by scientific and technological change depends a great deal on which end of the change you’re on.

Taken in aggregate, almost two-to-one, respondents to the survey responded optimistically to the headline question: “Over the long term, you think that technological changes will lead to a future where people’s lives are mostly better or to a future where people’s lives are mostly worse?” This question was asked without specific prompts about technologies, but was* asked on the basis of how the future has been portrayed, both positively and negatively, in books and movies.

When the data is broken down by gender, income and education, the responses are a little more telling. College educated men with a good income are far more likely to face the future enthusiastically than their female, less educated or less well-off counterparts. Men were more optimistic than women about technology’s impact on people’s lives by a margin of 16%—a significant gap.

This gap also plays out markedly in later questions about wearable or implantable technology that could provide information to users. 59% of women think these devices would have a negative impact on people’s lives, vs. 46% of men. Likewise, only 18% of women believe personal drones will be beneficial, while 27% of men believe so. In both cases, women have greater misgivings about the potential of technologies that already exist in some form and which have already been shown to have applications in surveillance and/or have been demonstrated in ways that invade privacy and exert power.

Lab-grown meat is another area where the publicly released data shows a gender gap. 27% of men surveyed said they would try lab-grown meat, while only 14% of women said they would. Similar factors to those shaping women's attitudes toward wearables and UAVs may also be at play here, according to Dr. Debbie Chachra, Associate Professor of Materials Science at Olin College, who also works on issues of gender and technology. “The gender differences in meat may also relate to trust and control, as with drones and surveillance,” she wrote in an e-mail, “to eat lab-grown meat (or raw meat) is to trust that the systems that produced them are safe and well-regulated, and that the consumer will not be placed at risk. Women are more sensitive to these issues than men in other contexts too, and generally for good reason: they are (as a group) more likely to be worse off if there is less oversight.”

While age didn’t seem to be a significant factor in optimism vs. pessimism, education did—with 44% more college graduates viewing the future positively than negatively, compared to only 21% for those with a high school education or less. Again, says Chachra, “the visibility afforded to the better educated and the accompanying higher risk tolerance may explain some of the difference here. Using the example of the lab-grown meat, she wrote: “College graduates are likely to be more secure generally (and possibly to have more understanding of the process of making lab-grown meat) and therefore be more likely to take risks, whether real or perceived, with their food.”

Responses by income differed as well: people with incomes greater than $75,000 a year were more likely to be optimistic than those making less than $30,000. While this isn’t earthshattering news, again it says something about risk tolerances as well as access to technology as a lever for success. The latter point also surfaces as the survey asks about the desirability of DNA manipulation to “produce smarter, healthier, or more athletic offspring” in Pew’s words. Almost one third of those below the $30,000 mark would be interested in this technology, while only 18% of the $50,000-plus cohort would be. In this instance, DNA manipulation is perhaps seen as a way to even the playing field of success among those with lower incomes.

The drivers for these perception gaps around gender, education and wealth have a lot to do with where technology sits vis-a-vis society and the economy at the moment. While in the recent past, technology has been viewed in American society as both a driver of growth and prosperity, as well as something of a social leveler, much of the technology zeitgeist at the moment revolves around its role as growing economic and social wedge, displacing jobs, driving significant wealth for a small group, and fueling gender conflict. These issues are not hidden behind a research paywall or part of an esoteric debate, but are being argued and analyzed out in the open as part of the mainstream economic and political debate. Thus, attitudes toward the futures that technology might bring are as much political and social attitudes today as positions for or against particular innovations.

While the far “future” is a somewhat comfortable idea, as it gets nearer, it becomes less so.

Without past datasets to compare with, it’s difficult to say firmly whether this is a dynamic that changes over time—whether the future is elastic, feeling closer or further away depending on momentary risk tolerances shaped in part by the perceived density and frequency of high-profile innovations. Nonetheless, Pew’s data suggests that, as with consumer products or political candidates, we tend to respond differently when we have evidence or experience of something than if it’s an abstraction in our mind—a hypothetical only.

In this case, we see a generalized optimism about the benefits the future may bring—recall that 59% of respondents were optimistic that the next 50 years would bring positive things to humanity—but more negative responses when queried about technological changes that already exist in some form today. Pew cited the following points of pushback among respondents:

  • 66% think it would be a change for the worse if prospective parents could alter the DNA of their children to produce smarter, healthier, or more athletic offspring.

  • 65% think it would be a change for the worse if lifelike robots become the primary caregivers for the elderly and people in poor health.

  • 63% think it would be a change for the worse if personal and commercial drones are given permission to fly through most U.S. airspace.

  • 53% of Americans think it would be a change for the worse if most people wear implants or other devices that constantly show them information about the world around them.

— Pew Research

Additionally, half of respondents said they wouldn’t want to ride in a self-driving car (which already exists), almost three-quarters said they wouldn’t want a brain implant that would improve memory or cognitive capabilities, nor to try lab-grown meat (also here now). Asked what they would like, respondents opted to play the long-game again, favoring flying cars, time travel and extended longevity, according to the released data. More than one-third couldn’t even say what they would want, if given a choice.

The takeaway here seems to be that people judge possible futures more critically when they have even a small experiences with those futures, or for which some implications are already playing out around them. In terms of timescales, DNA manipulation or robotic assistants are much closer in time to the average person surveyed than hoverboards, jetpacks or ESP, all of which were asked about in the survey. It may be easier to say yes to the abstraction of a technology that sounds interesting but may never materialize, but a change which you’ve already begun to sense or can see the implications of in the present world may seem less appealing.

Projections, Not Predictions

Forecasting futures is a tricky game, even for the professional. While these opinions have been pitched by some headline writers as “predictions” made by Americans about the future, they are hardly that. They seem to be more of a Rorschach test, eliciting responses based more on deeper attitudes of hope and fear than detached analysis. Given the technological and economic milieu we are currently in—where innovations are coming thick and fast and many markets and social structures are being impacted and even reshaped—unease with or confidence in what types of technologies the future may bring, are tightly connected to both the circumstances and context in which a particular respondent finds him or herself.

I believe that technology isn’t neutral, and that society often quickly embeds its values into technologies based on how they are used, and by whom. For wearables and drones, very different values are struggling to win out at the moment (invasive vs assistive, for example), and prominent use cases, as well as who has the power to influence these use cases, play a strong role in shaping the mood of the public and specific segments of society alike. My reading of these results, superficial as they may be, reinforces this notion, as well as the idea that we should listen, not just to the wealthiest, loudest, or most pleased segments of society regarding particular advances, but to the views of all who might be impacted by them—at both ends of the transaction, so to speak.


*I have some issues with the methodology of the Pew survey itself, particularly regarding its mixture of actual, existing innovations with topics drawn from science fiction and pop culture. By asking specifically about tropes from sci-fi, in my view, the survey jumps tracks from potential scientific and technological development into fantasy territory.

While experts continue to debate whether a relationship exists between future science and technology portrayed in media and the innovation undertaken in the past and today, asking survey respondents about the unknown future is generally problematic. Mixing (currently) fictional futures with questions about innovations for which early prototypes already exists seems methodologically muddy at best, and potentially leads respondents to make judgements about near-term advances informed by their feelings about speculative concepts which carry not insignificant cultural baggage.

For your own reference, the main part of the study questionnaire can be found here.

 

Secret Systems by Scott Smith

Flights cross SE Asia in the early morning hours.

Flights cross SE Asia in the early morning hours.

Visibility of unseen systems has been a recurring theme in the talks and research of my fellow travelers recently. And for good reason. Black-boxing of critical systems that support daily life creates significant long-term vulnerabilities for individuals, organizations and states. Two weeks ago I added a small piece to this conversation [now also available on Gizmodo], spurred by the uncertainty surrounding the disappearance of MH370, and the way the search for it has revealed systems and interconnections about which many average people are unaware. While experts sometimes have the tools to proactively probe and reveal the shape of these hidden systems, it often takes a crisis or wild card event to cast light on them for a public to whom they are effectively invisible.

The timing of the piece was serendipitous. Shortly after publishing (quietly), I was asked last week to chair the "Secret Lives of Systems and Services" panel at the end of FutureEverything in Manchester in Laurie Penny's absence. Filling in was a privilege, considering the panel was formed by Ella Saitta, James Bridle and Adam Harvey.  Their talks focused on reaching not just into technological layers to point to existence and potential abuse of the "architectures of control" Anab Jain invoked in her keynote, but also on the criticality of understanding and using political and cultural levers to push back and contest these systems when abuses take place, and making them visible even when there aren't abuses, in order to return power back to the individual.  

I'm hopeful that this ongoing focus on systems and infrastructure—and the assumptions and intentions embodied within them—will help steer a more fair and humane trajectory for the necessary elements of these systems, and I look forward to a great deal more discussion and action in this space.