Written by
Rachel M. Murray
Art direction by
Manoel do Amaral
“So the truth is, I never would have questioned this religion, I never would have looked deeply at this belief system because it gave me so much pleasure if it hadn’t been for the pictures.” (Daisey 2014) (Source)
I once had a friend whose twelve-year-old daughter designed movies on her tablet, informing everyone she’d be either a fashion designer or an app designer. The feminist in me applauds her interest in technology. Still, the friend in me is overwhelmed and drowning in a deluge of data and screens from dawn until sleep. I wonder how much a twelve-year-old should engage with technology, knowing it may overtake her life just as it has mine. I wonder about her experiences as a woman in tech — by the time she’s my age, will she be worn out by the culture, the constant drumbeats of speed, the rush to consume, the need to be always on and never not connected? Many of us are already engaged in a life of screens — TVs, cellphones, and even GPS devices account for about 8.5 hours a day for adults in the U.S., according to Council for Research Excellence research (Stelter 2009). From the shadows on the wall in Plato’s allegory of the cave to Neil Postman’s warning that we are ‘amusing ourselves to death’ (Postman 2006) with entertainment, concerns over media, distraction, and attention linger in our public discourse.
What is the effect of constant exposure to information on our attention? Who benefits from a dramatic change in our attention, and when and why has this happened? One hypothesis is that companies have created an ecosystem to encourage continual consumption of information, and they profit from our addiction to information consumption and change in values and behaviors around attention and engagement. This ecosystem includes ‘consumer friendly’ products and services, predictive analytics, intelligent agents, artificial intelligence, sensor networks, the Internet of Things (IoT), personalized content, and platforms — these all work together with notifications to continually engage us. Alvin Abraham has called this a ‘ubicomp system’ (ubiquitous computing systems). Designers contribute to this system by tacitly encouraging people to accept the wholesale convenience of connectivity through deceptive intentions. We have long lived in what Herbert Simon called the ‘Attention Economy’ (according to the Berkeley Economic Review). That economy has been reinforced by a complex technical infrastructure and tech solutionism to keep the infrastructure growing.
This technological ecosystem is a feedback loop, or Ouroboros, a snake that eats its tail: content is delivered via well-designed devices that act as a lure. These devices create data fed back into the content distribution systems that use predictive analytics to shape the design and delivery of more content. The platform ecosystem of devices additionally encourages the design, development, and pricing of the supporting technology infrastructure of apps to keep it growing.
This is further compounded by much of the workforce being sedentary ‘knowledge economy’ professions that prioritize engagement with data. As we move towards an increase in content and devices, we will continue to be pushed towards more engagement and consumption of goods — what design theorist Tony Fry calls “the ongoing creation of ever more things” at the cost of the planet, reinforced by messages, keep us distracted from identifying and protesting against the “Weapons of Mass Distraction”:
Capitalist culture organizes people as buyers of commodities and services (and) …transform(s) information and knowledge into commodities…The corporate conglomerates of the culture industry have created a global public sphere that does not offer any scope for discussing the social and cultural consequences of the ‘free flow of information’ they organized. The fusion of trade, politics, and communication has brought about the sophisticated one-dimensional character of our symbolic environment, which is as least as menacing as the pollution of the natural environment (van Toorn as cited in Dinot 2009:180)
Are we aware this is happening? Is this desirable? What can and should be done to address it if it isn’t?
These are questions for designers, too. If this system diverts attention and creates engagement to profit those in power, how can designers prevent that engagement when our work is centered on creating engagement? We know that ‘addictive design’ is hugely problematic — Actionable Design Magazine describes addictive design as ‘design[ing] for addiction and excessive use raises concerns about the impact on mental health and well-being.’ Addictive design is inevitable when we choose to use Deceptive Patterns (known as Dark Patterns) in our design work.
Yet designers should consciously challenge the idea that we design and treat attention in the UI layer alone. We need to look at larger ecosystems that shape products and services — the data, features, and distribution systems far beyond a user interface, even if they’re not on our immediate roadmaps of products. Here we are going to explore how a larger ecosystem around attention works using three product case studies. Apple’s Watch, Humanyze’s workforce optimization software, and SeeClickFix’s civic technology software all manage attention differently. For the Watch, the individual is a consumer, centered on ‘buy’ as the verb; for Humanyze, the individual is an employee, centered on ‘perform,’ and for SeeClickFix, the individual is a citizen, centered on ‘report.’ Wendy Brown and others in this essay will illuminate how attention ultimately shapes the actions (labor), the values (politics), and the way these technologies are created (design). In this last arena, we as designers must also be critical. As designers, we can become distracted by the lure of improvement — how Android is better because it is an open system. Instead, we must utilize a more realist perspective to understand the omnipresent nature of this Ouroboros technological ecosystem rather than be distracted by shiny lures, ultimately allowing us to challenge the system itself.
The challenge of design isn’t simply to design better for consumers, employees, and citizens — but to acknowledge this complex system, how it shapes attention, and the limitations of the idealist perspective for better design. These companies purposely design engagement for consumers as applicable to citizens and employees — the Quantified Self moves into Quantified Employee and Quantified Citizen — and a blending of these roles continues. There is a need for balance in our Attention Economy, not to seek a return to a mythical nostalgic era before technology, but rather to understand our relationship to these technologies and this blending, case by case.
The individual as a consumer can be seen in the case study of the Apple Watch. Behind the happy bing sound from a notification, the Apple Watch is a celebration of action — active action (a user consciously opening an app to engage in ‘info snacking’ petite bites of data) or passive actions (notifications pushed from applications). Because the screen is small, it is limited in what it can display. But brilliantly, Apple has created the perfect plate because the consumption of notifications is facilitated by their simple, practical design, which encourages repeated information-checking behavior and continual information grazing.
Image of Apple Watch courtesy Apple Web site
The Watch can be a conductor of a symphony of instruments — bells, tweets, alarms, on-screen boxes, a cacophony demanding your attention. Pulling out a phone at a dinner table is considered rude, but interacting discretely with your Watch shows that Apple recognizes that rudeness and plays with the social convention that users will still graze nonetheless. Alyssa Bereznak describes a change in her behavior that arose because of notifications and the need to respond:
When I first got my Apple Watch last month, that’s what I was most looking forward to: a tool that would keep me connected yet help me break from the magnetic pull of my cellphone — that thing I kept glancing at whenever there was a pause in my life, whether I was at an intimate dinner or on a productive conference call. I wanted to be less distracted, less obsessed with notifications. I wanted a gadget to save me from gadgets. Soon after I snapped the Watch (steel case, black classic buckle) to my wrist, however, I felt exactly the opposite effect. The notifications poured in, and with them, a new feeling of organization and efficiency. But with that productivity came a new sense of conflict between digital life and real life. I was becoming a more adept person, but also a more horrible one. I credit this change to the Watch’s clever-yet-distracting notification system.
Every time something you’ve deemed worthy of interrupting your life occurs, you feel a subtle, human-like tap on your wrist. Being summoned with these little nudges throughout the day is the most intimate experience I’ve had with a computer. The buzz of a phone, though attention grabbing, is easy to physically ignore. I can silently acknowledge that I got a text and make a note to check my messages later without flinching, or breaking eye contact with the person I’m talking to. But my reflexive reaction to a robot flicking my arm is almost always to look down at it. And look down, I did.
The Watch plays into the insecurity created by observing and comparing the self to others on social networks — the ‘fear of missing out’ (FOMO) on a life others are assumed to be experiencing and documenting on social media. Tapping into a stream of notifications, one can feel they are participating by consuming data. The Watch symbolizes a worldview that celebrates hyper-connectivity between person, data, objects that create the data, and the world. For Apple, constant connectivity and engagement with data are critical to a productive life, and the Watch facilitates this always-on, always-connected hyper-connectivity. Like a guard dog watching over a house, the Watch silently waits to notify you of content to consume.
The role present is Master Commander of the Data Seas to sail the waves of data from the Internet of Things (IoT). The worldview is a celebration of the data deluge and of a user’s supposed mastery over data. The Watch speaks a narrative of control as one uses it to curate a digital lifestyle aggregation of online consumption delivered by an endless array of potential new apps. Apple conveys that there is always another app to download, another call, email, message, video, reel… and more to consume.
Creating infrastructural hooks like APIs (application programming interfaces) increases demand for applications with notifications. Services like If This Then That promise to connect devices via the Web and the IoT and create more possible notifications to pay attention to. The Watch and the IoT then create expectations about time, content, and the objects to deliver them. We now ask why my x (product, service, experience) doesn’t connect with my y (input device, product, service, experience) as we become aware these connections are possible. Integrating with other technology companies benefits multiple masters and makes the’ business case’ for connecting more technology. Humanyze and SeeClickFix also speak of integration with existing systems, “whether it’s sales KPIs from Salesforce, call data from Communicator, or email and chat data” so that engagement can continue uninterrupted.
So, why do we keep on engaging? Biochemistry is a complicated milieu, but the relationship between the neurotransmitter dopamine and technologically motivated behavior is well understood. According to Psychology Today, dopamine “helps control the brain’s reward and pleasure centers […] emotional responses, enabling us not only to see rewards but to take action to move toward them.” Dopamine, as Susan Weinchenk notes, “makes us curious about ideas and fuels our searching for information.” We get a spike in dopamine after performing information-seeking behavior like checking social media accounts for new activity. Repeated dopamine-seeking behavior highlights the downside of seemingly innocent actions like notification consumption. We might search for one data point, only to fall down a well of multiple searches that cause a mild dissociative hyper-focus where we lose sense of how much time we’ve spent.
That information-seeking behavior cycle can be triggered even when we receive a notification but are not engaged with technology. Knowing we have a notification starts the cycle of dopamine-seeking behavior again.
Information-seeking behavior starts to appear in everyday life without our awareness. We bite at the apple in the Apple logo but return for another bite because we are never satiated and always need that following bite/app/content and dopamine hit.
This is more significant than one device. It is a love affair with ubiquitous computing personal devices and the digital lifestyle aggregation that lives on them. The Watch offers a way to manage this life and maintain control over time, information, and our attention through notifications. We attempt to control our data via notifications — as if to say, ‘Acceleration is here; time can’t be controlled, but here is a notification about it.’ As Lazzarato notes:
The Watch positions users as being in control of their data when, in reality, ownership is less clear. The customer chooses that worldview when purchasing Apple’s brand and technical apparatus (Apple ID as a billing system and iTunes as a content distribution platform) and the political and economic tenets of that worldview. One may argue they don’t buy into the Cupertino worldview — they’re not an ‘Apple Person.’ Still, one cannot use any Apple product without the accompanying infrastructure. As Barclay Sloan describes it, you’re in a walled garden of an operating system, the “walls of dependency that Apple has intelligently built around those living inside this Apple ecosystem to keep you inside -products and services not being compatible outside the ecosystem.”
From Dick Tracy’s watch to our smart objects, a message is communicated by technology companies that a data deluge is coming and that data will be ubiquitous and unavoidable. In response to de Pompadour’s edict ‘après nous le déluge,’ Apple and other technology companies assert there is no end to the deluge — we can only use objects to consume rather than create dams to stop it. This data also bleeds into infrastructures in our cities. As we move towards the surveillance opportunities within smart cities, continually connected smart devices and consumerist narratives like the ‘Quantified Self’ and the Internet of Things lend to the Ouroboros technological ecosystem, encouraging us to stay engaged rather than analyze the engagement itself.
Whether the Watch concerns action at a personal level is not the question — though there is little to tie the actual political action as a citizen to the message that we control our data. Instead, the question is whether we are controlling our constant information-seeking behavior or if it is controlling us. This example is unique from other as the individual becomes self-monitor and master, with little need for a group to monitor when we install a Panopticon made of shiny steel on our wrists to monitor ourselves.
What is the state of attention and engagement for the individual as an employee? In the U.S., workplace surveillance allows for managing and optimizing employees within the company. The case of Humanyze’s “people analytics and workforce optimization” software is one such surveillance apparatus, a platform of interconnected parts that monitor and analyze sociometric data, the science of social relationships.
Ben Waber, CEO of Humanyze, wearing one of the company's sensors. (Robin Lubbock/WBUR)
The badges have a microphone and sensors to track data such as “real-time voice processing to uncover variables like tone of voice, volume […] and how frequently you contribute to meetings.” This data is analyzed and measured according to productivity and other key performance indicators. Most workplaces in the U.S. have workplace surveillance across computers, Wi-Fi, and employee access badges. Humanyze specifically monitors voice data, which can track elevated heart rate (among other data) to indicate and “measure your body language.” For the entire Humanyze system to work, employees must continually wear the badges as, according to the company, “continuous wear is a commitment by a company to continuous improvement […] it is at its most powerful when adopted company-wide.”
Action for employees is wearing badges, but for employers, action is paying attention to tracked data from employees. Humanyze places managers as surveillance masters to use attention to yield benefits in the name of productivity. Humanyze encourages managers to “proactively understand disruptions to their teams [… ] and be warned of potential project failures based on communication gaps” (Humanyze, 2016), such as employees not participating during meetings even if they may not feel they have anything valuable to contribute. This system then rewards ‘false participation’ and can create a ‘gap’ where one may not exist. Humanyze also notes employees can benefit from monitoring ‘goals and benchmarks’ and self-tracking of attention, similar to Apple’s emphasis on self-tracking.
Tracking progress is a form of self-mastery as well as competition with others (“See how you compare to your team or other roles in your organization” — Humanyze 2016) as well as creating free content (Berardi’s Semiocapital) for the company to use to monitor. Humanyze reduces human behavior to data points — in contrast, human-centered design uses mixed methods of qualitative and quantitative data to understand human behavior and capture a full range of human experiences. When Wodiczko spoke of Interrogative Design, he spoke of integrating designers working with people worldwide. Still, his words could also talk to the dangers of reducing people to data points without context to explain subtleties of human behavior that data alone can’t capture:
When Humanyze speaks of actions (“Get a look into what everyday actions contribute the most to your creativity, collaboration, and productivity” — Humanyze 2016), users become analysts and self-monitors, similar to the Watch and SeeClickFix. For Apple, optimizing the self is the critical action, and for SeeClickFix, ultimately, citizens optimize the city; with Humanyze, we have an optimization of self and others to benefit both the company buying the service and Humanyze. Berardi could view optimization as a natural goal of “the capture of work inside the network […] the dissemination of the labor process into a multitude of productive islands formally autonomous, but actual coordinated and ultimately dependent” (Berardi 2009:88). When he speaks of “human terminals […] all connected to the universal machine of elaboration and communication” (Berardi 2009:76), those ‘terminals are part of a system which uses attention and monitoring via surveillance to optimize.
CEO Ben Waber sees Humanyze’s ‘smart badge’ and platform as desirable and inevitable this technology becomes ‘commonplace’ — “within about four years, every single company ID badge is going to have these sensors, whether it’s ours or somebody else’s, right?” (Humanyze 2016). Humanyze profits from the change in how we value attention and uses surveillance and the lure of optimization to make this shift seem natural.
How does attention and engagement fair for citizens? SeeClickFix is an app that aims to “connect citizens with local government, fixing problems and building trust in your community” (SeeClickFix 2016) for non-emergency service requests about broken public infrastructure and the company concerts action as citizens using their attention and engagement to become surveillance agents to report, track and monitor that infrastructure. There are many other similar apps currently that, with different approaches, has similar goals and, consequently, issues, such as NextDoor and CitizenApp to name a few.
SeeClickFix (source)
Surveillance for SeeClickFix is turned outward towards the world to monitor infrastructure. It is also turned inward towards public sector employees who are ‘monitored’ by the system. Government officials are notified via a private SeeClickNow Web-based request management system (for which they pay SeeClickFix). They can embed a SeeClickFix map ‘widget’ on municipal Web sites for citizens to pay attention to. Whereas the Apple Watch has notifications managed by individuals who opt into use, SeeClickFix sends notifications to civic servants who do not opt-in — only their employer does. The employer can then use the completion of the tickets as another metric to measure employee performance. Not only are employees monitored for completing their primary job, but they can also be monitored for completing tickets in the SeeClickFix system. The citizen’s attention then becomes another form of employee monitoring. Does this become a new kind of citizenship/ad hoc participation without a governmental body’s organization of the commons? Contrast this with more extensive, sustained efforts of community engagement via community boards, participatory budgeting, and participatory design all in combination:
Public engagement initiatives such as San Francisco’s Urban Forest Map, Chicago’s GO TO 2040, New York’s PlaNYC 2030, Los Angeles’ SurveyLA, and Washington D.C.’s Capital Bikeshare Survey have been successful in bringing the public into a participatory role in data collection, city planning, and vision setting activities. Such projects are examples of undertakings that would most likely be too big and cost-prohibitive for a city to carry out and continue on its own. Yet by asking citizens to participate, not only does it keep costs low but empowers citizens, brings together new ideas, and increases mass collaboration (Bradford 2010).
This new lifecycle of production, monitoring, and analysis of data also shifts responsibility onto citizens to provide services as they become monitoring agents in Smart Cities. SeeClickFix’s site says, “A better neighborhood starts with you. Let’s get to work.” (SeeClickFix 2016). When Wark speaks of a language of commanding, he speaks of how recycling becomes an extension of the consumption cycle with OBEY!/BUY!/ RECYCLE! (Wark 2013:3). The next part of this cycle will then be for citizens to REPORT — or else not really contribute to an invisible metric of civic involvement. We also find an ideology that promotes data and finance over people, similar to the narrative about Smart Cities prioritizing data over qualitative metrics of happiness. Will what’s happened with our consumer data, marketing, and privacy be carried over to our governments? Does analytics — and citizen attention — become a new kind of monitoring of workers now measured by SeeClickFix tickets answered? Do some infrastructures get fixed sooner because of vocal citizens where ‘this pothole got the most likes’, and what are the repercussions for civic engagement in areas without potholes watched by SeeClickFix?
Are policies and processes simplified in this solution, or does this add a bandage layer to existing technologies rather than a systemic solution? SeeClickFix has chosen citizen reporting and ticket management as a part of everyday life — fixing a pothole “or something like it — happen[ing] all the time in the disintegrating spectacle [falls] short of any project of transforming it” (Wark 2013:8). By SeeClickFix focusing on simplicity rather than complexity, they miss a chance to address complex systemic issues. Fry notes that “the actual organizational means to engage problems of defuturing with some chance of success must come from a broader and more informed understanding of causality and a sense of the relational complexity. Such means need to have the ability to undermine bio-political and technological inscribed networks of power” (Fry 2011:32). This oversimplification of the system then shapes expectations of how quickly problems should be solved — on-demand problem solving and the prioritization of speed assumes one can ‘SeeClickFix — see something, click on it, fix it.’
This is part of a larger narrative of neoliberal technosolutionism that views new technology as a solution to political and social issues by privatizing public services. Designing solutions for the public sector is complicated. Yet the civic technology space continues to grow as more private sector companies enter to provide services. Free market forces have worked to erode the public safety net of services traditionally offered by the government, a “dramatic worsening of social protections, determined by thirty years of deregulation and the elimination of public structures of assistance” (Berardi 2009:80). What Wendy Brown identifies as a “financialization of everything […] to enable a radical reduction in welfare state provisions and protections for the vulnerables; privatized and outsourced public goods [ so that ] ‘democracy can be undone, hollowed out from within, not only overthrown or stymied by anti-democrats” (Brown 2015:17).
Those free market forces which advocate eliminating public services often seek entry into the public sector by using data and analytics to ‘optimize’ and ‘improve’ the city, often with the goal of a Smart City which gathers data which can be monetized and used for surveillance. SeeClickFix views the government as its competition — inefficient, antiquated, and needs fixing.
“For me this whole website started because I was trying to report graffiti on a neighbor’s building,” Berkowitz said. “It wasn’t attractive graffiti, and it was in a place that was not a public space.” He reported the graffiti to the New Haven government but he said nothing happened. “I got the idea that my neighbors were reporting similar things, but there was no accountability and no collaborative discussion,” said Berkowitz (Harless 2013)
Berkowitz notes, “With many of the things we want government to help us with, it really makes sense to try to do it on our own […] at first, we thought of calling it Little Brother, like ‘Little Brother is Watching,’ but then we realized we needed to be a bit more kind to government” (Business Innovation Factory 2016). For SeeClickFix, technology expresses ideology and power to fix a broken government and profit from the destruction. ‘On-demand government’ is created and materialized, and this idea becomes “not merely a concept but a social reality” (Geuss 2008:46) that can become yours for a $10,000 license and monthly subscription fee. Municipalities pay SeeClickFix and have citizens become field agents under an indentured Servitude As a Service business model — and as Berardi notes, everyone with a cell phone becomes a perpetual worker and “labor is the cellular activity where the network activates an endless recombination. Cellular phones are the instruments making this recombination possible.
Every info-worker can elaborate a specific semiotic segment that must meet and match innumerable other semiotic fragments to compose the frame of a combinatory entity that is info-commodity Semiocapital” (Berardi 2009:89). SeeClickFix effectively outsources the maintenance of governance to us via our attention and engagement, while profiting from our work for them.
So que fair — what is to be done to protect our attention? How can designers save agency, build awareness of the issues, and create resistance? This is our work, both those who design and those who consume, to together make a realist response of resistance, a “rupture that declares the situation and creates practical possibilities in that declaration (Badiou 2012:8). Will we still be ‘patzers’ in Larry Holmann’s concept of realist politics (Holmann 2012: 295), simply designing more human-centered ethical Band-aids? Perhaps. But to not attempt a response is to not to think; if, as Hannah Arendt asserted, the only reliable source of light in dark times is found in the activity of thinking (Berkowitz 2010:5), then we must think because it is the start of a realistic political response about attention and any areas where our agency is threatened, the only reliable “safety net against the increasingly totalitarian or even bureaucratic temptations to evil that threaten the modern world” (Berkowitz 2010:5).
We can take action by becoming conscious, holding people accountable, taking responsibility, and creating alternatives. We must hold multinational corporations responsible for creating products and byproducts like data. A company may want to use tools like Humanyze to improve — the answer must also be to advocate for consent, transparency, security, and privacy before using such tools.
We can support Trebor Scholz’s and others in the Platform Cooperativism as they scale up their work. There have been Open Source mobile technologies, but none have gained traction primarily because of collusion between mobile companies and telecommunications carriers — which is why alternative systems like Platform Cooperativism are vital. We can involve citizens in monitoring broken infrastructure and encourage meaningful engagement and dialogue between citizens and public officials. ‘Working with, not for’ is the battle cry of Cuán McCann, and their analysis of 5 Modes of Civic Engagement in Civic Tech (McCann 2015) points to a sophisticated analysis of where civic technology can increase civic engagement, echoing Enzo Manzini’s emphasis on co-design as a dialogic conversation, where actors are “willing and able to listen to each other, to change their minds, and to converge towards a common view; in this way, some practical outcomes can be collaborative obtained” (Manzini 2016:58).
Governments can use tools without contributing to the privatization of the commons. One is Open311, an open-source technology that gives back to other public sector agencies without paying a private sector company. We can also learn to design to put people before the feature sets and technological capacities:
We can also consciously design to protect our attention and rethink the idea of engagement. While we want to have products that allow us to complete tasks and experience delight, we can explore what Jean-marc Buchert describes as a more “sober, uncluttered, and optimal experience to help users dive into their flow” — choosing to remove any extra content and UI elements which may serve to distract users. The most prominent example might be the Light Phone, which dramatically pairs down components of an experience by substantially simplifying the user interface altogether.
We must question why and how products change our behavior. It is up to me to teach younger designers like twelve-year-old Maddie that she has the power to step away from the screens, too. Technologies can be redesigned, but also we can rethink our use and respect each other’s attention, time, and agency. Consumers can question our relationship with these companies and add conscious contemplation to our use. Still, it also lies with designers to take responsibility for our role, heed the call Victor Papanek and others have noted, and take responsibility for designing attention and engagement. Ironically, anti-pattern designs can be a great wake-up call to show us that citizens and designers can unite towards a response that respects consciousness and is present to design for a world that values mindfulness. As Carl Jung noted, “One does not become enlightened by imagining figures of light, but by making the darkness conscious” (Jung 2014:265).