Thank you to The Inquiry, a radio show and podcast from the BBC World Service for their insightful broadcast this week about WhatsApp and its influence on vigilantism. Natalie Dixon, co-founder of Affect Lab, was interviewed for a segment of the show to give perspective and context to WhatsApp groups. Natalie’s focus was on Affect Lab’s research into neighbourhood WhatsApp groups in South Africa and The Netherlands and how this technology creates both a sense of neighbourly intimacy but also facilitates acts of alienation. Listen to the FULL PODCAST here. Read the full paper about WhatsApp groups here.
“strange bodies are produced through tactile encounters with other bodies: differences are not marked on the stranger’s body, but come to materialise in the relationship of touch between bodies…. it is the very acts and gestures whereby subjects differentiate between others that constitutes the permeability of both social and bodily space”
– Sara Ahmed, Strange Encounters (2000, page 15.)
Captured in a searingly real and beautiful photo essay, photographer Grey Hutton (vice.com) shows how memory, migration and mobiles are entangled. In the context of the migration crisis in Europe, mobile phones embody mobility on a massive scale – across oceans not just cities. And with mobility comes memory, travelling as mobile background images. ᔥ Vice.com
Embedded in the captions are significant narratives about cultural integration: “
“Hello Barbie” was released on 14 February at a toy fair in America- she’s wi-fi enabled and records kids conversations to develop authentic, real-time responses to them. While the tech press and others are dubbing her “eavesdropping Barbie” and “creepy” she’s not the first doll to be internet enabled. See Cayla, a talking doll that uses speech-recognition and Google’s translation tools, that was subsequently hacked. Besides the obvious questions around privacy and safety, what does this mean for the future of play?
I didn’t go looking for grief on Christmas Eve, but it found me anyway, and I have designers and programmers to thank for it. In this case, the designers and programmers are somewhere at Facebook.
I know they’re probably very proud of the work that went into the “Year in Review” app they designed and developed, and deservedly so — a lot of people have used it to share their highlights of 2014. I kept seeing them pop up in my feed, created by various friends, almost all of them with the default caption, “It’s been a great year! Thanks for being a part of it.” Which was, by itself, a little bit unsettling, but I didn’t begrudge my friends who’d had a good year. It was just a weird bit of copy to see, over and over, when I felt so differently.
Still, it was easy enough to avoid making my own Year in Review, and so I did. After all, I knew what kind of year I’d had. But then, the day before Christmas, I went to Facebook and there, in my timeline, was what looked like a post or an ad, exhorting me to create a Year in Review of my own, complete with a preview of what that might look like.
Clip art partiers danced around a picture of my middle daughter, Rebecca, who is dead. Who died this year on her sixth birthday, less than 10 months after we first discovered she had aggressive brain cancer.
Yes, my year looked like that. True enough. My year looked like the now-absent face of my Little Spark. It was still unkind to remind me so tactlessly, and without any consent on my part.
I know, of course, that this is not a deliberate assault. This inadvertent algorithmic cruelty is the result of code that works in the overwhelming majority of cases, reminding people of the awesomeness of their years, showing them a selfie at a party or whale spouts from sailing boats or the marina outside their vacation house.
But for those of us who lived through the death of loved ones, or spent extended time in the hospital, or were hit by divorce or foreclosure or job loss or any one of a hundred possible crises, we might not want another look at this past year.
To show me Rebecca’s face surrounded by partygoers and say “Here’s what your year looked like!” is jarring. It feels wrong, and coming from an actual person, it would be wrong. Coming from code, it’s just unfortunate. These are hard, hard problems. It isn’t easy to programmatically figure out if a picture has a ton of Likes because it’s hilarious, astounding, or heartbreaking.
Algorithms are essentially thoughtless. They model certain decision flows, but once you run them, no more thought occurs. To call a person “thoughtless” is usually considered a slight, or an outright insult; and yet, we unleash so many literally thoughtless processes on our users, on our lives, on ourselves.
Where the human aspect fell short, in this case, was in pushing the preview image into my Facebook timeline without first making sure I wanted to see it. I assume Facebook only showed the ad to users who hadn’t already created a Year in Review, in an attempt to drive more adoption. So the Year in Review ad kept coming up in my feed, rotating through different fun-and-fabulous backgrounds but always showing Rebecca, as if celebrating her death, until I finally clicked the drop-down arrow and said I didn’t want to see it any more. It’s nice that I can do that, but how many people don’t know about the “hide this” option? Way more than you think.
This whole situation illuminates one aspect of designing for crisis, or maybe a better term is empathetic design. In creating this Year in Review ad, there wasn’t enough thought given to cases like mine, or friends of Chloe, or really anyone who had a bad year. The ad’s design was built around the ideal user—the happy, upbeat, good-life user.
It didn’t take other use cases into account. It may not be possible to reliably predetect whether a person wants to see their year in review, but it’s not at all hard to ask politely—empathetically—if it’s something they want. That’s an easily solvable problem. Had the ad been designed with worst-case scenarios in mind, it probably would have done something like that.
To describe two simple fixes: First, don’t prefill a picture into the preview until you’re sure the user actually wants to see pictures from their year. And second, instead of pushing a preview image into the timeline, maybe ask people if they’d like to try a preview—just a simple yes or no. If they say no, ask if they want to be asked again later, or never again. And then, of course, honor their choices.
As a Web designer and developer myself, I decided to blog about all this on my personal Web site, figuring that my colleagues would read it and hopefully have some thoughts of their own. Against all expectations, it became an actual news story. Well before the story had gone viral, the product manager of Facebook’s Year in Review emailed me to say how sorry he and his team were for what had happened, and that they would take my observations on board for future projects. In turn, I apologized for dropping the Internet on his head for Christmas. My only intent in writing the post had been to share some thoughts with colleagues, not to make his or anyone’s life harder.
And to be clear, a failure to consider edge cases is not a problem unique to Facebook. Year in Review wasn’t an aberration or a rare instance. This happens all the time, all over the Web, in every imaginable context. Taking worst-case scenarios into account is something that Web design does poorly, and usually not at all. If this incident prompts even one Web designer out there decide to make edge cases a part of every project he or she takes on, it will have been worth it. I hope that it prompts far more than that.
We have all sorts of ways to communicate with one another: text messages, emails, Gchats and, according to my sources, even phone calls. We live in a word-heavy world, and why not? A sentence, both spoken and written, is a highly efficient way to transmit a lot of information with very little time and effort. But words aren’t necessarily the best way to express every idea. Logistically speaking, verbal and written languages have cultural barriers that are sometimes insurmountable. Emotionally speaking, sometimes words just don’t do justice for what we’re trying to convey.
Full-sensory correspondence is still a long way off. We’re just now beginning to explore how powerful virtual touch could be in connecting with each other. But there’s one sense that’s been notoriously missing from the landscape: smell. “When you think about how important the olfactive is in almost every type of communication, its absence in global communication is sort of astounding,” says David Edwards.
Edwards is the always-buzzing mind behind Le Laboratoire, the Paris innovation tank and research facility that brought us Wikipearls and Le Whaf. The group’s most recent invention, the oPhone, is aiming to make olfactory communication commonplace by transmitting odors much in the same way you send text messages.
It’s a basic idea. Humans have long bonded over smells, both good and bad (there’s nothing like a smelly subway car to force intimacy). It’s strange then, that no one has been able to channel scents into a more digestible form of communication.
There’s one big problem when it comes to doing this, says Edwards: “Odor transmission to date is not smart,” he explains. “If I give you the odor of a pizza, I have a difficult time immediately after giving you the odor of the sea and then giving you the odor of a cactus.” Basically what Edwards is saying, and what we already know from letting trash sit in our apartments a day too long, is that odors linger. Which makes it hard to craft any sort of cohesive and decipherable olfactive narrative.
The oPhone solves this problem with its main innovation: the oChip. This little cartridge, about the size of a fingernail, contains olfactive information that can produce hundreds (and soon thousands, says Edwards) of odor signals. The idea is that these chips can be installed in the oPhone, and via a bluetooth-connected app called oTracks, scents can be sent to yourself or an oPhone-carrying friend with the push of a button.
Edwards and his small team have been prototyping the oPhone for the better part of a year. The most current version, unveiled at the WIRED UK conference, is a system of sorts that uses four cylindrical oPhones that can each be loaded with up to eight scent chips. This allows for what Edwards calls an “odor symphony,” or the ability to craft a multi-odiferous message with actual context. “These are pretty subtle odor signals that allow me to create sentences, paragraphs and essays, if you will, of odor messages,” he says.
The final product, due out later this year, will come with two oPhones, a choice that Edwards says is a compromise to ensure people can experience more than once smell simultaneously. “You can have these great coffees on one side and breads on the other side,” he explains. “There will be some oTracks that use two oPhones and some that use one.”
He’s quick to say that the initial consumer product is less about catering to a mobile, urban user and more about creating a sensory experience around food or media consumption. Immediate applications will be a coffee experience, which allows oPhone holders to smell various coffee scents. Edwards is also working with Paris Vapors to integrate oPhone technology into media like books, movies and TV shows.
It might seem a little clunky, but most new technology is. More interesting is thinking about future applications, when the oPhone functions more like a cell phone. Edwards is plugging away on creating the universal oChip, a customizable version of the oChip that can be programmed with whatever smell you can think of. You can imagine that this could be applied in healthcare to stimulate memories and relieve stress. Or, more personally, someday while visiting your grandparent’s house, you could send a text message to your brother embedded with smell emoticons that will conjure up the cookies your grandmother used to make.
That inherent emotional connection to smell is what Edwards is looking to exploit. And in his opinion, it’s only a matter of time before it’s commonplace. “If Twitter had this enormous impact with very limited information content exchange, you can imagine a complete aroma equivalent of that,” he says. “It’s fascinating how powerful that could be.”
Watching people eat on trains? Okay. Hearing loud techno on trains? Okay. Overhearing someone’s mobile conversation? Please, anything but that.
On a recent train trip to Schiphol airport I was seated within earshot of an American traveller who’d forgotten his laptop at his hotel in Amsterdam. I know this because he was using his mobile phone to discuss, in loud breathless tones, the logistics of how it would be returned to him. He went through an astonishing amount of detail in the conversation. There was the arranged time for a courier to collect the laptop, how it would be labelled and packaged, and most importantly, what would happen on the off chance that someone was not at reception when the courier arrived. I could tell he was panicked from the way his requests almost clung to the person on the other side, like this call was his last and only link with his Macbook.
I’ve long since made my peace with train etiquette in Holland. For one, the Dutch are completely at ease, wolfing down a 3-course meal directly in front of you on a train. Once I watched a woman dismantle an entire club sandwich, removing the onion and then flinging it with sweet abandon into the open rubbish bin next to our co-joined seats. Some trains include “silent” carriages – ones which don’t allow any music to be played or mobile calls. But most times you’ll find yourself in the “anything goes” carriage, which means you run the full gauntlet of techno, house or my favourite, Dutch hip hop. It helps to close your eyes when you’re on the train. The sociologist Erving Goffman’s might have called this the extreme version of “civil inattentiveness.” But these are extreme times. And our new companion species, isn’t furry and doesn’t bark. Mobiles are lively critters, going everywhere we do.
Once I watched a woman dismantle an entire club sandwich, removing the onion and then flinging it with sweet abandon into the open rubbish bin next to our co-joined seats.
While music and food shenanigans are par for the course in commuting terms, it seems that we reserve a special tension for conversations we have to endure in shared space these days. The mobile phone has become a powerful actor in the erosion of the boundary line between our notions of private and public. This has been the subject for academic debate for over 10 years. What new meanings are created by having intimate conversations in public? What does this means for our defence of space? How does it shape concepts of individualism and the collective? Turning to new age framings: what does this mean for our sense of presence and being mindful?
Getting back to the guy who forgot his laptop, I wondered what exactly bothered me about overhearing his conversation. I narrowed it down to three options, not mutually exclusive. 1.) I was trapped into hearing his conversation, thanks to my proximity and his booming voice. So it disenfranchised me of any choice, and I had no way of defending my personal space. 2.) The conversation was one sided. In other instances, like when I overhear a conversation between two people chatting in the seat behind me in real life, it somehow feels more incidental and appropriate, it kind of melts into the background. But when someone is having a mobile conversation, it’s annoyingly lop-sided and somehow becomes intrusive.
And our new companion species, isn’t furry and doesn’t bark. Mobiles are lively critters, going everywhere we do.
It’s not a new thought that our bodies aren’t just solitary insulated items that merrily trot along in life. We exist as porous bodies, constantly exchanging information with other bodies (human and non-human). Which leads me to my most plausible explanation of the three. Like Mr America, I also became anxious listening to this conversation, it affected me. I had planned to decompress a little on the trip to Schiphol, instead now I was part of a someone else’s drama. More than that, the conversation was achingly banal. Being in inescapable earshot of this conversation reminded me of a part of life I’d much prefer not to think about. That dimension that has nothing to do with love, or art or fantasy or intrigue. It just reminded me of the flat quotidian side of life that is just doggedly banal.
By Mark Kingwell found on The Chronicle of Higher Education via Jona.
We are no longer owners and workers, in short; we are, instead, voracious and mostly quite happy producers and consumers of images. Nowadays, the images are mostly of ourselves, circulated in an apparently endless frenzy of narcissistic exhibitionism and equally narcissistic voyeurism: my looking at your online images and personal details, consuming them, is somehow still about me. [Guy] Debord was prescient about the role that technology would play in this general social movement. “Just when the mass of commodities slides toward puerility, the puerile itself becomes a special commodity; this is epitomized by the gadget. … Reified man advertises the proof of his intimacy with the commodity. The fetishism of commodities reaches moments of fervent exaltation similar to the ecstasies of the convulsions and miracles of the old religious fetishism. The only use which remains here is the fundamental use of submission.”
It strikes me that this passage, with the possible exception of the last sentence, could have been plausibly recited by Steve Jobs at an Apple product unveiling. For Debord, the gadget, like the commodity more generally, is not a thing; it is a relation. As with all the technologies associated with the spectacle, it closes down human possibility under the guise of expanding it; it makes us less able to form real connections, to go off the grid of produced and consumed leisure time, and to find the drifting, endlessly recombining idler that might still lie within us.
I discovered the beautiful work of Noortje De Keijzer recently in Amsterdam. Noortje knitted herself a boyfriend for a masters project at the Design Academy Eindhoven. I thought it was the saddest thing I’d ever seen. But I convinced myself it was about irony, or a clever parody of our desperate and messed up relationships in 2012. But no. Noortje told me she was actually just lonely. She said:
“I felt very lonely at times, and I’m sure everybody feels lonely from time to time. The strange part is that it seems like it’s an emotion nobody really talks about. As if it’s something to be ashamed of, a bad feeling that should always be avoided. By creating Arthur and Steve, I wanted to show the subject in a very light, humorous, positive way. I created this story about a girl so lonely that she decided to just knit a man who could accompany her”.
Apart from the beautiful commentary My Knitted Boyfriend offers on everything synthetic (even relationships) it also made me think of how touch and intimacy have been reinvented in ways we hardly ever reflect on. Haptic technology and gestural interfaces usher in new practices that are remodelling our sensual experiences. And while it might seem easy to snigger at the quirky knitted equivalent of a blow up doll, it’s no sadder than our current digital lives: cradling our beloved iPhone, pawing our iPad surface.
I keep coming back to Next Nature when I think about emotion and technology. Next Nature explores the boundary of where the born and made meet. So maybe it’s our next nature: to manifest our own solutions to loneliness, to knit the blues away. Soon we’ll access the memories (or entire consciousness) of our dream guy or girl.
I continue to be fascinated by the future of intimacy.
– Natalie Dixon