A Gender moment through mobile advertising taken from Wired magazine in 2003.
I didn’t go looking for grief on Christmas Eve, but it found me anyway, and I have designers and programmers to thank for it. In this case, the designers and programmers are somewhere at Facebook.
I know they’re probably very proud of the work that went into the “Year in Review” app they designed and developed, and deservedly so — a lot of people have used it to share their highlights of 2014. I kept seeing them pop up in my feed, created by various friends, almost all of them with the default caption, “It’s been a great year! Thanks for being a part of it.” Which was, by itself, a little bit unsettling, but I didn’t begrudge my friends who’d had a good year. It was just a weird bit of copy to see, over and over, when I felt so differently.
Still, it was easy enough to avoid making my own Year in Review, and so I did. After all, I knew what kind of year I’d had. But then, the day before Christmas, I went to Facebook and there, in my timeline, was what looked like a post or an ad, exhorting me to create a Year in Review of my own, complete with a preview of what that might look like.
Clip art partiers danced around a picture of my middle daughter, Rebecca, who is dead. Who died this year on her sixth birthday, less than 10 months after we first discovered she had aggressive brain cancer.
Yes, my year looked like that. True enough. My year looked like the now-absent face of my Little Spark. It was still unkind to remind me so tactlessly, and without any consent on my part.
I know, of course, that this is not a deliberate assault. This inadvertent algorithmic cruelty is the result of code that works in the overwhelming majority of cases, reminding people of the awesomeness of their years, showing them a selfie at a party or whale spouts from sailing boats or the marina outside their vacation house.
But for those of us who lived through the death of loved ones, or spent extended time in the hospital, or were hit by divorce or foreclosure or job loss or any one of a hundred possible crises, we might not want another look at this past year.
To show me Rebecca’s face surrounded by partygoers and say “Here’s what your year looked like!” is jarring. It feels wrong, and coming from an actual person, it would be wrong. Coming from code, it’s just unfortunate. These are hard, hard problems. It isn’t easy to programmatically figure out if a picture has a ton of Likes because it’s hilarious, astounding, or heartbreaking.
Algorithms are essentially thoughtless. They model certain decision flows, but once you run them, no more thought occurs. To call a person “thoughtless” is usually considered a slight, or an outright insult; and yet, we unleash so many literally thoughtless processes on our users, on our lives, on ourselves.
Where the human aspect fell short, in this case, was in pushing the preview image into my Facebook timeline without first making sure I wanted to see it. I assume Facebook only showed the ad to users who hadn’t already created a Year in Review, in an attempt to drive more adoption. So the Year in Review ad kept coming up in my feed, rotating through different fun-and-fabulous backgrounds but always showing Rebecca, as if celebrating her death, until I finally clicked the drop-down arrow and said I didn’t want to see it any more. It’s nice that I can do that, but how many people don’t know about the “hide this” option? Way more than you think.
This whole situation illuminates one aspect of designing for crisis, or maybe a better term is empathetic design. In creating this Year in Review ad, there wasn’t enough thought given to cases like mine, or friends of Chloe, or really anyone who had a bad year. The ad’s design was built around the ideal user—the happy, upbeat, good-life user.
It didn’t take other use cases into account. It may not be possible to reliably predetect whether a person wants to see their year in review, but it’s not at all hard to ask politely—empathetically—if it’s something they want. That’s an easily solvable problem. Had the ad been designed with worst-case scenarios in mind, it probably would have done something like that.
To describe two simple fixes: First, don’t prefill a picture into the preview until you’re sure the user actually wants to see pictures from their year. And second, instead of pushing a preview image into the timeline, maybe ask people if they’d like to try a preview—just a simple yes or no. If they say no, ask if they want to be asked again later, or never again. And then, of course, honor their choices.
As a Web designer and developer myself, I decided to blog about all this on my personal Web site, figuring that my colleagues would read it and hopefully have some thoughts of their own. Against all expectations, it became an actual news story. Well before the story had gone viral, the product manager of Facebook’s Year in Review emailed me to say how sorry he and his team were for what had happened, and that they would take my observations on board for future projects. In turn, I apologized for dropping the Internet on his head for Christmas. My only intent in writing the post had been to share some thoughts with colleagues, not to make his or anyone’s life harder.
And to be clear, a failure to consider edge cases is not a problem unique to Facebook. Year in Review wasn’t an aberration or a rare instance. This happens all the time, all over the Web, in every imaginable context. Taking worst-case scenarios into account is something that Web design does poorly, and usually not at all. If this incident prompts even one Web designer out there decide to make edge cases a part of every project he or she takes on, it will have been worth it. I hope that it prompts far more than that.
In a double loop-de-loop of weirdness, you heard it right: a Ugandan pop star, Desire, has to contend with naked pictures of herself floating around the interweb machine (thanks to her ex) and the State Minister for Ethics & Integrity in Uganda wants to arrest her. Yes, Uganda has a State Minister for Ethics & Integrity (Simon Lokodo) in case you thought that was a joke – this stuff’s for realz mafreendz!
A sweet-looking young boy sits in the park unpacking a picnic basket and waiting for someone to arrive. Along comes a burly older man, fresh from his outdoor gym routine and stops in front of the boy, bending on one knee. For a moment the man checks his phone and then asks for the boy by name. Then, in an awkward and surprisingly emotional performance the man breaks up with the boy. Well, to be more accurate, the man breaks up with the boy on behalf of his girlfriend.
Scene 1 of actress/artist Miranda July’s quirky new film project, Somebody. Part social experiment, part commentary, the tongue-in-cheek tagline says it all: “When You Can’t be There, Somebody Can”. Think of it like the high-tech version of a singing telegram. The project contains a short film (beautifully shot) and a mobile application for free download.
I love Somebody’s witty parody of “routine” digital experiences. Specifically I love the extreme scenarios of mediation. In a sweet, musing kind of way, it makes you wonder how far you would go. Take this scene: A woman is seated in a restaurant, a waitress walks over with her phone, asks for her by name and then proposes marriage on behalf of an errant partner – brilliantly performed by July herself.
Another scene: two women have a catfight. It look irresolvable until a passing pensioner pedestrian “messenger” delivers a reconciliatory note.
Beyond the cute parody (it’s light), this is a very nicely packaged (beautiful wardrobe courtesy of the film’s commissioner Miu Miu) commentary on our posthuman selves. In the best scene of the film – one with clever cybernetic overtones – a pot plant asks its owner for water via a proxy: the man’s lover. Things get nicely twisted, when the pot plant’s missives turn into sort of techno erotica (much funnier than it sounds). Potplant: “Test my soil, deeper!”
Running with the posthuman theme, feminist technoscience writer Donna Haraway might call this notion of distributed consciousness: living intimately ‘as’ and ‘in’ a biological world. Haraway’s work interrogates the divisions of nature/culture and human/machine. She refutes these so-called divisions, preferring “enmeshment” instead. She coined the term “natureculture” to explain the concept. The point is: drawing a dividing line between us, mobiles (and even thirsty pot plants) is irrelevant anyway. July’s film delivers the same message in a good-humoured parody.
Somebody is a brilliant hack of technology; in the project’s own words: “The antithesis of the utilitarian efficiency that tech promises, here, finally, is an app that makes us nervous, giddy, and alert to the people around us.” In visualizing affect – nearly all the scenes are real heart-wrenchers – the film tries to humanise technology. It introduces us humans back into our own technological exchanges in a speculative way. Naïve? Probably. Damn entertaining? You bet.
Somebody app is available for download via iTunes. No proxies needed.
Parody, playful, tongue in cheek: Invisible Girl/Boyfriend.
Created during a StartUp Weekend, according to the site: “Finally. A girlfriend your family can believe in. Invisible Girlfriend gives you real-world and social proof that you’re in a relationship – even if you’re not – so you can get back to living life on your own terms.”
Seems the invisible boyfriend was an afterthought, click far right in the corner for that option. No guessing the gender of the project initiators then :)
Thirty-five statues across London and Manchester have begun telling tales of the past through the voices of recognisable British actors and personalities, with words from our best writers.
To hear the statues, who first clear their throats today, visitors need to swipe their smartphones over signs near the statues. The phone will then ring and the monologues begin.
Playing some of our most notable characters from history – along with Dick Whittington’s Cat – will be Patrick Stewart, as the haunting voice of the unknown soldier at Paddington Station; Jeremy Paxman, who will defend free speech as John Wilkes in Fetter Lane; and Prunella Scales, as Queen Victoria, both on Blackfriars Bridge and in Manchester’s Piccadilly Gardens.
The statues will be brought to life as part of a project by Sing London (singlondon.org), a non-profit arts organisation, with the intention of lifting the nation’s spirits.
We have all sorts of ways to communicate with one another: text messages, emails, Gchats and, according to my sources, even phone calls. We live in a word-heavy world, and why not? A sentence, both spoken and written, is a highly efficient way to transmit a lot of information with very little time and effort. But words aren’t necessarily the best way to express every idea. Logistically speaking, verbal and written languages have cultural barriers that are sometimes insurmountable. Emotionally speaking, sometimes words just don’t do justice for what we’re trying to convey.
Full-sensory correspondence is still a long way off. We’re just now beginning to explore how powerful virtual touch could be in connecting with each other. But there’s one sense that’s been notoriously missing from the landscape: smell. “When you think about how important the olfactive is in almost every type of communication, its absence in global communication is sort of astounding,” says David Edwards.
Edwards is the always-buzzing mind behind Le Laboratoire, the Paris innovation tank and research facility that brought us Wikipearls and Le Whaf. The group’s most recent invention, the oPhone, is aiming to make olfactory communication commonplace by transmitting odors much in the same way you send text messages.
It’s a basic idea. Humans have long bonded over smells, both good and bad (there’s nothing like a smelly subway car to force intimacy). It’s strange then, that no one has been able to channel scents into a more digestible form of communication.
There’s one big problem when it comes to doing this, says Edwards: “Odor transmission to date is not smart,” he explains. “If I give you the odor of a pizza, I have a difficult time immediately after giving you the odor of the sea and then giving you the odor of a cactus.” Basically what Edwards is saying, and what we already know from letting trash sit in our apartments a day too long, is that odors linger. Which makes it hard to craft any sort of cohesive and decipherable olfactive narrative.
The oPhone solves this problem with its main innovation: the oChip. This little cartridge, about the size of a fingernail, contains olfactive information that can produce hundreds (and soon thousands, says Edwards) of odor signals. The idea is that these chips can be installed in the oPhone, and via a bluetooth-connected app called oTracks, scents can be sent to yourself or an oPhone-carrying friend with the push of a button.
Edwards and his small team have been prototyping the oPhone for the better part of a year. The most current version, unveiled at the WIRED UK conference, is a system of sorts that uses four cylindrical oPhones that can each be loaded with up to eight scent chips. This allows for what Edwards calls an “odor symphony,” or the ability to craft a multi-odiferous message with actual context. “These are pretty subtle odor signals that allow me to create sentences, paragraphs and essays, if you will, of odor messages,” he says.
The final product, due out later this year, will come with two oPhones, a choice that Edwards says is a compromise to ensure people can experience more than once smell simultaneously. “You can have these great coffees on one side and breads on the other side,” he explains. “There will be some oTracks that use two oPhones and some that use one.”
He’s quick to say that the initial consumer product is less about catering to a mobile, urban user and more about creating a sensory experience around food or media consumption. Immediate applications will be a coffee experience, which allows oPhone holders to smell various coffee scents. Edwards is also working with Paris Vapors to integrate oPhone technology into media like books, movies and TV shows.
It might seem a little clunky, but most new technology is. More interesting is thinking about future applications, when the oPhone functions more like a cell phone. Edwards is plugging away on creating the universal oChip, a customizable version of the oChip that can be programmed with whatever smell you can think of. You can imagine that this could be applied in healthcare to stimulate memories and relieve stress. Or, more personally, someday while visiting your grandparent’s house, you could send a text message to your brother embedded with smell emoticons that will conjure up the cookies your grandmother used to make.
That inherent emotional connection to smell is what Edwards is looking to exploit. And in his opinion, it’s only a matter of time before it’s commonplace. “If Twitter had this enormous impact with very limited information content exchange, you can imagine a complete aroma equivalent of that,” he says. “It’s fascinating how powerful that could be.”
The central hypothesis: Can a mobile application change the way we think about strangers?
Aim: The mobile app aims to create an intimate and anonymous connection between you and another person – a total stranger. Details – like name, age and address – will never revealed. For 20 days, both strangers are meant to continuously update each other about where they are, what they are doing, and eventually how they are feeling.
The rationale: In a world mediated through computing, our everyday lives are increasingly affected by complex and invisible systems. Some of these are algorithmic trades on the stock market, others are search results for information, movies, or a date. These systems often aspire to transparency, usability, and efficiency. Playful systems take a different approach, bringing the systems to the foreground as games, stories, narratives, and visualizations. Playful systems embrace complexity rather than conceal it, and seek to delight, not disappear.