Carbon-neutral design and robot eyes
This week, in honor of Friday’s Climate Strike, we review how the promise of unlimited electricity may have contributed to tech’s worst habits, think about how we look at each other and how robots look at us (or just seem to), and discover how small decisions made decades ago might be contributing to sexist perceptions of our voices.
1: Carbon-neutral experience design
Gauthier Roussilhe has been working with web designers to explore how you design websites with a limited energy budget. The results of this exploration are eye-opening and reveal how much of our modern internet products are actually built with the assumption of abundant energy. In this interview with We Make Money Not Art, Roussilhe discusses the example of embedding a map on a website. Embedding a Google Map is expensive in an energy-constrained environment, so it requires deeper thought about actually providing a user with what they need in that context rather than “reproducing a non-choice”. He points out that “some of the issues we commonly have with web design (addiction, privacy, etc.) are completely fading away once you rely on limited energy. Which shows that a lot of the problematic issues we associate with big tech companies are linked to abundant energy.”
Separately, this Fortune piece goes deeper into some of the statistics, including the impact of everything from streaming services to bitcoin mining on energy usage and carbon dioxide output. When we look at these numbers, it’s clear that there is a reckoning that needs to be made. So much innovation around AI, video streaming, 5G networks, and more is predicated on unlimited energy availability. How might our thinking and creativity change once you bring in new constraints around resource usage? What kinds of experiences and technologies are aligned with a carbon-neutral future?
Can you design a website on a limited energy budget?
An interview with Gauthier Roussilhe.
we-make-money-not-art.com • Share
2: Windows to the robot soul
Designers have long explored how to help people understand the intent, purpose, and accessibility of robots. Fiction points to steps along the uncanny valley, from robots that are clearly robots to ones that are indistinguishable from humans. In some cases, it might just mean sticking some googly eyes on them to make them more welcoming.
This piece from OneZero starts on the simple end, describing a 2005 study that found the addition of eyes contributed to overall “prosocial behavior” in game participants, leading to more generosity and sharing behavior. Designers are also considering how to make a robot’s eyes recognize when they’re making eye contact with a person, and react accordingly to keep contact and express emotion or other status information.
These interventions may also go beyond eyes, recognizing our inherent instinct to anthropomorphize these objects. This talk from Leila Takayama in 2012 discusses several such techniques; my favorite is having a robot scratch its head while it’s processing input, so people around it recognize that it’s thinking and not just inactive.
Eyes are the window to a robot’s soul
The design of a robot’s eyes is key to helping it better interact with human beings.
3: I'm not shrill, your microphone is
This New Yorker essay on bias in audio technology is a perfect illustration of that concept. It lays out the ways in which a number of regulatory and technical decisions in the early days of radio have had lasting consequences for how gendered voices are heard and perceived. In the 1920s, in an attempt to reduce signal interference, Congress limited the bandwidth available to each radio station, which led to most broadcasters and equipment manufacturers limiting their signals to “voiceband”, or a range between 300 and 3400 Hz. However, when determining this range, researchers and regulators primarily looked at male voices, which tend to be lower. The result? “Voiceband frequencies reduced the intelligibility of female speech by cutting out the higher frequency components necessary for the perception of certain consonants.”
In addition, reduced intelligibility due to the clipping of women’s upper range was misunderstood as women speaking too softly, so engineers tended to turn up the overall volume for women, which made their voices sound more piercing. Even now, with newer technologies, many data-compression algorithms “disproportionately affect high frequencies and consonants, and women’s voices lose definition, sounding thin and tinny”. All of these detailed choices have led to audio technology making women sound, on the whole, more shrill, harsh, and unintelligible than men. If you want to go deeper into these kinds of histories, check out Caroline Criado Perez’s book Invisible Women: Data Bias in a World Designed for Men.
A century of “shrill”: How bias in technology has hurt women’s voices
How gendered bias and failures in the design of audio technology have affected how women’s voices are perceived in society.
4: A pledge to make tech a force for good
Techfestival, held in Copenhagen for the third time this September, has released a “Tech Pledge” describing the work technologists must do to create positive products, consider unintended consequences, and fight for human rights in their work. This follows the “Copenhagen Letter” from 2017 and last year’s catalog of 150 Principles, which each tried to get at similar considerations. The pledge takes these ideas further, expressing a much more urgent call-to-action to change the way we think about building technologies. While it follows much of the same form as the 2017 letter, its principles are more direct, wide-ranging, and urgently expressed.
While no single effort like this will end dark patterns in design, lax data privacy policies, algorithmic discrimination or any other malice enabled by technology, every attempt to document better principles and encourage better behavior is beneficial. Hopefully pledges like these can be used to call out actors who violate these principles and remind us all how to design and develop more humane technology.
Join 1251 people who have taken The Tech Pledge
Take the Tech Pledge and join the movement.
5: Algorithms as plausible deniability
Amazon is the latest tech company to be accused of manipulating their search results to privilege their own products and maximize their profits, even as they purport to be a platform for everyone. Engineers at Amazon told The Wall Street Journal last week that they were pressured to emphasize Amazon’s private-label products (including Amazon Basics and others) in search. This follows on recent revelations that Apple and Google have also tweaked their search algorithms to privilege their own profit over other factors. As Sidney Fussell aptly points out in The Atlantic, the lack of legibility or visibility into the “secret sauce” of tech platform’s algorithms allows for a great deal of plausible deniability on the part of these companies:
“Algorithms interpret potentially millions of data points, and the exact path from input to conclusion can be difficult to make plain. But the effects are clear. This is a very powerful asymmetry: Anyone can notice a change in search results, but it’s extremely difficult to prove what caused it. That gives algorithm designers immense deniability.”
The secret sauce of search engines gives tech companies an abundance of plausible deniability.
6: Facebook Portal and honest disclosure
Facebook introduced new hardware options in its Portal line of telecommunications products this week, including a “clip-on smart camera” called Portal TV. It connects to your television screen, making it the interface for your Facebook calls, and includes the ability to co-watch videos with friends (only Facebook Watch at the moment which…limits its utility.)
What’s fascinating about this is that Facebook has added a clear and concise description of how recordings may be captured and parsed by people. As we talked about in our last issue, much of the distrust of so-called “virtual assistant” devices has been the opacity with which data is collected, stored and evaluated. By clearly describing this action, Facebook has attempted to explain its human interventions and why they’re necessary. While we would have preferred a “No, don’t send to people” option parallel with the big “OK” button, this is a start.
Facebook launches Portal TV, a $149 video chat set-top box
Facebook wants to take over your television with a clip-on camera for video calling, AR gaming and content co-watching.
One 3D thing: depth-mapping abstract art
Check out this whole thread of neural networks turning abstract artworks into trippy 3D video art. We’re particularly digging the Gerhard Richter ones.
Trying depth-mapping on abstract art. I don't know what you would call 'working properly' here, it's just fascinating to see a neural network try to interpret a Jackson Pollock painting as a 3D landscape. (Convergence, 1952) https://t.co/RaII2RHatA
Six Signals: Emerging futures, 6 links at a time.
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue