The planned, the weird, and the unexpected
This week, we look at a number of unexpected collisions — between public and private, between divergent fields, and between skin and your phone. We see how unintended consequences are rampant and that technology needs to be shaped by thoughtful choices rather than excused as inevitable progress.
1: Technology isn't inevitable
This piece by Rose Eveleth encapsulates so much of what we care about at Ethical Futures Lab. Specifically, she argues against the way in which technologists excuse themselves from the ethical repercussions of their products by framing all applications as an inevitable state of evolution. But nothing about technological progress is natural — it reflects a series of decisions made by humans, and we have the power to make those choices for good or ill. Eveleth further ties the idea of inevitable progress to fundamental threads in American culture: the idea of manifest destiny, settler colonialism, and the march forward at any cost.
The argument that technology is inevitable and uncontrollable is often used as an excuse to prevent bad behavior, or explain a negative side-effect. Just this week Mark Zuckerberg deployed a version of it, saying that combating misinformation can’t be the responsibility of companies like his, and that it may be “something we have to live with.”
The article also touches on an idea we discussed a few weeks ago, namely that the introduction of a new choice is not neutral. There are costs to opting out of the ability to surveil your children, track your productivity, or whatever the latest innovation offers.
The myth of inevitable technological progress
Tech moguls see facial recognition, smart diapers, and surveillance devices as inevitable evolutions. They’re not.
2: Tickle Me iPhone
Researchers at Telecom Paris have developed a material that feels like skin, and can be stroked, pinched, or tapped to create a digital interaction. Inspired by a simple premise (“I want to pinch my phone”) the interface has been programmed to interpret a variety of signals and apply human understanding to them:
Sudden hard pressure on the skin is associated with anger and tapping is a means of seeking attention, while sustained contact and stroking are associated with providing comfort.
The technology makes use of copper wire threaded between two layers of silicone, allowing the material to bend, compress, and pinch. Future versions may include hair (!!) and temperature effects, which could be used to subtly communicate various statuses to its user.
Aside from the pretty satisfying wow/ick reaction this brings, making digital interfaces feel more life-like could change how we think about our devices. Studies have shown that simple reactions to environmental stimuli (e.g. a Furby held upside-down will cry out “Me scared”) will make kids, and even adults, treat a device in a more “alive” way. What would it mean if your phone blushed? How would you respond?
Creepy human-like skin makes your phone ticklish and pinchable
A smartphone case made from artificial human-like skin responds to being pinched, tickled and stroked to add an extra layer of interactivity to the device
3: Privately owned public spaces
This week, Yahoo! Groups announced that it is shutting down and will be deleting all of its content, taking down years worth of data and conversations from its 115 million users and 10 million groups. Among the many organizations that are affected by this move is Ofcom, the UK’s communications regulator. It turns out that Ofcom has been using Yahoo! Groups to manage all of the UK phone number assignments (who knew?). It’s moments like these when we realize how much public institutions, from governments to community groups to NGOs, depend on privately-owned infrastructure and services. Those private companies have no formal obligation to serve the public good, and yet they are being used by groups who absolutely do. This is a problem in obvious ways, like when a service shuts down, but also in much more subtle ways, such as when a platform’s privacy practices are at odds with a public institution’s mission and regulations.
Today it was announced that Yahoo! Groups is shutting down, and taking with it a piece of critical national infrastructure: the Oftel Yahoo Group which is used for managing UK phone number assignments.
Yes, really: See Ofcom's website https://t.co/zuT1fPIzka https://t.co/RsPyGlwfxs
4: Are you allowed to say that? Am I?
How much does it matter who says a particular sentence in determining whether it’s offensive?
Researchers at the University of Washington and Carnegie Mellon University found that messages written in “African American English” dialects are more likely to be classified as “toxic” or hate speech when compared to other messages. They initially discovered this unexpected correlation in reviewing several widely-used hate speech datasets, and confirmed that models trained on those datasets, like Perspective API, further reify and propagate those biases. From there, they conducted an experiment to see how these biases might be mitigated and found that when individuals rating message content are told (a) the race of the person who wrote it or (b) are asked to consider the dialect and context of the message, messages were far less likely to be incorrectly rated as offensive. In other words, priming people to consider the connection between race and language made them more thoughtful and aware in their perception of that language.
From this analysis, we can assume that the models currently in use to identify tone, content, and context of written and spoken words to be flawed, and biased against non-White dialects. We can also see a path to improving those models, simply by acknowledging that not everyone speaks with the same syntax and acting accordingly.
The risk of racial bias in hate speech detection
We investigate how annotators’ insensitivity to differences in dialect can lead to racial bias in automatic hate speech detection models, potentially amplifying harm against minority populations.
homes.cs.washington.edu • Share
5: Nip, tuck, and smile for the camera
This is definitely the first time we’ve included a link from the Journal of the American Society of Plastic Surgeons, and it shows just how unexpected the reach of new technologies can be. The journal provides a primer for plastic surgeons to talk to their patients about how plastic surgery may impact patient interactions with facial recognition systems. This kind of guide is just one illustration of how two completely unrelated fields suddenly begin to collide in unintended ways.
Facial recognition technology: A primer for plastic surgeons
Plastic surgeons should be prepared to answer questions from patients about the potential effects of plastic surgery on facial recognition technology performance.
6: This message will self-destruct in 3 months
This small addition to Google’s data settings panels is so simple and obvious, and yet, revolutionary in how large tech companies think about their users’ data.
Earlier this year Google introduced controls that allow you to set your browser and search histories to automatically delete after 18, or even just 3, months. (By default this data is stored in perpetuity.) In early October they added YouTube watch histories to the list of activities that can be forgotten, and soon will introduce similar tools for Google Maps.
The benefits for this move for both users and companies can’t be overstated. For the individual, this means a very short history of online activity can be discovered, hacked, or subpoenaed, letting what’s in the past stay in the past. Further, recommendations based on that data may improve, as they could more accurately reflect your recent changes in taste, location, or preference. For companies, a far smaller data footprint means lower storage costs, lower liability in case of breach, and more accurate (and therefore, effective) recommendations of all kinds.
We hope that tools like this become more common, even in the absence of regulation like GDPR and CCPA, to help individuals better manage their digital histories.
How to set your Google data to self-destruct
Google has now given us an option to set search and location data to automatically disappear after a certain time. We should all use it.
One archaic thing: Explaining scrolling
The first computers often came with thick manuals and tutorial programs that explained the basic interactions people needed to understand in order to operate these new machines. This graphic perfectly introduced the concept of scrolling to an audience still unfamiliar with computing.
This illustration in the Apple IIe manual is always so effective for helping students think about how seemingly natural interfaces were once new & unnatural, & had to be explained. Thanks to @elikaortega for introducing me to it. https://t.co/QhtwuiezU2
Six Signals: Emerging futures, 6 links at a time.
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue