Retail arbitrage and mediated reality
This week we focus on the impact of computing and technology on our physical spaces. Rather than being fixed concepts, we continually find that privacy, intimacy, and even reality are malleable. As technology can increasingly both read and write data in our physical environments, we find ourselves confronting new ethical dilemmas: How much data should we collect? How mediated do we want our physical realities to become? How might the hacks and workarounds to these new realities further change our experiences?
1: Changing the conversation on privacy
The public discussion about data privacy has tended to focus on the ways in which companies can be better stewards of the massive amounts of personal data they are collecting and storing. But what if, instead, we were discussing whether those companies have a right to be recording that data at all? In a recent essay, Maciej Cegłowski argues that we need to fundamentally reframe the privacy conversation to talk about “‘ambient privacy’—the understanding that there is value in having our everyday interactions with one another remain outside the reach of monitoring, and that the small details of our daily lives should pass by unremembered.”
When the amount of data being collected about us is so massive and the systems doing the collecting are so complex, it is useless to talk about individual agency, and relying on corporations to treat our data ethically only gets us so far, especially when those ethics are in conflict with their profit motive. Ceglowski makes an analogy to a dragon and its hoard of gold: “The problem with the dragon, after all, is not its stockpile stewardship, but its appetite.”
When all discussion takes place under the eye of software, in a for-profit medium working to shape the participants’ behavior, it may not be possible to create the consensus and shared sense of reality that is a prerequisite for self-government.
2: A real-life block list
A convenience store in Tacoma, WA has tested a facial-recognition system designed to prevent repeat offenses of theft and other crime. As designed, the system would require patrons who arrive between 8 p.m. and 6 a.m. to look into a camera to gain entry to the store. If the person is wearing a mask, or matches a “flagged” photo in the store’s database of suspects, the door stays locked.
Staff at the store have to mark particular patrons as suspects, as the system is not (yet?) connected to any law enforcement databases of any kind. This means that reformed criminals won’t be blocked from service, but that it’s possible for store employees to put people on this list for any reason they see fit. Given that facial recognition tech tends to misidentify women and people of color more often than white men, this system could lead to people being blocked who shouldn’t, though it’s unclear if store staff has a way to override the system’s decisions. Also not discussed here are how a patron would be told why their entry was denied, or how a person could appeal having been placed in the local database of suspects.
When convenience meets surveillance: AI at the corner store
At some convenience stores, an A.I. system may bar the door if you look like the suspected criminals in its database, or if you’re wearing a mask. That’s part of…
3: Eyewear that can help preserve your privacy
Designer Scott Urban has developed a pair of glasses that are designed to defeat some forms of facial recognition, and which also can confuse IR-based security cameras. The “Phantom” glasses are currently (until July 24) listed on Kickstarter with a projected ship date of April 2020. They are designed to reflect IR light back at its source, foiling Apple’s Face ID technology, and tricking security cameras into thinking your face is a light source.
His earlier product in this line, called Ghost, reflected back both visible and IR light in order to foil flash photos and more traditional security tech that uses the visible spectrum. Urban designed the Phantom glasses to help people take back their personal privacy without drawing attention to themselves — since it’s only infrared light that’s affected, it’s unlikely to be noticed by bystanders.
Phantom glasses block facial-recognition tech and look good
Phantom glasses are designed to thwart various forms of facial-recognition tech.
4: Retail arbitrage and Amazon nomads
One of the things we find fascinating are surprising hacks, workarounds, and emergent behaviors that evolve in response to digital ecosystems. These behaviors often reveal unintended consequences of technologies that can range from the sublime to the truly dystopian. One recent example is this piece from The Verge that describes a subculture of folks who have taken on nomadic lifestyles in order to maximize their profits as Amazon merchants.
If you weren’t already aware, most of what you buy on Amazon is sold not by Amazon itself, but by millions of merchants who use Amazon as a storefront and fulfillment mechanism. A number of these merchants “have figured out that the best way to find lucrative products is to be mobile, scouring remote stores and chasing hot-selling items from coast to coast”. Thus, the largest digital retailer has brought about the inverse of the 19th-century traveling salesman: a class of nomadic merchants who rove around the country in RVs collecting goods to stock Amazon’s shelves.
Nomads travel to America’s Walmarts to stock Amazon’s shelves
To stock Amazon’s shelves, nomadic merchants travel the backroads of America in search of rare soap and coveted toys. Some live in RVs and vans, driving to Walmarts and Targets and sending products to Amazon as they go.
5: The slippery slope of altering shared reality
In the recent beta release of Apple’s iOS 13, there is a feature called FaceTime Attention Correction, which silently “corrects” your gaze to simulate direct eye contact between the participants on the call (due to the distance between the camera and screen, the uncorrected image shows you looking slightly away from your interlocutor). Matt Webb wrote up some of the potential concerns this feature raises, including objections from those who are autistic or neuro-atypical that the design of this feature reifies what “correct” social behavior looks like. More broadly, he touches on the expectation that a video call is an unmediated view of another person, but that expectation is now being called into question by features like Apple’s, as well as Zoom’s “touch up my appearance” option.
These features may seem fairly innocuous, but if we fast-forward to a near future where heads-up augmented reality displays are the norm, how might our shared reality become one that is heavily mediated at all times? Think ad-blocking for physical spaces, Snapchat-like face filters for real life, editing out “unpleasant” aspects of your surroundings (homeless people, perhaps?). We need to quickly start discussing ethical guidelines for what can or cannot be mediated, who gets to control our shared reality, and what consent or notification should look like in these situations.
A lengthy ramble through responses to that FaceTime Attention Correction tweet
I had mentally categorised video calls as a whole as “unmediated” and Attention Correction is reminding me that it they are very much mediated and we will have to develop personal skills and social norms to tell authentic and inauthentic apart.
6: Culinary arbitrage and antisocial dining
Uber Eats has begun testing a new “dine-in” feature in Austin, TX, whereby users can order their favorite foods, and choose to go to the store and eat there. While the stated purpose is to shorten wait times and simplify payment, one side effect would be that a diner could have a meal without having to speak with anyone else.
For fast food restaurants like Starbucks or McDonald’s, this doesn’t seem too foreign, or very different from existing order-ahead models. Expanding the idea to more server-based restaurant experiences could shift our expectations of interactions with restaurant staff, alter our use of common spaces, or even redefine what constitutes a “restaurant” altogether.
Uber Eats is quietly testing a 'dine-in' option for customers who want to eat in restaurants
Customers in Austin, Texas, can now choose to “dine-in” in addition to ordering delivery or pick-up when using the Uber Eats app.
www.businessinsider.com • Share
One trippy thing: “Latent Cinema” by neural networks
The video below is a generative piece that shows the landscape of Stockholm transforming and morphing before our eyes. The film was created by a GAN and is a “narration” through 300,000 photographs of the city.
Dear friends, I’m extremely excited for our current research for “Latent Cinema” idea. This piece is narrated from ~300k collective photographic memories of Stockholm. Our GAN browser practically opens a camera into the space in the mind of a machine. https://t.co/M6RhBibnYg
Six Signals: Emerging futures, 6 links at a time.
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue