We here at Ethical Futures Lab really like robots. This issue is chock full of them: some that help their human partners, and some that replace human labor. We take a deeper look at the designs and considerations that spring up depending on which of these approaches their manufacturers take. We also touch on the unseen risks within AI and the possibility of mining social capital.
—Matt & Alexis
1: Social currency, literally
Social tokens — digital currency that is backed by the reputation of a person or brand — seem to be the new hotness in the cryptocurrency world. Fans can buy a person’s tokens in exchange for access to perks and exclusive content, like a private chat with an influencer, access to a music video, or special merchandise. We’re generally crypto-skeptics around here, but this movement could lead to interesting possibilities for independent artists and creators.
Devin Mancuso, a friend of EFL, has been playing in this space and shared his thoughts with us:
“The social money scene is really fascinating to me right now. Basically individuals or brands are launching their own cryptocurrencies (typically an Ethereum-based token), and using them as an alternative monetization source for their work and communities. Many of these tokens have their own digital communities with Discord servers, newsletters or websites that can only be accessed if you own a certain amount of the token (anywhere from $5 - $150+), so the tokens encourage loyalty among fans, and can also be gifted, traded or used to buy goods or services within the community. The social currency is valued by the network, with price seemingly impacted by a mixture of hype, reputation, the value of the experience and the rewards on offer by the token owners. It’s honestly a deep, deep rabbit hole. Things start off innocently enough, you buy a few tokens to check things out, join a few Discord servers, but before you know it you’ve got wallets full of all these colorful tokens and you’re trading them with folks from all over the world. Roll seems to be the go-to platform at the moment for social money, maybe I should launch a $DEVIN token…?”
Creators and influencers have a new way to monetize their efforts and reward their loyal followers.
2: The risks of large language models
If you’re interested in the topics we often talk about at Ethical Futures Lab, then you likely saw the controversy unfold at Google after Timnit Gebru was dismissed (Google’s language was that she “resigned”, a claim Dr. Gebru disputes) over her co-authorship of a paper that described four risks of AI models built on large stores of language. Bias and the inability to move toward anti-racist or anti-sexist language was one risk, which we and others have covered extensively in the past. Three others are stark, somewhat unexpected, and bear repeating.
She and her co-authors described a tendency to ascribe meaning to the outputs, a fallacy in that the model doesn’t understand language per se, but is good at manipulating it in believable ways. Another risk is the opportunity cost of the research efforts and funds that go into these models, starving out other methods that may be more nascent or experimental, but could ultimately bring better outcomes.
The risk we found most alarming is one we’ve pointed toward in the past: the massive power requirements for models like these. The paper studied an algorithm called “Transformer”, a precursor to GPT-2 and GPT-3, and the BERT model that powers Google Search. In one version of this model that reviewed 213 million inputs and added a “neural architecture search”, training that model just once would produce as much CO2 as five gas-powered automobiles would throughout their entire useful lives. Even in smaller cases, like BERT, training a model once creates as much CO2 as a flight from New York to San Francisco, and as the paper reminds, these models are trained and retrained multiple times throughout their development.
We certainly don’t have the full picture of what occurred between Dr. Gebru and her former colleagues, but we do applaud the focus she brings to studying the ethics underlying artificial intelligence. Side effects and externalities must be understood to grasp the full impact of these nascent technologies.
The company’s star ethics researcher highlighted the risks of large language models, which are key to Google’s business.
3: What happens when we make people cheaper than robots?
Uber, Lyft, DoorDash, and other gig economy companies poured over $200 million into lobbying and marketing efforts supporting Proposition 20, a California ballot initiative that classifies drivers and delivery workers as contractors, not employees. This allows Uber and others to benefit from the labor of their drivers without providing the protections and benefits that are afforded to full-time employees.
Uber’s support of this initiative was unsurprising, as it directly affects their cost structures and their ability to profit. What was surprising occurred shortly after the law passed: Uber announced it was giving its self-driving tech, as well as $400 million, to a company called Aurora and ending its research into driverless cars. After spending over $1 billion on R&D on the issue over the last few years, it isn’t hard to do the math here. Now that Uber can reliably employ drivers at low costs and with few labor protections, human drivers are suddenly far cheaper than developing reliable driverless cars. Further, it highlights that for Uber, one of the benefits of a driverless ecosystem was the massive profit margins they could realize if they could get the tech to work properly.
20 months ago: Uber touts how it's invested a billion dollars in driverless tech
11 months ago: Uber hypes its driverless unit as core to its future profitability
1 month ago: Proposition 22 passes in California
0 months ago: Uber shutters its driverless R&D unit. https://t.co/mUEPO3FDnm https://t.co/yNb5VsK0XU
4: The veneer of compliance
Back in July, Walt Disney World put in place a rule that visitors who took their masks off while on rides would not be given access to their on-ride PhotoPass photos. This week, visitors discovered that Disney was testing a change to that policy, where maskless guests could receive their photos, but those photos were digitally altered to place virtual masks over their faces. As soon as this news became public, Disney announced a reversal: “In response to guest requests, we tested modifying some ride photos. We are no longer doing this and continue to expect guests to wear face coverings except when actively eating or drinking while stationary.”
Our guess is this was likely a weird PR move, trying to appease guests’ desire for pictures without having photos out in the world that showed people violating their rules, or implied that the parks might not be enforcing safety regulations. But as digital media alteration gets better, and synthetic media becomes more accessible, we wonder where else tactics like this might lead. When might it be easier to change or fake the record of an experience? Where else could corporations use media to fake compliance? Might our personal photos and videos become less a recording of memory and more of a creative rendering of a moment?
Walt Disney World Photoshopped masks on unmasked guests in commemorative ride photos but has since stopped the practice.
5: Hey, I'm walkin' here!
Kudos to Christopher Kent for pointing us to signals 5 and 6 this week!
Pennsylvania’s newest pedestrians don’t walk and travel much faster than your average person. Pennsylvania is the latest state to legalize autonomous delivery robots, and it has classified them as pedestrians, which means they can use sidewalks and other spaces that have historically only been the purview of people despite the fact that they’re allowed to travel at up to 12 MPH. Delivery robots have typically been championed by companies like Amazon and FedEx, but have been opposed by advocates for pedestrians (not to mention labor unions). In a country where cars already dominate the usage of public space, it seems like an encroachment to allow these robots into the few places that still operate at the scale and speed of humans.
A Scientific American piece from last year delves into the issues and concerns around the deployment of these delivery robots, which range from their ability to navigate humans to humans’ desire to antagonize the robots: “A cautionary tale comes from the Canadian hitchBOT social experiment, in which a simple humanoid robot traveled across Canada and Europe—before ending up decapitated in Philadelphia at the start of its U.S. journey.” Good luck with your deliveries, Philly!
Last month, Pennsylvania legalized autonomous delivery robots, which can weigh up to 550 pounds without cargo…
Gita (pronounced with a soft-g, don’t get us started about that) is a new device manufactured by Piaggio, the company behind the iconic Vespa scooter. While the form and function seems quite close to the delivery robots we talked about above, the gita has been designed in ways that put it much more on a pedestrian scale.
The device “pairs” with its owner via machine vision, not through any Bluetooth or other wireless token. It feels personal, like a pet, and is programmed to follow behind you, carrying your belongings (shopping, sporting equipment, picnic supplies, or whatever you like), which means that it would rarely move faster than a typical person would walk. It’s also been specifically programmed to behave like a pedestrian, anticipating the movements of people around it and acting accordingly.
At this scale a device like this could be extremely useful, especially for people who have trouble carrying much weight for long stretches. As a shopping aid, however, I do have to wonder how it would handle the escalators at the mall.
The gita following robot carries your stuff and promotes an active lifestyle. Designed and built by Piaggio Fast Forward, a member of the Piaggio Group.
One curling robot
Six Signals: Emerging futures, 6 links at a time.
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue