MASA presence: Diverse and thought-provoking reactions to delivery robots

[Although she doesn’t use the term, the author of this story from HuffPost explores her own and others’ medium-as-social-actor presence responses to the increasingly common delivery robots in our public spaces. The story features quotes from two of our colleagues and friends; see the original version for three more images. –Matthew]

[Image: Josephine Rios, of Service Employees International Union — United Healthcare Workers West, asks a delivery robot if it wants to sign a petition to tax billionaires. She said since so many jobs are going to AI, why not ask a robot to sign. Credit: Myung J. Chun via Getty Images]

You’re Not Alone. Many People Feel Sympathy For Delivery Robots — And The Reason Why Is Fascinating.

Over 30 years of research show that we treat computers as if they are social entities as long as they meet three criteria, an expert says.

By Brittany Wong
March 26, 2026

Is there a word in any language to describe the strange secondhand embarrassment you feel toward a robot making a fool of itself?

There really should be. It’s how I felt when I watched this snazzily dressed dancing robot eating dirt at a tech expo. Or, more recently, when I watched another dancing robot have to be restrained at a hot pot restaurant after going rogue, smashing dishes and sending chopsticks flying.

I might use the same word to describe the worry-slash-sympathy I feel when I see a food delivery bot barreling down the sidewalk on the way to deliver a $16 burrito to someone. Maybe my concern is warranted; there’s been a number of cases in cities across the United States where people have vandalized those delivery bots — kicking them, tipping them over.

But it’s just a bot! I tell myself. There are real issues to be concerned about ― and I am concerned about them! ― but for some reason, I can’t help but feel for these little guys, too.

Is that weird? I’m not the only stupid human to feel this way, right?

“No, not at all,” S. Shyam Sundar, a professor and the director at the Penn State Center for Socially Responsible Artificial Intelligence, recently told me.

According to Sundar, 30 years of research has shown that we treat computers as if they are social entities as long as they meet three criteria: They’re interactive, they use natural language, and they perform roles that were hitherto filled by humans.

Delivery robots (and dancing humanoid bots at hot pot restaurants) meet that criteria, so it’s no surprise that humans show social responses to them (especially when we see or hear about a bot being abused or mistreated).

“There’s an automatic social response that we have when we see someone being bullied,” Sundar said. “It’s a script that we rely on without thinking too much; we don’t pause to say, this is a machine and therefore undeserving of such social responses from me.”

People will claim that they don’t apply politeness norms to machines when asked, but Sundar said study after study has shown that humans do indeed display polite behavior toward computers, perceiving human qualities machines don’t actually have and worrying about hurting their feelings.

“It’s not a conscious act but an automatic response that we are hardwired to show as humans,” the professor said.

Robots are designed to look cute so you won’t beat them up.

Then there’s a matter of the design: We’re supposed to be wary of our robot overlords, but it’s hard to be when they have LED eyes and human names like Sergio and Jamie.

Fleets of delivery robots built by AI-driven companies like Avride and Coco Robotics are purposely designed to look cute. Those anthropomorphic and zoomorphic features are meant to help the bots stand a better chance of survival out on the mean streets of American cities.

“It’s very important to us to design our robots in such a way that people connect with them and feel comfortable,” said Felipe Chávez, the co-founder of Kiwibot (now rebranded as robot.com), in a 2020 interview with The Bold Italic.

Designers of social technologies are never designing function alone; they’re also designing feeling, explained Kwan Min Lee, a professor of new media who specializes in human–computer interaction at Nanyang Technological University in Singapore.

“The rounded edges, diminutive scale, gentle movements and almost childlike demeanor of many bots are not incidental,” he told HuffPost. “They make the machines appear approachable, harmless, even deserving of protection.”

The public’s affectionate response is at least partly a testament to how skillfully some robots have been designed to fit into human emotional life, he said.

Some of the bots are even rainbow washed. (Of course they are!) In 2023, Serve Robots introduced rainbow-painted robot Marsha, named for trans rights activist Marsha P. Johnson.

But people hate them, too. Here’s why someone might want to kick a bot.

There are advantages to delivery via robot: Compared to a car, a robot delivering food has a smaller environmental footprint and a positive effect on congestion. The machines arguably make restaurant delivery faster and more efficient.

But there’s plenty to be critical about when it comes to delivery bots: They’re marketed as “autonomous,” but they still need real people keeping an eye on them, and many companies offshore these jobs to cut costs. Cars sometimes have to swerve to avoid hitting them, and there have been instances where they’ve impeded the path of wheelchairs.

A bot may not be very trustworthy, either.

“People think they are your friends, but they’re actually cameras and microphones of corporations,” Joanna Bryson, a longtime AI scholar and professor of ethics and technology at the Hertie School in Berlin, told CNN. “You’re right to be nervous.”

For every person that finds the bots endearing, there’s another person that finds them irritating, uncanny or emblematic of something larger and more troubling, Lee said.

“A delivery bot can become a proxy for anxieties about automation, inequality, surveillance or the impersonality of the platform economy,” he said. “So the impulse to lash out at the robot is often not really about the machine itself; it’s about the economic and social order the machine has come to represent.”

Others may dwell on the “sheer eeriness” of a robot performing the duties of a delivery worker, Sundar said.

“There is even some research that says making them very human-like can backfire because people are sometimes repulsed by the uncanniness of the resemblance,” he said. “This is called the ‘uncanny valley’ effect.”

Then, of course, there are the people who just want to break stuff, in the immortal words of Limp Bizkit.

“For some, psychologically, there’s something provocative about an object that seems socially present but remains defenseless,” Lee said.

Whether you hate them or love them, humans will have to get used to bots.

Serve Robotics ― the creator of a lot of the sidewalk delivery bots you see around Los Angeles and other cities ― predicts the shift from humans to robots in the last-mile logistics industry will create a $450 billion opportunity by 2030.

As the market grows and robotics and AI become more integrated into our lives, the subject of human-bot interactions is going to be rife with these kinds of new, weird feelings I’ve experienced lately: Some will feel weirdly sympathetic toward the plight of the working bot or a dancing robot crashing out. Others will call them a clanker and kick them out of spite. (Or because they’re a bored teenager out for the night, wanting to kick a robot.)

Sundar, who has spent years studying human-computer interactions, hopes there’s less of the latter behavior.

“For every person that vandalizes a self-driving car, let’s hope there are many folks who will undo the damage and help clean it up as an act of social responsibility,” he said.

Ultimately, these robots may be teaching us more about ourselves than the machines, Lee said. Borrowing sociologist Sherry Turkle’s phrase, he thinks delivery bots can be considered “evocative objects” ― things that invite reflection, projection and emotional response. In this case, the object shows how readily human beings extend social and moral concern beyond biological life.

“So the deeper question is not simply whether humans will care about robots,” he said. “It is what it means when companies intentionally design machines to elicit attachment, sympathy, and protective feelings. That is not just a technical matter; it is also a cultural and ethical one.”


Comments


Leave a Reply

Your email address will not be published. Required fields are marked *

ISPR Presence News

Search ISPR Presence News:



Archives