NSF-funded Quori: A standardized robot to help accelerate research around human-robot interactions

[In a 14-minute segment from the Oregon Public Broadcasting (OPB) program Think Out Loud, robotics researcher Bill Smart describes an ambitious project to design, create and distribute 50 ‘standardized’ robots to academic researchers to, as he says, “try and figure out what parts of human-human psychology and social interaction carry across to human-robot interaction” – i.e., advance CASA/MASA (Computers Are Social Actors/Media Are Social Actors) research. Follow the link above to the original version of the story to listen to the broadcast, and the link below to the project’s website for more information. –Matthew]

[Image: OSU researchers Bill Smart and Naomi Fitter interact with a Quori robot. The two are leading a project to build and distribute 50 of the standardized robots throughout the research community. Credit: Shivani Jinger/Oregon State University]

How a standardized robot could help accelerate research around human-robot interactions

By Gemma DiCarlo (OPB)
June 21, 2023 (Broadcast June 28, 2023)

Repeating experiments and replicating results are key parts of successful scientific research. But in the field of robotics, working with different software platforms on different machines means that replication can be difficult. A $5 million National Science Foundation project led by Oregon State University aims to help with this challenge by building and distributing 50 standardized robots throughout the research community. The robot’s expressive face and gesturing arms are meant to help researchers study how humans and robots should interact in the workplace and other social environments.

Bill Smart is a professor in OSU’s robotics program and one of the leaders of the project. He joins us to explain how a standardized robot could help accelerate research.

The following transcript was created by a computer and edited by a volunteer:

[snip]

Dave Miller:  What’s the big idea behind Quori?

Smart: This is a project that builds on a previous project from the University of Southern California and the University of Pennsylvania where they designed and built a small number of these robots that are being used to investigate human-robot interaction, how robots and humans can work together effectively.

Our work is to take that initial set of results and build more robots, distribute them to a whole bunch of research groups in the US, and then have that common hardware platform so just as you said, I can replicate someone else’s experiments, make sure the results hold, and try and move a little bit more from engineering, just building a thing and doing an experiment once, to a science, where we can replicate and maybe falsify experiments.

Miller: Quori is the name for this robot model, how has it changed since it was first envisioned and built?

Smart: The first iteration of it was interesting in that the project that designed it actually reached out to human-robot interaction researchers across the country and asked them if you had a standard robot, what would you want on it? Would you want it to look sort of humanoid? Would you want arms? Would you want the head? And it used input from that in the original design. What that ended up with is sort of a humanoid looking robot about 4.5 ft tall. It’s got a head, it’s got two very simple arms, it’s got wheels, it doesn’t have legs. And that hasn’t changed much since that initial design.

But what we’re going to do with this follow on project is take some of the lessons learned. The arms right now don’t have elbows, so maybe we’ll put elbows in and maybe make some other modifications, based on the feedback that we get from the small number of people who have got it right now so that it can be more useful for that larger group.

Miller: My understanding is that the Quoris can bow at the waist. Why include that as a functionality?

Smart: So there’s always a tension between how simple your robot is and how expensive it is. If you were to build a robot with a lot of what we call degrees of freedom, a lot of joints, then it gets pretty expensive because building those joints, installing those joints is expensive. So you try to pick the smallest number of joints that you can where they’re useful. And it turns out that when you’re interacting with people, having a joint at your waist is pretty useful. Certainly in Asian cultures where bowing is a big part of social interaction, you can imagine that a robot that’s able to do that is important. But even in the US, changing the height of your head as you’re leaning into people and listening to them, or doing those little physical motions that help grease a conversation. It turns out that the waist joint was something that a lot of people wanted, and wanted to see if it was effective in helping these interactions.

Miller: I was really intrigued by the way that Quori has been designed to allow for a whole variety of facial expressions without, say, eyebrows on motors. Can you describe the way the projection is gonna work?

Smart: You can think of Quori’s head sort of as a fish bowl with a projector inside it. We can project facial features on the inside of this ball-shaped head, and you can see them from the outside. And the advantage of that is if you were to put a mouth on the robot mechanically and then have it open and closed, then first of all that’s quite expensive, but it’s also quite slow. A lot of robot motors don’t operate as fast as your joints operate. And so we could move things around on the face, eyebrows or the mouth, but it wouldn’t be as fast, as responsive.

By projecting it on, you get most of the feeling of a face and facial expressions, but you can do it much faster. And you can also change the shape of the mouth, change the shape of the eyes, change the shape of the eyebrows much more simply if it’s a projection than if it’s real physical hardware.

Miller: Another, you can call this a limitation or just a design decision, is Quori is not engineered to hold things. The point is not to have it bring you a cup of coffee. What will it do? What are possible scenarios where this kind of robot could interact with a human?

Smart: The main purpose of Quori is to try and figure out what parts of human-human psychology and social interaction carry across to human-robot interaction. You could imagine, when you first meet someone, there’s sort of a comfortable distance you stand from them. You don’t quite face them directly, you make a certain amount of eye contact. A lot of the stuff that we do naturally as humans, there’s an open question of if a robot were to do those behaviors, will it get the same effect as another human doing it with you? Humans have a comfortable distance they stand, how far away from you should a robot stand when it’s interacting with you?

Miller: How much do you or other scientists who have already been studying human-robot interactions, how much can you already answer that particular question? What do you already know?

Smart: There’s a lot of work in this. I think a lot of the human-human social interaction rules kind of apply to human lifelike robots. So if the robot’s about the size of a human, it should stand about as far away from you as a human should stand to make you feel comfortable. And I think part of that is because our response to people standing at a certain distance or acting in a certain way is very baked into our brains. Humans are very social animals, and so we’re evolved to expect certain interactions from the things we interact with because we’ve only ever interacted with humans before. It kind of makes sense that the robots will have a similar set of rules.

Miller: It seems like you said an important piece there, “if it’s a humanoid robot.” The less something is like a human, is it fair to say the less we would expect the existing social rules to necessarily apply?

Smart: I think that’s an interesting question because we have machinery in our brains to recognize these social signals, when someone looks at you or when someone stands a certain distance from you. And I think it’s an open question of how much of that carries across to robots that are very much not humanoid. There’s some work that says if you have a robot arm, just a disembodied arm that hands off a package to you, that hand off is going to be more effective if you put a pair of eyes on that arm.

Miller: Eyes on the arm? Not eyes on a head next to a torse that the arm is attached to, eyes on the arm?

Smart: Well, it kind of works both ways. If there’s some sort of visible eyes that are connected with the robot, it could be on the arm, it could be next to the arm or part of a larger system. But when we’re handing stuff between humans, there’s this interaction that we do. You make eye contact, your hand does a certain set of motions, and you’re kind of evolved to expect that. And so if you can replicate that on an arm, a robot arm that’s giving you stuff, you’ll drop the package, you’ll fumble the handoff less. And so I think there’s an interesting set of questions there of how much of that expected human social stuff is baked into our brains, and we should be taking advantage of even for robots that aren’t human.

Miller: You mentioned that in some Asian cultures, bowing is much more embedded in daily life than in many Western cultures. How much do cultural differences make their way into our interactions with robots?

Smart: I think right now there isn’t a lot of cultural specificity in the robots that are out there. But one of the things that’s really exciting about having a common platform like this is we can do an experiment, we can start doing those cross-cultural experiments to see if people stand at different distances in different cultural settings. Or if people on the east coast walk faster towards a robot than on the west coast.

Miller: Greet their robot overlords in a faster way.

Smart: Right. There’s this feeling in the field that there are going to be cultural differences, both within the US and across different countries. And I think having a common hardware platform where you can do the same experiment as much as you can in different places really lets us dig into doing science around that, and actually figuring it out for real.

Miller: We’re talking at a time when a lot of for profit companies are all racing to figure out the best AI so they can make more money for their shareholders. Is there a tension between for profit robotics companies and this open source scientific or academic side, where the whole point of this is let’s all work on this together so we can all learn more together?

Smart: I’m not sure I would call it a tension. If you’re a for profit company, you’re obviously trying to develop a product and sell it. Your goals are different. I think the open source research community is more interested in answering these very fundamental questions of interaction rather than driving towards a specific product or a specific end use.

Miller: What are you most excited to be able to learn as these Quoris are produced, 50 of them going all over the country in the coming years, and with robot scientists studying human robot interactions?

Smart: I’m most excited to see what people will do with them, beyond the simple experiments that we talked about right now. Maybe more than the science, what really motivates me about this project is the prospect of bringing together scientists and students from all over the US and building a community of people who are thinking about this. And then taking work from one lab and then replicating it in another lab and building on the work of others. I think when you have these networks of people working on the same set of problems on the same hardware where they can really collaborate on a meaningful level, you see a lot of progress a lot more quickly than individual labs working on single things. So I think it’s the people part of it that really motivates me, the network part of it.

Miller: Bill Smart, thanks very much for joining us. I appreciate it.

Smart: Thanks very much.

Miller: Bill Smart is a professor in the robotics program at Oregon State University and he’s one of the leaders of a project to build and distribute standardized robots that will help researchers better understand the interactions between humans and robots.

CONTACT “THINK OUT LOUD®”

If you’d like to comment on any of the topics in this show or suggest a topic of your own, please get in touch with us on Facebook, send an email to thinkoutloud@opb.org, or you can leave a voicemail for us at 503-293-1983.


Comments


Leave a Reply

Your email address will not be published. Required fields are marked *

ISPR Presence News

Search ISPR Presence News:



Archives