Mercer U. professor and undergrad create control system that seamlessly integrates human motion with virtual and real environments

[A professor and an undergraduate student in the School of Engineering at Mercer University in Macon, Georgia in the US are using motion tracking sensors, virtual reality, drones, a robot, and artificial intelligence to create what sounds like an impressive system to create spatial presence and enable more effective teleoperation in a remote, real-world location. The story below from Mercer’s The Den website provides details; see the original version for two more images. –Matthew]

[Image: Dr. Hunmin Kim and student Shrey Patel in the lab where they are developing a state-of-the-art VR control system that seamlessly integrates human motion with virtual and real-world environments. Patel is wearing a glove that he uses to control a quadcopter drone, held by Dr. Kim, with his hand. Credit: Photo by Jennifer Falk]

Engineering research project combines virtual reality and the real world

By  Jennifer Falk
August 11, 2025

Imagine a search and rescue mission where a victim is trapped somewhere dangerous or inaccessible to rescuers. Perhaps he’s injured and stuck in a building threatening to collapse, or maybe he was swept away by a raging river. Either way, rescuers don’t know the victim’s exact location.

Now, imagine a rescuer putting on two specially made gloves and a virtual reality headset. While wearing the gloves, the rescuer controls two drones with his hands. The drones, equipped with cameras, send video back to the VR headset, creating a virtual world that mimics the real world for the rescuer. The rescuer navigates the drones with ease and precision. Serving as the rescuer’s eyes, the drones help locate the victim, so the rescuers can plan a safe extraction.

This is the future that Dr. Hunmin Kim envisions for the technology he is developing with Mercer University student Shrey Patel. Dr. Kim, assistant professor of electrical and computer engineering in the School of Engineering, is pioneering a state-of-the-art VR control system that seamlessly integrates human motion with virtual and real-world environments. There are two parts to the project.

“The first one is by using a hand gesture, we try to control a multi-drone system,” he said. “Secondly, using a voice, we can also control a ground robot.”

Patel, a rising junior majoring in computer engineering, has been working on the project with Dr. Kim since November. He’s continuing to work on the project this summer through the Mercer Undergraduate Research Scholar Training Initiative, also known as MURS. MURS is a 10-week summer program that provides Mercer undergraduate students the opportunity to participate in cutting-edge research with faculty mentors.

“We want to innovate in the field of interactivity, more human-to-machine connections,” Patel said. “Just simply like moving, let’s say a drone, with the gestures of your hands, these types of technologies already exist, but we’re trying to make it more accessible and easier to work with for the average person.”

For the drone project, Dr. Kim and Patel have developed a glove that a user can wear to control a quadcopter drone. The glove is equipped with probes that are detected by cameras set up in the lab. The cameras read the hand motions and send them to a computer program that controls the drone. Patel wrote the program using artificial intelligence to generate the base code, which he then debugged, modified and improved.

“There’s only so much you can do with a remote control,” Patel said. “If you have more of a feel for the technology you’re using, then you might be able to get more output out of the drone or the car or whatever technology you’re trying to control.”

A remote or a joystick may only allow a person to control a single drone, as opposed to multiple drones, Dr. Kim said.

“For example, using this motion, the drones are spreading,” said Dr. Kim, moving his hands apart. He then moves his hands closer together. “And then using this motion, they are focusing on a specific area.”

The ground robot Dr. Kim and Patel are working with is a robotic car.

“The ground robot is very different from the drone because we’ve attached a camera with a microphone onto it, so it can detect your voice, and we have an AI model so that it can understand what you’re saying,” Patel said. “If you say, ‘forward,’ the car is able to move forward just by your voice, and if you say something like, ‘follow me,’ the camera of the ground robot turns on, and it can detect a specific color, and once you bring that color near the camera it can follow that color.”

Both these projects have the opportunity to transform how humans interact within virtual spaces and real environments. They allow for precise and intuitive control and have potential uses far beyond the imagined search-and-rescue scenario at the beginning of this article. Other applications may range from inspecting bridges and buildings to facilitating more advanced remote exploration and surveillance operations.

Patel will continue to work with Dr. Kim on the project in the fall. He plans to pursue software engineering after graduation but doesn’t want to sit at a desk all day. This project has shown him how to work in the field and still interact with the environment.

“This project is a good way to develop different algorithms and different software to supplement in more physical environments,” he said. “I can program this car. I can program this drone and move it how I want to. Writing code and implementing software is really beneficial in these two projects.”


Comments


Leave a Reply

Your email address will not be published. Required fields are marked *

ISPR Presence News

Search ISPR Presence News:



Archives