arrow-next arrow-scroll arrow-short arrow check close dropdown facebook instagram linkedin search twitter

Q&A with MSTI Students About Their Robotics Stockroom System

Date: February 07, 2022

The market for new consumer-facing robots is growing rapidly, with the service robotics market expected to increase 300% by 2026. The robotics industry is responding in kind, shifting focus from industrial automation to assistive and service robots that support everyday health, efficiency, and wellbeing. MS in Technology Innovation robotics students Daniel Azua, Yifei Fang, and James Muir are applying the robotics programming, navigation, computer vision, and machine learning skills gained during their 15-month degree program in their culminating Launch Project. They are combining a mobile Fetch robot with a Kinova manipulator arm to create an autonomous, automatic pick-and-place system that will enable fellow students to remotely retrieve gear from a storage room.

We sat down with Daniel to learn more about the team’s work:

Tell me about the problem that your team worked to solve.
Right now in the Prototyping Labs, staff have to go to the Cheqroom storage to get items for students, and that’s a manual, time-consuming task you also see in libraries, warehouses, and more. Our challenge was to make an automated robotic system which would deliver items to students in the Labs, all while navigating through a complicated and busy environment.

What sort of user testing did you do?
First, it was about how students would check out items from Cheqroom. We had to create our own webapp so users could request an item from the robot, and we had to do user testing on our webapp to make sure it was clear and understandable. We added in a status, so people knew the robot was on its way or had arrived. Users need feedback.
Next, we focused on how people interact with the robot.

We ran a test by having the robot navigate through the crowded Labs, but we didn’t tell people there was a test. We got a lot of data about how people naturally interact with robots in the real world.

We noticed that, in the beginning, people would look right at it. After a while, they learned they could ignore it. They felt safe and that the robot wasn’t going to hit them. They felt more comfortable they could do whatever they wanted and that the robot would stop or turn around.

Talk to me about the different roles on your team.
Our solution has a few major parts: the web app, the Kinova robotic arm station, and the Fetch mobile robotic base, as well as communication between all three of these. We divided the responsibilities up between ourselves, and we each focused on a specific area. We all work collaboratively on everything, but we each have an area that we’re specifically responsible to ensure is working.

What assistance have you received from your industry partners Fetch and Kinova?
We have a meeting with them every week, as well as an open Slack channel where we can ask questions. For example, we had a hardware question with our Fetch robot, so we opened a ticket and they were able to help us. Fetch is an open-source research platform, so they were able to help us easily. And we can help them too; Yifei posted a change suggestion in the existing GitHub teaching others how to solve a problem. There’s a big community with both Fetch and Kinova.

Tell me your career goals and what you’re looking to do right out of school.
Right now, I’m interested in robotics hardware – I would like to be able to build sensors and the hardware of a mobile robot. Through this project I learned that there aren’t many low cost robots for students, and I want to build those platforms. Long term, I want to have my own company that uses robotics to measure the performance of athletes. They have very special requirements.

What did you learn about working with robotics?
I learned there are so many challenges. We’ll have an idea like “The robot will place the item on the table,” but then we see, “Okay, the robot isn’t tall enough,” and then you realize you have to tell the robot which plane is the floor and which is the table. You can’t make assumptions. The arms also have a specific range that they can grasp, so we must account for that. There’s a lot of work to even just pick up a pencil.
We also learned that people aren’t used to interacting with robots. They don’t know how the robot will interact, or that it will automatically avoid users. They were nervous and tried to get out of its way. After our testing, we decided to put voice alerts on our mobile bot so that people knew it was there and that it would navigate around them.

What would you like to say to prospective robotics students in the MSTI program?
Get involved with the robotics community, especially the Robotic Operating System (ROS). All you need to do is install it and start. There is a lot of information available, and also a lot of work in that area. If you get interested and join the MSTI, you’ll get to work with real robots, which is hard. That’s why we decided to tackle a problem in the Prototyping Labs, because it’s a specific application—we’re trying to solve a problem that actually exists and improve the experience for students. Here, you’ll have a lot of resources and access to the actual hardware.

Every year, ambitious students like Daniel, Yifei, and James join the MSTI to gain expertise and explore new fields of study to meet their professional goals. This experience prepares them to pursue careers as project managers, software developers, data scientists, and technical leaders. Meanwhile, our alumni around the world are contributing to various fields ranging from cloud computing, robotics, augmented reality to biotechnology.
Are you ready to advance your career with us? Learn more about the MSTI or sign up to attend one of our upcoming virtual info sessions.