Here’s how researchers are using IoT to learn how you react to autonomous vehicle technology

Researchers at Mcity are investigating the human reactions towards autonomous vehicle technology.

Zoey Ren article author avatarZoey RenJanuary 29, 2019
Here’s how researchers are using IoT to learn how you react to autonomous vehicle technology

Michigan has been leading innovation in the automotive industry for more than a century and continuing to innovate with autonomous vehicle technology. This June, Mcity at the University of Michigan launched the first driverless shuttle project in the U.S. to focus on how people react to driverless vehicles as a way to gauge consumer acceptance of the technology.

Mcity is a public-private research center working to advance connected and automated vehicles. The center funds research, operates the Mcity Test Facility, and works with partners to deploy connected and automated vehicles on public roads.

With two shuttles transporting students, faculty, and staff on the University of Michigan campus, the Mcity Driverless Shuttle research project collects data to understand vehicle performance, roadway interactions, and passenger attitudes.

After browsing their website and shuttle case study, my curiosity urged me to call Tyler Worman, Data Architect at Mcity, to find out more about how they are using Particle products in their research.

Note: The interview has been condensed and edited for clarity. The questions are shown below in strong italics, and Tyler’s responses are in plaintext.

What do you do at Mcity as Data Architect?

I am responsible for the data collection systems at the Mcity Test Facility and on-road labs like the shuttle. I lead our software development team to build out internal web applications that aggregate data and make it available and usable for researchers. For example, with the shuttle dataset, say we have 5,000 hours of video, and a client says, “I’d like to see every time it encountered a pedestrian.” It is not enough to just hand over 5,000 hours of video. Instead, we would use machine learning to pull out video frames with people in the shots.  

The project focuses on user-behavior research and data connection, so what is the data you are collecting and for what research goals?

There is not much data available about human interactions with autonomous vehicles. Generally, the driverless vehicles on the road today look distinctly different from normal cars. Do people trust them? Does consumer trust grow over time? Can the shuttle improve campus transportation? In what situations can the shuttle be used? These are the questions our team and our member companies are interested in.

We have interior cameras mounted on the shuttle to collect data for internal usage. Instead of looking at the actual words spoken in the audio, we’re interested in trying to derive emotion from tone or volume. We will also identify if people wear seatbelts, where they are sitting, and how they react if they come to board with someone else.

We have exterior cameras outfitted with different sensors to record things like the state of the doors or the speed of the shuttle. The GPS unit onboard gives us accuracy down to two centimeters, so we know exactly where the vehicle is and what different motions the vehicle is making.

Are there any interesting things you noticed during the road test?

I’ve seen cases where we pull up to a stop sign, and someone waves the shuttle through. They know it is a driverless shuttle and they maybe don’t trust it yet, so they wave it through like “No. You go first.” The fact is, the shuttle is following the rules of the road, so if you pull up to the stop first, you should really be going. When we first launched, some people noticed the shuttle because it looks different, but it is an electric vehicle, and it’s quiet. If the pedestrians are focusing on their phone, they don’t react differently.

What Particle products are you using for the project and how does it work?

We use Particle Electron, along with a GPS device and cell modem in the Asset Tracker Kit, for location tracking. We added the indicator lights button on it, and it’s installed inside the shuttle. Students, faculty or staff can go on their phones to see where the shuttle is. Service like this is set up to be used by a central dispatcher, which is how traditional bus systems work. Since we are a small group with only 10 employees, we didn’t have a central dispatcher. We use Particle device to give control to the conductors so they can hit the button inside and put the shuttle on its route.

In addition, we use Particle device to monitor the whole data pipeline and verify the data collection systems are working. If the data collection systems are down, we are just paying to shuttle passengers without getting additional data points.

Do you use Particle for other projects as well? And how is your overall experience?

We used Particle Photon in the Mcity Test Facility. My director picked Particle because, at that time, Particle products were the easiest to fuse to the network and allow control from the cloud. I can remotely push firmware and changes to the device. If we find a bug in the device mounted up in a high area, we don’t have to roll a truck to get the board down. I can flash it over the air.

When it came to the driverless shuttle project, I knew Particle would be the easiest way for us to connect our physical device to many external services through the use of webhooks. We couldn’t do the processing we wanted on the Particle board, but we could do it up in the Cloud, and you already had a Cloud interface built in and webhook infrastructure. That’s what we leverage to trigger our Amazon web services to do the work we need. It makes it really easy for us to click these projects together as if they were legos.

The new Mesh device is also of interest to me. We have Wi-Fi available at most of our facility, but in the areas that don’t have Wi-Fi available, rather than install the Wi-Fi, we can go with an affordable mesh device and have it communicate with everything else out there. That is something I would like to explore as well.

I heard Mcity is planning on a new rail crossing project. Can you tell me more about it?

We have two rail crossings at Mcity Test Facility, just like what you would see at a rail crossing with flashing lights and gated arms. The actual control systems for those arms are quite expensive and complicated. We don’t have a real train that runs through the test facility, so we broadcast the message through connected vehicles. With Particle device, I can control the lights, lower the gates, sound the chimes to replicate a real-world rail crossing within the facility.

What do you see the future of driverless vehicles? Who do you think will benefit the most from it?

I think it is a long way off, but the idea with using driverless vehicles to reduce congestion in crowded cities is fascinating. Uber or Lyft is essentially a fleet management service, but there is a driver. What if the only thing you need is a fleet and you are not reliant on the drivers being there to take you to your destination? This would change your whole experience with mobility.

We are not perfect as humans when we drive. I might make mistakes that I don’t even realize, but when cars are communicating, they can put an end to some of these mistakes. Some advanced driver assistance systems can tell you when you’re going too fast, or you haven’t hit the brakes, but there are still parts they lack, for example, blind corner tests. The connected vehicles can pick up information and stop you when your view is obscured. If we advance these technologies to where they can prevent accidents from happening and save more lives on the road, that would be a huge win for everyone.

 

What are some challenges you are facing? And what’s next for 2019?

Our plan for next year is to research how to simulate different weather conditions in our test facility. It is easy to create rain scenes, but how do you generate actual rain beside other than waiting for it to fall? If we can develop reproducible weather, I see Particle products may help make that happen.

We will also need to research how to better utilize the sensors in those situations because they are quite expensive. People say it would be nice to have an autonomous car for $30,000, but sometimes one of the sensors costs that much. So we need to figure out what combination of sensors at what cost can get us the best performance and the best driving features.