Tag Archives: self-driving car

Three Approaches to Making Self-Driving RC Cars

By Wenchao Liu

There are numerous technologies used in a real self-driving car. However, when it comes to self-driving RC cars, people normally just use a subset of those technologies. The different technologies use different sensors and different algorithms. Here, I will go through three popular ones.

The simplest approach involves no sensors whatsoever. How is that possible? Well, it’s possible if you can just manually drive through the course once, record the steering and throttle inputs, and replay them at the same starting point. The drawback of this approach is that the car drifts, meaning the car deviates further from the original trajectory as it goes further. That said, this simple approach can tackle all the autonomous RC car challenges as long as they satisfy a few conditions. First, you have to have access to the course before the race. Second, the course does not change. Finally, you can place the car exactly where you originally put it when you recorded the data. It also helps if the rule only allows one car per race, so no other cars would pump into yours.

The second approach involves a camera and a neural network. The flagship product of this approach is the Donkey Car, which uses only one camera and one Raspberry Pi. You first have to drive through the course a couple times to collect training data for the neural network. Because of the computation constrains of the Pi, you have to upload the data to another more powerful computer, train the neural network there and transfer the trained model to the Pi. I have no personal experience with this approach, because cameras, computer vision and neural networks are too much for me! That said, I know for a fact that this approach doesn’t work in total darkness and might not work well if lighting changes a lot.

The third approach is Lidar-based, and it is my favorite. The pipeline is to use SLAM, collect waypoints by manually driving through the course, and use motion planning and trajectory tracking. SLAM refers to simultaneous localization and mapping, which means the car localizes itself and maps the environment at the same time. After the car has a map and knows where it is, you can manually drive the car and collect waypoints. Once you have the waypoints the car should hit, you use motion planning to plan for the trajectory through the waypoints and trajectory tracking to make the car follow the desired trajectory. This approach is the most powerful, because it can tackle dynamic environments. For instance, if a car stopped in front of you, your motion planning algorithm will give you another path to go around the car.

Here you have it, three approaches for making an RC car drive autonomously in a course. Real self-driving cars are definitely more sophisticated, but some ideas are very similar. For instance, Tesla’s approach is using mostly cameras and no Lidars. Other companies such as Waymo and GM Cruise use both cameras and Lidars. Only time will tell which one will prevail!

Sensors for Self-Driving Cars

By Wenchao Liu

Just like how humans have various senses, self-driving cars use various sensors as well. Roughly speaking, they have six types of sensors: radars, lidars, cameras, IMUs, GPS, and ultrasonic sensors. They differ in range, cost, and many other things, but I will try to cover the basics in this article.

Before we learn anything about sensors, it’s good to be able to categorize them. For instance, you can categorize sensors as being passive or active, meaning if they actively send out signals or just passively receive what’s in the environment. You can also categorize them by the type of signals they work with; cameras and lidars work with light, while ultrasonic sensors work with sound and radars with radio waves. In addition, it’s also good to have some physics knowledge, so you can know more about the pros and cons for each sensor.

Let’s talk about the simplest sensor first: ultrasonic sensors. As the name implies, they emit and receive ultrasound to estimate distance. It’s called ultrasound, because its frequency is over 20,000 HZ, the upper limit of the sound frequency that humans can hear. As a result, the sensor wouldn’t be constantly making noise, because we can’t hear it! Another good thing about ultrasound is that it has less diffraction, which I won’t get into. If you want to learn more about sound, take PHYS 107: Physics of Music!

The more complex sensor is the cameras, although it’s more common. Essentially, it uses photosensitive devices to capture light. I won’t get into the physics of such devices, because it’s just too complex. However, I will say that as a sensor, cameras give us a lot of information. A lot of information is not necessary always good, because sometimes we don’t just know how to make sense of it. Computer vision is a type of area where people are trying to make sense of such information, and to them, I say good luck!

Now it’s time to present the most exciting sensor of all: lidars! I am biased, but out of all the sensors, lidars are the best. Lidars are a very new invention, although the same technology as been used in barcode scanners and light shows. They work like ultrasonic sensors, but they use light rather other sound. As a result, you can get more frequent measurements. There’s a wonderful video about them from Sparkfun. Watch it, because it will tell you more than what I can write in a thousand words! Another good video is from Velodyne, the leading lidar supplier for self-driving cars. (Again, lidars are the best.)

For the rest of the sensors, I will just briefly mention them, because I either don’t know much about them or they are hard to write about. We have radars which have been used in the automotive industry for a long time. They are cheaper than lidars, but don’t give you as much detailed information. Then there’s the IMU, which stands for Inertial Measurement Unit. It uses accelerometers and gyroscopes, and sometimes magnetometers. Last but not least, we have all used GPS, as it’s embedded in most of the phones.

Here you have it, the six different sensors used in self-driving cars. Beware that each sensor is a sophisticated piece of technology beyond the scope of this article. Just to illustrate how sophisticated each sensor can get: rockets need some very precise IMU to go to space, and some self-driving car companies use the same type of IMU! How incredible!

Build Your Own Self-Driving Car

By Wenchao Liu

Well, I meant a self-driving “RC” car, not a real car. However, if you are as good as George Hotz, who made a real car drive itself, please give it a try. When I was a junior, I knew I wasn’t George Hotz, so I decided to build a self-driving RC car. Well, wall-following RC car at least.

The first step was to find out what was on the Internet. If you just do a quick search, you will find a lot of different resources. When I just searched “self-driving RC car,” the first result was a self-driving RC car that uses one camera and one ultrasonic sensor. Another one that’s popular is the donkey car, which is bigger and has more instructions. They actually assembled it live in Denver during the Autonomous Vehicle Competition in 2017, which I was also part of. Well, why didn’t I get some camera time? The one I chose, however, was from f1tenth.org from University of Pennsylvania, because it has the most detailed instructions, uses the biggest car and has the most powerful computing platform. In addition, JetsonHacks, a blog dedicated to NVIDIA Jetson Platform, has a lot of good resources for that as well.

I didn’t know back then, but as I learned more about the robotics industry, I realized that I made a good choice with my pick. The Raspberry Pi is cheap, but it comes with serious computation constrains. As a result, you can’t really run a standard Ubuntu operating system on it. NVIDIA Jetson platform, however, can be almost as small as a Pi and comes with Ubuntu pre-installed. In addition, f1tenth.org uses ROS, Robot Operating System, which is actually used by robotics companies, including those working on self-driving cars, such as Cruise Automation and Baidu.

After I picked the project, I immediately applied for funding. As it was my senior experience project, I was able to get funding through Lawrence University. Without the funding, I’d not be able to buy the parts for the project. If I were just some guy working in my garage, I’d probably just pick the cheapest option. Since Lawrence could pay for it, why didn’t I just get the best parts? If you don’t have the money and still want to build something with more than a Pi and a camera, check out HyphaROS and Linorobot.

After I had the parts, I buried myself in the project. No matter what project you are working on, you will encounter problems. Problem-solving will be a time-consuming task. I will skip through all the pain I had, because Angela knew it all, as I worked in the library and complained to her all the time. That’s something important as well: make sure you have someone to complaint to!

As self-driving cars get more and more attention, more universities are teaching courses about the technology with RC cars. I recently discovered two useful websites, one from UC Berkeley and one from UC San Diego.

Robotics Competitions

By Wenchao Liu

In 2004, an agency in the Defense Department decided to sponsor a competition, where self-driving cars would compete with each other in a dessert. Short for DARPA, the Defense Advanced Research Projects Agency was interested in the technology because they wanted to put it in their military vehicles. Many institutions participated in the competition, and none completed the course. In 2005, DARPA decided to sponsor the competition again, and this time, more teams completed the course. In 2006, DARPA decided to take a break, and came back in 2007, where teams were competing in an urban environment. This series of events eventually jump-started the self-driving car industry, and many participants are still living in the past and working on the technology.

Fast forward a decade to 2017, I was a happy college student at Lawrence with a newly-built wall-following RC car. During the course of my project, I often bought electronics from Sparkfun. One day I noticed that they were hosting an Autonomous Vehicle Competition, AVC in short. I decided to enter, and did not do well. Well, I did so badly that I didn’t even participate, because I knew my car wouldn’t go far. Some participants’ cars didn’t even spin at the start line, and I was wondering if they anticipated that. If they did, did they just want to show everyone that they had a car?

In the summer of 2018, I was working on an internship, and took a day off to participate the AVC again. It was their 10th year, and I definitely won the participation award. Well, I didn’t even win the participation award, because, again, I didn’t participate. Many teams, again, wanted to show that they had a car, although it wasn’t even spinning. I saw some new faces, and some familiar faces. I told myself that I would keep coming back.

The DARPA challenges gave birth to the self-driving car industry, and AVC inspired me to keep working on my RC car. One of the reasons that those competitions are so fun to me is that you can get to know people. There are other software competitions, but those competitions don’t require you to be physically present somewhere. Robotics competitions do!

There are many regional and national robotics competitions. If you want to find out what is happening in your area, just search on the Internet, especially on Meetup. Those happen mostly weekly or monthly. For instance, there’s a monthly robot RC car competition in Oakland. There are also national ones that happen annually. I mentioned the AVC earlier, which is in Denver. There is a similar one sponsored by University of Pennsylvania. If you want more variety of competitions, there is the National Robotics Challenge in Ohio. Whatever robot you are building, you should definitely try finding a competition, because you will meet interesting people and win at least the participation award!