Makerspaces & Pedagogy: How (and Why) to Integrate the Makerspace into Your Courses

Interested in adding 3D printing and other makerspace tools to your courses, but not sure how? Below is a presentation delivered to Lawrence University faculty about some of the whys and hows of using the LU makerspace with coursework.

View the full presentation (with notes) in Google Slides

Here’s a general outline of the presentation:

  • What is a makerspace and what’s in our makerspace?
  • Why makerspaces?
    • Hands-On, Kinesthetic, Active Learning
    • Problem Solving Process
    • Differentiation of Learning
    • Prepare for Work
    • Wellness
    • Engaged Learning at Lawrence University
  • Challenges of educational makerspaces
  • Examples of uses from projects at LU and elsewhere by discipline/general subject area
    • Studio Art
    • Art History
    • Theatre Arts
    • Film Studies
    • Math & Computer Science
    • Music
    • Humanities
    • Anthropology
    • Psychology & Neuroscience
    • Sciences
    • Innovation & Entrepreneurship
  • Things made by students outside of classwork
  • Things made by student organizations and campus departments
  • Where to find this stuff?
    • 3D print search engines & general repositories
    • Lesson plans
  • Designing
  • How to go about adding this stuff to your classes
  • Discussion

Since we presented this, we’ve also worked on a couple more ways to help faculty add the makerspace tools and equipment to their courses and research:

  • Makerspace Assignment Request Form: By letting us know about the intended learning outcomes and equipment they’d like to use, we can do some research and set up a time to meet to discuss assignment ideas.
  • Faculty 3D Printing Request Form: We’re happy to print objects that faculty may need for their teaching or scholarly/creative work. While faculty are welcome to come over and do their own printing, we know that sometimes this isn’t possible.
One slide from the presentation. Image links to the Google Slide of the full presentation.

More Virtual Reality at Lawrence University!

“One of the reasons I did that project was not only to explore my interest in it but also to give Lawrence the chance to pioneer in an art medium and form that not many schools are doing yet.”
Christopher Gore-Gammon ’17, on creating with virtual reality

There have been lots of exciting new projects and class assignments happening with VR (virtual reality) at Lawrence University! Take a look at the recent Lawrence University News post, “Use of VR tech now reality in classrooms; FaCE grant to ramp up pace” to learn all about them!

Sensors for Self-Driving Cars

By Wenchao Liu

Just like how humans have various senses, self-driving cars use various sensors as well. Roughly speaking, they have six types of sensors: radars, lidars, cameras, IMUs, GPS, and ultrasonic sensors. They differ in range, cost, and many other things, but I will try to cover the basics in this article.

Before we learn anything about sensors, it’s good to be able to categorize them. For instance, you can categorize sensors as being passive or active, meaning if they actively send out signals or just passively receive what’s in the environment. You can also categorize them by the type of signals they work with; cameras and lidars work with light, while ultrasonic sensors work with sound and radars with radio waves. In addition, it’s also good to have some physics knowledge, so you can know more about the pros and cons for each sensor.

Let’s talk about the simplest sensor first: ultrasonic sensors. As the name implies, they emit and receive ultrasound to estimate distance. It’s called ultrasound, because its frequency is over 20,000 HZ, the upper limit of the sound frequency that humans can hear. As a result, the sensor wouldn’t be constantly making noise, because we can’t hear it! Another good thing about ultrasound is that it has less diffraction, which I won’t get into. If you want to learn more about sound, take PHYS 107: Physics of Music!

The more complex sensor is the cameras, although it’s more common. Essentially, it uses photosensitive devices to capture light. I won’t get into the physics of such devices, because it’s just too complex. However, I will say that as a sensor, cameras give us a lot of information. A lot of information is not necessary always good, because sometimes we don’t just know how to make sense of it. Computer vision is a type of area where people are trying to make sense of such information, and to them, I say good luck!

Now it’s time to present the most exciting sensor of all: lidars! I am biased, but out of all the sensors, lidars are the best. Lidars are a very new invention, although the same technology as been used in barcode scanners and light shows. They work like ultrasonic sensors, but they use light rather other sound. As a result, you can get more frequent measurements. There’s a wonderful video about them from Sparkfun. Watch it, because it will tell you more than what I can write in a thousand words! Another good video is from Velodyne, the leading lidar supplier for self-driving cars. (Again, lidars are the best.)

For the rest of the sensors, I will just briefly mention them, because I either don’t know much about them or they are hard to write about. We have radars which have been used in the automotive industry for a long time. They are cheaper than lidars, but don’t give you as much detailed information. Then there’s the IMU, which stands for Inertial Measurement Unit. It uses accelerometers and gyroscopes, and sometimes magnetometers. Last but not least, we have all used GPS, as it’s embedded in most of the phones.

Here you have it, the six different sensors used in self-driving cars. Beware that each sensor is a sophisticated piece of technology beyond the scope of this article. Just to illustrate how sophisticated each sensor can get: rockets need some very precise IMU to go to space, and some self-driving car companies use the same type of IMU! How incredible!

Build Your Own Self-Driving Car

By Wenchao Liu

Well, I meant a self-driving “RC” car, not a real car. However, if you are as good as George Hotz, who made a real car drive itself, please give it a try. When I was a junior, I knew I wasn’t George Hotz, so I decided to build a self-driving RC car. Well, wall-following RC car at least.

The first step was to find out what was on the Internet. If you just do a quick search, you will find a lot of different resources. When I just searched “self-driving RC car,” the first result was a self-driving RC car that uses one camera and one ultrasonic sensor. Another one that’s popular is the donkey car, which is bigger and has more instructions. They actually assembled it live in Denver during the Autonomous Vehicle Competition in 2017, which I was also part of. Well, why didn’t I get some camera time? The one I chose, however, was from from University of Pennsylvania, because it has the most detailed instructions, uses the biggest car and has the most powerful computing platform. In addition, JetsonHacks, a blog dedicated to NVIDIA Jetson Platform, has a lot of good resources for that as well.

I didn’t know back then, but as I learned more about the robotics industry, I realized that I made a good choice with my pick. The Raspberry Pi is cheap, but it comes with serious computation constrains. As a result, you can’t really run a standard Ubuntu operating system on it. NVIDIA Jetson platform, however, can be almost as small as a Pi and comes with Ubuntu pre-installed. In addition, uses ROS, Robot Operating System, which is actually used by robotics companies, including those working on self-driving cars, such as Cruise Automation and Baidu.

After I picked the project, I immediately applied for funding. As it was my senior experience project, I was able to get funding through Lawrence University. Without the funding, I’d not be able to buy the parts for the project. If I were just some guy working in my garage, I’d probably just pick the cheapest option. Since Lawrence could pay for it, why didn’t I just get the best parts? If you don’t have the money and still want to build something with more than a Pi and a camera, check out HyphaROS and Linorobot.

After I had the parts, I buried myself in the project. No matter what project you are working on, you will encounter problems. Problem-solving will be a time-consuming task. I will skip through all the pain I had, because Angela knew it all, as I worked in the library and complained to her all the time. That’s something important as well: make sure you have someone to complaint to!

As self-driving cars get more and more attention, more universities are teaching courses about the technology with RC cars. I recently discovered two useful websites, one from UC Berkeley and one from UC San Diego.