Category Archives: Assignments

Making sound with film

By Kelvin Maestre

Laser cut film

This winter, I took a course in Artisanal Animation. For my final, I was tasked with making an animation using any of the mediums we had studied. I was personally drawn to direct on film animation. It wasn’t the images that I was after, but the sound. The biggest inspiration for the project was Norman McLaren. McLaren was an animator for the national film board of Canada. He specialized in direct on film animation. One of his most impressive feats in the medium was being able to create his own hand drawn sound. I had made previous attempts to emulate McLaren’s process with little success. This time around, I decided I wanted to tackle the project in my own way. My initial thought was to use the laser cutter in the Makerspace, but the test ended in nothing but burnt film.

I had to consider another option. The only other machine I could use was the silhouette cameo. I was hesitant at first because I had never extensively used the machine. To my surprise, it was very easy to use making my overall process faster. Now I had the tools to etch the film, now it was time for the sound.

Sadly you can not just plop and audio file in the Silhouette cameo’s software. The cameo works best with vector based graphics, so get our sound to be cut-able we need to make it into an image file first. A quick Google search for “Sound to waveform graphic” yielded a website that does just that (link will be below). Once I had a image of my sound file, I ported it into the cameo’s software, resized the audio image, lined it up, and hit send.

One of the images from my stop motion animation

I wanted to see how far I could push the technology, and so I tried to etch a stop motion animation I made of my hand. The animation was shot in black in white (easier for the software to recognize), each image was then combined into rows of 24, plopped into the cameo software, and cut. The results was imagery that didn’t reflect the source it came from. The machine had added a layer of abstraction (you can see the result at the end!)

Here is a link to a video of the film being run through a projector!

Below is a list of links I used to do this project, including a link to an in-depth guide on how to do this yourself:

Makerspaces & Pedagogy: How (and Why) to Integrate the Makerspace into Your Courses

Interested in adding 3D printing and other makerspace tools to your courses, but not sure how? Below is a presentation delivered to Lawrence University faculty about some of the whys and hows of using the LU makerspace with coursework.

View the full presentation (with notes) in Google Slides

Here’s a general outline of the presentation:

  • What is a makerspace and what’s in our makerspace?
  • Why makerspaces?
    • Hands-On, Kinesthetic, Active Learning
    • Problem Solving Process
    • Differentiation of Learning
    • Prepare for Work
    • Wellness
    • Engaged Learning at Lawrence University
  • Challenges of educational makerspaces
  • Examples of uses from projects at LU and elsewhere by discipline/general subject area
    • Studio Art
    • Art History
    • Theatre Arts
    • Film Studies
    • Math & Computer Science
    • Music
    • Humanities
    • Anthropology
    • Psychology & Neuroscience
    • Sciences
    • Innovation & Entrepreneurship
  • Things made by students outside of classwork
  • Things made by student organizations and campus departments
  • Where to find this stuff?
    • 3D print search engines & general repositories
    • Lesson plans
  • Designing
  • How to go about adding this stuff to your classes
  • Discussion

Since we presented this, we’ve also worked on a couple more ways to help faculty add the makerspace tools and equipment to their courses and research:

  • Makerspace Assignment Request Form: By letting us know about the intended learning outcomes and equipment they’d like to use, we can do some research and set up a time to meet to discuss assignment ideas.
  • Faculty 3D Printing Request Form: We’re happy to print objects that faculty may need for their teaching or scholarly/creative work. While faculty are welcome to come over and do their own printing, we know that sometimes this isn’t possible.
One slide from the presentation. Image links to the Google Slide of the full presentation.

More Virtual Reality at Lawrence University!

“One of the reasons I did that project was not only to explore my interest in it but also to give Lawrence the chance to pioneer in an art medium and form that not many schools are doing yet.”
Christopher Gore-Gammon ’17, on creating with virtual reality

There have been lots of exciting new projects and class assignments happening with VR (virtual reality) at Lawrence University! Take a look at the recent Lawrence University News post, “Use of VR tech now reality in classrooms; FaCE grant to ramp up pace” to learn all about them!

Sensors for Self-Driving Cars

By Wenchao Liu

Just like how humans have various senses, self-driving cars use various sensors as well. Roughly speaking, they have six types of sensors: radars, lidars, cameras, IMUs, GPS, and ultrasonic sensors. They differ in range, cost, and many other things, but I will try to cover the basics in this article.

Before we learn anything about sensors, it’s good to be able to categorize them. For instance, you can categorize sensors as being passive or active, meaning if they actively send out signals or just passively receive what’s in the environment. You can also categorize them by the type of signals they work with; cameras and lidars work with light, while ultrasonic sensors work with sound and radars with radio waves. In addition, it’s also good to have some physics knowledge, so you can know more about the pros and cons for each sensor.

Let’s talk about the simplest sensor first: ultrasonic sensors. As the name implies, they emit and receive ultrasound to estimate distance. It’s called ultrasound, because its frequency is over 20,000 HZ, the upper limit of the sound frequency that humans can hear. As a result, the sensor wouldn’t be constantly making noise, because we can’t hear it! Another good thing about ultrasound is that it has less diffraction, which I won’t get into. If you want to learn more about sound, take PHYS 107: Physics of Music!

The more complex sensor is the cameras, although it’s more common. Essentially, it uses photosensitive devices to capture light. I won’t get into the physics of such devices, because it’s just too complex. However, I will say that as a sensor, cameras give us a lot of information. A lot of information is not necessary always good, because sometimes we don’t just know how to make sense of it. Computer vision is a type of area where people are trying to make sense of such information, and to them, I say good luck!

Now it’s time to present the most exciting sensor of all: lidars! I am biased, but out of all the sensors, lidars are the best. Lidars are a very new invention, although the same technology as been used in barcode scanners and light shows. They work like ultrasonic sensors, but they use light rather other sound. As a result, you can get more frequent measurements. There’s a wonderful video about them from Sparkfun. Watch it, because it will tell you more than what I can write in a thousand words! Another good video is from Velodyne, the leading lidar supplier for self-driving cars. (Again, lidars are the best.)

For the rest of the sensors, I will just briefly mention them, because I either don’t know much about them or they are hard to write about. We have radars which have been used in the automotive industry for a long time. They are cheaper than lidars, but don’t give you as much detailed information. Then there’s the IMU, which stands for Inertial Measurement Unit. It uses accelerometers and gyroscopes, and sometimes magnetometers. Last but not least, we have all used GPS, as it’s embedded in most of the phones.

Here you have it, the six different sensors used in self-driving cars. Beware that each sensor is a sophisticated piece of technology beyond the scope of this article. Just to illustrate how sophisticated each sensor can get: rockets need some very precise IMU to go to space, and some self-driving car companies use the same type of IMU! How incredible!