PLAY MAKE LEARN: Making and Gaming in the Liberal Arts

Poster that includes a venn diagram in the center with a circle for library, makerspace, and game studies, with access to resources, critical media literacy, and technology literacy in the overlaps. Tools to understand the world is in the middle where all circles overlap.
Click on the image to zoom in Flickr, or click here to view a high resolution version.

I (Angela) recently had the opportunity to attend PLAY MAKE LEARN on the UW-Madison campus. It is an excellent annual conference that represents the intersection of a lot of what I do here at Lawrence University- with library instruction, teaching in the makerspace, and teaching game studies. This prompted me to submit a poster to visualize how all of these intersect and share some common themes that are crucial skills for today’s learners. The idea of seemingly different areas of study coming together reminded me of the goal of liberal arts education- so I named my poster “Making and Gaming in the Liberal Arts.” It was wonderful to talk with so many technologists, librarians, K-12 educators, professors, game designers, and graduate students during the poster session (and throughout the conference). While looking at this poster, one librarian pointed out that the library often plays a large role in technology literacy. While those are not connected on the diagram in the poster, I certainly agreed with her. Another librarian commented that perhaps the library on the top could be seen as an umbrella- which I decided was intentional. 🙂

PLAY MAKE LEARN was a rewarding and engaging conference, and I look forward to returning next year!

Makerspaces & Pedagogy: How (and Why) to Integrate the Makerspace into Your Courses

Interested in adding 3D printing and other makerspace tools to your courses, but not sure how? Below is a presentation delivered to Lawrence University faculty about some of the whys and hows of using the LU makerspace with coursework.

View the full presentation (with notes) in Google Slides

Here’s a general outline of the presentation:

  • What is a makerspace and what’s in our makerspace?
  • Why makerspaces?
    • Hands-On, Kinesthetic, Active Learning
    • Problem Solving Process
    • Differentiation of Learning
    • Prepare for Work
    • Wellness
    • Engaged Learning at Lawrence University
  • Challenges of educational makerspaces
  • Examples of uses from projects at LU and elsewhere by discipline/general subject area
    • Studio Art
    • Art History
    • Theatre Arts
    • Film Studies
    • Math & Computer Science
    • Music
    • Humanities
    • Anthropology
    • Psychology & Neuroscience
    • Sciences
    • Innovation & Entrepreneurship
  • Things made by students outside of classwork
  • Things made by student organizations and campus departments
  • Where to find this stuff?
    • 3D print search engines & general repositories
    • Lesson plans
  • Designing
  • How to go about adding this stuff to your classes
  • Discussion

Since we presented this, we’ve also worked on a couple more ways to help faculty add the makerspace tools and equipment to their courses and research:

  • Makerspace Assignment Request Form: By letting us know about the intended learning outcomes and equipment they’d like to use, we can do some research and set up a time to meet to discuss assignment ideas.
  • Faculty 3D Printing Request Form: We’re happy to print objects that faculty may need for their teaching or scholarly/creative work. While faculty are welcome to come over and do their own printing, we know that sometimes this isn’t possible.
One slide from the presentation. Image links to the Google Slide of the full presentation.

More Virtual Reality at Lawrence University!


“One of the reasons I did that project was not only to explore my interest in it but also to give Lawrence the chance to pioneer in an art medium and form that not many schools are doing yet.”
Christopher Gore-Gammon ’17, on creating with virtual reality

There have been lots of exciting new projects and class assignments happening with VR (virtual reality) at Lawrence University! Take a look at the recent Lawrence University News post, “Use of VR tech now reality in classrooms; FaCE grant to ramp up pace” to learn all about them!

Sensors for Self-Driving Cars

By Wenchao Liu

Just like how humans have various senses, self-driving cars use various sensors as well. Roughly speaking, they have six types of sensors: radars, lidars, cameras, IMUs, GPS, and ultrasonic sensors. They differ in range, cost, and many other things, but I will try to cover the basics in this article.

Before we learn anything about sensors, it’s good to be able to categorize them. For instance, you can categorize sensors as being passive or active, meaning if they actively send out signals or just passively receive what’s in the environment. You can also categorize them by the type of signals they work with; cameras and lidars work with light, while ultrasonic sensors work with sound and radars with radio waves. In addition, it’s also good to have some physics knowledge, so you can know more about the pros and cons for each sensor.

Let’s talk about the simplest sensor first: ultrasonic sensors. As the name implies, they emit and receive ultrasound to estimate distance. It’s called ultrasound, because its frequency is over 20,000 HZ, the upper limit of the sound frequency that humans can hear. As a result, the sensor wouldn’t be constantly making noise, because we can’t hear it! Another good thing about ultrasound is that it has less diffraction, which I won’t get into. If you want to learn more about sound, take PHYS 107: Physics of Music!

The more complex sensor is the cameras, although it’s more common. Essentially, it uses photosensitive devices to capture light. I won’t get into the physics of such devices, because it’s just too complex. However, I will say that as a sensor, cameras give us a lot of information. A lot of information is not necessary always good, because sometimes we don’t just know how to make sense of it. Computer vision is a type of area where people are trying to make sense of such information, and to them, I say good luck!

Now it’s time to present the most exciting sensor of all: lidars! I am biased, but out of all the sensors, lidars are the best. Lidars are a very new invention, although the same technology as been used in barcode scanners and light shows. They work like ultrasonic sensors, but they use light rather other sound. As a result, you can get more frequent measurements. There’s a wonderful video about them from Sparkfun. Watch it, because it will tell you more than what I can write in a thousand words! Another good video is from Velodyne, the leading lidar supplier for self-driving cars. (Again, lidars are the best.)

For the rest of the sensors, I will just briefly mention them, because I either don’t know much about them or they are hard to write about. We have radars which have been used in the automotive industry for a long time. They are cheaper than lidars, but don’t give you as much detailed information. Then there’s the IMU, which stands for Inertial Measurement Unit. It uses accelerometers and gyroscopes, and sometimes magnetometers. Last but not least, we have all used GPS, as it’s embedded in most of the phones.

Here you have it, the six different sensors used in self-driving cars. Beware that each sensor is a sophisticated piece of technology beyond the scope of this article. Just to illustrate how sophisticated each sensor can get: rockets need some very precise IMU to go to space, and some self-driving car companies use the same type of IMU! How incredible!