Previously, I collaborated with Prof. Alyssa Hakes of the Biology department on a very interesting project, which highlights 3D printing’s high versatility and interdisciplinary potential. We worked on a project which may allow us to protect an endangered plant species known as the Pitcher’s Thistle (Cirsium pitcheri). This unique intersection of ecology and 3D printing is not intuitive at first, but it’s also an intersection that has only recently been explored by the scientific community.
Prof. Hakes has a wonderful page on
experiments.com
(https://experiment.com/projects/can-we-trap-invasive-weevils-and-protect-the-federally-threatened-pitcher-s-thistle)
which describes the project in depth. In short summary, the goal was to fabricate
decoys of the Pitcher’s Thistle (PT) to attract weevils away from the real and
vulnerable plant. We wanted to make the decoys as high fidelity as possible
considering things like shape, size, color, and reflectiveness. We also wanted
to optimize these decoys such that they were easy to print/work with and easy
to deploy in the field.
During the initial design phase, one of
the biggest challenges was trying to replicate the topology of the PT. The
small pineapple-like protrusions on the curved surface of the bud, proved
difficult to design and we anticipated that it might also be challenging to
print. In a stroke of genius, Angela Vanden Elzen had the creative idea to
modify a design she’d happened to come across on Thingiverse. The file was of a
lamp shade which Angela then further modified by placing two inside one
another, adding a sphere to the middle, and inserting a hole through the base
(so the decoy could be placed onto a dowel which would act as the plant stem).
This ultimately resulted in a decoy which looked something like this:
A snapshot of the decoy design Angela made
Interestingly, we discovered that the
“spiky” parts of this design weren’t printed exactly like they are shown in the
.stl file. Instead, because of printing limitations (e.g. the angles of
these edges) we ended up with decoys that displayed intricate, thin, somewhat
“frilly”, and lengthwise fibers which surrounded the bud. Ultimately, these
fibers actually helped make the decoys even more realistic in terms of texture.
They also facilitated some of our feasibility constraints (e.g. no
supports in the design makes it quick to scale up printing and the protrusions
may make adding/maintaining adhesive easier).
As we were printing, we utilized several
different shades of green (including an algal based filament which was surprisingly
. . . aromatic). We initially relied on prof. Hakes’ previous field experience
to determine colors that best match the PT. Later we decided we could use
images of the PT (taken by prof. Hakes in the field) to obtain a hex code and
subsequently a customized color filament. But where could we order customized
color filament? As it turns out, about 10 minutes away from the Makerspace is a
local business called Coex, which supplies several different types of filament.
We then began collaborating with them to create this custom filament.
A few prototypes printed with different filaments.
This winter, I took a course in Artisanal Animation. For my final, I was tasked with making an animation using any of the mediums we had studied. I was personally drawn to direct on film animation. It wasn’t the images that I was after, but the sound. The biggest inspiration for the project was Norman McLaren. McLaren was an animator for the national film board of Canada. He specialized in direct on film animation. One of his most impressive feats in the medium was being able to create his own hand drawn sound. I had made previous attempts to emulate McLaren’s process with little success. This time around, I decided I wanted to tackle the project in my own way. My initial thought was to use the laser cutter in the Makerspace, but the test ended in nothing but burnt film.
I had to consider another option. The only other machine I could use was the silhouette cameo. I was hesitant at first because I had never extensively used the machine. To my surprise, it was very easy to use making my overall process faster. Now I had the tools to etch the film, now it was time for the sound.
Sadly you can not just plop and audio file in the Silhouette cameo’s software. The cameo works best with vector based graphics, so get our sound to be cut-able we need to make it into an image file first. A quick Google search for “Sound to waveform graphic” yielded a website that does just that (link will be below). Once I had a image of my sound file, I ported it into the cameo’s software, resized the audio image, lined it up, and hit send.
One of the images from my stop motion animation
I wanted to see how far I could push the technology, and so I tried to etch a stop motion animation I made of my hand. The animation was shot in black in white (easier for the software to recognize), each image was then combined into rows of 24, plopped into the cameo software, and cut. The results was imagery that didn’t reflect the source it came from. The machine had added a layer of abstraction (you can see the result at the end!)
Here is a link to a video of the film being run through a projector!
Below is a list of links I used to do this project, including a link to an in-depth guide on how to do this yourself:
Our awesome Communications department has been putting together some great content about the makerspace!
Video: This is Lawrence- Makerspace
Blog Post: 2 Minutes With… Kelvin Maestre
Kelvin Maestre ’21, watches as a laser cutter starts its work on a piece of wood in the Makerspace on the first floor of the Seeley G. Mudd Library. (Photo by Danny Damiani)
Thanks to our Communications friends for helping us spread the news about the Lawrence University Makerspace!