Lovecats

In this project I use a Leap Motion in conjunction with openFrameworks and C++ to create a playful experience where disembodied cat heads float around a forest and are followed by hearts.

Music: Three Trees by Tanlines off the 2010 album Volume On
Concept

This term we explored ways of interfacing with computers beyond more traditionally common inputs such as the keyboard and mouse. We were introduced to interaction through computer vision (including death cameras), Open Sound Control, and via voice control with machine learning. I became interested in exploring a touch-less or invisible interface. An invisible interface allows for less rigid physical interaction than traditional inputs and has greater propensity for discovery which is ultimately more engaging for the user.

The inimitable science fiction writer Arthur C. Clarke wrote, “any sufficiently advanced technology is indistinguishable from magic” and being able to write code, understand the tools used, and not be limited by a lack of ability allows one to be the magician - to craft the spectacle and experience. Inputs such a Leap Motion or small imperceptible sensors allow interaction to feel and look magical.

Other than utilizing an invisible interface, I was interested in creating a pleasing, enjoyable, fun experience and perhaps something relaxing in the vein of calm technology. I knew I was going to have to do heavy research on the technical side so having the end product be lighthearted made the project fun to work on.

Technical Research and Process

I began by researching using a Leap Motion with openFrameworks. I found two add-ons, ofxLeapMotion by Theo Watson and ofxLeapMotion2 by Gene Kogan. After obtaining a Leap Motion Developer License, downloading the most recent SDK, and installing the addons, opted to use Kogan’s ofxLeapMotion2 addon as it supported the second version of the LeapMotion SDK and had skeleton tracking including individual finger recognition. I didn’t realize until I was quite invested that ofxLeapMotion2 does not support Gesture Recognition like Theo Watson’s original addon does. I tried combining both addons into one project but wasn’t surprised when it wouldn’t compile. Fingertips are recognized in ofxLeapMotion2 so I decided to use the index fingers of both hands as attraction points. Palms could also be used but I felt that the finger tips were more accurately tracked.

Once I got a working model of LeapMotion in a project, I turned my attention to creating a particle system. Daniel Shiffman’s Nature of Code has an excellent chapter on particle systems as does Chapter 3 in the Mastering openFrameworks book. I also found several projects and addons on GitHub. I considered using the very sleek ofxGpuParticles but thought is was too polished for the lighthearted interaction I wished to create.

A student at Parsons, Danielle Beecham posted her code for a flocking particle system and I thought instead of trying to recreate one from scratch I could use this suitable code and modify it to create desired interaction. Her particle system uses the Mouse X and Y as attractions points, sound is triggered on a mouse press that also loads a different particle system. I removed these details, added the Leap Motion attraction points, and modified the velocity, drag, force, and position to create what I was looking for. I also added a custom heart particle instead of ellipses and added a Lerp to change the color of the particles based on distance from the attraction point.

Once I had a baring on the organization I focused on the interaction and overall aesthetic. I used Illustrator to create the images and output 1x PNGs with transparency and using ofImage, I drew the cat heads at the index finger attraction points. I debated using SVGs but previous explorations with SVGs proved unsuccessful. Even though I made sure the alpha channel was included in the PNGs, I had some issues with the transparency seen in this test GIF where the particles weren’t wrapping around the trees correctly. Adding these two lines of code fixed it.
Responsive Figure
test GIF showing Alpha Issues
Future Development

I would like to gameify the experience and create more of a narrative. Also, I’d love to add an interaction when the cat heads touch and preferably the heart stealing would be more gradual instead of instantaneous. I absolutely love the invisible interface using a Leap Motion allows. I will look into marrying the addons or using the patch I was sent to enable both gesture support and skeleton tracking for future development or for another project.

Self Evaluation

I’m proud of myself for using a device we did not cover in class or in any of our oF assignments. I worked independently researching addons, using Github to clone addons, and to try version control. I reached out for help to organize the project using classes but was able to do most of the coding overall. I built on previous knowledge of vectors (!) to deal with particle color and the number of hands interaction. I also figured out how to add a Run Script in the Build Phase to facilitate the use to the ofxLeapMotion addons. It took me a while to wrap my head around the technical side of the LeapMotion and I feel I could take it further in the future or even over the Summer.
References

Most references are included in the hyperlinks above but here are some other ones.......

ofxLeapMotion by Theo Watson
ofxLeapMotion2 by Gene Kogan

On using Github:

ofBook
freecodecamp
Cloning Addons

Have a Leap Motion and want to play? Download the binary here.
Vector Illustrations