Breaking the Frame is an art installation that explores the connection between visual perception and time by displacing the user’s reflected image over time and space. The installation elevates the user’s self-awareness in the present moment and challenges our traditional perception of a mirror image by incorporating not only our present reflection, but also our reflections from the immediate past.
I.C.U. is an eye-tracking game that enables the user to explode objects on a screen simply by looking at them.
Here’s how it works: User dons rad, red eye-tracking glasses, stands directly in front of a projection screen and then looks at 12 calibration points that are projected on the screen, one by one, while a brief calibration is performed. Then the game begins. An object, perhaps a banana, appears on the screen, and the user must stare at that object until it explodes into yellow particles that fly all over the screen. Then another object, perhaps a strawberry, appears, and the user shifts his/her gaze to the strawberry, which blows up in a cloud of red strawberry particles. A satisfying “splat!” sound accompanies each explosion.
I.C.U. was created entirely in Processing. I loved the idea of blowing up objects by looking at them – but being a chill, peaceful type of person, I wanted to make the explosions as comical and kid-friendly as possible. Instead of making epic explosions as previously intended, I decided to work again with the Processing Box2D physics library, an open-source physics simulation library written with game designers in mind, to create explosions of colorful particles incorporating weight, velocity, gravity simulation and collision detection.
When the user’s eye rests on an object, the object jiggles, then quickly disappears to be replaced by an explosion of particles. The initial explosive force causes a burst of particles to fly in all directions, then decelerate and fall downwards. I created a boundary on the sides of the screen for the particles to bounce off, creating a confetti effect and then falling to the ground.
My decision to use Box2D was to enable the particles to mount up, allowing the user to make a big fruit salad that filled up the entire screen. I’ve discovered, however, that Box2D probably was not the most efficient way to go in the case of this particular project. The video tracking involved in eye tracking component of the project takes up a huge amount of processing power, and when combined with Box2D the frame rate slows down significantly — from 60 to 15 fps, in some cases — once a few hundred particles are present on the screen. Rewriting the code using a single particle system to create the explosions will likely be a significant improvement.
The eye-tracking component of the project can be challenging and finicky at times, but when the eye-tracking works, it works really well. The eye-tracking Processing code relies on the positions of the pupil and the glint to tell where the user’s eye is focusing on the screen. It’s important to get a good image of the eye, especially for the calibration stage – the LED (only one LED, mind you, or there will be multiple glints, which will not do) needs to be positioned so that it lights up the whole eye, and must be kept to the side rather than getting within view of the camera.
The rad, red eye-tracking glasses are based on the EyeWriter design, the full directions for which can be found on Instructables. Our glasses consist of a hacked PS3 Eye camera fitted with an infrared filter to block out all but IR light, a pair of sunglasses from the ever-wonderful St. Marks Place, some alligator clips, a battery pack and an infrared LED to illuminate the eye. The glasses cost about $50 to build. They’re a little on the large side and tend to slip down my nose (which is a little on the small side), but they’re awesome, uber-nerdy and lots of fun to wear.
The Processing sketch and code can be found here. Feel free to bug me for my eye-tracking glasses, or if you have about $50 and an hour or two to spare, try making your own EyeWriter using the Instructables.
There are many cool possibilities for future eye-tracking projects, and I plan to continue working with it. Once I figure out how to make the eye-tracking system more robust onscreen, I’d like to move off-screen to create a physical moving object that the user can control with their eyes. It’s the next best thing to telekinesis, methinks.
Also, lasers. I want to put lasers on the glasses so the user can have awesome superhero laser vision.
My work with eye-tracking to date led to a few more random observations:
- Eye-tracking is a good way for me to drive other people bananas by controlling what they can see, based on what I’m looking at.
- Eye-tracking is a good way for other people to have a laugh at my expense, depending on what’s happening on the part of the screen I’m not looking at.
- The eye is an inefficient cursor for controlling objects in a larger context, because you can only “see” what you’re looking at.
Below are some pictures from the project. Click on the thumbnails for a larger image.