In the 1990s I created a series of grid structures using acid-etched industrial steel parts. A sensor was located at each intersection.
They were all connected with a combination of RJ11 and RJ45 connectors and appropriate cable. For ease of touring and installations, I designed six grids to be 48″ x 96″. Plywood, sheetrock, and other building materials in the United States typically max out at those two dimensions. Correctly fabricated the modules would have easily fit in cargo vans, elevators, and trucks. Much to my chagrin, I did not factor in the additoinal length caused by the couplings and the additional offset of the pipe. This caused some extra challenges. But that’s for a different text.
The grid structures were created for people to be able to play music and/or poetry either in a random manner or by controlling where their shadows fell on the grid. In this manner they could then control the placement and pattern of sounds or words if they chose.
I always liked the idea of creating participatory volumes and felt that their must be a better way to get control signals to the computer.
Processing and the availability of low cost web cameras would seem to be one way to update the mechanics of the artwork.
Serial port communication is a lot like it was in the 1990s. Except that it is more robust, runs over a USB (then non-existent!) cable, has relaible physical connectors, and Apple computers which can do things undreamt of back then. The Arduino was essential in making this project work. (This seemed like a heavy-handed solution for the task at hand. I can’t help but thing that a simple 16C58B could have given me the hardware interface I needed for this project.) I used the Arduino because I wanted the experience with the platform as well as the fact that its language is nearly identical to that of Processing.
A screen grab of the Arduino code appears below. In its uncommented out incarnation, the code allows me to see which outputs are firing and when. For the demonstration and testing I am using incandescent bulbs. Due to the heat they radiate, I prefer to have the lights “off” while waiting to trigger them using camera input.
Clicking on this link will open a new window with a screen cap showing the code testing with the laptop webcam. A white rectangle will appear with contiguous moving black segments, showing where my arm passes in front of the camera. (QT controller is at the bottom of the file.)
The above example is the code portion working. The file at this link shows the lights changing in response to the detected movement.
ITP, Tisch School of the Arts
New York City