Dreamachine 0.1

Being an old hippy, I’ve always wanted my own Dreamachine, the trance-induction device designed by William S. Burroughs’ compatriot, Bryon Gysin. Being a nerd, I’ve never been satisfied with the idea of a simple rotating flicker, but have felt that a more appropriate instrument for my own purposes would need to be… well, just more.

And now, with the ready availability of programmable LED strips, I’ve finally built my first version (which now wraps my cubicle at work) and am working out the details of the more sophisticated model to be installed in my meditation room at home.

The alpha model, code for which is shown below, is a simple strip of RGB LEDs hooked up to an Arduino. The code allows me to specify different waveforms, rates of change, and maximum values for each color channel, and push them down the strip in waves.

An Arduino is slow enough that actually ‘drawing’ colors across the strip would probably be pretty tedious, so all I really do is calculate the next value of each channel, assign it to the first LED, and then shift the whole thing forward one step. Holding the strip in memory – which you have to do forĀ  anything remotely interesting – turns out to blow the low memory of the Arduino pretty easily, so I couldn’t use a normal array processing routine to perform the shift… instead, I use C’s memmove, which performs the shift “in place” as it were.

The waveform I need to work on a bit more is the RAND, which needs some first-order smoothing and a slightly more sophisticated ‘colorizer’ than just the threshold.

I’ve gathered circular LEDs and illuminated rotary encoders to build the final version, which will allow me to control all the relevant parameters through a pretty little hardware interface. In the meantime, here’s the desktop model’s direct-control code:




Arduino MIDI Drum .002

After several iterations I’ve come up with the basis for a functional trigger circuit.

The piezo element (across a resistor) runs into Arduino’s Analog 0 pin, and also feeds a comparator whose reference is supplied by a variable resistor (which is fed to Arduino’s A5 for monitoring). The output of the comparator drives an interrupt routine attached to Arduino’s Digital pin 2 (interrupt 0). That interrupt routine polls the piezo’s value at A0 then returns to the main loop- which does nothing right now, but is where MIDI and other processing will go.

The comparator is to provide trigger thresholding in the analog domain, relieving the Arduino of the overhead necessary to do it in software.

It turns out that the acquisition of the piezo value is fast enough that I can grab a significant series of data points into an array and print the array out to see how the thing actually responds to various events. My first discovery is that its first sample isn’t its peak and that it really does ‘ring’. This confirms my thought that a likely peak detection algorithm is to run the samples until I see a falling value, in which case the prior sample is the peak.

I tried printing the array of sampled values to Processing for graphing but am having port reading problems in Processing.

The other thing I saw is that, across a 1 megohm resistor, the piezo maxes out the Arduino’s input very fast under very small impacts, and that reducing to a 100k resistor brings the response to within a more reasonable range. This suggests that replacing this fixed resistor with a variable will allow me to adjust the overall sensitivity of the final instrument. But I don’t want to have a pot control for every piezo element, so I’ll use a set of digital potentiometers under control of a single rotary encoder. The comparators for each element can all use the same voltage reference, so only a single pot is still required for that function.

So, next steps: set up my first multiplexed piezo array, implement what I now think is the appropriate peak sensing algorithm, and collect the components necessary for my ganged sensitivity controller.

Thanks are due for my work so far to the following sources on Arduino drumkit building and basic comparator use:


Arduino MIDI Drum .001

I am an inveterate tabletop drummer, so when I firstĀ  heard about the Zendrum my gearlust fizzed, only to be decarbonated by the price of what is really just a bunch of piezo triggers mounted on a nice piece of wood.

So, now that I’ve got the Arduino bug (my first circuit used a potentiometer to control the brightness of an LED [chuff, preen]), all things seem possible and I think to myself, “self… we can do one of these.”

Having obtained a fistful of piezo discs and a MUX board, I’m now faced with the task of sampling the peak values of those discs faster than I can thwap them and converting their data to MIDI events.

Now, I’m obviously not the first person to come up with this idea (here, let me Google that for you), so I can vicariously consider the problems of peak detection, sample rate and so on, before I ever touch jumper wire to breadboard.

For now, I’m going to see if I can get away without external peak detection or sample/hold circuitry. I’m going to write a simple state loop with no delays and see whether it serves the purpose without ornamentation. Something like this:

The actual MIDI processing is dirt simple (actual I/O will be handled by another board), so it shouldn’t eat up a lot of time. And if either acquisition or output winds up being slower than I think, I can offload either or both processes to auxiliary microcontrollers.

If all this works reasonably well, then I can look at adding a ‘programming’ state (probably switched to by interrupt) which would allow me to assign MIDI key values, modify thresholds, etc.

I wonder just how much of this stuff you can cram into 32K anyway?

FollowBot: First Thoughts

When I decided to attend DragonCon this year I started thinking about elaborations of the idea of ‘costuming’ that went beyond attire and accessories, to environmental effects and devices. I hit on the idea of a “followbot,” based on the Star Wars mousebot which you see zipping around the imperial corridors making electronic squeaky noises.

Such a mousebot would be Arduino-based, and consist of a motor system, steering servos, and some means of controlling direction. I’ve considered making a radio-controlled or even autonomous mousebot, but in this case I want one that will actually track me and stay at heel.

I don’t want to use any optical method of motion tracking, because that would require that my own costume have special features to enable visual recognition. Neither do I want to use infrared (IR) for pretty much the same reasons… even an IR beacon would have to maintain line-of-sight, and so would necessarily have to be a costume feature – not to mention potential interference problems.

That leaves me with radio frequency (RF) tracking, and the problem of determining direction and distance from the platform to me.

My first idea is to mount a circular array of 6 or 8 RF receivers on the platform and use radio signal strength indication (RSSI) to determine the angle from it to a pinging beacon I could carry hidden. The antenna with the highest signal level would be pointing to me, and I continuously scan the array to track my movement and provide control to the steering motors. Initial reading on this topic suggests that RSSI may be unreliable, as the signal strength doesn’t really correlate with the distance… in fact, the strength may even go down as the transmitter comes closer in some instances. I suspect this has to do with measuring wavefronts with a single antenna, and I’m going to see if using the circular array allows me to compensate for that, perhaps by calculating some product of the measurements from adjacent receivers.

So far, my only other idea is to mount a kind of ‘doppler array’ of only 3 or 4 receivers, and measure the time differences of wavefronts. That would require that the signal I transmit is actually coded, so that the processing program would know which ‘pings’ to measure the differences of. Distance measurement would probably then involve a ‘pingback’ of some kind, again for timing comparison. This is a sufficiently complex task that I think I’m going to stick with the RSSI method for now.

Clearly, none of this is happening for this year’s DragonCon. This little project is going to take a while.