So I seem to be getting back into ADB again. Usually I take a long hiatus from a project after a major show, on the order of 6 months or more. I haven’t done that this time. It’s been less than two months since SIGGRAPH and in the time that’s passed I’ve written a paper about the project which took a couple of weeks, and I tried some documentation, which only turned out so so. So it has been relatively slow on the making front. It was at least 5 weeks until I even cracked open the shipping case. A show like that took so much out of me that it’s hard to get back into the zone. Here’s the strange thing. I had been intending to submit documentation of the robot to the Japan Media Arts Festival, but didn’t get it finished in time. That was a bit disappointing, but also relieving. I don’t want to submit the robot for any more opportunities until it is a bit further developed, such that its behavior fully fits the description. Here’s the latest description:
After Deep Blue, or ADB for short, is a snake-like, modular robot designed for haptic interactions with people (see figure 1). It writhes, wriggles, twists and squeezes in response to how it is held and touched. It can be used to explore intimate and emotional relationships with technology through direct physical contact. ADB adapts to, and reciprocates the energy you put into it though your body. When touched, it comes to life. When stroked, it seeks more contact. And soon, when it is harmed, it will defend or try to get away.
It’s that “soon it will” part that I have to change into a “now it does”. In fact, I’m not entirely sure it even does what it is supposed to at this point, which is seeking more contact. It kind of does that, but also seems to have a mind of its own in its movements.
So, I need to get control of the movements. To do this I have to learn whether the sensors are being falsely triggered, and if so then what is causing the triggering and find a solutinon. Alternatively, if the sensors are triggering fine then it is the software which is not dealing with the stimulus correctly. I already know that to some degree it is both of those problems. A module’s sensors respond to proximity of other modules, but fortunately not as much as a touch. So I should be able to differentiate between those phenomena (as long as that holds true when many are connected together). As for the software, right now the modules are told to go to discreet positions, whereas what I want is for them to turn until they make contact with skin.
In addition to these immediate issues, I also want to make some changes to the circuit, adding a power ORing feature that switches between batteries and an external plug. Will also reduce the number of send pins for cap sensing, (change the cap sensing circuitry altogether if necessary), implement the gyro, use a faster baud rate for motor communication, see if I can get batch programming of all modules to work. Lots of other things as well. Started scratching the surface this week. More to come.