Sharir and Wei Yei, working on the Future Physical performance commission
Intelligent City along with Sophia Lycouris and Stan Wijnans, gave
a process demo at the WEAR-ME!! Network Exchange.
started with some background: Sharir is Professor of Dance And Technology
at the University of Texas, and Yei has collaborated with him for
the past two years; previously he collaborated with a team of architects
in Salt Lake City.
and Yei demonstrated three iterations of wearable computing suits
which they have developed in the past two years. Sharir said: "Issues
relating to human/virtual interactions have been around for a long
time, but we have never mastered the technology." He moved
on to show video footage from a project using the first wearable
suit developed with Wei Yei, entitled The Automated Body Project.
This featured the pair's cyber-suit: "Which collected data
from the wearer, including EEG information, talked to a "mothership"
and returned, via radio-frequency communications, an image representing
the data. You could move your eyes and the image would move, or
extend an arm and the image would extend - the images were projected
on a transparent screen. A dataglove also let me manipulate that
material in real-time."
first iteration of the cyber-suit featured a large, rigid control
pack on the front, and Sharir and Yei set about shrinking as much
of its electronics into the actual suit as possible. Sharir demonstrated
the second version of the suit operating in a performance entitled
Lullaby: "For me, it was the first time I could create a situation
with a slight relationship between a human and a cyber-human. By
wearing the suit, I could activate different types of movements
Yei took over, to provide an insight into the technology behind
the cyber-suit: "If this technology is going to integrate into
daily lives, it must withstand being stepped on or chucked in the
back of a rucksack. And it must have multiple levels of redundancy.
It has to be washable and scrunchable - which meant that we would
have to build every piece of the hardware and software."
started with the Cyberprint suit, which measured body responses."
Sharir added: "We were interested in creating an alternative
to a motion-capture suit, able to control a cyber-character from
a distance wirelessly." Yei continued: "With the original
version, wearers had to be upright. With Version 2, which we called
Aurora, we stripped out the unnecessary elements - the laptop, USB
chain and 802.11b wireless networking hardware. We designed a new
controller, called Itchy: if a sensor failed, it would know, and
as it calibrated the sensors it would know who was wearing the suit.
We worked towards shrinking everything down; using Bluetooth for
the network provided controllable sensor nodes."
was then developed into Scratchy, which provided a six-degree-of-freedom
inertial measurement unit. With our final Cybersuit prototype, called
Shorty, you can pass data through the suit itself: the circuitry
is inside the fabric, with mounting points for sensors. Our latest
controller, called Gumdrop, has a radio transmitter and processor
on a board measuring 2.8 by 1.5 cm, can be embedded in cloth and
made waterproof, and is based on a Cirrus Logic ASIC chip which
we designed ourselves."
it seems, Sharir and Yei are close to achieving their goal of controlling
virtual characters by sensing dance movements in real-time, with
the controlling dancers wearing a suit which is truly non-intrusive.
Sharir gave a revealing quote: "I think the term wearable computing
is a misnomer: I prefer the term performance augmentation"
which shows the direction from which he and Yei are approaching.
It will be fascinating to see their technology employed in Intelligent
here for images from this Process Demo....