Also at Delft is the programmer who was working in the last Cyberstudio to make link between Lifeforms/ Motion Capture (Polhemus system) to animate LifeForms characters in realtime (or using it to automatically notate dancer's movements) at Motek. ACTION -- email Kirk Woolford for his name?
Future Moves People: TU Delft (dr. ir. Hans Furneé, ir. Jan Cees Sabel van Delft Motion Analyses en ir. Hans van Veenendaal) -- I think Jan Cees Sabel is the guy... email Kirk.
"The project Limelight, named as a reminiscence to the film of Charles Chaplin, is a modular program which integrates all components, who are necessary for a professional computer aided choreographing. The work in Limelight is project-oriented. In the project-window you collect all media belonging to the choreography. The media ranges from the storyboard, music (as sample and as notation), video, pictures to animation control data and dance-notation. Some media will be imported other media will be entered here or generated. The media could be synchronised via a common timecode. An important instrument is Labanotation from which a computer animation will be generated. As host-program of LabanPad (a program for interactive labanotation input -- using Apple Newton Palmtop) the notation can be imported or newly entered in Limelight. In opposition to the small display of the Apple Newton computer you could work stationary in a large format. Labanotation is also the basis of a future expert system for choreography."
David Rodger's site: http://farben.latrobe.edu.au/motion/dance.htmlJack: http://www.transom.com/
Transom Technologies, Inc. is the world’s leading provider of human modeling and simulation software. Over 100 industrial, government and academic organizations throughout the world use the company’s digital human technology to improve product designs, refine manual workplace procedures, and create compelling simulations.
Flavia Sparacino at MIT Media Lab: http://www-white.media.mit.edu/~flavia/Sue Ki Wilcox -- This book offers a unique perspective on avatars- inside guides to lead people through complex virtual worlds-combining a how-to approach with real-world business concerns. CD-ROM contains demo software and models for avatars. A companion Web site will also be updated in conjunction with inquiry.com. Table of Contents Catching the Wave. Working with 3DGraphics. USING AVATAR CONSTRUCTION SETS. AvatarMaker: Sven Technologies. Avatar Assembler from Cosmo Software. Poser from MetaCreations. 3D Assistant from 3D Planet. VRML Avatars. Oz: "Who Would You Like to be Today?" Le Dieuxieme Monde: Second World. Figure Sculptor by Attic Graphics. OnLive! The Audio Pioneers. WORKING WITH PRE-BUILT BODIES. Ergonomic Avatars and Virtual Humans. Ready to Wear: 3D Clip Art Avatars. PROFESSIONALS POLISH. Skimming the Surface with Voxels: Animatek. 3D Scanning for a Perfect Avatar. Cybermasks. GIVING THE BREATH OF LIFE. LifeForms: Bringing Avatars to Life. Keeping Things Moving: Behavioral Animation Versus Motion Capture. Beyond Snowcrash: The Future of Avatars.
"Avatars are the 3-D representatives of people that inhabit virtual worlds. They serve as our mediums to interact with 3-D worlds for activities such as playing interactive games or walking through an architectural model of a home. And just like people, the potential variations are infinite, from how they look and move to what they see and think. This is a complete guide to understanding, creating. and using avatars in virtual worlds."
Digital Image Design Incorporated -- interesting input devices such as Monkey 2 a hands-on desktop input device for keyframing and performance capture. Engineered to provide exceptional freedom in designing motion, Monkey 2 can be manipulated any way you want. Twist him. Bend him. Pose him as you see fit. Your on-screen 3D character follows.
It is possible to organise a demonstration when in NYC? Works with the following software:
Some basic concepts/ keywords:
1)Keyframing -- Traditional animation proceeds with the head animator sketching "key frames" and having others create all of the frames in between them. The Monkey 2 allows easy creation of key frames for 3D animators, using forward kinematics for complete control of all joints. The Monkey 2 is used in this way in creating animation for entertainment and games.
2)Real-time performance capture -- Some animators prefer to let puppeteers manipulate the armature in real time in order to get puppet-like motion into their scene. Monkey 2 has been carefully designed to allow easy real-time manipulation and a smooth, comfortable feel.
Send them email monkey@didi.com to see what's up... possibility to borrow Monkey?? for cyberstudio ??
Are classical animation techniques better suited to imparting life to computer-animated characters than performances captured from an actor or puppeteer via motion capture or digital input devices? The panel focused on the application and value of the many forms of motion-capture technology in CG character animation: full-body motion capture systems, digital and stop-motion armatures, and other real-time physical input media.
Jason Marchant comments:
What do you want out of an animation program with regards to dance? Life
Forms is a good choreographic process tool. I haven't seen the
realization of movement made simpler in any other 3D animation tool in
terms of time and cost. Life Forms exports directly to vrml 2.0 and
accepts biovision data. Life Forms runs on multiple platforms like my PB
520.
Lar Lubovich's approach does make sense. I find working with multiple figures in Life Forms is relatively easy and a great way to visualize complex movement patterns in space to different rhythms. If your intersted in incorporating dance in virtual 3D environments you might want to check out the Biovision group. I believe they are set up in California possibly the City of Angels but it could be San Francisco. When I plugged in some of the mocap examples into Life Forms it was as real as animated movement could look. As good as the 3DStudioMax dancing baby demo. In mocap every frame is a key frame but it is interesting to try and combine some keyframe animation with mocap and see how the data can be manipulated.
Thacker, Eugene. .../visible_human.html/digital anatomy and the hyper-texted body. Published online in CTHEORY-THEORY, TECHNOLOGY AND CULTURE-VOL 21, NO 1-2, Article 60, 98/06/02, Editors: Arthur and Marilouise Kroker.
The NPAC Visible Human Viewer http://www.npac.syr.edu/projects/vishuman/VisibleHuman.html. There are three types of image slices -- Axial, Sagittal, and Coronal. Small (preview) images for each of these viewpoints are displayed in the main panel of the viewer. Moving one of the cutting lines will create a new slice through the Visible Human. You can then select 'load' to pop up and then possibly print/ download the selected 'slice'. (had to use Internet Explorer - crashed Netscape]
The Biometric Digest - a monthly publication presenting an executive news digest on biometrics, identification, security, fraud, finger imaging, voice recognition, retinal scanning and other means of positive identification.
Floating Point Unit (Body without Organs piece is named Thacker's article -- relationship with Artaud] -- another Thacker thing, an Artaud quote apparently "make human anatomy dance at last".
One topic for further consideration is the linking of ‘traditional’ dance notation systems into this ‘new’ software: i.e. Labanotation/ Benesh -- in realtime?
Digital Choreographic Software for Dance Makers: At present, the only viable software 'tool' for choreographers is Lifeforms (some would say Poser). This project proposes to research the possibility for a 'new' software tool for choreographers. This research takes into account that the digital space is an 'alternative' space for dancemaking and therefore the project immediately is concerned with issues of aesthetics/ architectures, etc. The project sees itself as creating a framework within which to support but also question/ challenge current dancemaking assumptions and methodologies.
Procedure -- to investigate:
At present, this initial pre-research phase of the project is being supported by the V2 -- Media Lab in Rotterdam, NL.