Tuesday, June 02, 2009

Microsoft's Project Natal: Motion Capture, Mime, Puppetry For your Xbox 360?


The computer vision wizards of Microsoft (including the Internet sensation Jonny Lee, the guy who hacked a Wiimote into a virtual whiteboard) have been busy working on a controller-less technology that, apparently, can sense shapes and forms and track their motions.

Imagine the uses for puppetry or mime! In the video above, the boy gets to perform the rampages of a giant Japanese monster. The girl drives a car by miming the hands on a steering wheel. I can see this being used for virtual Muppets, where a simple two-handed rod puppet could drive a virtual puppet decorated to look like whatever you want.

Some questions to ponder. Can Project Natal track depth accurately? What's the latency? How many things can it track? If a tracked object gets occluded and then reappears, is there a delay before it gets picked up again?

Low-cost motion capture / digital puppetry inches closer and closer. I hope Microsoft opens this up to XNA so that indie developers can play with it.

Labels: , , , ,

Bookmark and Share
posted by Brian at 10:30 AM 0 comments links to this post

Monday, December 01, 2008

Video Editor from the Future!

Bookmark and Share
posted by Brian at 2:18 PM 0 comments links to this post

Friday, June 20, 2008

SPORE Creature Creator: The Future of Digital Puppet-Building?


Will Wright is the mastermind behind Sim City and The Sims series (the best-selling PC game of all time). His upcoming SPORE game takes the player through stages of living things, starting with the microbial, then land-based fauna, then into social groups (tribal), then into civilization mode, then finally into space mode! But what's really exciting to me is the Creature Creator mode and the technology behind it.

Most modeling and animation package interfaces start with an interface left over from Computer Aided Design (CAD). That is, you model geometry first. When it's how you like it, you add skeletal rigging (bones) and mesh the two. Then you build in all sorts of constraints so that when you move parts of the puppet, it behaves realistically. Finally, after the digital puppet is ready, you can animate it or apply clip-art animation to it. Life takes a while to emerge.

SPORE's interface starts with the premise that you are tweaking a living being. You, the designer*, get to add in functioning hands, legs, and eyes (not just shapes of them) and this being will react to them, try them out, shake them around, and perhaps disapprove of your modifications. It's very much like the classic Chuck Jones Daffy Duck cartoon Duck Amuck where the unseen artist keeps redrawing Daffy's background and his body.

I do hope Autodesk and Softimage are watching. Wouldn't it be great to be able to construct creatures in this organic way? Wouldn't it be lovely to not have to worry about building IK handles and binding deformers and just start building an instantly animatable character? Granted, there are only 250 shapes available and with all this built-in procedural structure, there are limits to what you can build. But sometimes limitation is exactly what you want. (Who needs the thousands of combination that DON'T work as an animated being?)

* It's perhaps unfortunate that SPORE creatures don't seem to "evolve" when they mate, emphasizing the unnecessary (but fun) step of design.

Labels: , , , , ,

Bookmark and Share
posted by Brian at 7:58 AM 0 comments links to this post

Friday, February 22, 2008

LifeFormz: Mr. Stick


*sigh* Ok, I had promised you some LifeFormz footage months ago (last year in fact), but I became frustrated with the results of digitizing the 15 year-old VHS tape I had -- bad sound, all washed out or too dark, and generally crappy. But just this past week I discovered I had another, much better tape. Yay!

So here's one of everybody's favorite sketches, Mr. Stick. Brian Flumen came up with the idea of a silent film actor who happened to be a stick. Somehow we evolved it into having a historical film host guy showing off a few of Mr. Stick's films, Mr. Stick Goes to Town, and the sequel, Mr. Stick Comes Back from Town, plus Mr. Stick Walks His Dog.

Brian performed the voice of the host here while I simultaneously listened and lip-synched along in front of a green screen Chroma-keyed (using an old JVC analog video switcher) with footage taken from U.Penn's Fine Arts Library. Besides wanting Brian to perform, the reason we did it this way was that Penn's UTV Station did not have good microphones in the studio at the time, only in the control room. That's why sadly, most of our sketches did not involve multiple characters speaking. Separately, Brian also performed Mr. Stick himself in front of a green screen with a Chroma-keyed image taken from a book of old streets we found somewhere.

Oh! The piano music... Well, in the grungy basement of the studio, back in a far storage closet, Steve and I found a piano, and one day, a young woman was practicing on it. We asked her if she would play something ragtimey, so she played The Entertainer. Perfect! Steve ended up speeding it up old-school style, by dubbing it off of one S-VHS player to another that was recording at a slower speed. Man, we would have LOVED having Logic or ProTools back then.

The Amiga Video Toaster provided the film-look and black-and-white FX.

One technical challenge we faced was that we could not do compositing after the fact like you can today. So anything being Chroma-keyed had to be ready, up and running in either the JVC switcher or the Amiga Video Toaster, or in some cases both! That also meant we needed enough people on hand to operate everything, essentially live. Although editing-wise, we often shot in a film style. This drove UTV nuts because we used WAY more S-VHS tape than everyone else and we produced episodes much much slower than they would have liked. (Not to mention the fact that our puppets and building materials were slowly taking up a huge section of UTV's office!) Cié la vie. We had a hit show and it won a Student Emmy, so they stopped complaining eventually.

Up next, "In Search of the Unknown Unexplained Mysterious Things We Do Not Know Anything About".

Labels: , , , ,

Bookmark and Share
posted by Brian at 11:10 PM 0 comments links to this post

Friday, January 11, 2008

Puppet Torture Game on the PS3

Do you remember the boy named Sid in the movie Toy Story who tortured and mangled toys? Do you have a Sony PS3? (No? Very easy to get, unlike the Nintendo Wii™). Well now's your chance to have Sid-like fun as you wreak havoc on your own virtual character doll.

In Sony's new PS3 game Pain, the goal is to fling your puppet (via sling-shot, cannon, what-have-you) into brick walls, bowling pins, glass and other shatterable material. Seems like not so much a game as a toy to let out your aggressions that you've culminated from real life or perhaps your deep-rooted hatred of representational inanimate objects.

Or looking at it another way, this game is the interactive, digital puppet equivalent of those Jackass movies.

Wouldn't it be cool if you could load in puppet characters / avatars from other games? (Take that, Mario!) Or build your own? Seems like there's a lot of cross-licensing opportunities.

Labels: , , , ,

Bookmark and Share
posted by Brian at 3:03 PM 0 comments links to this post

Tuesday, March 27, 2007

Dancing Puppet Game on the Nintendo Wii!


I knew it would happen eventually.

Electronic Arts (Montreal Division) is about to release a game called "Boogie" that seems a lot like my idea for a combined Dance Dance Revolution, Guitar Hero, and puppetry performance game. Curious to see how they've used the Wii's controller for making the character dance. Are you just triggering dance motions with certain gestures, or can you really put your own nuance into the movement?

I hope it does well, so that publishers will be open to more puppetry-oriented games.

Labels: , , , ,

Bookmark and Share
posted by Brian at 1:34 PM 0 comments links to this post

Tuesday, October 24, 2006

Gametrak Fusion -- Is this the Puppet Controller To Die For?

It appears there may be an inexpensive motion capture technology on the horizon (2007). From the specs, it is superior to the Nintendo Wii remote, as it uses ultrasound to detect exact 3-D positions at high speeds, and supposedly it gathers shape information about whatever you've attached the sensor to. They claim this could be used for collision detection and physics.

Also, it's platform-neutral, connects via USB, and will cost about $56.

One could build any number of puppetry controls with it! Virtual rod-puppets, marionette-controllers, shadow puppets. One could even attach it to actual puppets, driving virtual versions on screen.

Labels: , ,

Bookmark and Share
posted by Brian at 1:57 PM 0 comments links to this post

Wednesday, October 04, 2006

Machinima Application Design Requirements, Take One

Examples of Machinima have finally gone viral, such as Overman's Male Restroom Etiquette, which got over a million downloads from YouTube the other day. It was created mostly using the engine and assets of Will Wright's The Sims 2.

There has been much talk of "Hey, this will forever change how we make film!" but before we get carried away, let's think about some of the challenges and requirements of an ideal platform for making movies inexpensively.

* Assets

A primary problem of Machinima is the limited number of existing rigged characters (actors), props, costumes, and environments available in any given engine. The easiest are Sims 2, Second Life, and a few others with communities making things in their spare time. The most flexible are engines where one can model using any available tools, such as Unreal, Panda3D, Blender), but require a lot of skill, time and large teams to produce decent work.

A problem with prefabricated assets is that they follow a particular aesthetic. Sure, you've got a lot of objects to play with in The Sims, but they all look Sims-y. What if you want a dark, noir look? What if you want black & white? Or you want it to look and feel like a movie or video, with gorgeous lighting? You'll end up needing to fix things in post, which can be done, but that adds a lot to the cost. A director has limited choices in any pre-existing game engine.

Currently, low-cost modeling software programs are too primitive. They do not help a 3-D artist (much less a Machinima director) be an artist at a high, conceptual level. Photographers now have means (to some extent) to alter their works far above the level of pixels of color -- they can use adverbs and similes via Photoshop filters. High end (expensive) software is beginning to use genetic algorithms, advanced auto-rigging, computational geometry techniques, and novel painting techniques (like Zbrush) but it will be a few years before these are available in low-end or free software. (Though from what I've seen, many of the techniques being incorporated into Will Wright's next game, Spore could be adopted into inexpensive content creation tools even sooner. Something along the lines of this.)

* Control

As a puppeteer, my biggest gripe with typical game engines is the lack of decent control. Even Susanne Vega commented (via her Avatar) in Second Life, that the experience of performing in SL was like playing with puppets at home with her kids, only that she couldn't move hers how she wanted. Traditional gamepads and joysticks are good for position and for triggering canned poses, but software has not been made that lets them become a means of performance beyond simple humorous videogame-y moves.

Sims offers a large number of "canned" behaviors, which I believe is one reason why Male Restroom Ettiquette did so well. Sure, it could have been done in Halo with expressionless soldiers. But how much funnier to see characters holding themselves in agony as they wait for a toilet, a behavior all-too-familiar in the Sims?

* Puppeteering / Live Action, vs. Animation and Layered Editing

In music, there are sequencer software programs meant for explicitly laying down tracks, and editing them. This is akin to the Beatles doing take after take per song on Sgt Pepper's Lonely Hearts Club band, then painstakingly crafting it with effects, filters, sound effects, overdubs, and other production techniques until it worked.

Then there are programs like Ableton Live, meant for triggering musical events or playing in real-time. This can be a lot more spontaneous, like improvisational jazz or improvisational theatre.

Right now, with most game engines, the tools are more akin to puppeteering / live Action then they are full fledged animation and video editing platforms. Most machinima is edited after the fact, the end result being a video. (Would it be possible to do a LIVE Machinima piece? I think so.)

* Content / Storytelling

So what kinds of stories and content can be made with machinima? Over the last hundred years, film has evolved to cover the gamut of topics - romance, horror, drama, comedy, history, sex, religion, documentary, science, you name it. Videogames, so far at least, are largely limited to people, monsters and robots running around with guns, battle-axes or cars.

For machinima to really be significant, we're going to need more assets, and far better control over the characters if they are to be compelling. The democratization of tools for editing video and sound have already made filmmaking easier. The same for tools that make assets will help tremendously.

Labels: , ,

Bookmark and Share
posted by Brian at 12:56 PM 3 comments links to this post

Thursday, December 01, 2005

Brave New World of Puppetry, Part II

Andrew over at the Puppet Vision blog is starting some exciting projects regarding digital puppetry, and it got me thinking again of my blog entry back in May 2003 (see archive) on this topic, the articles I wrote for the Puppetry Journal, and the early digital puppetry attempts that Ranjit and I did in Lifeformz back in 1993. From what I can tell, Andrew has been building two engines for displaying and triggering both flat and dimensional puppets. This is very cool! But as I mentioned in my blog, there is still much work to be done:
For real-timers like actors or puppeteers, there are few options to manipulate the virtual world that virtual puppets live in. Products do exist, such as the DataGlove™, various 3D coordinate tracking sensors by Ascension™, Polhemus™ & Vicon™, and even musician-oriented mixing boards, or remote-control car joysticks and videogame controllers can be coerced into a form of puppetry interface. However, this is not ideal for the Puppeteer, whether shadow, hand, rod, talking mouth, or marionette.
We need real-time TWO-handed computer controls built specifically for puppeteers, and software written that can enhance the motion in ways impossible with real puppets.
It sounds like Andrew's working on the latter. Are there any tinkerers out there who are up for the former?

Nazooka over in Belgium is doing some neat stuff, and my friend Dave Barclay over at PerformFX has invented a nifty glove controller. But like nearly everything I've seen (protozoa, mediaLab, Jim Henson Co, etc), it's proprietary and unavailable for sale. What we need is an open-source platform to develop real-time controlled 2-D and 3-D puppet applications with environments, props, lights and camera. It sounds like Panda 3D is a good start, but now we need ways to interface it with wacom tablets and other inputs, preferably ones that we can all afford!

Labels: ,

Bookmark and Share
posted by Brian at 7:57 AM 0 comments links to this post

Friday, May 30, 2003

Brave new world of puppetry, part I

Like it or not, the world of Motion Arts is going Virtual. Puppetry, the ancient art of manipulating things, is alive and well in today's world of virtual, computer-generated images.

However, the software being written for those who manipulate virtual beings, such as Maya, Softimage, MotionBuilder, Lightwave 3D, 3DStudio Max, Cinema 4D or even Animation Master, is designed for Animators, artisans of a much more recent Artform based on Cinema, Comics (and occasionally Sculpture). Animators, whether the 2D illustration or the 3D puppet variety, are thinking in terms of single photographs or frames. Traditional puppeteers do not think in frames -- their performance is continuous and in real-time, like those of actors, dancers, mime artists, or even stuntpeople.

Computers and Frame-based Arts go hand-in-hand. Digital things broken up into pieces are what computers eat for lunch. The interfaces into computers are like this also -- individual keystrokes, the coordinates of a mouse (or drawing tablet) over time, images from a digital camera, etc. Those that are somewhat useful and common enough are designed for the standard one-hand mouse and one-hand on keyboard that we've been locked into by the legacy of the Desktop GUI invented by Xerox in the early 70's.

For real-timers like actors or puppeteers, there are few options to manipulate the virtual world that virtual puppets live in. Products do exist, such as the DataGlove™, various 3D coordinate tracking sensors by Ascension™, Polhemus™ & Vicon™, and even musician-oriented mixing boards, or remote-control car joysticks and videogame controllers can be coerced into a form of puppetry interface. However, this is not ideal for the Puppeteer, whether shadow, hand, rod, talking mouth, or marionette.
We need real-time TWO-handed computer controls built specifically for puppeteers, and software written that can enhance the motion in ways impossible with real puppets.

Of course, very method of control (interface) has its stylistic nuances. Animation looks like animation, captured motion off a human being looks like that human being, stop-motion has its own look too. Some of this nuance is physical, but the rest is in the training philosophy adopted by the performer. Animators tend to adopt the Disney/Nine Old Men "Principles of Animation." Puppeteers adopt similar principles. Actors, Dancers, and Mime artists have many philosophies to choose from within each of their circles. There is a criticism many classically-trained Animators have about the real-time approaches to Virtual Puppets. Using a glove makes the mouth move "like a Muppet." True to some extent, except that software can be modified to make the motion style more flexible. IF such software existed.

Advantages of Real-time over Frame-based control

There is always the question -- why bother? Well, if these controls existed and were cheap enough and did what I propose, you could do some things the Frame Artists cannot do:

* Live Virtual Puppet Theatre! There is something magical about seeing things live, even if they are on a screen.
* Improvisation!
* Faster scene takes (for larger CG productions)
* Give children something more to do with their computers than just play standard, violent videogames, such as Virtual Puppet storytelling. This could even be done over the Internet...

Technological Hurdles, and the Future

Alas, despite the power of 3D graphics cards available only within the last few years, we are still not up to the point where one can cost effectively create an entire Muppet Show, even one scene from the show, on a computer using real-time tools. It's still more viable to use socks, cheap video cameras, and real sets. But while we wait for the machines to get fast enough, someone needs to build good real-time puppet controls and software for them.

Here's a good start by MIT researcher Andy Wilson.

After all, why should Animation have the monopoly on Virtual Character motion?

Labels: ,

Bookmark and Share
posted by Brian at 7:25 PM