The Hollywood secret to Life Extension and Longevity.
Clyde DeSouza
2013-09-26 00:00:00

But, once someone tastes success in Hollywood…they never want to leave just yet. Audiences also keep demanding one more film from their heartthrob who might be pushing 60, but was their role model for so long, it’s unthinkable that newbie replace them. After all, “The Rock” saying “I’ll be back” is not the same. So what does Hollywood do to solve this dilemma? – Digital Surrogate Actors.

High Definition Cameras can pick up flaws in Human skin, and one too many beauty passes are required in post as actors may age even faster than others with a combination of work stress, lifestyle and the effects of being under stage lighting for hours at a time. The answer – Digitize an actor – Nothing less than full body digital documentation and performance capture.

When such technology does go mainstream (read: cost effective) and it will, soon – one of the many benefits will be the capability to do an infinite number of “takes” – But the main benefit? When opting for a Digital surrogate of themselves, an actor gets an extended lease on life – a digital soul, albeit, sold to Hollywood.

But, who owns this Digital Surrogate of an actor? The actors themselves? the Studio? or the Mo-cap/Performance capture house? To answer these questions, let’s take a look at how an actor’s digital surrogate is born.





 

Realistic Human Skin:

The holy grail of CG human rendering is to achieve “realism”. Big strides are already underway that should make Digital Humans (actors) indistinguishable from live talent when viewed in movies – In real-time.

Yes, if the whole Digital Surrogate Actor argument is to gain traction in Hollywood, Directors will want to be able to work with Digital actors in real-time. Seeing them respond live, via the view finder. The video above is the work of one man – Jorge Jimenez and his algorithms that render life like skin. He calls it separable SSS, [5] to speed up render time with today’s GPUs.

To learn more, browse through the paper from Nvidia on the subject of rendering more realistic Human skin in CG [6]

The Ethics of DSA – Digital Surrogate Actor ownership:

Who “owns” this Digital Actor today? There are full digital replicas of some actors lying around on servers in VFX houses and Studios, and with a little re-targeting, these performance capture files can be spliced and edited… much like video, and can even be broken down to sentence and phrase level facial expressions, then mapped onto any other digital character, without the actors consent.

For example: Does a studio need Jim Carrey’s exaggerated facial expressions for “MASK 3″ and Jim Carrey is asking for too much for the film? A typical response from suits at a production studio would be “Let’s search our servers and see if we’d mo-capped and performance captured him”. A few minutes later on a positive file search… “OK, let’s re-target it to a stock Digital character and get on with the show”.

Along with the advances as shown in the Jimenez realistic CG face rendering, the Studio now has a completely digital actor surrogate within their budget. At this point we are still talking vanilla 3D, but once we have a 3D scan and the Digital Surrogate, it’s a no-brainer to render it in Stereoscopic 3D. When the depth channel kicks in… we have total immersion in Artificial Reality, and the audience is none the wiser.

This is something for Actors to think about. The argument is open on whether actors have signed contracts where they have given up the rights to their mo-cap and performance capture libraries. FACS libraries, their digital “skins” are all part of personal property that should be copyrighted.  The right to decide should fully be with the human owner that the surrogate was made from.

However, there’s also the good side to Digital Actor Surrogates.

Think John Travolta’s “Strut” in the final scene in Staying Alive, back in 1983.  Can he “strut” the same as he could back then? …err. What if a studio were to decide to make Staying Alive again? If they had the original actor’s mo-capped “strut”, they could simply apply it to the backside of a Digital actor in leather trousers. Photo-realistic New York is available today.

How about Michael Jackson’s dance style and the moon walk. He’s often imitated, but in the end, the nuances were his. Was his moon-walk ever mo-capped? I do not know. Can it be mo-capped with 100% accuracy from a video file? I’m guessing with advances in algorithms and multiview video analysis, it could.

References:

Images used:

Cyberpunk 7070 - http://www.rogersv.com/blog/cyberpunk-2077-platige-images/

JimCarrey Perf capture - http://www.flickr.com/photos/deltamike/3636774647/

[1] Whole Body Scanning – Infinite Realites: http://ir-ltd.net/

[2] FACS – Facial Action Coding System - http://en.wikipedia.org/wiki/Facial_Action_Coding_System

[3] Motion Capture – Nuicapture - http://www.youtube.com/watch?v=mhb6Uqxg9Tg

[4] Voice Banking - http://www.nbcmiami.com/news/local/Woman-With-Lou-Gehrigs-Disease-Recording-Voice-Using-Voice-Banking-216827071.html

[5] Separable Sub-Surface Scattering - http://www.iryoku.com/separable-sss-released

[6] Realistic Skin rendering – Nvidia - https://developer.nvidia.com/content/gpu-gems-3-chapter-14-advanced-techniques-realistic-real-time-skin-rendering