Purpose of the Institute for Ethics and Emerging Technologies

Support the IEET

The IEET is a 501(c)3 non-profit, tax-exempt organization registered in the State of Connecticut in the United States. Please give as you are able, and help support our work for a brighter future.

Search the IEET
Subscribe and Contribute to:

Technoprogressive? BioConservative? Huh?
Quick overview of biopolitical points of view

whats new at ieet

IEET Launching Annual Fundraiser

“I Don’t See Class”

A Fleet of Jets: A Critical Look at the Business of African Pentecostalism

Is the Ethiopian Village of Awra Amba Really a Utopia?


Big-Bank Bad Guys Bully Democracy – And Blow It

ieet books

Anticipating Tomorrow’s Politics
Ed. David Wood


CygnusX1 on 'The Sofalarity is Near' (Mar 31, 2015)

rms on 'Nigerians will soon have to worry about implanted pacemaker security' (Mar 31, 2015)

rms on 'Three Tales of the DRM Curtain' (Mar 31, 2015)

Rick Searle on 'The Sofalarity is Near' (Mar 31, 2015)

CygnusX1 on 'The Sofalarity is Near' (Mar 31, 2015)

Peter Wicks on 'Armed with Cameras...' (Mar 31, 2015)

instamatic on 'Armed with Cameras...' (Mar 30, 2015)

Subscribe to IEET News Lists

Daily News Feed

Longevity Dividend List

Catastrophic Risks List

Biopolitics of Popular Culture List

Technoprogressive List

Trans-Spirit List


Enframing the Flesh: Heidegger, Transhumanism, and the Body as “Standing Reserve”

Moral Enhancement and Political Realism

Intelligent Technologies and Lost Life

Hottest Articles of the Last Month

IEET Launching Annual Fundraiser
Mar 18, 2015
(3724) Hits
(0) Comments

The nanobots are coming back
Mar 10, 2015
(7456) Hits
(0) Comments

The Sofalarity is Near
Mar 30, 2015
(6225) Hits
(3) Comments

Posthumanisms: A Carnapian Experiment
Mar 19, 2015
(5817) Hits
(4) Comments

IEET > Life > Access > Innovation > Health > Vision > Contributors > Clyde DeSouza

Print Email permalink (0) Comments (3428) Hits •  subscribe Share on facebook Stumble This submit to reddit submit to digg

The Hollywood secret to Life Extension and Longevity.

Clyde DeSouza
By Clyde DeSouza
Ethical Technology

Posted: Sep 26, 2013

The year is 2020 and your favorite 1980′s actor doesn’t look the way (s)he used to. Many iterations of cosmetic and then reconstructive surgery have now succumbed to gravity and the shortcomings of the biological substrate that is the human body. It is particularly hard for a superstar actor to come to terms with this. The combination of years of hard work to reach the top, the adoration of fans, wealth and ego are a hard mix to overcome. An actor’s career however is not over, should he or she choose the next milestone – Voice acting.

But, once someone tastes success in Hollywood…they never want to leave just yet. Audiences also keep demanding one more film from their heartthrob who might be pushing 60, but was their role model for so long, it’s unthinkable that newbie replace them. After all, “The Rock” saying “I’ll be back” is not the same. So what does Hollywood do to solve this dilemma? – Digital Surrogate Actors.

High Definition Cameras can pick up flaws in Human skin, and one too many beauty passes are required in post as actors may age even faster than others with a combination of work stress, lifestyle and the effects of being under stage lighting for hours at a time. The answer – Digitize an actor – Nothing less than full body digital documentation and performance capture.

When such technology does go mainstream (read: cost effective) and it will, soon – one of the many benefits will be the capability to do an infinite number of “takes” – But the main benefit? When opting for a Digital surrogate of themselves, an actor gets an extended lease on life – a digital soul, albeit, sold to Hollywood.

But, who owns this Digital Surrogate of an actor? The actors themselves? the Studio? or the Mo-cap/Performance capture house? To answer these questions, let’s take a look at how an actor’s digital surrogate is born.

  • The person (actor) undergoes a whole body scan. [1]

  • More detailed scans of the skin face texture, and hands for instance are taken.

  • This skin texture is “baked” let’s say on September 2013 and the person (actor) is 35 years old – Thus the actor can choose to “look” 35 in all movies henceforth.

  • Facial expressions are captured based on algorithms that are unique to a facial capture software. Through CG morph targets, and CG Face rigs, almost any expression could then potentially be synthesized. See the FACS [2] explanation for more technical details on Facial Action coding.

  • Signature Facial expressions unique to an actor (example: Jim Carrey) can also be captured as a macro rather than re-creating it via FACS.

  • Finally, motion capture. [3] The solution referenced here, shows how Moore’s law is also affecting price of Mo-cap solutions. Mocap files can be re-targeted from any actor, though in certain cases, the best mo-cap performance for a certain ‘style’ is best acquired from the original actor.

  • An actor can also “immortalize” their voice, using Voice banking [4] that people afflicted with ALS have come to rely on. Voice banking may have uses for them, a long time after they retire.


Realistic Human Skin:

The holy grail of CG human rendering is to achieve “realism”. Big strides are already underway that should make Digital Humans (actors) indistinguishable from live talent when viewed in movies – In real-time.
Yes, if the whole Digital Surrogate Actor argument is to gain traction in Hollywood, Directors will want to be able to work with Digital actors in real-time. Seeing them respond live, via the view finder. The video above is the work of one man – Jorge Jimenez and his algorithms that render life like skin. He calls it separable SSS, [5] to speed up render time with today’s GPUs.

To learn more, browse through the paper from Nvidia on the subject of rendering more realistic Human skin in CG [6]

The Ethics of DSA – Digital Surrogate Actor ownership:

Who “owns” this Digital Actor today? There are full digital replicas of some actors lying around on servers in VFX houses and Studios, and with a little re-targeting, these performance capture files can be spliced and edited… much like video, and can even be broken down to sentence and phrase level facial expressions, then mapped onto any other digital character, without the actors consent.

For example: Does a studio need Jim Carrey’s exaggerated facial expressions for “MASK 3″ and Jim Carrey is asking for too much for the film? A typical response from suits at a production studio would be “Let’s search our servers and see if we’d mo-capped and performance captured him”. A few minutes later on a positive file search… “OK, let’s re-target it to a stock Digital character and get on with the show”.

Along with the advances as shown in the Jimenez realistic CG face rendering, the Studio now has a completely digital actor surrogate within their budget. At this point we are still talking vanilla 3D, but once we have a 3D scan and the Digital Surrogate, it’s a no-brainer to render it in Stereoscopic 3D. When the depth channel kicks in… we have total immersion in Artificial Reality, and the audience is none the wiser.

This is something for Actors to think about. The argument is open on whether actors have signed contracts where they have given up the rights to their mo-cap and performance capture libraries. FACS libraries, their digital “skins” are all part of personal property that should be copyrighted.  The right to decide should fully be with the human owner that the surrogate was made from.

However, there’s also the good side to Digital Actor Surrogates.

Think John Travolta’s “Strut” in the final scene in Staying Alive, back in 1983.  Can he “strut” the same as he could back then? …err. What if a studio were to decide to make Staying Alive again? If they had the original actor’s mo-capped “strut”, they could simply apply it to the backside of a Digital actor in leather trousers. Photo-realistic New York is available today.

How about Michael Jackson’s dance style and the moon walk. He’s often imitated, but in the end, the nuances were his. Was his moon-walk ever mo-capped? I do not know. Can it be mo-capped with 100% accuracy from a video file? I’m guessing with advances in algorithms and multiview video analysis, it could.


Images used:

Cyberpunk 7070 - http://www.rogersv.com/blog/cyberpunk-2077-platige-images/

JimCarrey Perf capture - http://www.flickr.com/photos/deltamike/3636774647/

[1] Whole Body Scanning – Infinite Realites: http://ir-ltd.net/

[2] FACS – Facial Action Coding System - http://en.wikipedia.org/wiki/Facial_Action_Coding_System

[3] Motion Capture – Nuicapture - http://www.youtube.com/watch?v=mhb6Uqxg9Tg

[4] Voice Banking - http://www.nbcmiami.com/news/local/Woman-With-Lou-Gehrigs-Disease-Recording-Voice-Using-Voice-Banking-216827071.html

[5] Separable Sub-Surface Scattering - http://www.iryoku.com/separable-sss-released

[6] Realistic Skin rendering – Nvidia - https://developer.nvidia.com/content/gpu-gems-3-chapter-14-advanced-techniques-realistic-real-time-skin-rendering

Clyde DeSouza is an Author and Creative Technology Evangelist. He explores technologies such as Augmented Reality, Real-time Game engines and Stereoscopic 3D and their influence on human perception.
Print Email permalink (0) Comments (3429) Hits •  subscribe Share on facebook Stumble This submit to reddit submit to digg


YOUR COMMENT (IEET's comment policy)

Login or Register to post a comment.

Next entry: Neural Simulations Hint at the Origin of Brain Waves

Previous entry: On Philosophy of Science pt1 & 2


RSSIEET Blog | email list | newsletter |
The IEET is a 501(c)3 non-profit, tax-exempt organization registered in the State of Connecticut in the United States.

Contact: Executive Director, Dr. James J. Hughes,
56 Daleville School Rd., Willington CT 06279 USA 
Email: director @ ieet.org     phone: 860-297-2376