Aug 10, 2013

6254c178d931fd208010e79224619666_220x147

Dr Mark Sagar with Baby X, a computer-driven simulation based on his own daughter. Photo / Brett Phibbs

Mark Sagar is playing with his computer baby.

“It has a vision system, so if we wave around here, it’ll look at that,” he says, moving his hand in front of a camera on a desktop monitor.

On the screen below, a lifelike image of a baby breaks into a smile. Sagar points to a series of graphs on the right, tracing the neurochemical reactions in the baby’s simulated brain.

“See the activity of the neurons flowing here and we can see neuromodulator levels, such as the oxytocin. If we hide from it, we’ve essentially abandoned the baby. Watch its stress levels – this is cortisol, it’s climbing now.”

The digital baby gets visibly upset as we move out of its sight, and starts to cry.

“Come up and console the baby,” urges Sagar. “Put your face clearly in the middle of the screen.”

As the baby sees me, it relaxes again and starts to smile. Sagar notes the spike in dopamine, which drives the reward and attention centre of the brain and oxytocin, the “love hormone”, associated with trust and bonding.

“The more time you spend with it, the more oxytocin builds up and it becomes more resilient, just like a real baby.”

We are playing with Baby X, Sagar’s computer-driven simulation of how a human brain sends emotional signals to a digital face. It’s based on his own 20-month-old daughter, Francesca.

The Auckland University scientist and former Weta Digital animation expert unveiled a prototype at last weekend’s TEDx conference in Auckland and is working on a more developed version, which he hopes to have on long-term public display in the not-too-distant future.

It’s early days but eventually Sagar hopes to create a machine that thinks the same way we do. One that learns, dreams, adapts to its environment and develops a memory and a personality as it grows up – just like a human baby.

“One of my early motivations for this was how do we get a character to animate itself,” says the 46-year-old veteran of CGI blockbusters such as Avatar, King Kong and Spider-Man 2.

“That means almost giving it some sort of digital life force, so you’re trying to put a soul into the machine.”

He took computational neuroscience models of how the brain is thought to work and linked them to biomechanically based models of human faces.

Sagar cheerfully admits that so far Baby X is thinking “very small thoughts” but believes this could change as the baby model is increased in sophistication and learns from interacting with people and scanning the internet.

“The really long-term goal of where we’re going is to make an electronic consciousness.”
Sagar’s seven-person team at the grandly named Laboratory for Animate Technologies in the university’s bioengineering institute at the city campus on Symonds St are not the only scientists in the world working in this area. What makes their work distinctive, he believes, is the interactive model which allows academics from any brain-related discipline to test their theories and, by trial and error, make Baby X grow smarter.

The initiative builds on Sagar’s pioneering work on computer-generated faces for animated films, which won him two Sci-Tech Academy Awards in 2010 and 2011. His motivation is an unusual blend of scientific and artistic curiosity, which becomes obvious as he peppers the conversation with references to painting, music and a love of story-telling, which ranges from Dostoevsky to science fiction’s obsession with artificial intelligence coming to life.
Sagar’s diverse talents may come from his genes. His mother was an artist and his father a systems analyst. Growing up in Kenya, he remembers his mother encouraging him to draw everything he could see from an early age, including animals in the game parks.

The family moved to New Zealand when Sagar was 3. He became dux of Takapuna Grammar, did a degree in maths and science at Auckland University and then, as he puts it, he became a “beach bum”, travelling round the world for four years, again painting everything in sight. For a while, his stock in trade was doing portraits of customers in English pubs.

When he returned to Auckland University and began a doctorate in bioengineering, the connection between his love of art and science finally fell into place. A colleague was working on computer graphics and Sagar saw how the technique could be applied to eye and heart simulations to make realistic models for doctors to work on. He developed the idea in post-doctoral study at MIT in Boston, making human faces come alive on screen. The result was a short video which showed the digitally-altered face of an 80-year-old woman reverse ageing to the 20-year-old actress used as a model.

Sagar also used the technology to make Facemail, a digital character which read your email with appropriate expressions, but his immediate future was in the movies. Starting at Sony Pictures (Monster HouseSpider-Man 2), he captured expressions on a human actor’s face on video and used software to convert them to instructions which were replicated on a digital character’s face. Back in New Zealand, he refined the process at Weta Digital for Peter Jackson’s King Kong. By the time he got to James Cameron’s 3D breakthrough, Avatar, he could do this in real time, allowing the actors to drive their digital characters’ performances.

That’s when Sagar started wondering if his digital faces could take on a life of their own. He returned to Auckland University last year, recruited the then 4-month-old Francesca as the model and started work on Baby X.

The emotive digital face demonstrated here is an early version. Sagar hopes that as he adds more building blocks of information and life experience, the baby will help scientists learn about the development of memory and personality.

For instance, he says, you can programme the baby’s chemical levels to make it more anxious (by adding cortisone) or happy (increasing oxytocin), then watch how it responds to the same events.

Rather like Marvin the paranoid android from Hitchhiker’s Guide to the Galaxy, he grins. “I’m sure that book’s been very influential.”

The system also mimics attention modulation, which is the way our brain copes with the huge number of outside influences vying for our concentration. Sagar switches the view to a 3D simulation of the baby’s brain. When he asks me to stand in front of the baby and move my hand, a blue light goes off inside the superior colliculus, a visual centre of the brain that detects movement. When I move my eyes, a yellow light blinks. The brain quickly chooses to refocus on my face, overrides the distraction and the blue light goes off.

Unfortunately, this selection process does not work for people with Huntington’s or Parkinson’s disease. Huntington’s makes everything in the brain fire up at once and the body twitches uncontrollably. In Parkinson’s the opposite happens – the brain tells facial muscles to move but the signal doesn’t go through, creating a blank stare.

The digital baby can show this process, both in the baby’s facial expressions and in the brain view, giving researchers a powerful tool to test their theories on what makes the circuit break down or overload. It could also help research into amblyopia (lazy eye), a disorder in which the eyes see normally but the brain cannot process the information.

Sagar cautions that the model is only a current best guess about how the brain works, as revealed in our faces.

We understand the dynamics of the eye fairly well (which is very hard for even the best animators to get right), however, “nobody knows everything that happens to make you smile”.

He sees this as an opportunity for scientists to “plug in” rival theories and test which is most likely to be correct, filling in the gaps of knowledge as they go. Language will be especially challenging, as we don’t know much about how the brain works here, but he plans to start with basics.

“Even if it says ‘cat’ when it sees a cat, at least it’s communicating.”

Already the technology behind Baby X looks commercially lucrative for business. Sagar reels off the potential uses – interactive lighting and background music in a hotel lobby that changes to match the emotional responses of customers, a digital matron in the hospital sternly reminding you to wash your hands, a petrol gauge that shows an increasingly worried face rather than a blinking light when you run low …

Of course, he adds, the animation has to be appropriate. “That petrol gauge would be looking increasingly distressed. It’s not going to yell at you or anything.”

But what happens if his computer baby develops an unpleasant personality? Or the army creates an aggressive fighting drone with a mind of its own?

Sagar is familiar with what he calls “the Skynet question” – a reference to the self-aware artificial intelligence system in theTerminator films that enslaves humans – but he doesn’t think his digital babies would qualify.

“Our ones are likely to have all the fallibilities of people, so the military are unlikely to be interested in them.”

Another popular science fiction dilemma is whether it becomes morally wrong to switch off a machine that becomes capable of independent thought. Sagar’s baby is a long way from that point but he finds the idea fascinating, as it shows how we can bond emotionally with a machine, even when our rational side tells us this makes no sense.

“We want to build a machine that you don’t want to turn off, for that reason. You’ve had something experience the world, do you want to throw that away?”

Sagar sees the “good/bad machine” debate as part of a bigger question about free will and the conflict between our inner desires and higher self.

For artists, he says, this tension lies at the heart of all great drama. Scientifically, it plays out between different regions of the brain as our memories of past behaviours and emotional dispositions battle with the reasoning powers of our cerebral cortex.

He hopes the public will get a chance to learn all this first hand this year if Baby X goes on public display. Sagar wants to show off the machine as it dreams, which he thinks could be visually spectacular. Over time, the experiment could also test a prominent theory that part of the brain called the hippocampus replays memories from the previous day while we sleep, forming associations that evolve into long-term memories in the cortex.

And if all these ideas of artificial intelligence with a life of its own sound overwhelming, Sagar has a homegrown reminder that a small human brain is still far more complex than we can imagine.

The latest version of Baby X is based on Francesca at 18 months. That was two months ago, he says, and already “the real baby is mentally advancing, so fast there’s no way we can keep up”.