in

Creating a personal connect with AI: Pranav Mistry, CEO, Samsung’s STAR Labs

[ad_1]

Neon, the artificial human prototype conceptualised by computer scientist and inventor
Pranav Mistry, created waves recently. The President and CEO of Samsung’s STAR Labs told ET’s
Surabhi Agarwal in an exclusive interview that he created Neon because human beings are unable to connect with artificial intelligence (AI) assistants such as Apple’s Siri. The Palanpur (Gujarat)-born Mistry, considered one of the best innovative minds in the world right now, said Neon will be a companion to the elderly and to those who are lonely and could even work as fashion models or news anchors. The 38-yearold also spoke about the dangers posed by AI, echoing Google parent Alphabet Inc’s chief Sundar Pichai who recently called upon governments to regulate AI. Edited Excerpts:

When you started thinking about Neon, what was the problem you were trying to solve?

Two years ago, when we started thinking about it … (we wanted to) push the boundaries of science so that the machine interfaces become just like humans. It solves a lot of problems and opens up a lot of opportunities that never existed before. A lot of businesses are coming to us (for Neon), from media companies, film and even the fashion industry.

This is not an AI chatbot or an assistant — for that you have your phones. Neon is front-end — human interface to the machines.

Say, for example, if ICICI Bank has domain-specific knowledge, (and) they need to talk to customers in a particular way (then) think about a powerful front-end interface that can connect to any third-party services for B2B applications. When it comes to consumer applications, in the long run, we need to build something that doesn’t exist – someone who can have the human aspects, emotions…

As a science, this thing never existed. Tomorrow, it might solve healthcare problems, or education problems or the loneliness problem in the United Kingdom.

Initially, what could Neon be used for?

I was talking to the chairman of a big media company in (South) Korea. They say the major problem is breaking news at night since you need to wake up the whole crew. We’re not here to replace the main news anchor, but how about (if) your favourite news anchor is on your phone and tells you ‘hey, you missed yesterday’s game’. This is what happened. That personal connection is more important. As a kid, I grew up with Mickey Mouse. These cartoon characters are not just in our movies, they are in our dreams. My daughter is sad when Princess Fiona is sad; the human connection is what keeps us attached as a human. And, that is the reason behind doing this.

So, it’s like a face to AI assistants like Alexa or Siri?

Yes, AI doesn’t have a face right now. No technology has that interface. Even when our soldiers are in the field, somebody needs to talk to them. Are they mentally healthy? You can put a questionnaire to them — ‘On a scale of 0-6 how are you feeling?’. That’s not the right interface in that situation. If someone could only to talk to them…

An old person…has an AI system to give him all the answers at his home, but what he is looking for is someone to talk to. Because, that’s where the loneliness problem is coming in. If you notice all these problems of loneliness or autism…they are coming from countries that already have everything — every single piece of technology — they can watch any movie, but they still feel lonely… because current technology is not solving that problem. Because there is no human face to the technology… and that is why I am doing this. STAR Labs is an independent future factory. No one decides what STAR Labs does. It is fully backed by Samsung, but Samsung does not ask a single question.

MISTRY

Theoretical physicist and cosmologist, the late Stephen Hawking, had said that AI could be dangerous. Others, too, have echoed those thoughts. Are you concerned about what you are building?

Every technology, if you take even the smallest neon bulb right now, can be misused. Nuclear fusion is a threat to the world right now, but this is the same nuclear fusion that drives more than 50% of the world’s electricity. AI has its own problems, but why I feel even more comfortable about this is because if I don’t do this thing right now, in two years’ time someone else will be talking about it…

I feel comfortable because…I understand what the ethical standards of this thing should be, because I am imagining and dreaming a better world, not just a richer world for a few. So many of my products, ideas like Sixth Sense were open source; when I did Galaxy Gear, it was nothing to do with money.

What are the checks and balances an AI ecosystem needs to have?

This phase of AI — the understanding of it in the scientific world, technology world as well as the consumer world is very different. Core people who are actually behind this know where this technology’s expertise ends. They know that AI has nothing to do with intelligence.

AI is a smart database system. If you do not show a machine 1,000 pictures of an elephant, it cannot recognise it. It doesn’t even know what an elephant is. The limitations of the current approaches to AI end there, because nature doesn’t work this way. My daughter can see a vague drawing and tell you that it’s an elephant, but an AI machine will not do the same because the approach is not the same.

But, we still need the tabs, both as a consumer community as well as the scientific and technology community. We need to put the boundaries of architecture on the design level, like science fiction movies, like in any robotics movie there used to be the three rules of robotics, such as ‘robots will not hurt humans’ etc. Why this is important is because anything has potential to explode or to make the world better.

So, who puts in those rules?

That’s what the world needs to decide — the core thinkers of the world. And that’s why I feel, if I don’t do it (build Neon), someone else will do it. That’s why I feel much safer that what is going to go will go past me. I would love to have that conversation, not at the government level but at the community level — that, can we ensure there are rules and everybody follows them.

[ad_2]

Source link

“Most Frightening Setback In India…”

More sensitive testing can detect hormone changes signaling menopausal transition — ScienceDaily