[MUSIC] Can you talk about issues and concerns surrounding AI? >> So I would say the number one key challenge is privacy. It's been a challenge getting through to patients, to care providers, to hospital administrators, that their information is safe. So what we've had to do is prove that we're actually anonymizing the data, we're de-identifying it from the patient themselves and then blending it into the system. So that's something that we've had to educate our users on to kind of get through that barrier. Because it is information that we need. We do need their information in order to provide the better health outcomes. >> Yeah, so, any new technology produces a bit of angst in all of us. Change is inevitable, but change is scary, especially when that scare factor is enhanced and propagated by popular movies and social media. So it's only natural. I think at the same time, we view this ethical dilemma as being not something that it currently presents, but it's something that we're looking in the future to be more prevalent. I think it's something we need to start caring about today and not wait until the year 2050. So some of people have described AI as the last invention humankind is ever going to make. And it's basically driving us towards this point where artificial intelligence will eclipse biological intelligence. And at that point in time, humans will become irrelevant. >> And in terms of ethics when it comes to machine learning technology, this is another one of those sort of gray areas. Where we still have to sort of catch up and think about exactly the law and stuff that goes behind machine learning or AI. And we have to do this with a fundamental assumption that machine learning technology's not human. It's just another computer. Like, for example, I have an application called Make Your Own Mozart that enables you to generate your own pieces of music based off of the style of music from Mozart, Beethoven, their piano sonatas. Now the thing is, people ask me, well, who would own the IP of the music generated by this neural network? And the thing is, it's a very difficult question to answer, but if you really think about it, it's actually pretty simple at its core. Now, the machine learning algorithm is just another algorithm. So when it generates a piece of music, for example, whoever owns the algorithm, whoever owns the rights to use the algorithm is the one that owns the piece of music it generated. So, for example, let's just say I was a programmer and I programmed this for Company XYZ and now XYZ has the rights to use this application, it's their music once they generate it with a neural network. So there are many different concerns to work through here. This was just one of them. >> As humans, we have a history of creating technology that can be destructive for our human race. In July 16th 1945, in the New Mexico desert, we have exploded a nuclear device. And we've been struggling in containing that technology ever since. We're always living in the fear of us wiping out the civilization with that device. As humans, we've made an imprint on our planet and our environment. We we have a first hand in deforestation. We've dried up entire seas. And these are all things that have consequences that can be fundamentally destructive in terminating our society as we know it. Where we're now looking for plan b of relocating the human civilization to Mars. So artificial intelligence is no different. Artificial Intelligence can be used for nefarious reasons and can possibly be our last great invention, or it can be one of our greatest inventions. It's how we apply it. >> It's not like when you see in the movies AI is going to be some big, here's some monster that takes us over, it's just a tool and it's how we use it. [MUSIC]