Peeking Inside the Black-Box

Can we predict AI?

10 mins
What is it about?

Computer Science

Data/Statistics

IEEE (Journal)

Today, Artificial Intelligence (AI) is democratizing our daily lives. To put the numbers in perspective, the International Data Corporation (IDC) predicts that global investment in AI will grow from US$12 billion in 2017 to US$52.2 billion in 2021. Indeed, AI has become ubiquitous and we are used to AI making decisions for us in our daily lives, recommending products and movies on Netflix and Amazon to Facebook frieToday, Artificial Intelligence (AI) is democratizing our daily lives. To put the numbers in perspective, the International Data Corporation (IDC) predicts that global investment in AI will grow from US$12 billion in 2017 to US$52.2 billion in 2021. Indeed, AI has become ubiquitous and we are used to AI making decisions for us in our daily lives, recommending products and movies on Netflix and Amazon to Facebook friend suggestions and personalized ads on Google. This further complicates matters, because entrusting important decisions to a system that in itself cannot be explained carries obvious dangers. In AI, we call this a black box, where the system which can be viewed only in terms of its inputs and outputs, without any knowledge of its internal workings. Its implementation is "opaque". To solve this problem, Explainable Artificial Intelligence (XAI) proposes to switch to a more transparent AI.nd suggestions and personalized ads on Google. This further complicates matters, because entrusting important decisions to a system that in itself cannot be explained carries obvious dangers. In AI, we call this a black box, where the system which can be viewed only in terms of its inputs and outputs, without any knowledge of its internal workings. Its implementation is "opaque". To solve this problem, Explainable Artificial Intelligence (XAI) proposes to switch to a more transparent AI.