Sketch 14: Understanding the decisions of ML






Summary Limerick

A Local Interpretative Model-Agnostic Explains,
what's going on in the layers of a neural net's brains,
it's simpler twin,
taking the same data in,
and shows the same patterns that the data contains


Mind Map

4tuitous Knowledge Network

Groups
Hacker Noon

People
Sundar Pichai

Animated Discussion

Share your questions, answers, limericks, mindmaps, sketches and insight with others.
Join the discussion on your favourite forum

Quick Quiz

AI models are sometimes called a 'Black Box' because

Understanding the decisons of a Neural Network would be impossible because

A LIME model can indicate how the decisons of a Neural Network were arrived at by

LIME stands for..

You have __ right answers