Local Model Interpretation: An Introduction
Local model interpretation is a set of techniques aimed at answering questions like: Why did the model make this specific prediction? What effect did this specific feature value have on the prediction?
Student Robotics and Artificial Intelligence at AAU Klagenfurt | Technical Writer
Local model interpretation is a set of techniques aimed at answering questions like: Why did the model make this specific prediction? What effect did this specific feature value have on the prediction?
Global model interpretation is a set of techniques that helps us to answer questions like how does a model behave in general? What features drive predictions and what features are completely useless for your cause.
The Google Coral Edge TPU allows edge devices like the Raspberry Pi or other microcontrollers to exploit the power of artificial intelligence.
Regardless of what problem you are solving an interpretable model will always be preferred because both the end-user and your boss/co-workers can understand what your model is really doing.
Learn how to create your own object detector using the Tensorflow Object Detection API.
Get started visualizing data in Python using Matplotlib, Pandas and Seaborn
Learn the basics of Keras, a high-level library for creating neural networks running on Tensorflow.
Scrape data from Reddit using PRAW, the Python wrapper for the Reddit API.
Build a system that is able to recommend books to users depending on what books they have already read using the Keras deep learning library.
Generating text in the style of Sir Arthur Conan Doyle using a RNN
Gilbert Tanner is a robotics researcher and Bachelor student at the University of Klagenfurt.
to get all the latest & greatest posts delivered straight to your inbox