Explainerdashboard

Latest version: v0.4.8

Safety actively analyzes 723177 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 11 of 14

0.2.5

New Features
- New dashboard tab: WhatIfComponent/WhatIfComposite/WhatIfTab: allows you
to explore whatif scenario's by editing multiple features and observing
shap contributions and pdp plots. Switch off with ExplainerDashboard
parameter whatif=False.
- New login functionality: you can restrict access to your dashboard by passing
a list of `[login, password]` pairs:
`ExplainerDashboard(explainer, logins=[['login1', 'password1'], ['login2', 'password2']]).run()`
- Added 'target' parameter to explainer, to make more descriptive plots.
e.g. by setting target='Fare', will show 'Predicted Fare' instead of
simply 'Prediction' in various plots.
- in detailed shap/interaction summary plots, can now click on single
shap value for a particular feature, and have that index highlighted
for all features.
- autodetecting Google colab environment and setting mode='external'
(and suggesting so for jupyter notebook environments)
- confusion matrix now showing both percentage and counts
- Added classifier model performance summary component
- Added cumulative precision component


Improvements
- added documentation on how to deploy to heroku
- Cleaned up modebars for figures
- ClassifierExplainer asserts predict_proba attribute of model
- with model_output='logodds' still display probability in prediction summary

Other Changes
- removed monkeypatching shap_explainer note
-

0.2.4

New Features
- added ExplainerDashboard parameter "responsive" (defaults to True) to make
the dashboard layout responsive on mobile devices. Set it to False when e.g.
running tests on headless browsers.

Bug Fixes
- Fixes bug that made RandomForest and xgboost explainers unpicklable

Improvements
- Added tests for picklability of explainers

0.2.3.2

0.2.3

Breaking Changes
- RandomForestClassifierExplainer and RandomForestRegressionExplainer will be
deprecated: can now simply use ClassifierExplainer or RegressionExplainer and the
mixin class will automatically be loaded.

New Features
- Now also support for visualizing individual trees for XGBoost models!
(XGBClassifier and XGBRegressor). The XGBExplainer mixin class will be
automatically loaded and make decisiontree_df(), decision_path() and plot_trees()
methods available, the dashboard Decision Trees tab and components now also work for
XGBoost models.
- new parameter n_jobs for calculations that can be parallelized (e.g. permutation importances)
- contrib_df, plot_shap_contributions: can order by global shap feature
importance with sort='importance' (as well as 'abs', 'high-to-low'
'low-to-high')
- added actual outcome to plot_trees (for both RandomForest and XGB)


Improvements
- optimized code for calculating permutation importance, adding possibility to calculate in parallel
- shap dependence component: if no color col selected, output standard blue dots instead of ignoring update

Other Changes
- added selenium browser based integration tests for dashboards (also working with github actions)
- added tests for multiclass classsification, DecisionTree and ExtraTrees models
- added tests for XGBExplainers
- added proper docstrings to explainer_methods.py

0.2.2

Bug Fixes
- fix for shap v0.36: import approximate_interactions from shap.utils instead of shap.common
- kernel shap bug fixed
- contrib_df bug with topx fixed

0.2.1

Page 11 of 14

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.