[ { "chunk_id": "b8de41cb-b9a8-4e3e-892b-c3932dd68989", "text": "In preparation v2.1 (2018) 1-48 Submitted 4/00; Published 10/00 DALEX: Explainers for Complex Predictive Models in R Przemyslaw Biecek przemyslaw.biecek@gmail.com\nFaculty of Mathematics and Information Science\nWarsaw University of Technology\n75 Koszykowa Street, Warsaw, Poland", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 0, "total_chunks": 29, "char_count": 277, "word_count": 35, "chunking_strategy": "semantic" }, { "chunk_id": "f8bae3eb-436c-4dce-86cb-fbfd4e6329ec", "text": "Editor: ...\n2018 Abstract\nJul Predictiveor ensemblesmodeling(model isstacking,invadedboostingby elastic,or yetbagging).complexSuchmethodsmethodssuchareasusuallyneural describednetworks\nby a large number of parameters or hyper parameters - a price that one needs to pay for5\nelasticity. The very number of parameters makes models hard to understand. This paper describes a consistent collection of explainers for predictive models, a.k.a.\nblack boxes. Each explainer is a technique for exploration of a black box model. Presented\napproaches are model-agnostic, what means that they extract useful information from any\npredictive method despite its internal structure. Each explainer is linked with a specific\naspect of a model. Some are useful in decomposing predictions, some serve better in understanding performance, while others are useful in understanding importance and conditional[stat.ML]\nresponses of a particular variable. Every explainer presented in this paper works for a single model or for a collection of\nmodels. In the latter case, models can be compared against each other. Such comparison helps to find strengths and weaknesses of different approaches and gives additional\npossibilities for model validation. Presented explainers are implemented in the DALEX package for R. They are based\non a uniform standardized grammar of model exploration which may be easily extended. The current implementation supports the most popular frameworks for classification and\nregression. Keywords: machine learning, R, visualization, model interpretability, modeling Then we show how our methodology helps to better understand complex\npredictive models a.k.a. black-boxes.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 1, "total_chunks": 29, "char_count": 1675, "word_count": 229, "chunking_strategy": "semantic" }, { "chunk_id": "8806aff3-e92e-44c7-a1a8-a6dcd941636e", "text": "Predictive modeling has a large number of applications in almost every area of human\nactivity, starting from medicine, marketing, logistic, banking and many others. Due to the\nincreasing amount of collected data, models become more sophisticated and complex. It is believed that there is a trade-offbetween the interpretability and accuracy of\na model (see e.g., Johansson et al. (2011)). It comes from the observation that the most\nelastic models usually have higher accuracy but in turn they are also more complex. Complexity here means a large number of parameters that affect the final prediction. That\nnumber is big enough to make the model ununderstandable for an ordinary human being. c⃝2018 Przemyslaw Biecek. License: CC-BY 4.0, see https://creativecommons.org/licenses/by/4.0/.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 2, "total_chunks": 29, "char_count": 787, "word_count": 117, "chunking_strategy": "semantic" }, { "chunk_id": "f7916c08-f181-4fc8-b62d-c0e814b6a21e", "text": "Interpretability may be introduced naturally in the modeling framework, see an example in Figure 1. In many areas the interpretability of a model is very important, see for\nexample Lundberg and Lee (2017), Murad and Tarr, Puri (2017). The reason behind it\nthat interpretability allows to clash the model structure with the domain knowledge. And\nthis may bring multiple benefits such as: Very flexible models may be over-fitted to the training data and\nfocused on some biases that result from the manner in which the data was collected\n(sample bias) or some surrogate variables (variable bias). Validation of the model\nstructure helps to identify these biases. In Figure 1 this feature is marked as C. Identification of subsets of observations in which a model has\nlower performance allows us to correct the model in this subset and leads to further\nimprovements of the model. In Figure 1 this feature was marked as D.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 3, "total_chunks": 29, "char_count": 917, "word_count": 153, "chunking_strategy": "semantic" }, { "chunk_id": "3f254854-f276-40d4-8c4d-06cde8e18f51", "text": "If the model is used to assist people in activities such as selection of proper\ntherapy, understanding key factors that drive model predictions is very important. See more examples in Ribeiro et al. (2016). Sculley et al. (2015) argue that lack of interpretability leads to hidden\ndebt in machine learning models. Despite initial high performance, the real model\nperformance may deteriorate quickly. Model explainers help to control this debt. Figure 1: Points A and B are from typical workflow of data modeling. A) domain knowledge and data are turned into models. B) models are used to generate predictions,\npresented methodology extends this framework with new processes C) model understanding increases our knowledge and, in consequence, it may lead to a better\nmodel, D) undertsanding preditction helps to to correct wrong decisions and, in\nconsequence, it leads to better models. DALEX: explainers for complex predictive models It is hard to increase knowledge about a domain on the basis of black\nboxes. They may be useful but it does not lead to any new knowledge about a given\ndiscipline. Understanding model structure may lead to new interesting discoveries. In this paper we present a consistent general framework for local and global model interpretability. This framework covers the most known approaches to model explanation such\nas Partial Dependence Plots (Greenwell, 2017), Accumulated Local Effects Plots (Apley,\n2017), Merging Path Plots (Sitko and Biecek, 2017), Break Down Plots (Biecek, 2018),\nPermutational Variable Importance Plots (Fisher et al., 2018) or Cateris Paribus Plots. All these explainers are extended in a way that allows us to compare different models\nagainst each other on the same scale.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 4, "total_chunks": 29, "char_count": 1727, "word_count": 270, "chunking_strategy": "semantic" }, { "chunk_id": "5177d54d-c647-4ab4-b9ec-4b3bd8d8bd2b", "text": "Model comparison is very important since in model\nbuilding often one gets a collection of competing models. Comparisons of these models and\nexploration of structures learned by elastic models gives new insights that may be used to\nconstruct better features for new models (assisted training with surrogate models). Also\nlot of effort was put in the graphical side of explainers. Solutions such as Visualizations for\nConvolutional Networks (Zeiler and Fergus, 2014) or Conditional visualization for statistical\nmodels (OConnell et al., 2017) show that well-prepared visualization boost actionability. Also, by purpose, we have not included approaches that do not fit into our grammar of model\nexploration, such as Individual Conditional Expectations Plot (Goldstein et al., 2015) and\n(Apley, 2017). Nevertheless, they are still available, for example in the ICEbox package.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 5, "total_chunks": 29, "char_count": 872, "word_count": 129, "chunking_strategy": "semantic" }, { "chunk_id": "a6514563-575e-4eae-9174-83298947129e", "text": "The presented methodology is available as an open source package DALEX for R. The R\nlanguage (R Core Team, 2017) is one of the most popular software systems for statistical\nand machine learning modeling. The current implementation of DALEX supports models\ngenerated with the most popular frameworks for classification or regression, such as caret\n(from Jed Wing et al., 2016), mlr (Bischl et al., 2016), Random Forest (Liaw and Wiener,\n2002), Gradient Boosting Machine (Ridgeway, 2017) and Generalized Linear Models (Dobson, 1990). It can be also easily extended to other frameworks and other techniques for\nmodel exploration.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 6, "total_chunks": 29, "char_count": 626, "word_count": 97, "chunking_strategy": "semantic" }, { "chunk_id": "3710c27a-4598-40c7-9740-5ad5755d610f", "text": "The DALEX package is available on GPL-3 license at CRAN1 and at\nGitHub2 along with technical documentation3 and extended documentation4. Example explainers presented in this paper were recorded with the archivist package\n(Biecek and Kosinski, 2017). Each explainer is an R object, which can be downloaded\ndirectly to R console with hooks added to every section. To save space, we present in this\npaper only graphical representation of explainers. The tabular representation is available\nthrough attached hooks. Figure 2 presents the general architecture of the DALEX package.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 7, "total_chunks": 29, "char_count": 575, "word_count": 87, "chunking_strategy": "semantic" }, { "chunk_id": "1f6dbeba-2896-452d-a35e-c0a365f2e058", "text": "The presented methodology is model-agnostic and works for any predictive model that returns a numeric score,\nsuch as classification and regression models. To achieve a truly model-agnostic solution, explainers cannot be based on model parameters nor model structure. The only assumption here is that we can call the predict 1. https://cran.r-project.org/package=DALEX\n2. https://github.com/pbiecek/DALEX\n3. https://pbiecek.github.io/DALEX\n4. https://pbiecek.github.io/DALEX docs", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 8, "total_chunks": 29, "char_count": 478, "word_count": 58, "chunking_strategy": "semantic" }, { "chunk_id": "fc008045-f246-4145-8ae9-63ec49a4efba", "text": "function for any selected data points. Such function is wrapped with the model and the\nvalidation dataset. Such wrapper serves as a unified interface for a model. Methods for better understanding of global structure of a model (a.k.a. model explainers) and for better understanding of a local structure of a model (a.k.a. prediction\nexplainers) are implemented in separate functions. We call these function explainers since\nthey are design to explain a single feature of a model.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 9, "total_chunks": 29, "char_count": 479, "word_count": 76, "chunking_strategy": "semantic" }, { "chunk_id": "3bbe9f8a-85d6-485b-aca1-dc889ba29e5b", "text": "As a result, they return numerical summaries in a tabular format. Results from each explainer may be summarized with\ngeneric plot function. The plot function will work with any number of models and will\noverlay all models in a single chart for cross examination.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 10, "total_chunks": 29, "char_count": 262, "word_count": 44, "chunking_strategy": "semantic" }, { "chunk_id": "b62c5a0b-9c93-4499-805a-daf7fcd89100", "text": "yraw := M(x) yhat := p(yraw) A) B) explain(model; data; y; predict_function) prediction_breakdown(..., new_x) variable_importance(...) variable_response(..., variable) C) Figure 2: Architecture of the DALEX package is based on simple yet unified grammar. A) Any predictive model with defined input x and output yraw ∈R may be\nused. B) Models are first enriched with additional metadata, such as a function\nthat calculates predictions and validation data. The explain() function creates\nan wrapper over a model that can be used in further processing. C) Various\nexplainers may be applied to a model. Each explainer calculates a numerical\nsummaries that can be plotted with generic plot() function. DALEX: explainers for complex predictive models In this section we present explainers that increase understanding of a global structure of\nthe model. The primary goal of these explainers is to answer the following questions: How\ngood is the model?", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 11, "total_chunks": 29, "char_count": 944, "word_count": 143, "chunking_strategy": "semantic" }, { "chunk_id": "7fca8de2-b6b4-4788-8e6b-749d1981b3ed", "text": "Which variables are the most important? How are the variables linked\nwith the model response? 3.1 Explainers for model performance Model performance is often summarized with a single number such as precision, recall, F1,\naverage loss or accuracy. Such approach is handy in model selection. It is easy to construct\nranking of models and choose the best one on the basis of a single statistic. However, more\ndescriptive statistics are better when it comes to understanding of a model. The descriptive statistics most often used for classification models is ROC (Receiver\nOperating Characteristic). It has many various implementations.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 12, "total_chunks": 29, "char_count": 632, "word_count": 98, "chunking_strategy": "semantic" }, { "chunk_id": "a4d0b154-9b28-4de0-bf10-03ab47479bc4", "text": "In R, the most widely\nused descriptive statistic is the ROCR package Sing et al. (2005). ROC plots have also\nextensions for regression models. Find an overview of Regression ROC curves in HernndezOrallo (2013). Distribution of | residuals | Boxplots of | residuals | Red dot stands for root mean square of residuals\n100 G G G GG G GG G G GG G GG GGG G G G GG G GGG G G G G G G GG G G GG G G GGG G G G GG G GGG G GGGG G GG GGGG GG G GGGGG GG G GG G GG GG G GGGGG G GG G GG GGGG GGG GGGGGGGG GG GGGG GGGGG G GGGGGGG GG GGGG GG GGG G G GGGG G GG G G GGG GG GG GGG GGG GG GG GGGGG GGGG GGGG G G G GG GG GGGGG GG GGG GGGGG G G GG GGGGG G GGGG G GGGG GG GGGG GGG GGGG GG G G G GGGGG GGG G G G G GGGGG GGGGGG G G GGGG GGG G GG G GGG GG GGG GGGGGGG GG G G G GGGGGGG GGGG G GGG GGGG G GG GGG GGGGG GGG GG GG G G GG G GGG GG GGGG GG GGG GGGGG GG GGGG GGG GGGGGGGG GGG GGGGG GGG G GGGGGGGGGGGGG GG GG GG G GGGGGG G GGGG GGG GGGGGGG G GG GGGGG G GG GGG G G GGG GG GGGGGGG GG G GG GG G GGGG GGG GGGG GGG GGGGGG G G GGG GGGGGGGGGGGGGGGGG G GGG GG G GGGG G GGGGGGGG GGG GG GG GGGGGGGG GG GG GGGGG G GG GGGG G GGG GGG GGGGGGGG GGG GGGGGGGGGG GG GGGGGGGG GG GGGGGGGG GGGGGG G GGG GGGG GGGGGGGG G GGGGGGGGG G GGGGGG GG G GGGGGGG GGG GGG GGGG GG GG GGG GGG GGGGGGGG GGGGG GG GGGG GGG GGG GGGG GGGGGGGGGGGG GGG GGG GG GGGGGGGGGGGGGGGGGGG GGGG GGGGG GGG G G GGGGG GG GGGG GGGGG GG GGGGGGGGG GGG GGG GG GGGG GG GGGG GGGG GGGGG GGG G G GG GGGG GGG GGGGGGG G G GGG G GGGGGG GG GGG GG G GGGGGG GG GG GG GGGGGGGGGGGGGG GGGGG GG GGGG GG GG GGGG 90 % GGGG GGGGGGG GG GGGGG GG GGGG GGGGGGGGGGGGGGGGG GGGGGGGGGGGGGG GGGG GG GGGGG GG GGG GGGGGG GGGGGGGGGGG GGGGGGG GGGGGGG GGGGGG GGG GGGGGGGGG GGGGGGGGGG GGGGGG GGGGGGGGGG GG G GGGGGGGGG GGGGG GGGGGGG GGG GGGGGGGGG GGGGGGGGGGG GGGGG GGGGGGGGGGG GGGGG G GGG GGGG GGGGGG GGGG GGG GGGG GG GGGGGGGGGGG GGGGG GGGGG GGGGGGGG GGGGGGGGG GGGGG GGGGGG GG GGGGG GGGGGGGGG GGG GGGG GGGGGG GGGGG GGG GGGGGGGGGGGGGGGGGGGGG GGG GGGG GGG GGG G GGGG G GGGGGG GGGGG GGG GGGGGGGGGGGGG GGGGGG GGG GGGGGG GGGGGGGGGGGG GGG GGGGG GGGGGG GGGG GGGG GGGGGGGGGGGGGGGGGG GGGGGGGGG GGGGGGGGGGGGG G GG GGGGGGGGGGG GGGG GG GGG GGG GGG GGGG GGGGGGGGGG GGGGGGG GGGG GGG GGGGG GGGGGG GGG GGGGGGGGGGGGG GGGGGGG GGGG GGGGGGGGG GGGGGGGGGGGGG GGGGGG GGG GGGGGG GGGGGGGGGGGGGGG GGGGGG GG GGGGGGGGGGGGG GGGGGGGGGG GGGGG GGG GGGGGGGGGGGGGGG GGGGGG GGGGGGGGGGG GGGGGGGGGGGGGGG GGGG GGGGGGGGGG GGG GGGG GGG GG G GGGGGGG GGGGGG GGGGGGGGGGGGGGGGGGGG GGGGG GGGGGGGGGGGGG GGGGGGGG GGGGGGGGGGGGGGGGGG GGGGG GGGGGGG GGGGGGGG GG GGGGGGGGGGGGGGGG GGGG GGGGGGG GGGGGGG GGGGGGGGG GGGGG GGGGGGGGG 80 % GGGG GGGGG GGG GGGGGGGGGGG GGGGGGGGGG GGGGGGGG GGGGGGG GGGGGGG GGG GGGGGGGGGGG GGGG GGGGGGGGGGG GGGG GGGGG GGGGG GGGGGG GG GGG GGG GGGG GGGGGGGGGGG GGGGG GGGGG GGGG GGGGGGGGGG GGGGG GG GGGG GGG GGGGGGGGGGGGGGGGGG GGGGGGGGGGGGGG GG GGGGG GGGG GGG GGG GGG GGGG GGGGGGG G GGGGGGGGGGGGG GGGGGGGGG GGG GGGGGGG GGGGGGG GGGG GG GGGGGG G GG GGGGGGGGG GGG GGGGGG GG GGGGG GGG GGGGGGG GGGGG GGGGGGGGG GGGGGG GGG GGGG GGG GGG GGGG GGG GGGG GGGGG G GGGG GG GGGG GGGGGGGG GGG GG GGGG GGGGGGGG GGGGGG GGGGGGG GGGGG GGGGG GGG GGGGGGG GGGGGGGGG GGGGGGG GGGGGG GGG GGGGGG GG GGGGG GGG GGGGGGG GGGGGGG GGGG GGG G GGGGG GGGGGGGG GGGGGGGGGGGGGGGG GGGGGGGG GGGGGGGGG GGG G GGGGG GG GGGGGGGG GGGGGGGGG GGGG GGGGGG GGGGG GGGGGGGGG GGGG G GGGG GGGGGGGGGGGGGGG GGGGGGGGGGGGGGGGG GGGGGGG GGGGG GGGGGGGGGGGGG GGGGGGG GG G GGG GGGG GGG GGGGG GGGGGG G GGG randomForest GGGGGGGGGG GGGGGGGGGGGGGGG GGGGGG G GGGGGGGGGGGGGGGGG GG G GG GGG GG GGGGGG G GGGG GG G GGG GGGGG GGGGGGGG GG GG G G GGG GGGGG GG G GGGGGG GGG GGGGGG GGG GGGG GGGGGGGGGGGGGGGG GG GG GGGG GGGGGGGG GGGGGGGGG GGGGGGGGG G GGGGGGGGGGG G GGG GGG 70 % GGGGGGGGGG GGGGGGGGGGGGGGGGG GGG GGGGG GGG GGG GG GGGGGGGGGGGGGG GGG GG GGG GGG GGGG GGGGGGGGGGGGG GGGGG GG GGGGGG GGGGGGGGGG GGGGGGGG GGGG GGGG GG GGGGGGGGGGGG GGGGGGGGGGGGGGG GGGGGGG GGGGGGGGGGGGGG GGGG GGGG GGG GG GGG GGGGGGGGGG G GGGG G GGGGGG GGGGGGGGG GG GG GGG GGGGGGGG GGGGGG GGGG GGG GGGGGG GGG GG GG GGGGGG GGGGGGGGGGGG G GGGGGGGGGGG GGGGGGGGGGG GGG GGG GGG GGG GGG GG GGG GG GGGGG GGGGGGGGG GGGGGGGGG GG GGG GGGG GGGG GGGGGG G GGGG G GGGGGGGGGGGGGGGGG GGGGGGGGGG GGGGG G G GG GGGGGG GGGGG GGGGG GGG GGG GGGGGG GG GGGGGGGGG GGGGGGGGGGGGGGGGGGGGGGGG GGGGGGGGGGG GGGGGGG GGGG G GGGGGGGG GGG GG G GGG G GGGGGGGG G GGGG GGGG GG GGGGGGGGGGGG G GGG GGG GGGG GGGGG GG GGGG GG GGGGGG GGGGGGGG GGGGGG GG G GGGG GG GG GGG GGG GGG GGGG GGGGGGGGGGG GGG GGGGGG GGGGG GGGGGGGGGG GG GGGGGGG GGGGGG GGG GGGGG G G GGGGGGGGG GGGGG GGGGGGG GGG G GGG GGGG GG GGG GGGGGGGGGGGGGGGGGGGG GGGGG GGGG GGG GGGG GG GGGGGGGGGGGG GGG GGGGGG G GGGGGGGGG G GGGGGGGG GG GGGGGGG GGG GG G GGGGG GG GGGGGG GGGGGGGGG GGGGG G GGGGGGGGGG GGGG GGG GGG GGG GGGGGGGG GGGGGGGGGGGGG GGGGGGGGGG G 60 % GGGGGGGGGGGGGGGGGGG GGGGGGG GGG GGGG GGGGGGGGGG GGG GGGGG GGGGGG GGG GG GGG GGG GGGGGGGG GGGG GGGGGGGGGGG GGGGGGGGGGG GGGG GGGGG GGGG GGGGGGGG GGGGG GGGGG GGGGGGGGG GGG GGGGGG GGGGG GG % GG GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG GGG GGGGGGG G GGGGGG GGG GG GG GG GGGGGGGGGG GGGGGGGGGG GGGG G GGGGGGG GGG GGGGGGGG G GGGGGGGGGGG GGG GGGGGGG G GG G GG GGGGGGGGG G GG GGGGG GG GG GGGGG GGGGGG GGGGG G GGGG GGGGGGG GG GGGGGGGG GG G GGGGGGGG GGGGGGGGGGGG G GGGG G GGG GG G GG GGG GGGGGGGG G G GGGG G GGG GGGGGGG GGGGGG GGGGG G G G G GG G G GGGGGG GGGGGGGGG GGGGGGG GG GGGGG GGG G GGGG GGG GGGGGG GGG GGGGGG GGGG GGGG GGGGGG GGGG GGGGGG GGGGGGG GG GGGGG GGGGG GGGGGGGGG GG GGGG GGGGGGG GG GGG GG GG GGGGGGG G GG GGGG GG G G GG GGGGGGGG GGG G G GGGGGG GG G G GGG GGGGG G G GGGGGGGGGGG GGG GGGGGG G G G GGGGGGGGGGGGGGGGGGGGGGG G G GG G G GG G GGG GGGGG G G GGG GGG GG GG GG GG G G GGGGG G G GG GG G G G G GGGGGGG GG G GG GG G G GGGGGG GGG GGGG G GGG GG GG G GG GGGGGGGG G GGGGGG GGG GGGGGGGG GG GGG GGGG G GGGG G GG G GGG GG G G GG G GGGGGGGGGGGGGGGG G G G GGG G GG GG GGG GGGGG G G G GG GGG GGG G GGGG GG GG GGG G GGGG GG G G GGGG GG G G GG G GGG GG Model G G GG GG GG G G G G GGGGGGGGGGGG GG G GGGG GG G GG GG G G GG G G G G GG GGG G GGG G GGGGGGGGGGGGGGGGGGGGG GG GG GGGGG G G GGGGGGGGGG G G GG G GG GGG GG GGG G G G GGGGGGGGGG GGG GGG GG GG GGG G GGG G GG G G G GG GG GGGGGGGGGGGGG GG GGG GGG GG GGGGGG G G GG G GG G G G GG G G G GGG GGG G GGGG GG G G GGGGGGGGG G G G G G GGGGG GGGGGGGGGGGGGG GG G G G G G GGGG G GG GG GGGG GG GGGGGG GG GGGGGGGGGGG G G GG GGGGG GG GG G GG GGG GGG GG G GG G GGGGGGGGGGGGGGGG GGGG G G GGGGGG GG G GGG GGGGGG GGGGGG GGG GG G GG GGGG G G G G G G G GGGG Model GGG GGG G GGGGGGGG GGG GGG GG GG GGG GGG GGG G G G GGG G GG G G GGG G GGG GGGGGGGGGGGGG GG G GGG G G G G G GG G GGG GG G G GGGGGG G G G G G G G GGGGGGGGGGGGGGG GG GGGGGGGG GGG G G GGGGGG GG GGG G GGGGG G G G G GGGG G G G G G GGGGG G G G GG GGGG G GGGG GG G G G G GGGGGG G G G G GG GGGGG G G G G GGGGGGGGG GGG G GGGG GGGG G G GG GGGGGGGG GG G GGGG GGG GG G G GG GG GGG G GGGGGGGGGG G GGG G GG GGGGGGG G G GGGG G G G lm GG GGGGGGGGGGG G G G G G GGGGG GG GG G G GG G GG G G G G GGGGGGGGGGG G GG GGGGG GGGG G GG G GG GGG G GG GGG GGG GGGGGGGG G G G G G G GG GGGGGGGGGGG GGG G GGGGG G G GGGGGG GGG G G 50 % G G GGGGG GG G GGG GGGGG G GG G GGGGGGGGGGGGGGG G G G GGGG GG GG GGG G GG GGG GG GGGGG GGG G GG GGGGG G G G G GGGG G G G G G GG GG G G GGGGGGGGGGG G GG G GGGG G G GG GGGGGG GGG GG GGG G GG GGG GG GGG GG G G GGGGGGGGGG G G GG G G G G GGGG GG G G GG GGGGGG G G G G GG GGGGGG G GG G G GG G G G G G G G GGGGGGGGGGGG G GGGG GGGGG G G GG GGGG G G G G GG G GG GG G G G G G GGGGGGGGGGGGGGG G G lm G GGG G GG G GG G GGG GGG GG GGGGGGGG GG G GG G G GG G GG G G GG G GGGGGGGGGGGGG GG G GG GGGG G G G G G GGG G G G GG GG G G G G G G G GG G G GG G G GGGGGGGGGGGGGGGGG G GG GG GGGG G G GG GGG G G GGG GG GGG G GG GG GG G G G G GGG G GGG GGGGGGGGGGGG GGGG G G GGG G G GGG G GG GGG GG G GG GG G G G G GG GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG G GGGGGGGGGGGGGGG GG GGG G GGG G G G G G G G GG G GG GG G GGG G GG G GG G G G GGGGGGGGG G G G G GG GG G G GGG GGGGGGG G GG GG GGGG G G G G GG GG GGGGGGG G G G G G GGGGGG G GG G G G G G GGGGGGG G GGGGGGGGG randomForest GGGG G G GGGGG G G G G G G GGGGGG G G G G GGGGGG GGG G G GGG G G GG G GG GGG G GG GG GG GG GGGGGG GG G GG GG G G GGGG G G GG GG G G GG GG GGGGGGGGG G GGG GG G GGG GGGG GGGG GG GG G GGGGG GG GGGGGGG G GG GG G GGGGGGG G G G G GG GGGGGG G G G GGG GG G GG GG GGGGGGG G G GG G GG G G G GG GGGGGGGGGGGGG GG GG G GGGG GGGG G G G G G GGG G GGGGGGGGGGGGG G GG G G G G G G GGGGGGG G G GG G G GGGG GGG GG GGGGG G G G G GGGG G GG GGGGGGGG GG G G G GGGGGGGG GGG GG GGGGG GGG GG GG GGG GG G G GG G GG GG G G G G G GGGGGGGGGGGGGGGGGG GGG G GG GG GGGGGG G GG GG G GGGGGG randomForest GGG G G GG G GG GGGGGGGGGGGG GGGG G GGGG GGG G GGGG GGGG G GGG GG G G GG GG GGGGGGGGGGGGGGG G G GGG G GGGG GGG GG GGG GGGGG GG G GG G GG GGGGGGGGGGG GG G GG G GGGGGGGG GGG G GGG GGGGGG GG GG GG G GG G G G GGGGG G GG G GGGGG GG GG G GGGGG GG GGG G GG GGGG G GG GG GGGGG GG G G GGGG GG GG G G GGGGGG GGGG GGGG GG GG GG GGGGGGGG GGG GG GGGG GGGGGGG G GG GG G GG GG G GG G GG G GG GGGG G G GG G G GG G GGGGGGGGGGGGGG G G G GGGG GG G G GGG GG GGG GG G GG GG GG G G GGG G G GG G GGGGGGGGGGG GGG G G GGGGGG GGG GG G GGGG G GG GG GG G G GGGGG GGGG G GGG G G GGG GG G G G G GGGGG G G G GG GGGGG G GGGGGGGGGGGGGGGGGGGGG GG G G G GG G GGG G G G GGGGGGGGGGGG G G G GG G GGGGGG G GGG G GGGGGG G G GGG GGGGG G G G GGG G G GGGG GGG G G GG GG GGG GG GGGGGGGGG GG GG GGGG GG GG G GG G GGGGGGGGG GG G GG GGGG G GGGG GGGGG G G G GGGG GG GG GG GGGGG GG G GG GGG G G G G GGGGG G G G G G GG GGGGGG GG GGG GGGG GGGGG GG GG G G G GGGGGGG GG GGG GGGGG G G GG GGG GG GG GG GG GGG GG G G GGGGG GG G G GGGGG GG GG GGG G G G G G G G G G G GGGG GGGG G GGGG G G G G G GGGG GG GGGGGGG GG 40 % GGG GGGG GGGGGGGGG GG G GGG G G G GG G G G G GG GGGGGGGGGGGG GG G GG GGG GGG G GGGG GGG GGGGGG G G G G GG G G G GGG GG G GG GGGGGGGGGGGGGGGGGG G GG GGG G GG GG G G GG G GG G GG G GGG GG G GG G G G G G G GG GG G GGG GG G GG GG G GG GG GG GG GGG GG G G G GG G G GG G G GG GGG G G GG G G G G G GG GGG GG GG G G G G GG GG G G G GG G G GGG G GGG G GG G GG GG G G GG GG GG G GG G GG GG G G GGG G G G G G G G G G G GG G G G G G GG G GG G G G G G G G GG GG G GG GGG GGG G GG G G G G GGG GG G G G G G G G G GGG\nGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG G G GG G G G GGG GGGGGGGG GG GGG GGGGGGGGGG GG G G GG GGG GGGGGG GGGGG GG GGGGGGG GGGGGG G GGGG GGGGGGGGGGG GGGGGGGGGGGGGG GG GG GG GGGGG G GGGGG G GGGG G GGGGGGG GG GGGGGGG GG GG GGGGG G G GGG GGGG GG 30 % G GGGG GG GGGG GGGGGGGG G GGGGGGG G GGG GGG GGG G G GGGGGGGGGGGG GGGGGGGGGGGGGGGGGGG G GGG GGGGGGG GGG G GG GGGGGG GGGGG GGGGGGGGGGGGG GGGGGGGGGG GGGGGGGG GGGGGGGGG G GGGGGGGGGGGGG GGGGGGG GGG GGGG GGGGGG GGG GGGGGG GGGGGGGGGGG GGGG GGGGGGGGGG GGGGG GG G GGGGGG G G GGGGGGGGGGG G GGGG GG GGGGGGG G GGGGGGGGG G GGG G GGG GGGGGG GGGGGGGGGGGGGGGGG GGG GG GGGGGGGGG GGGG GG GGG GG G GGGGGGGG GGGG GG GGGGGGGGGG G GGGGGGGGG GGGGGGGGGGGGGGGGG GGG GGGGGGGGGGGGGGGGGGGGGG GGGGGGGGGGGG GG GGGGGGGGGGGGG GGGGGGGG GGGGGGGGGGG GGGGGGGGG GGGGGG GGGGGGG GGG GGG GGGG GGGGGG GG GGGGG GGGGGG G GGGGGGGGGGG GGGGGGGGG GGGG GG GGGGGGGG GGGGGGG GGGGGGG GGGGGG GGGGGGGGGG GGGGGGGG G GG GGGGGG GGGG GG GGGGG GGG GGGGGGGGGGGGG GGGGGGGGGGGGGGGGGGGG GG GGG GGGG GG GGGGGGGGGGGGGGG GGG GGGGGGGGGG GGG GGGGG GGGG GGGGGGGGGGG lm G GG GGG GGGGGGGG GGGG GGGGGGGG GGGGGGGGGGGGGGGGG GGGGGGGGGG GGGG GGGGGGGGGGGGGGGG GGGGGGGGGGGGGGG GGGG GGGG GG GGG GGGGGGGGGG GGGGGGG GG GGGGGGGG GGGGGGGGG GGGGGGGG G GGG G GGGG G GGGGG GGGG GG G GGGG GGGG GGGGGGGGGGGGGGGGGGG GGGG GGG GGGG GGGGGGGGGGGGGGGGG 20 % GGGG G GGGG GGGGGGGGGGGGGGGGG GGGGGGG GGGG G GGG G GGGGGG GGGGGGGGGGGG GGG GGG GG GG GGGGG G GGGGGGGGGGGGGGGGGGGGG GGGGGGGGGGGGGGGG GGGG GGGGGGGG GGG GGGGGGGGGGGGGGGG GGG GG GGGGGGGG GGG GG GGGG G G GG GG GGGGGGGGGGGGGGGG GGGGG GGGGGGGGGG GGGGG GGGGGGGGGGGGG GGG GG G GGGGGGGGGG GGG GG GGGGG G GGG GG GGGGGGGGGGGGG GGGGGGGG GGGGGGG G GGGGGGGGG GGGG G GG G GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG GG GG GG G G G G GGGGGGG G G G GG GG GG G GGGGGGG GG G G G G GGGGGGGGGGGGGG G G G G G G G G GGGGGGGGGGGG G GG GGG G GGG G GGGGGGG G G G G G G G G G G G G GGGGGGGGGG G GG GG G G GG GG GGGGGG G GG G G G GG GG G GG G G G G G G G GG GG G GGGGGGGGGGGGGGGGGGGGGGGGGGG GGG GGGG G G G GGG GG GGGG G G G G GGGGGG GG GG G GG G G G GG G G G G G G G GGGGGGG G G G G GGG GGGGG G GGG G G GGGGGG G G G G GGGG G G G G G G G G G G GG GG G G G G G G G G G GG GG G GGGGGGG G G G G G G G G GGGGGG G G G G G G G G G GGGGGGGGGG G GG G G GGGGG G G G GGGGGGGGGGGGGGGGGGGGGG G G GG G G GGG G G G GG GGG G GG G G G G GG G G G G GG GG GG G G GG GG G G G G GGGG G G G G G G G GG G G GG G G G GG GG G G G G G G GG G G G GG G G G GG GG G GGG G G GG G GG G G G G G G G G G G GG G GG G G G G G G G G G G G GG GGG G G G G G G G G G G G G G G G GG G G G G G G G G G GG G GGG GG G G G GG G G G G G G G G GG G G G G G G G G G GGGG G\nG GG G G G GG\n10 % GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG G G G G GGGGGGGG G G G GGG G G G G GGGGGG GGG G G GGGGG GGG G GGG GGGGG G GGG G G G GGG GGG GG GGGG G GG G GGG G G G GG GG G GGG G GG GGGG GGG GGGGG GGGG G G G G GGGGGGGGGG\nGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG G GGG GGG GGG G GGGGG GG GGGG GG G GGGGG GG GG GGGGG GGGGG G GGG GGGG G GGGGG G GGG GGG GGGGG G G GG G GGGGGG GGG G G G GGG G G G G G G G G G GG G GG G G GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG GGGGGG GGGG GG 0 % 0 500 1000 0 500 1000\n| residuals | Figure 3: Both plots compare distributions of residuals for two models. The left plot shows\n1 - Empirical Cumulative Distribution Function for absolute values of residuals,\nwhile the right plot shows boxplots for absolute values of residuals. Red dots in\nthe right plot stand for root mean square loss. The DALEX package offers a selection of tools for exploration of model residuals. Figure\n3 presents example explainers for model performance5 created with model performance()\nfunction. Here distribution of absolute residuals is compared between two models. The\naverage mean square loss is equal for both models, yet we can see that the random forest\nmodel has more small residuals and only a small fraction of large residuals. 10% of residuals\nin random forest model is larger than the largest residual in the linear model. More diagnostic plots are available through the auditor package (Gosiewska and Biecek,\n2018), which is closely integrated with the DALEX.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 13, "total_chunks": 29, "char_count": 22397, "word_count": 3239, "chunking_strategy": "semantic" }, { "chunk_id": "1c33b8dc-e855-47a7-a3c5-f9a19f63cb43", "text": "3.2 Explainers for conditional effect of a single variable The DALEX package offers a selection of tools for better understanding of a conditional\nmodel's response based on a single variable. Current implementation covers: • Partial Dependence Plot (Greenwell, 2017), as implemented in the pdp package. • Accumulated Local Effects Plot (Apley, 2017) as implemented in ALEPlot package, • Merging Path Plot (Sitko and Biecek, 2017) as implemented in the factorMerger\npackage. First two methods were designed to deal with continuous variables, while the third one is\ndesigned for categorical variables. Examples for these explainers6 created with function variable response() are presented in Figure 4. On the basis of these explainers it is easy to see that the random forest\nmodel learns the nonlinear relation between price and construction year. The linear model\nis unable to handle such relation without some prior feature engineering. For categorical\nvariable we can see that both models divide the district variable into three groups of values:\ndowntown (largest responses), three districts close to downtown (middle response) and all\nremaining responses.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 14, "total_chunks": 29, "char_count": 1159, "word_count": 175, "chunking_strategy": "semantic" }, { "chunk_id": "171aeb16-d04a-4dd3-af46-d1a82accc090", "text": "Access this explainer with archivist::aread('pbiecek/DALEX arepo/b4eb1')\n6. Access these explainers with archivist::aread('pbiecek/DALEX arepo/3b150') and\narchivist::aread('pbiecek/DALEX arepo/6cbf4') DALEX: explainers for complex predictive models Variable response Variable response", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 15, "total_chunks": 29, "char_count": 284, "word_count": 26, "chunking_strategy": "semantic" }, { "chunk_id": "2308cb1a-2221-4936-a291-48c2b0f3413b", "text": "3700 G G G G G G 3700 G G G G G G 3600 G randomForest 3600 G pdp\ny^ y^ Model\nType\nG pdp GG randomForestlm G 3500 G 3500 G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G\n3400 G G G G G 3400 G G G G G\nG G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G 1920 1940 1960 1980 2000 1920 1940 1960 1980 2000\nconstruction.year construction.year\nrandomForest Partial Group Predic randomForest Partial Group Predic GG GG Srodmiescie: 4548.03 Srodmiescie: 4548.03 GG GG Ochota: 3867.86 GG Mokotow: 3858.83 **G ***G Ochota: 3867.86\nGG GG Zoliborz: 3842.42 GG Ursus: 3168.64 ***G\nGGGG Bielany: 3161.37\nGG**G ***GGG Mokotow: 3858.83 GG Bemowo: 3160.68 *G GG Zoliborz: 3842.42 GGGGGG Wola: 3153.95 GG Ursynow: 3153 GG Praga: 3152.8 2000 3000 4000 5000 600 Ursus: 3168.64 ***G GG GG GG lm Partial Group Predic\nBielany: 3161.37 GG GG Srodmiescie: 5109.19 GG Mokotow: 3946.96 Bemowo: 3160.68 **G ***GGG *GGG GG Ursus: 3058.52 Wola: 3153.95 *** G .GGGGG Bielany: 3045.79", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 16, "total_chunks": 29, "char_count": 1028, "word_count": 240, "chunking_strategy": "semantic" }, { "chunk_id": "1fbd81f7-1e35-4a0d-b0c3-5872189291fa", "text": "GG Bemowo: 3028.58 ***\nGG Ursynow: 3153 **GGGGGG Wola: 3011.69\nGG ***GGGGG Ursynow: 3009.72 GG GG Praga: 3152.8 Praga: 2991.48 2000 3000 4000 5000 600 2000 3000 4000 5000 6000", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 17, "total_chunks": 29, "char_count": 175, "word_count": 30, "chunking_strategy": "semantic" }, { "chunk_id": "4e574c58-68d2-45d0-8b35-9a49c65d1b4a", "text": "Figure 4: Example explainers for variable responses. Two top plots show responses for\nquantitative variable construction year (Partial Dependency Plots), while bottom plots show responses for factor variable district (Factor Merger Plots). The\nleft plots show explainers for a single model, while the right panels show plots in\nwhich two models are being compared. Explainers for quantitative variable show\nthe expected response given a selected value of a variable. Explainers for factor\nvariable present similarity of responses for each possible value. 3.3 Explainers for variable importance The DALEX package offers a model-agnostic procedure to calculate variable importance. The model-agnostic approach is based on permutational approach introduced initially for\nrandom forest (Breiman, 2001) and then extended for other models by Fisher et al. (2018). An example for these explainers7 created with function variable importance() is presented in Figure 5.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 18, "total_chunks": 29, "char_count": 960, "word_count": 138, "chunking_strategy": "semantic" }, { "chunk_id": "2d4638e6-a73e-43d1-8764-c2c01b6abe3d", "text": "The initial performance of both models is similar, and for that reason\nthese intervals are left aligned. For both models the district and surface variables are the\nmost interesting variables. The largest difference between these models is the effect of construction year.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 19, "total_chunks": 29, "char_count": 271, "word_count": 42, "chunking_strategy": "semantic" }, { "chunk_id": "b7d4efad-8a4f-4ed1-b23d-f2bd364d80e6", "text": "For the linear model the length of corresponding interval is almost 0, while\nfor the random forest model is far from 0. This observation is aligned with variables' effects\npresented in Figure 4. The usual practice in variable importance charts is to present only the length of the\ninterval which is related to loss in the performance metrics after the selected variable is\nshuffled. Bars on such plots are hitched in 0. In the DALEX package we propose to present\nnot only drop in model performance but also the initial model performance. In that way\none can compare variables between models with different initial performance.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 20, "total_chunks": 29, "char_count": 626, "word_count": 105, "chunking_strategy": "semantic" }, { "chunk_id": "896848f1-21b6-4623-8a7f-d316536b6dd1", "text": "construction.year district no.rooms\n_full_model_ 5e+08 1e+09 4.0e+08 8.0e+08 1.2e+09 1.6e+09\nDrop−out loss Drop−out loss Figure 5: Example explainers for variables importance. Left panel shows explainers for a\nsingle model, random forest, while the right one compares two models.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 21, "total_chunks": 29, "char_count": 279, "word_count": 38, "chunking_strategy": "semantic" }, { "chunk_id": "7796142e-525b-4795-a953-454dcdc03315", "text": "Importance of every variable is presented as an interval. One end of this interval is the\nmodel performance with regard to validation data, while the second end is the\nperformance with regard to a data set with single variable being shuffled. The\nlonger the interval, the more important the corresponding variable is.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 22, "total_chunks": 29, "char_count": 317, "word_count": 52, "chunking_strategy": "semantic" }, { "chunk_id": "c31bfd7f-b834-425a-b661-a424303133bb", "text": "Access this explainer with archivist::aread('pbiecek/DALEX arepo/9378c') DALEX: explainers for complex predictive models Prediction understanding In this section we present explainers that increase understanding of a prediction for a single\nobservations. The primary goal of these explainers is to answer the following questions:\nHow stable is the prediction? Which variables influence the prediction? How to attribute\neffects of particular variables to a single model prediction? 4.1 Explainers for robustness of predictions Ceteris Paribus Plots show how the model response changes as a function of a single variable. These plots recollect similarities to Partial Dependency Plots presented in Section 3.2; the\nonly difference between them is the fact that Ceteris Paribus Plots are focused on a single\nobservation. CP Plots have many applications. The derivative is related to local variable importance (as measured in LIME), the profile may be used to verify some constraints related to\na variable (such as monotonic relation) or to asses variable contribution. An example for this explainer8 created with ceterisParibus package9 is presented in\nFigure 6. We can read from it that the variable surface has the largest effect on the model\npredictions and and it lowers the model prediction for large apartments. We can also read\nthat small changes in the variable construction year will not affect model predictions.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 23, "total_chunks": 29, "char_count": 1419, "word_count": 213, "chunking_strategy": "semantic" }, { "chunk_id": "4a153b96-5df0-41d7-beef-b0a4b2321145", "text": "Access this explainer with archivist::aread('pbiecek/DALEX arepo/c8989')\n9. https://github.com/pbiecek/ceterisParibus Ceteris Paribus Plot Ceteris Paribus Plot\nconstruction.year surface randomForest\n4800 GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG 4800 GGGGGG\nGGGGGGGGGGGGGGGGGGG\nGGGGGG G 4500 GGGGGGGGGGGGG\nGGGGGGGGGGG\nG GGGGGGGGGGGGGGGG G 4200 GGGGGG GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG GGGGGGGG GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG\n4500 GGGGGGGGGG\nGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG\nGGGGG 3900\nGGGGGGGGGGGGGGGGGGGGGGGG GG\nGGGGGGG\nGGGGGGGGGGG GGGGGGGGGGG GGGGGGGGGGG\nGGGGGGGGGGGG GGGGGGy 3600 y GGGGGGGGGGGGGGGGG\nGGGG 1980 2000 40\nG GG GGG 4200 GGG GGG floor no.rooms GGGGGGGGGG GGGGGGGGGGGGGGGGG GGGGGGGGGGGGGGGGGGGG GGGGGGGGGGGGGGGGGGG 80 120 Predicted G G GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG Predicted 4800 1920 1940 1960\nGGGGGGGGGG GGG\nGGGGGG\nGGGGGGG\n4500 G G GGGGGGGGGGGGGGGGGGGG\nG 3900 G G\n4200 G G G GGGGGGGGGG G G G\n3900 GGGGGGGGGG\nG GGGGGGGGGG G G G 3600 GGGGGGGGGG 3600\n2.5 5.0 7.5 10.0 2 4 6 −100% −80% −60% −40% −20% 0% 20% 40% 60%\nRelative percentile vname G construction.year G surface G floor G no.rooms vname G construction.year G surface G floor G no.rooms", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 24, "total_chunks": 29, "char_count": 1211, "word_count": 144, "chunking_strategy": "semantic" }, { "chunk_id": "8658bac8-3665-457d-b190-10aa449e0db3", "text": "Figure 6: Ceteris Paribus Plots - explainers for a single observation. The left plot shows\nhow the model response fluctuates for a single observation (predicted y is on OY\naxis) and if all its unchanged variables remain constant when a single variable is\nchanged (ceteris paribus principle). The right plot shows effects of all variables in\nthe same coordinate system. On OX axis values are normalized through quantile\ntransformation. DALEX: explainers for complex predictive models 4.2 Explainers for variable attribution The most known approaches to explanations of a single prediction are LIME method\n(Ribeiro et al., 2016), working best for local explanations, and Shapley values (ˇStrumbelj\nand Kononenko, 2010, 2014; Lundberg and Lee, 2017), working best for variable attribution. Break Down Plots are fast approximations of Shapley values. The methodology behind this\nmethod and comparison among these three methods is presented in (Staniak and Biecek,\n2018). An example for BDP explainers10 created with function prediction breakdown() is\npresented in Figure 7. As one can read from the graph, in both models the largest increase\nin model prediction is due to variable district = Srodmiescie (downtown). Large surface\nlowers the prediction in the random forest model, while the variable number of rooms has\nlarger impact in the random forest model than in the linear model.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 25, "total_chunks": 29, "char_count": 1381, "word_count": 214, "chunking_strategy": "semantic" }, { "chunk_id": "0843c3ed-fd5d-4744-a07a-ef7174adfd1b", "text": "final_prognosis 1332.99\nfinal_prognosis 641.468\nconstruction.year = 1976 −2.559 + surface = 131 −285.649 no.rooms = 5 −61.877 + no.rooms = 5 −236.454 surface = 131 −464.9 district = Srodmiescie 1601.3 + construction.year = 1976 −66.87 randomForest final_prognosis 641.468 + floor = 3 185.52 + surface = 131 −285.649 + no.rooms = 5 −236.454 + construction.year = 1976 −66.872\n+ district = Srodmiescie 1044.919\n+ floor = 3 185.525 + district = Srodmiescie 1044.919\n(Intercept) 0\n(Intercept) 0 3500 4000 4500 3500 4000 4500 5000", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 26, "total_chunks": 29, "char_count": 525, "word_count": 84, "chunking_strategy": "semantic" }, { "chunk_id": "07dbbac2-98d1-4f26-af26-606303ce8760", "text": "Figure 7: Break Down Plots - explainers for a single observation that attributes variables\nto parts of model prediction. The left plot shows how the random forest model's\nresponse decomposed onto five variables. The right plot shows decompositions for\ntwo models. The gray rectangles show how the single model prediction is different\nfrom the population average (reference), the blue rectangles show which variables\nincrease model prediction, while the yellow rectangles are related to variables that\nlower the model prediction. Thinking about data modeling is currently dominated by feature engineering and model\ntraining. Kaggle competitions turn the data modeling process into a process that returns Access this explainer with archivist::aread('pbiecek/DALEX arepo/72b47')", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 27, "total_chunks": 29, "char_count": 775, "word_count": 110, "chunking_strategy": "semantic" }, { "chunk_id": "3a5a9891-367b-4144-abbf-25770fb95806", "text": "a single model with highest accuracy. Tasks of that type may be easily automated. Such\nthinking about modeling is popular due to lack of tools that can be used for model validation\nand richer domain verification. In this article we have introduced consistent methodology and a set of tools for modelagnostic explanations. The presented global explainers for model understanding and local\nexplainers for prediction understanding are based on uniform grammar introduced in Figure\n2. Every explainer is constructed in a way that allows for numerical summary, visual\nsummary and comparison of multiple models. The methodology is developed in a way that is easy to extend with broad technical\ndocumentation with rich training materials11. The code is properly maintained and tested\nwith tools for continuous integration. The work was partially supported as RENOIR Project by the European Union Horizon 2020\nresearch and innovation programme under the Marie Sklodowska-Curie grant agreement No\n691152 (project RENOIR) and by NCN Opus grant 2016/21/B/ST6/02176.", "paper_id": "1806.08915", "title": "DALEX: explainers for complex predictive models", "authors": [ "Przemyslaw Biecek" ], "published_date": "2018-06-23", "primary_category": "stat.ML", "arxiv_url": "http://arxiv.org/abs/1806.08915v2", "chunk_index": 28, "total_chunks": 29, "char_count": 1054, "word_count": 159, "chunking_strategy": "semantic" } ]