Concetti Chiave
Decision trees provide more accurate explanations of support vector regression models compared to LIME and multi-linear regression in terms of root mean square error (RMSE) values.
Sintesi
The content compares the use of decision trees, LIME, and multi-linear regression as techniques to explain support vector regression (SVR) models. The key findings are:
Decision trees outperform LIME in explaining SVR models, with decision trees having lower RMSE values compared to LIME in 87% of the runs across 5 datasets. The comparison of results is statistically significant.
Multi-linear regression also outperforms LIME in explaining SVR models, with multi-linear regression having lower RMSE values compared to LIME in 73% of the runs across 5 datasets. However, the comparison of results is not statistically significant.
When used as a local explanatory technique, decision trees also perform better than LIME, and the comparison of results is statistically significant.
The superior performance of decision trees in explaining SVR models is attributed to their ability to capture non-linear relationships, which LIME and multi-linear regression struggle with.
Statistiche
The RMSE values for the different techniques on the 5 datasets are provided in the table.