Differences
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
courses:xai:p1_2024 [2024/11/22 11:16] – admin | courses:xai:p1_2024 [2024/11/22 12:35] (current) – [Explainable Actrive Learning] admin | ||
---|---|---|---|
Line 10: | Line 10: | ||
Main goal of the project is to take the code: https:// | Main goal of the project is to take the code: https:// | ||
And add possibility of adding dynamic windows using ruptures point-detection algorithm. | And add possibility of adding dynamic windows using ruptures point-detection algorithm. | ||
+ | |||
+ | ===== Study Rashomon Effect | ||
+ | {{: | ||
+ | {{: | ||
+ | |||
+ | Main goal of the project is to reproduce methods from: https:// | ||
+ | |||
+ | ===== LUX Speedup | ||
+ | {{: | ||
+ | {{: | ||
+ | |||
+ | Speedup LUX explainer and confirm that after the speedup the results are not changed (Different wasy of speedup can be considered). | ||
+ | |||
+ | ===== PPCF Counterfactual explainer evaluation | ||
+ | {{: | ||
+ | {{: | ||
+ | |||
+ | The goal is to take: https:// | ||
+ | |||
+ | |||
+ | ===== Shap on OPTUNA | ||
+ | {{: | ||
+ | {{: | ||
+ | |||
+ | The goal is to run hyperparameter optimization algorithm such as optuna, collect results from it and use SHAP to try to explain the impact of different hyperparameters on the quality of the optimization process | ||
+ | |||
+ | |||
+ | ===== Explainable Active Learning | ||
+ | {{: | ||
+ | {{: | ||
+ | |||
+ | The goal is to use SHAP gradients (implemented in LUX) to select instances that should be labelled by expert (Active Learning paradigm) and compare with random sampling. | ||
+ | |||
+ |