Skip to content

Instantly share code, notes, and snippets.

@tteofili
Last active January 12, 2023 15:27
Show Gist options
  • Save tteofili/eaaeaaa8af2d22005fe199f1dc8874ad to your computer and use it in GitHub Desktop.
Save tteofili/eaaeaaa8af2d22005fe199f1dc8874ad to your computer and use it in GitHub Desktop.

Instructions for evaluating ER explanations with CERTEM

  1. create a new Google Colab Notebook
  2. open the CERTEM-demo Google Colab Notebook at https://colab.research.google.com/drive/1e9L4Edbm3_X4V_xH7EFkxsmT3wE4Qwca?usp=sharing
  3. copy each cell from CERTEM-demo.ipynb into your newly created notebook (in the very same order)
  4. Run all cells (Runtime -> Run all)
  5. Once all cells are loaded you should see something like this
  6. Select one of the datasets in the dropdown menu, either BA or AB (avoid IA because of visualization issues)
  7. Select one of the existing models by checking one of the boxes DeepER, DeepMatcher, Ditto.
  8. Choose the index of one of the rows in the table to explain (each row contains a pair of records with its label and prediction, according to the selected model)
  9. Click Explain Item $Index (e.g., Explain Item 0) to obtain an explanation for the chosen pair of records
  10. Click Compare $Model (e.g., Compare DeepMatcher) to compare different explanations generated for the same pair of records by different explanation algorithms
  11. You will see a table with different saliency explanations for the same pair of records
    • A darker blue background in a cell is associated to bigger numbers and means higher feature "importance"
    • Some explanations might contain either 0.000000 or nan values, this can happen with numbers lower than 1e-7.
  12. Select the checkbox associated to the row which contains the saliency that sounds the correct one or that, at least, helps you understand the prediction (e.g. if the saliency at row 0 is the best one, select the checkbox named Sys 0).
  13. If none of the selected saliencies satisfies you or if you feel something is missing, fill some text within the User Defined Saliency and select the associated checkbox.
  14. Once you are satisfied with your selection, click the Record button.
  15. Now click on the Counterfactuals tab, you will see one or more counterfactual explanations.
  16. Select the checkbox associated to the counterfactual explanations that sounds the correct one or that, at least, helps you understand the behavior of the system.
  17. Once you are satisfied with your selection, click the Record button.
  18. Repeat steps 6-17 for 10 times.
  19. Open the File tab on the left of the Google Colab page
  20. Locate the /root/certem/us.csv file and download it.
  21. Upload the us.csv file by 18:00 CET of January 21st 2023 on the following online form: https://forms.office.com/r/PYP0ncXYqc
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment