91 to 100 of 1,117 Results
May 16, 2024 -
InvisibleEye
ZIP Archive - 1.2 GB -
MD5: fc6a883632225ab726cbcc79bf5a6579
|
May 16, 2024 -
InvisibleEye
ZIP Archive - 1.3 GB -
MD5: 609b281bd8e638a24e8b49384ec70c81
|
May 16, 2024 -
InvisibleEye
ZIP Archive - 1.7 GB -
MD5: 74ec086a2365f56e79c28c3d25db6bf3
|
May 16, 2024 -
InvisibleEye
ZIP Archive - 1.5 GB -
MD5: 06ccd2ee16d2dbcd20c710e205a42042
|
Jan 26, 2024 -
SalChartQA: Question-driven Saliency on Information Visualisations (Dataset and Reproduction Data)
Python Source Code - 519 B -
MD5: 52d002345282b7ccec556fd12877bfb8
dataloader for SalChartQA |
Jan 26, 2024 -
SalChartQA: Question-driven Saliency on Information Visualisations (Dataset and Reproduction Data)
Python Source Code - 170 B -
MD5: 6fcef81e6722e6dec003ea1a6152d008
python envorinment $TORCH_HOME and $TRANSFORMERS_CACHE |
Jan 26, 2024 -
SalChartQA: Question-driven Saliency on Information Visualisations (Dataset and Reproduction Data)
Python Source Code - 4.1 KB -
MD5: 57190d4825e65b73b0a8255e242caa20
evaluation script to load VisSalFormer weights and make predictions |
Jan 26, 2024 -
SalChartQA: Question-driven Saliency on Information Visualisations (Dataset and Reproduction Data)
Shell Script - 87 B -
MD5: 0e9c6b5e196912cee38ae0adc5d4f3eb
bash script to run evaluation.py |
Jan 26, 2024 -
SalChartQA: Question-driven Saliency on Information Visualisations (Dataset and Reproduction Data)
Python Source Code - 4.5 KB -
MD5: 5c733608c453403dae142c965bb9f14f
definition of the VisSalFormer model |
Jan 26, 2024 -
SalChartQA: Question-driven Saliency on Information Visualisations (Dataset and Reproduction Data)
Markdown Text - 2.2 KB -
MD5: b97033eca57cd5221fa4e8262c37bd25
Readme document |