1 to 4 of 4 Results
Mar 23, 2021
Holzmüller, David, 2021, "Replication Data for: On the Universality of the Double Descent Peak in Ridgeless Regression", https://doi.org/10.18419/darus-1771, DaRUS, V1
This dataset contains code used to generate the figures in the paper On the Universality of the Double Descent Peak in Ridgeless Regression, David Holzmüller, International Conference on Learning Representations 2021. The code is also provided on GitHub. Here, we additionally pro... |
Jun 20, 2022
Holzmüller, David; Steinwart, Ingo, 2022, "Code for: Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent", https://doi.org/10.18419/darus-2978, DaRUS, V1
This data set contains code used to generate figures and tables in our paper "Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent". The code is also available on GitHub. Information on the code and installation instructions can be found in the file README.md. |
Nov 5, 2024
Holzmüller, David; Grinsztajn, Léo; Steinwart, Ingo, 2024, "Code and Data for: Better by default: Strong pre-tuned MLPs and boosted trees on tabular data [NeurIPS, arXiv v2]", https://doi.org/10.18419/darus-4555, DaRUS, V1
This dataset contains code and data for our paper "Better by default: Strong pre-tuned MLPs and boosted trees on tabular data", specifically, the NeurIPS version which is also the second version on arXiv. The main code is provided in pytabkit_code.zip and contains further documen... |
Aug 8, 2024
Holzmüller, David; Grinsztajn, Léo; Steinwart, Ingo, 2024, "Code and Data for: Better by default: Strong pre-tuned MLPs and boosted trees on tabular data", https://doi.org/10.18419/darus-4255, DaRUS, V1
This dataset contains code and data for our paper "Better by default: Strong pre-tuned MLPs and boosted trees on tabular data". The main code is provided in pytabkit_code.zip and contains further documentation in README.md and the docs folder. The main code is also provided on Gi... |