Code for: Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent (doi:10.18419/darus-2978)

View:

Part 1: Document Description
Part 2: Study Description
Part 5: Other Study-Related Materials
Entire Codebook

(external link)

Document Description

Citation

Title:

Code for: Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent

Identification Number:

doi:10.18419/darus-2978

Distributor:

DaRUS

Date of Distribution:

2022-06-20

Version:

1

Bibliographic Citation:

Holzmüller, David; Steinwart, Ingo, 2022, "Code for: Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent", https://doi.org/10.18419/darus-2978, DaRUS, V1

Study Description

Citation

Title:

Code for: Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent

Identification Number:

doi:10.18419/darus-2978

Authoring Entity:

Holzmüller, David (Universität Stuttgart)

Steinwart, Ingo (Universität Stuttgart)

Grant Number:

EXC 2075 - 390740016

Distributor:

DaRUS

Access Authority:

Holzmüller, David

Access Authority:

Steinwart, Ingo

Depositor:

Holzmüller, David

Date of Deposit:

2022-06-02

Holdings Information:

https://doi.org/10.18419/darus-2978

Study Scope

Keywords:

Computer and Information Science, Mathematical Sciences, Artificial Neural Network, Regression

Abstract:

This data set contains code used to generate figures and tables in our paper "Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent". The code is also available on <a href=https://github.com/dholzmueller/nn_inconsistency>GitHub</a>. Information on the code and installation instructions can be found in the file README.md.

Notes:

Basic instructions for installing and running the software can be found in the README.md file.

Methodology and Processing

Sources Statement

Data Access

Other Study Description Materials

Related Publications

Citation

Title:

David Holzmüller and Ingo Steinwart. Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent, 2020.

Identification Number:

2002.04861

Bibliographic Citation:

David Holzmüller and Ingo Steinwart. Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent, 2020.

Other Study-Related Materials

Label:

custom_paths.py

Notes:

text/x-python

Other Study-Related Materials

Label:

eval_nn_setups.py

Notes:

text/x-python

Other Study-Related Materials

Label:

eval_star_dataset.py

Notes:

text/x-python

Other Study-Related Materials

Label:

LICENSE

Notes:

text/plain; charset=US-ASCII

Other Study-Related Materials

Label:

mc_event_estimation.py

Notes:

text/x-python

Other Study-Related Materials

Label:

mc_plotting.py

Notes:

text/x-python

Other Study-Related Materials

Label:

mc_sgd_keras.py

Notes:

text/x-python

Other Study-Related Materials

Label:

mc_training.py

Notes:

text/x-python

Other Study-Related Materials

Label:

plot_examples.py

Notes:

text/x-python

Other Study-Related Materials

Label:

README.md

Notes:

text/markdown

Other Study-Related Materials

Label:

requirements.txt

Notes:

text/plain

Other Study-Related Materials

Label:

run_nn_setups.py

Notes:

text/x-python

Other Study-Related Materials

Label:

show_training.py

Notes:

text/x-python

Other Study-Related Materials

Label:

tex_head.txt

Notes:

text/plain

Other Study-Related Materials

Label:

tex_tail.txt

Notes:

text/plain

Other Study-Related Materials

Label:

TrainingSetup.py

Notes:

text/x-python

Other Study-Related Materials

Label:

train_star_dataset.py

Notes:

text/x-python

Other Study-Related Materials

Label:

utils.py

Notes:

text/x-python