On the rate of convergence of fully connected deep neural network regression estimates M Kohler, S Langer
The Annals of Statistics 49 (4), 2231-2249, 2021
116 * 2021 Approximating smooth functions by deep neural networks with sigmoid activation function S Langer
Journal of Multivariate Analysis 182, 104696, 2021
73 2021 Estimation of a function of low local dimensionality by deep neural networks M Kohler, A Krzyżak, S Langer
IEEE transactions on information theory 68 (6), 4032-4042, 2022
42 2022 Statistical theory for image classification using deep convolutional neural networks with cross-entropy loss M Kohler, S Langer
arXiv preprint arXiv:2011.13602, 2020
23 2020 Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function S Langer
Journal of Multivariate Analysis 182, 104695, 2021
21 2021 Discussion of:“Nonparametric regression using deep neural networks with ReLU activation function” M Kohler, S Langer
11 2020 The smoking gun: Statistical theory improves neural network estimates A Braun, M Kohler, S Langer, H Walk
6 2021 Convergence rates for shallow neural networks learned by gradient descent A Braun, M Kohler, S Langer, H Walk
Bernoulli 30 (1), 475-502, 2024
5 2024 Estimation of a regression function on a manifold by fully connected deep neural networks M Kohler, S Langer, U Reif
Journal of Statistical Planning and Inference 222, 160-181, 2023
3 2023 A statistical analysis of an image classification problem S Langer, J Schmidt-Hieber
arXiv preprint arXiv:2206.02151, 2022
2 2022 Supplement to “On the rate of convergence of fully connected deep neural network regression estimates.” M Kohler, S Langer
2 2021 Dropout Regularization Versus -Penalization in the Linear Model G Clara, S Langer, J Schmidt-Hieber
arXiv preprint arXiv:2306.10529, 2023
2023 The Smoking Gun: Statistical Theory Improves Neural Network Estimates ABM Kohler, S Langer, H Walk
2021 Ein Beitrag zur statistischen Theorie des Deep Learnings S Langer
2020