Session | ||
DFG-PP 2298: Theoretical Foundations of Deep Learning
| ||
Presentations | ||
2:00pm - 2:20pm
Foundations of Supervised Deep Learning for Inverse Problems 1University of Siegen; 2Helmholtz Imaging, Deutsches Elektronen-Synchrotron DESY; 3Universität Hamburg 2:20pm - 2:40pm
Combinatorial and implicit views on parameter optimization in neural networks 1University of California, Los Angeles, USA; 2Max Planck Institute for Mathematics in the Sciences 2:40pm - 3:00pm
Regularized, structure-preserving neural networks for the minimal entropy closure of the Boltzmann moment system 1Oak Ridge National Laboratory, USA; 2Karlsruhe Institute of Technology 3:00pm - 3:20pm
Adaptive Step Sizes for Preconditioned Stochastic Gradient Descent 1University of Bayreuth; 2Heidelberg University 3:20pm - 3:40pm
Non-vacuous PAC-Bayes bounds for Models under Adversarial Corruptions RPTU Kaiserslautern-Landau 3:40pm - 4:00pm
Convergence results for gradient flow and gradient descent systems in artificial neural network training University of Duisburg-Essen |