All sources cited or reviewed
This is a list of all sources we have used in the TransferLab, with links to the referencing content and metadata, like accompanying code, videos, etc. If you think we should look at something, drop us a line
References
[Li20F]
Fourier Neural Operator for Parametric Partial Differential Equations,
[Kim20S]
SoftFlow: Probabilistic Framework for Normalizing Flow on Manifolds,
[Pap20P]
Prevalence of Neural Collapse during the terminal phase of deep learning training,
[Roc20S]
Solving Schrödinger’s equation with Deep Learning,
[Dos20I]
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale,
[Bas20I]
Influence Functions in Deep Learning Are Fragile,
[Gon20T]
Training deep neural density estimators to identify mechanistic models of neural dynamics,
[Tho20L]
Likelihood-free inference by ratio estimation,
[Gri20B]
Bootstrap your own latent: A new approach to self-supervised Learning,
[Tej20S]
sbi: A toolkit for simulation-based inference,
[Nix20M]
Measuring Calibration in Deep Learning,
[Kol20H]
How to Exploit Structure while Solving Weighted Model Integration Problems,
[Peh20R]
Random Sum-Product Networks: A Simple and Effective Approach to Probabilistic Deep Learning,
[Wu20S]
Stronger and Faster Wasserstein Adversarial Attacks,
[Wan20W]
When and why PINNs fail to train: A neural tangent kernel perspective,
[Din20R]
Revisiting the Evaluation of Uncertainty Estimation and Its Application to Explore Model Complexity-Uncertainty Trade-Off,
[Bra20S]
Single Shot MC Dropout Approximation,
[Kri20B]
Being Bayesian, Even Just a Bit, Fixes Overconfidence in ReLU Networks,
[Rib20A]
Beyond Accuracy: Behavioral Testing of NLP Models with CheckList,
[And20W]
What Matters In On-Policy Reinforcement Learning? A Large-Scale Empirical Study,
[Shi20C]
On the convergence of physics informed neural networks for linear second-order elliptic and parabolic type PDEs,
[Wit20S]
Simulation-Based Inference for Global Health Decisions,
[Maz20L]
Leveraging exploration in off-policy algorithms via normalizing flows,
[Kob20N]
Normalizing Flows: An Introduction and Review of Current Methods,
[Hu20I]
Improved Image Wasserstein Attacks and Defenses,
[Wan20L]
Less Is Better: Unweighted Data Subsampling via Influence Function,
[Qiu20Q]
Quantifying Point-Prediction Uncertainty in Neural Networks via Residual Estimation with an I/O Kernel,
[Buc20P]
Proxy tasks and subjective measures can be misleading in evaluating explainable AI systems,
[Mce20S]
Statistical Rethinking: A Bayesian Course with Examples in R and Stan,