All sources cited or reviewed
This is a list of all sources we have used in the TransferLab, with links to the referencing content and metadata, like accompanying code, videos, etc. If you think we should look at something, drop us a line
References
[Mul22T]
The third international verification of neural networks competition (VNN-COMP 2022): Summary and results,
[Obr22E]
Evaluation Metrics for Graph Generative Models: Problems, Pitfalls, and Practical Solutions,
[Pet22E]
Escaping limit cycles: Global convergence for constrained nonconvex-nonconcave minimax problems,
[Sha22L]
Label Encoding for Regression Networks,
[Son22D]
Denoising Diffusion Implicit Models,
[Top22U]
Understanding over-squashing and bottlenecks on graphs via curvature,
[Tra22P]
pyDVL: The Python Data Valuation Library,
[Wei22C]
Chain-of-thought prompting elicits reasoning in large language models,
[Xu22A]
Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy,
[Xu22G]
GeoDiff: A Geometric Diffusion Model for Molecular Conformation Generation,
[You22B]
Bayesian Modeling and Uncertainty Quantification for Learning to Optimize: What, Why, and How,
[Zha22B]
Boosting the Certified Robustness of L-infinity Distance Nets,
[Zha22G]
GreaseLM: Graph REASoning Enhanced Language Models,
[Zha22C]
Comparing Distributions by Measuring Differences that Affect Decision Making,
[Lan21P]
Perfect density models cannot guarantee anomaly detection,
[Kos21O]
Offline Reinforcement Learning with Implicit Q-Learning,
[Mir21F]
The Fundamental Limits of Interval Arithmetic for Neural Networks,
[Kar21S]
A Style-Based Generator Architecture for Generative Adversarial Networks,
[Xia21B]
BarrierNet: A Safety-Guaranteed Layer for Neural Networks,
[Geb21D]
Datasheets for datasets,
[Rag21W]
Worst-Case Robustness in Machine Learning,
[Esc21M]
Mixtures of Laplace Approximations for Improved Post-Hoc Uncertainty in Deep Learning,
[Bro21A]
An Automatic Finite-Sample Robustness Metric: When Can Dropping a Little Data Make a Big Difference?,
[Deu21E]
Explanations for Data Repair Through Shapley Values,
[Awa21D]
DEHB: Evolutionary Hyperband for Scalable, Robust and Efficient Hyperparameter Optimization,
[Van21I]
An Introduction to Probabilistic Programming,
[Hu21L]
LoRA: Low-Rank Adaptation of Large Language Models,
[Lim21T]
Temporal Fusion Transformers for interpretable multi-horizon time series forecasting,
[Sal21P]
Progressive Distillation for Fast Sampling of Diffusion Models,