0%
NeurIPS 22 accepted papers list
EBM
- Diffusion Models as Plug-and-Play Priors
- video diffusion models
- Photorealistic Text-to-Image Diffusion Models with Deep Language Understanding
- On analyzing generative and denoising capabilities of diffusion-based generative models
- BinauralGrad: A two stage conditional diffusion probabilistic model for binaural audio synthesis
- Thompson sampling efficiently learns to control diffusion process
- Generative Time Series Forecasting with Diffusion, Denoise and Disentanglement
- CARD: Classification and regression diffusion models
- Improve diffusion models for inverse problems using manifold constraints
- Elucidating the design space of diffusion-based generative models
- Score-based diffusion meets annealed importance sampling
- Deep equilibrium approaches to diffusion models
- Flexible diffusion modeling of long videos
- Conditional diffusion process for inverse halftoning
- Diffusion-LM improves controllable text generation
- Riemannian diffusion models
- Denoising diffusion restoration models
- DPM-Solver: A fast ODE solver for diffusion probabilistic model sampling in around 10 steps.
- First hitting diffusion models
- GENIE: High -order denoising diffusion solvers.
- Antigen-Specific antibody design and optimization with diffusion-based generative models
- Unsupervised representation learning from pre-trained diffusion probabilistic models.
- Exponential family model-based reinforcement learning via score matching
- Convergence for score-based generative modeling with polynomial complexity.
- Score-based models detect manifolds
- Concrete score matching: generalized score matching for discrete data.
- Score-based generative modeling secretly minimizes the Wasserstein distance.
- Wavelet score-based generative modeling
- Riemannian score-based generative modeling.
- End-to-end stochastic programming with energy-based model
- Adaptive multi-stage density ratio estimation for learning latent space energy-based model
- EGSDE: Unpaired image-to-image translation via energy-guided stochastic differential equations.
- A continuous time framework for discrete denoising models.
GAN
- Masked GANs are robust generation learners.
- Improving GANs via adversarial learning in latent spaces.
- Amortized projection optimization for sliced wasserstein generative models.
Dynamic systems
ODE
- Neural differential equations for learning to program neural nets though continuous learning rules.
- Do residual neural networks discretize neural ordinary differential equations?
- Constraining Gaussian processes to systems of linear ordinary differential equations.
- Imrpoving neural ordinary equations with Nesterov’s accelerated gradient method
SDE
- Learning white noises in neural stochastic differential equations
- Riemannian neural SDE: learning stochastic representation on manifolds.
PDE
- Neural Stochastic PDEs: Resolution invariant learning of continuous spatiotemporal dynamics
Control
- Neural stochastic control
- Markov Chain Score Ascent: A unifying framework of variational inference with Markovian gradients.
Time Series
- Self-supervised contrastive pre-training for time seires via time-frequency consistency
- BILCO: An efficient algorithm for joint alignment of time series.
- GT-GAN: General purpose time series synthesis with generative adversarial networks.
- Generative Time Series Forecasting with Diffusion, Denoise and Disentanglement
- Non-stationary transformers: rethinking the staionarity in time series forecasting
- Time dimension dances with simplicial complexes: Zigzag filtration curve based supra-hodge convolution networks for time-series forecasting
- FILM: Frequency improved legendre memory model for long-term time series forecasting.
- Learning latent seasonal-trend representations for time series forecasting.
- SCINet: Time series modeling and forecasting with sample convolution and interaction.
- WaveBound: Dynamically bounding error for stable time series forecasting.
- Causal disentanglement for time series.
- Dynamic sparse network for time series classification: Learning what to “See”
- Efficient learning of nonlinear prediction models with time-series privileged information.
- Multivariate time-series forecasting with temporal polynomial graph neural networks.
- Dynamic graph neural networks under spatio-temporal distribution shift.
- AutoST: Towards the universal modeling of spatio-temporal sequences.
- Neural Stochastic PDEs: Resolution invariant learning of continuous spatio-temporal dynamics
- Variational context adjustment for temporal event prediction under distribution shifts.
- Practical adversarial attacks on spatio-temporal traffic forecasting models.
- Quo Vadis: Is trajectory forecasting the key towards long-term multi-objective tracking
- Contact-aware human motion forecasting.
- Forecasting Human Trajectory from scene history
- Motion forecasting transformer with global intention localization and local movement refinement.
- Representing spatial trajectories as distributions.
WeChat Pay