A Divergence-Based Method for Weighting and Averaging Model Predictions
A divergence-based method outperforms traditional weighting in small sample scenarios.
Olav Benjamin Vassend
A divergence-based method outperforms traditional weighting in small sample scenarios.
Olav Benjamin Vassend
CLVAE model uses a variational autoencoder for long-term customer revenue forecasting, enhancing accuracy.
Jeffrey Näf, Riana Valera Mbelson, Markus Meierer
The Mixed Membership sub-Gaussian Model (MMSG) addresses the limitation of classical GMM by allowing observations to belong to multiple components.
Huan Qing
WassersteinGrad explains dynamic physical field predictions by computing the entropic Wasserstein barycenter, enhancing autoregressive weather forecasting model interpretability.
Younes Essafouri, Laure Raynaud, Luciano Drozda et al.
FedSPDnet outperforms traditional methods on EEG datasets using ProjAvg and RLAvg strategies, enhancing F1 score and robustness.
Thibault Pautrel, Florent Bouchard, Ammar Mian et al.
SQUEAK algorithm achieves low space complexity for kernel ridge regression using unnormalized ridge leverage scores.
Daniele Calandriello, Alessandro Lazaric, Michal Valko
Pliable Rejection Sampling (PRS) learns the proposal distribution using kernel estimation, ensuring high-probability i.i.d. sampling.
Akram Erraqabi, Michal Valko, Alexandra Carpentier et al.
Concave statistical utility maximization bandits using influence-function gradients.
Matías Carrasco, Alejandro Cholaquidis
Fast estimation of Gaussian mixture components via centering and singular value thresholding without iteration.
Huan Qing
Revisiting active sequential prediction-powered mean estimation reveals smallest confidence width when constant probability weight is near one.
Maria-Eleni Sfyraki, Jun-Kun Wang
Introduced spectral bandit algorithms for smooth graph functions, achieving linear and sublinear scaling in effective dimension.
Michal Valko, Rémi Munos, Branislav Kveton et al.
Adaptive kernel selection enhances stability and accuracy of kernelized diffusion maps.
Othmane Aboussaad, Adam Miraoui, Boumediene Hamzi et al.
Bias-aware simulation-based inference framework addresses selection bias, enhancing estimation accuracy.
Jonas Arruda, Sophie Chervet, Paula Staudt et al.
Kometo algorithm achieves fast learning rates in multi-fidelity optimization without known smoothness or fidelity assumptions.
Come Fiegel, Victor Gabillon, Michal Valko
Structural interpretability in SVMs using truncated orthogonal polynomial kernels reveals model complexity.
Víctor Soto-Larrosa, Nuria Torrado, Edmundo J. Huertas
Amortized Optimal Transport using sliced potentials enhances OT plan prediction efficiency across multiple measure pairs.
Minh-Phuc Truong, Khai Nguyen
Fast interpretable autoregressive estimation using neural network backpropagation, achieving 12.6x speedup.
Anaísa Lucena, Ana Martins, Armando J. Pinho et al.
OmniAnomaly and PCA perform comparably on the SMD dataset, especially without point adjustment.
Bruna Alves, Ana Martins, Armando J. Pinho et al.
Study the mechanism of diffusion models learning data statistics from simple to complex using the mixed cumulant model.
Lorenzo Bardone, Claudia Merger, Sebastian Goldt
VecMol generates 3D molecules using vector-field representations, avoiding explicit graph generation and enhancing geometry-chemistry coherence.
Yuchen Hua, Xingang Peng, Jianzhu Ma et al.