1
Efficient Aggregated Kernel Tests using Incomplete U-statistics
2
KSD Aggregated Goodness-of-fit Test
3
Statistical Depth Meets Machine Learning: Kernel Mean Embeddings and Depth in Functional Data Analysis
4
A Witness Two-Sample Test
5
A Kernel Two-Sample Test for Functional Data
6
Learning Kernel Tests Without Data Splitting
7
Minimax optimality of permutation tests
8
AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data
9
Learning Deep Kernels for Non-Parametric Two-Sample Tests
10
Adaptive test of independence based on HSIC measures
11
On the Optimality of Gaussian Kernel Based Nonparametric Tests against Smooth Alternatives
12
Failing Loudly: An Empirical Study of Methods for Detecting Dataset Shift
13
Post Selection Inference with Incomplete Maximum Mean Discrepancy Estimator
14
Minimax Estimation of Maximum Mean Discrepancy with Radial Kernels
15
Generative Models and Model Criticism via Optimized Maximum Mean Discrepancy
16
Interpretable Distribution Features with Maximum Testing Power
17
A Kernelized Stein Discrepancy for Goodness-of-fit Tests
18
A Kernel Test of Goodness of Fit
19
On the High Dimensional Power of a Linear-Time Two Sample Test under Mean-shift Alternatives
20
Optimal Inference After Model Selection
21
On the Decreasing Power of Kernel and Distance Based Nonparametric Hypothesis Tests in High Dimensions
22
Exact post-selection inference, with application to the lasso
23
Multi-sample comparison of detrital age distributions
24
Optimal kernel choice for large-scale two-sample tests
25
Kernels Based Tests with Non-asymptotic Bootstrap Approaches for Two-sample Problems
26
The two-sample problem for Poisson processes: adaptive tests with a non-asymptotic wild bootstrap approach
27
Estimation of the mean of functional time series and a two‐sample problem
28
Universality, Characteristic Kernels and RKHS Embedding of Measures
29
A two-sample test for high-dimensional data with applications to gene-set testing
30
Kernel Measures of Conditional Dependence
31
Goodness-of-fit testing and quadratic functional estimation from indirect observations
32
A Kernel Method for the Two-Sample-Problem
33
Alteration of the chemical environment disrupts communication in a freshwater fish
34
Measuring Statistical Dependence with Hilbert-Schmidt Norms
35
Comparison of serum and heparinized plasma samples for measurement of chemistry analytes.
36
Exact and Approximate Stepdown Methods for Multiple Hypothesis Testing
37
Non-asymptotic minimax rates of testing in signal detection
38
Integral Probability Metrics and Their Generating Classes of Functions
39
Minimax testing of the hypothesis of independence for ellipsoids inlp
40
U
-Statistics Theory and Practice
41
The Tight Constant in the Dvoretzky-Kiefer-Wolfowitz Inequality
42
Minimax Testing of Nonparametric Hypotheses on a Distribution Density in the $L_p$ Metrics
43
A Distribution Free Version of the Smirnov Two Sample Test in the $p$-Variate Case
44
Asymptotic Minimax Character of the Sample Distribution Function and of the Classical Multinomial Estimator
45
The Kolmogorov-Smirnov Test for Goodness of Fit
46
Theory of Reproducing Kernels.
47
A Class of Statistics with Asymptotically Normal Distribution
48
2022b) who construct linear-time variants of these three aggregated tests
49
Yj) is a one-sample U -statistic of order 2. Hence, we can apply the result of Albert et al. (2022
50
2020) in particular propose a regularized estimator for
51
2020) propose another approach to an MMD adaptive two-sample test
52
They work in the asymptotic regime
53
show that maximizing this criterion for the linear-time setting corresponds to maximizing the asymptotic power of the test. The test is then performed on the remaining part of the data
54
Remark 2.1), Dvoretzky–Kiefer–Wolfowitz inequality (Dvoretzky et al., 1956), more precisely the version with the tight constant which is due to
55
2019, Theorems 3 and 5), the minimax rate of testing over the Sobolev ball Ss
56
2013) consider the two-sample problem with sample sizes following
57
2019) consider the problem of testing whether two random vectors
58
Decoupling From Dependence To Independence
59
2016) which holds for uncountable candidate sets (i.e. all linear combinations)
60
2013) construct a two-sample aggregated test in a framework
61
For the special case when m = n, Gretton et al. (2012a, Lemma 6) also propose to consider a different estimator for the Maximum Mean Discrepancy which is the one-sample U -statistic
62
2012a) actually consider the unnormalised Gaussian kernel without
63
2012a) first suggested using the median heuristic to choose the bandwidths of the single test with the Gaussian kernel7 corresponding to Ki(u
64
2011), so the Maximum Mean Discrepancy depends on the choice
65
Motivated by the experiment
66
MNIST handwritten digit database
67
Learning Multiple Layers of Features from Tiny Images
68
2012a) which is a kernel-based metric on the space of probability
69
Asymptotically minimax hypothesis testing for nonparametric alternatives
70
2020), this quadratic-time estimator can be written as a two-sample U -statistic (Hoeffding
71
Approximation Theorems of Mathematical Statistics
72
Centre De Referència En Economia Analítica Barcelona Economics Working Paper Series Working Paper Nº 17 Stewise Multiple Testing as Formalized Data Snooping Stepwise Multiple Testing as Formalized Data Snooping