Looks like those kernels (and RKHS) that you mention have many applications.
Can we say that kernels allow us to define metrics and norms on probability measures? Then you can use the artillery of Functional Analysis bear, as I attempted to introduce before it was shot down. People like their comfort zones.
I see that it is even possible to use kernels instead of Kolmogorov-Smirnov. Are Cauchy sequences hiding in kernel methods?
Beware: some folk think that Cauchy sequences caused the financial crisis.
Hello. I am not sure that kernel methods can be credited for defining metrics on probability spaces, as we can define such objects without introducing kernel methods : for instance Wasserstein distance, log entropy distance, etc...
However, to any such metric, I think that we can associate one (or infinitely many) kernels, with which we can define a functional space (or infinitely many), and Cauchy sequences.
I am not a specialist of Kolmogorov Smirnov test
. But a possible consequence is that we can try using another distributions than the Kolmogorov distribution one for this test : is that already known ? This would lead to infinitely many different tests than the one using the Jacobi theta function. What could be the purpose of such a construction ? I am not sure but we could adapt the Kolmogorov Smirnov test to a specific applications, exactly as we do for PDE or Machine learning, for instance to lower the number of testing samples. Could that be interesting for the stat community ?
EDIT : I just read new posts in this quite related thread, and found out that @Cuchullain posted a reference : https://arxiv.org/pdf/0805.2368.pdf that seems to give some good first answers to applications of kernels to Kolmogorov Smirnov-like test. I think that we could here contribute to this reference, giving more general and more precise testings, if there is any interest for such an approach.