- Today, 10:32 am
- Forum: Numerical Methods Forum
- Topic: Universal Approximation theorem
- Replies:
**148** - Views:
**11768**

Speak of the devil Sobolev training , train the function and its derivatives. That sound relevant. http://mcneela.github.io/machine_learning/2018/02/19/A-Synopsis-Of-DeepMinds-Sobolev-Training-Of-Neural-Networks.html Sounds like a logical step. Sobolev spaces are the bread and butter of advanced nu...

- Today, 9:00 am
- Forum: Numerical Methods Forum
- Topic: Universal Approximation theorem
- Replies:
**148** - Views:
**11768**

My one-penny guess : take any square-integrable function [$]\phi[$], and call the convolution [$]\varphi = \phi \ast \phi[$] an activation function. Then you can use it in Cybenko Theorem. That's already a lot of examples. But there exists much more examples. In fact, give me any probability measure...

- Yesterday, 2:12 pm
- Forum: Numerical Methods Forum
- Topic: Universal Approximation theorem
- Replies:
**148** - Views:
**11768**

Homework for Cuch: this recent paper (jun 2017) is getting many people excited, it proposes SELU (instead of RELU, sigmoid). It works really well, I'm seeing very stable learning with deep networks. You can go straight to the appendix with the proofs (page 9 ..100) that motivate why it should work ...

- Yesterday, 2:01 pm
- Forum: Numerical Methods Forum
- Topic: Universal Approximation theorem
- Replies:
**148** - Views:
**11768**

Cuch, you complain that there's not enough maths in ML. Is this paper sufficiently mathy for you? https://arxiv.org/pdf/1908.10828.pdf I tried again to read this paper. Here is the Main Theorem , but I am afraid that an entire life would not be enough to understand it...I wonder if they are using ...

- Yesterday, 1:46 pm
- Forum: Numerical Methods Forum
- Topic: Universal Approximation theorem
- Replies:
**148** - Views:
**11768**

Cuch, you complain that there's not enough maths in ML. Is this paper sufficiently mathy for you? https://arxiv.org/pdf/1908.10828.pdf I tried again to read this paper. Here is the Main Theorem , but I am afraid that an entire life would not be enough to understand it...I wonder if they are using ...

- Yesterday, 12:25 pm
- Forum: Technical Forum
- Topic: Why is Bellman Equation solved by backwards?
- Replies:
**26** - Views:
**2090**

Thank you all, but I what I cannot understand is the real reason that Bellman equation is ususally solved by backwards. Can any one give an exmaple in which both intitial and terminal conditions are well defined, but the Bellman equation can only be solved by backwards? I think there are tons of ex...

- Yesterday, 11:51 am
- Forum: Numerical Methods Forum
- Topic: Universal Approximation theorem
- Replies:
**148** - Views:
**11768**

Cuch, you complain that there's not enough maths in ML. Is this paper sufficiently mathy for you? https://arxiv.org/pdf/1908.10828.pdf I tried again to read this paper. Here is the Main Theorem , but I am afraid that an entire life would not be enough to understand it...I wonder if they are using ...

- Yesterday, 11:03 am
- Forum: Numerical Methods Forum
- Topic: Universal Approximation theorem
- Replies:
**148** - Views:
**11768**

Sent !ok thanksYes. dduffy AT datasim DOT nl

- Yesterday, 10:22 am
- Forum: Numerical Methods Forum
- Topic: Universal Approximation theorem
- Replies:
**148** - Views:
**11768**

ok thanksYes. dduffy AT datasim DOT nl

- Yesterday, 10:19 am
- Forum: Numerical Methods Forum
- Topic: Universal Approximation theorem
- Replies:
**148** - Views:
**11768**

oh wait, a function you don't even know? We all know that it doesn't make sense. That's one possible weltanschauung , that does not produce interesting cases. In mathematics, you scope the class of problems you want to model, e.g. Sobolev spaces, Fourier etc. and then investigate which activation ...

- Yesterday, 10:01 am
- Forum: Numerical Methods Forum
- Topic: Universal Approximation theorem
- Replies:
**148** - Views:
**11768**

The mathematical precision in Cybenko 1988 has been superceded/improved on here http://www2.math.technion.ac.il/~pinkus/papers/acta.pdf In particular, Theorems 3.1, 4.1, 5.1, 6.2, 6.7, Proposition 3.3. Seems like ML' maths is stuck in the 80s. The mathematical subtleties surrounding activation fu...

- Yesterday, 9:57 am
- Forum: Numerical Methods Forum
- Topic: Universal Approximation theorem
- Replies:
**148** - Views:
**11768**

I don't buy this paper !! Try to read it, you will understand what I meanCuch, you complain that there's not enough maths in ML. Is this paper sufficiently mathy for you? https://arxiv.org/pdf/1908.10828.pdf

- October 9th, 2019, 6:33 am
- Forum: Book And Research Paper Forum
- Topic: MSc Theses on Machine Learning and Computational Finance
- Replies:
**9** - Views:
**657**

JohnlL.eM, Adding to your list of questions, I wrote this a while back @Cuchullain, for me, Gradient Descent is a swiss-knife methods. Always produce results, but can be stuck in local minima. Local minima, if it is lucky, That's the least of your worries. GD has a whole lot of issues: Off the top ...

- October 9th, 2019, 6:24 am
- Forum: Book And Research Paper Forum
- Topic: MSc Theses on Machine Learning and Computational Finance
- Replies:
**9** - Views:
**657**

6. but nobody can tell if the resulting algorithm is performant or not. Not sure if I completely agree --> cross-validation and 5-folds were used. I don't contest the numerical figures, you did a great job to guarantee them with this K-fold method, as well as teaching me the existence of this meth...

- October 8th, 2019, 12:32 pm
- Forum: Book And Research Paper Forum
- Topic: MSc Theses on Machine Learning and Computational Finance
- Replies:
**9** - Views:
**657**

6. but nobody can tell if the resulting algorithm is performant or not. Not sure if I completely agree --> cross-validation and 5-folds were used. I don't contest the numerical figures, you did a great job to guarantee them with this K-fold method, as well as teaching me the existence of this meth...

GZIP: On