Wed Dec 05, 2018 1:11 pm

Let's take a step backwards.

Gradient descent methods as we know and love them are nothing more than Euler's (ugh) method applied to ODEs (aka*gradient system*):

[$]dx/dt = - grad f(x) = -\nabla f(x)[$] where [$]f[$] is the function to be minimised.

The local minima of [$]f[$] are the*critical points* of this ODE system. (Poincaré-Lyapunov-Bendixson theory).

I have tried it on a number of benchmark unconstrained optimisation problem and solved using C++ boost::odeint without any of the infamous learning rate fudges.

We use more robust ODE solvers than Euler which means that we don't get the well-documented issues with GD methods.

For the evergreen [$]e^5[$] we have ODE [$]dx/dt = -\nabla(x-e^5)^2 = -2(x - e^5)[$]. We get x = 148.413 for any initial value of the ODE.

Maestro !Let's take a step backwards.

Gradient descent methods as we know and love them are nothing more than Euler's (ugh) method applied to ODEs (aka

[$]dx/dt = - grad f(x) = -\nabla f(x)[$] where [$]f[$] is the function to be minimised.

The local minima of [$]f[$] are the

I have tried it on a number of benchmark unconstrained optimisation problem and solved using C++ boost::odeint without any of the infamous learning rate fudges.

We use more robust ODE solvers than Euler which means that we don't get the well-documented issues with GD methods.

For the evergreen [$]e^5[$] we have ODE [$]dx/dt = -\nabla(x-e^5)^2 = -2(x - e^5)[$]. We get x = 148.413 for any initial value of the ODE.

Statistics: Posted by ExSan — December 15th, 2018, 12:29 pm

]]>

[$]R^+ = 0.675[$] with probability 1/2

[$]R^- = -0.45[$] with probability 1/2

In other words, if you start period n with wealth [$]W_n[$], then [$]W_{n+1} = W_n (1 + R)[$], where the R's are independent draws each time from the above distribution.

I wanted to calculate the exact expected utility of a risk-neutral investor whose expected utility over N trials is [$]\bar{U}_N = E_0 \left(\frac{W_N}{W_0}\right)[$].

Here was my thinking.

First, [$]\bar{U}_N = E_0[e^{X_N}][$], where [$]X_N = \sum_{n=1}^{N} x_n[$] and [$]x_n = \log(1+R_n)[$].

Again, the [$]R_n[$] are i.i.d draws from the simple distribution above.

By the Central Limit Theorem, since the [$]x_n[$] are i.i.d, we expect [$]X_N[$] to tend to a normal distribution with mean [$]N \mu[$] and variance [$]N \sigma^2[$], with the approximation becoming better and better as [$]N \rightarrow \infty[$].

Next, [$]\mu = E[x_n] = E[\log(1 + R)] \approx -0.041012[$], and

[$]\sigma^2 = Var[x_n] \approx 0.3100541[$]

Since [$]X_N[$] is normally distributed for large enough N, we can use the EXACT normal relation (or wikipedia log-normal mean, if you like):

(*) [$]\bar{U}_N = E_0[e^{X_N}] = \exp \{(\mu + 0.5 \, \sigma^2)N \} \approx (1.120769)^N[$], using the values just given for [$](\mu,\sigma^2)[$].

But (*) can't be correct, since the returns are independent each time, and so we must have

(**) [$]\bar{U}_N = \left( E[(1 + R)] \right)^N = (1.1125)^N[$], which contradicts (*).

The brainteaser: where was my mistake?

p.s. Just to be clear, everything I said in that linked thread was correct. The mistake you want to find is in the calculation here. And, yes, I know where it is -- pretty sure

Statistics: Posted by Alan — December 14th, 2018, 10:10 pm

]]>

Although both [$]\pi[$] and [$]e[$] are known to be transcendental, it is not known whether the set of both of them is algebraically independent over [$]\mathbb {Q}[$]. In fact, it is not even known if [$]\pi +e[$] is irrational. Nesterenko proved in 1996 that:

the numbers [$]π[$], [$]e^{\pi}[$], and [$]\Gamma(1/4)[$] are algebraically independent over [$]\mathbb {Q}[$].

the numbers [$]π[$], [$]e^{\pi\sqrt{3}}[$], and [$]\Gamma(1/3)[$] are algebraically independent over [$]\mathbb {Q}[$].

for all positive integers [$]n[$], the numbers [$]π[$], [$]e^{\pi\sqrt{n}}[$] are algebraically independent over [$]\mathbb {Q}[$].

Statistics: Posted by ppauper — December 5th, 2018, 2:15 pm

]]>

Let's take a step backwards.

Gradient descent methods as we know and love them are nothing more than Euler's (ugh) method applied to ODEs (aka

[$]dx/dt = - grad f(x) = -\nabla f(x)[$] where [$]f[$] is the function to be minimised.

The local minima of [$]f[$] are the

I have tried it on a number of benchmark unconstrained optimisation problem and solved using C++ boost::odeint without any of the infamous learning rate fudges.

We use more robust ODE solvers than Euler which means that we don't get the well-documented issues with GD methods.

For the evergreen [$]e^5[$] we have ODE [$]dx/dt = -\nabla(x-e^5)^2 = -2(x - e^5)[$]. We get x = 148.413 for any initial value of the ODE.

Statistics: Posted by Cuchulainn — December 5th, 2018, 1:26 pm

]]>

Have you solved it yet?what a puzzle !Compute [$]e^{\pi}[$] to 2 decimal places with pencil and paper. (Gelfond's constant)

A follow-on from Exsan's post andansatz (big conjecture but in the right direction)is whether [$]\pi[$] and [$]e[$] arealgebraically independent? i.e. is there a polynomial relation[$]a_{n}\pi^{n} + a_{n-1}\pi^{n-1} + ... + a_{0}\pi^{0} = e^5[$]

or[$]a_{n}e^{n} + a_{n-1}e^{n-1} + ... + a_{0}e^{0} = \pi[$]where [$]a_{j} , j = 0,..,n[$] are algebraic numbers?Whichever one takes your fancy.

Statistics: Posted by ExSan — December 5th, 2018, 10:56 am

]]>

what a puzzle !Compute [$]e^{\pi}[$] to 2 decimal places with pencil and paper. (Gelfond's constant)

A follow-on from Exsan's post andansatz (big conjecture but in the right direction)is whether [$]\pi[$] and [$]e[$] arealgebraically independent? i.e. is there a polynomial relation[$]a_{n}\pi^{n} + a_{n-1}\pi^{n-1} + ... + a_{0}\pi^{0} = e^5[$]

or[$]a_{n}e^{n} + a_{n-1}e^{n-1} + ... + a_{0}e^{0} = \pi[$]where [$]a_{j} , j = 0,..,n[$] are algebraic numbers?Whichever one takes your fancy.

Statistics: Posted by Cuchulainn — December 4th, 2018, 11:31 pm

]]>

]]>

]]>

infinite wisdom too much, will blow your mind, must be managed

Statistics: Posted by Collector — November 2nd, 2018, 10:57 pm

]]>

]]>

]]>

Statistics: Posted by katastrofa — November 2nd, 2018, 10:29 pm

]]>

]]>

Statistics: Posted by katastrofa — November 2nd, 2018, 10:05 pm

]]>

Statistics: Posted by Collector — November 2nd, 2018, 9:59 pm

]]>