instead of

Code: Select all

```
import math
...
math.sqrt(5.0)
```

Code: Select all

```
from math import sqrt, exp, log, pow
...
sqrt(5.0)
```

Code: Select all

```
from math import *
...
sqrt(5.0)
```

Hee Collector,

instead of
you can also import a set of functions that are inside math and then drop the "math." prepending
you can also import all functions that are inside math
I would do the middle version. The first is indeed too verbose. The last imports lots of stuff, and maybe it import some "bs" function who's name clashes with a function you made yourself. With the middle one you'll have an explicit list of functions you've imported from math.

instead of

Code: Select all

```
import math
...
math.sqrt(5.0)
```

Code: Select all

```
from math import sqrt, exp, log, pow
...
sqrt(5.0)
```

Code: Select all

```
from math import *
...
sqrt(5.0)
```

Cool!

I found another little thing:

I see you sometimes using "2" (integer) and sometimes "2." (float). The "2." is shorthand for "2.0" and it's telling python that you want a float there instead of an integer. In python 2 division of integers (the "2" version) will return an integer. So 10/4 will return 2! Division of floats will however return a float, 10. / 4. = 2.5. This can cause confusion, and this was changes in python 3. In python 3 both 10/4 and 10.0/4.0 will give 2.5

.. so if you want your code to run on python 2, and if you want 10/4 to return 2.5 then you can do two things

1) replace "2" with "2." or "2.0" to make the constants floats instead of integers. You do this sometimes, but sometimes not.

2) specify at the top that you want the python 3 behaviour for your code even when run in python 2, like this:

..anyways, .. this issue is the reason that you sometimes see people with "2" as "2."

I found another little thing:

I see you sometimes using "2" (integer) and sometimes "2." (float). The "2." is shorthand for "2.0" and it's telling python that you want a float there instead of an integer. In python 2 division of integers (the "2" version) will return an integer. So 10/4 will return 2! Division of floats will however return a float, 10. / 4. = 2.5. This can cause confusion, and this was changes in python 3. In python 3 both 10/4 and 10.0/4.0 will give 2.5

.. so if you want your code to run on python 2, and if you want 10/4 to return 2.5 then you can do two things

1) replace "2" with "2." or "2.0" to make the constants floats instead of integers. You do this sometimes, but sometimes not.

2) specify at the top that you want the python 3 behaviour for your code even when run in python 2, like this:

Code: Select all

```
from __future__ import division
print(10/4)
>> 2.5
```

pretty good, and I see even the factorial function seems to work well at least for n=10000, not light speed, but nice language

There is a nice "arbitrary precision" library for python which is fast:pretty good, and I see even the factorial function seems to work well at least for n=10000, not light speed, but nice language

http://mpmath.org/doc/current/functions/gamma.html

- Cuchulainn
**Posts:**57692**Joined:****Location:**Amsterdam-
**Contact:**

The end. Why not install VM on top of OSX or just buy a Windows machine with C#. Just sayin'not so promising eitherWhat not just use Excel-DNA (does it work on Mac?). Piece of cake.

"I have solved this problem with ExcelDNA. As stated in the ExcelDNA forums it does not support Mac for Office, however there is a workaround. Install Wine for OSX and then run Microsoft Office 2003 or 2010 within it. I used a commercial distribution of Wine called Crossover and it works well. The downside of this approach is that you must install Wine on every target for your AddIn." https://stackoverflow.com/questions/222 ... -in-on-osx

BYW if you still have Apple II then this is

https://en.wikipedia.org/wiki/VisiCalc

"Microsoft products work well together"

And if you remain nostalgic about more demanding programming languages à la C++, the following shall work:

Code: Select all

```
import tensorflow as tf
val_ = tf.constant(5.0)
sqrt_ = tf.sqrt(val_)
with tf.Session() as sess:
print(sess.run(sqrt_))
```

Thats a very good minimalistic intro into using Tensorflow in Python!

You don't easily see it, but tf builds a computational graph of operations on tensors and dependencies between them. Typically you feed values in the graph and then recompute values that depend on them.

E.g. if you replace val_ with 16 and compute sqrt_ again you get 4

Tensorflows strength is high performance and cross platform linear algebra and automatic differentiation, it also runs on GPU.

If you had to compute the implied vol of 100.000 Barrier Exchange Options then tf would perform really well. It would run on the GPU or parallel cores, and you could use automatic differentiation to solve for the implied.

You don't easily see it, but tf builds a computational graph of operations on tensors and dependencies between them. Typically you feed values in the graph and then recompute values that depend on them.

E.g. if you replace val_ with 16 and compute sqrt_ again you get 4

Code: Select all

```
with tf.Session() as sess:
print(sess.run(sqrt_, feed_dict={val_:16}))
```

If you had to compute the implied vol of 100.000 Barrier Exchange Options then tf would perform really well. It would run on the GPU or parallel cores, and you could use automatic differentiation to solve for the implied.

Yes, I wrote it ugly purposefully! I like this intro on tensorflow.Thats a very good minimalistic intro into using Tensorflow in Python!

You don't easily see it, but tf builds a computational graph of operations on tensors and dependencies between them. Typically you feed values in the graph and then recompute values that depend on them.

E.g. if you replace val_ with 16 and compute sqrt_ again you get 4

Tensorflows strength is high performance and cross platform linear algebra and automatic differentiation, it also runs on GPU.Code: Select all`with tf.Session() as sess: print(sess.run(sqrt_, feed_dict={val_:16}))`

If you had to compute the implied vol of 100.000 Barrier Exchange Options then tf would perform really well. It would run on the GPU or parallel cores, and you could use automatic differentiation to solve for the implied.

While we are at it, do you advise to go for

I haven't used that because it's a recent feature aimed at are more interactive experience which I don't need.

I think it was added to offer the same experience as Pytorch, which is growing in popularity?

I think it was added to offer the same experience as Pytorch, which is growing in popularity?

- Cuchulainn
**Posts:**57692**Joined:****Location:**Amsterdam-
**Contact:**

- I am looking for books on Numpy and Scipy, with focus on the numerical algorithms and background. I am not interested in having to wade in syntax before getting to these topics.

Any suggestions? Thx!

GZIP: On