SERVING THE QUANTITATIVE FINANCE COMMUNITY

 
User avatar
Traden4Alpha
Posts: 23951
Joined: September 20th, 2002, 8:30 pm

Re: If you are bored with Deep Networks

January 28th, 2018, 5:21 pm

The fundamental working hypothesis of AI is that intelligent behavior can be precisely described as symbol manipulation and can be modeled with the symbol processing capabilities of the computer.

??
The 1980s called and they want their hypothesis back.
 
User avatar
ISayMoo
Topic Author
Posts: 2294
Joined: September 30th, 2015, 8:30 pm

Re: If you are bored with Deep Networks

January 28th, 2018, 9:58 pm

It's still true. Only we need to build this computer first.
 
User avatar
Cuchulainn
Posts: 62106
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

February 25th, 2018, 11:26 am

Does Tensorflow support multi-processor CPUs and can it be customised?
 
User avatar
Cuchulainn
Posts: 62106
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

February 25th, 2018, 11:27 am

Does Tensorflow support multi-processor CPUs and can it be customised? e.g. if you don't have GPUs or Cloud?
What's the speedup for this application? What's the maximum number of cores before performance degrades? I suppose it uses a fine-grained thread model with locking and pre-emption?
 
User avatar
tagoma
Posts: 18354
Joined: February 21st, 2010, 12:58 pm

Re: If you are bored with Deep Networks

March 3rd, 2018, 1:23 pm

This answer on SO?
 
with tf.device("/cpu:4"):
  # ...

with tf.device("/cpu:7"):
  # ...

with tf.device("/cpu:0"):
  # ...

config = tf.ConfigProto(device_count={"CPU": 8},
                        inter_op_parallelism_threads=1,
                        intra_op_parallelism_threads=1)
sess = tf.Session(config=config)
 
User avatar
Cuchulainn
Posts: 62106
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

March 3rd, 2018, 6:39 pm

import time
def wait_on_b():
    time.sleep(5)
    print(b.result())  # b will never complete because it is waiting on a.
    return 5

def wait_on_a():
    time.sleep(5)
    print(a.result())  # a will never complete because it is waiting on b.
    return 6


executor = ThreadPoolExecutor(max_workers=2)
a = executor.submit(wait_on_b)
b = executor.submit(wait_on_a)
I don't use Python(yet) but the issues are language-independent. Threads scale badly and anecdotal evidence shows that speedup in TensorFlow is bad with > 32 processors. Too fine-grained.

Python supports asynchronous futures which offer an alternative (C++11 has futures). The Scheduler determines what is pinned to what, not the application developer (sub-optimal).

https://docs.python.org/3/library/concu ... tures.html
 
User avatar
mkoerner
Posts: 4
Joined: April 8th, 2009, 12:17 pm

Re: If you are bored with Deep Networks

March 7th, 2018, 11:18 am

Does Tensorflow support multi-processor CPUs and can it be customised? e.g. if you don't have GPUs or Cloud?
What's the speedup for this application? What's the maximum number of cores before performance degrades? I suppose it uses a fine-grained thread model with locking and pre-emption?
As far as I know it supports multi-core CPUs on a number of levels, on a low level by dispatching calls to the MKL, and on a high level by running multiple sessions in parallel (FAQ, Does the runtime parallelize parts of graph execution?). Similar in pytorch where on a high-level the input data can be distributed along a dimension using DataParallelThis post on "Python Global Interpreter Lock (GIL) + numerical data processing" could also be informative if you are interested in Python.
 
User avatar
Cuchulainn
Posts: 62106
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

March 27th, 2018, 9:51 am

History of the tools



Logistic regression — 1958
Hidden Markov Model — 1960
Stochastic gradient descent — 1960
Support Vector Machine — 1963
k-nearest neighbors — 1967
Artificial Neural Networks — 1975
Expectation Maximization — 1977
Decision tree — 1986
Q-learning — 1989
Random forest — 1995
 
User avatar
Cuchulainn
Posts: 62106
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

April 21st, 2018, 12:57 pm

I find trying to grasp all these articles a bit painful. Lots of theory/theorems etc. but where is the example? e.g. that you can check against.

Image
 
User avatar
Traden4Alpha
Posts: 23951
Joined: September 20th, 2002, 8:30 pm

Re: If you are bored with Deep Networks

April 21st, 2018, 4:37 pm

I find trying to grasp all these articles a bit painful. Lots of theory/theorems etc. but where is the example? e.g. that you can check against.

Image
This does not sound right to me. My (limited) experience of math has been the opposite. Examples are mostly for applied math.

I wonder what percentage of academic math papers have examples?
 
User avatar
Cuchulainn
Posts: 62106
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

April 21st, 2018, 5:07 pm

Maybe this example helps.(sections 3 and 5)
https://en.wikipedia.org/wiki/Metric_space

'Concrete' can be at many 'levels' e.,g. objects and types.
Just think about metrics for NNs, input representation and the mappings (hopefurrry bijective) between metric spaces etc.
 
User avatar
Cuchulainn
Posts: 62106
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

June 6th, 2018, 12:20 pm

Some open research issues(??)
https://pdfs.semanticscholar.org/a2cf/2 ... 9e43dc.pdf

and some guidelines ..

https://www.ucl.ac.uk/~ucfamus/papers/i ... ions17.pdf

It would be useful for the august panel to give their feedback.
 
User avatar
ISayMoo
Topic Author
Posts: 2294
Joined: September 30th, 2015, 8:30 pm

Re: If you are bored with Deep Networks

June 6th, 2018, 9:42 pm

The 1st article is very old (from 2007). A lot of the stuff in it is out of date or applies to things people are no longer really interested in.

The 2nd article (guidelines) is kind of "Captain Obvious speaking" paper. The author's right about everything (or maybe almost everything, but I didn't have the time to go nitpicking), but it's rather well-known stuff.
 
User avatar
Cuchulainn
Posts: 62106
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

June 7th, 2018, 4:35 pm

The 1st article is very old (from 2007). A lot of the stuff in it is out of date or applies to things people are no longer really interested in.

The 2nd article (guidelines) is kind of "Captain Obvious speaking" paper. The author's right about everything (or maybe almost everything, but I didn't have the time to go nitpicking), but it's rather well-known stuff.
Fair enough.So things have progressed in leaps and bounds in the period 2007-2017?
2007 is not very old. BP is at least 50 years old. 
 
User avatar
Cuchulainn
Posts: 62106
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

June 7th, 2018, 7:08 pm

The recent spate of articles are great on description and narrative (the what) but fall short on explanation (the how), as Wittgenstein might say.

Reverse engineering is well-nigh impossible. It's all black box anno 2007 or has that been resolved, ISayMoo?

Does university education not teach how to write unambiguous algorithms?
ABOUT WILMOTT

PW by JB

Wilmott.com has been "Serving the Quantitative Finance Community" since 2001. Continued...


Twitter LinkedIn Instagram

JOBS BOARD

JOBS BOARD

Looking for a quant job, risk, algo trading,...? Browse jobs here...


GZIP: On