Page 9 of 34

Re: If you are bored with Deep Networks

Posted: January 28th, 2018, 5:21 pm
by Traden4Alpha
The fundamental working hypothesis of AI is that intelligent behavior can be precisely described as symbol manipulation and can be modeled with the symbol processing capabilities of the computer.

??
The 1980s called and they want their hypothesis back.

Re: If you are bored with Deep Networks

Posted: January 28th, 2018, 9:58 pm
by ISayMoo
It's still true. Only we need to build this computer first.

Re: If you are bored with Deep Networks

Posted: February 25th, 2018, 11:26 am
by Cuchulainn
Does Tensorflow support multi-processor CPUs and can it be customised?

Re: If you are bored with Deep Networks

Posted: February 25th, 2018, 11:27 am
by Cuchulainn
Does Tensorflow support multi-processor CPUs and can it be customised? e.g. if you don't have GPUs or Cloud?
What's the speedup for this application? What's the maximum number of cores before performance degrades? I suppose it uses a fine-grained thread model with locking and pre-emption?

Re: If you are bored with Deep Networks

Posted: March 3rd, 2018, 1:23 pm
by tagoma
This answer on SO?
 
with tf.device("/cpu:4"):
  # ...

with tf.device("/cpu:7"):
  # ...

with tf.device("/cpu:0"):
  # ...

config = tf.ConfigProto(device_count={"CPU": 8},
                        inter_op_parallelism_threads=1,
                        intra_op_parallelism_threads=1)
sess = tf.Session(config=config)

Re: If you are bored with Deep Networks

Posted: March 3rd, 2018, 6:39 pm
by Cuchulainn
import time
def wait_on_b():
    time.sleep(5)
    print(b.result())  # b will never complete because it is waiting on a.
    return 5

def wait_on_a():
    time.sleep(5)
    print(a.result())  # a will never complete because it is waiting on b.
    return 6


executor = ThreadPoolExecutor(max_workers=2)
a = executor.submit(wait_on_b)
b = executor.submit(wait_on_a)
I don't use Python(yet) but the issues are language-independent. Threads scale badly and anecdotal evidence shows that speedup in TensorFlow is bad with > 32 processors. Too fine-grained.

Python supports asynchronous futures which offer an alternative (C++11 has futures). The Scheduler determines what is pinned to what, not the application developer (sub-optimal).

https://docs.python.org/3/library/concu ... tures.html

Re: If you are bored with Deep Networks

Posted: March 7th, 2018, 11:18 am
by mkoerner
Does Tensorflow support multi-processor CPUs and can it be customised? e.g. if you don't have GPUs or Cloud?
What's the speedup for this application? What's the maximum number of cores before performance degrades? I suppose it uses a fine-grained thread model with locking and pre-emption?
As far as I know it supports multi-core CPUs on a number of levels, on a low level by dispatching calls to the MKL, and on a high level by running multiple sessions in parallel (FAQ, Does the runtime parallelize parts of graph execution?). Similar in pytorch where on a high-level the input data can be distributed along a dimension using DataParallelThis post on "Python Global Interpreter Lock (GIL) + numerical data processing" could also be informative if you are interested in Python.

Re: If you are bored with Deep Networks

Posted: March 27th, 2018, 9:51 am
by Cuchulainn
History of the tools



Logistic regression — 1958
Hidden Markov Model — 1960
Stochastic gradient descent — 1960
Support Vector Machine — 1963
k-nearest neighbors — 1967
Artificial Neural Networks — 1975
Expectation Maximization — 1977
Decision tree — 1986
Q-learning — 1989
Random forest — 1995

Re: If you are bored with Deep Networks

Posted: April 21st, 2018, 12:57 pm
by Cuchulainn
I find trying to grasp all these articles a bit painful. Lots of theory/theorems etc. but where is the example? e.g. that you can check against.

Image

Re: If you are bored with Deep Networks

Posted: April 21st, 2018, 4:37 pm
by Traden4Alpha
I find trying to grasp all these articles a bit painful. Lots of theory/theorems etc. but where is the example? e.g. that you can check against.

Image
This does not sound right to me. My (limited) experience of math has been the opposite. Examples are mostly for applied math.

I wonder what percentage of academic math papers have examples?

Re: If you are bored with Deep Networks

Posted: April 21st, 2018, 5:07 pm
by Cuchulainn
Maybe this example helps.(sections 3 and 5)
https://en.wikipedia.org/wiki/Metric_space

'Concrete' can be at many 'levels' e.,g. objects and types.
Just think about metrics for NNs, input representation and the mappings (hopefurrry bijective) between metric spaces etc.

Re: If you are bored with Deep Networks

Posted: June 6th, 2018, 12:20 pm
by Cuchulainn
Some open research issues(??)
https://pdfs.semanticscholar.org/a2cf/2 ... 9e43dc.pdf

and some guidelines ..

https://www.ucl.ac.uk/~ucfamus/papers/i ... ions17.pdf

It would be useful for the august panel to give their feedback.

Re: If you are bored with Deep Networks

Posted: June 6th, 2018, 9:42 pm
by ISayMoo
The 1st article is very old (from 2007). A lot of the stuff in it is out of date or applies to things people are no longer really interested in.

The 2nd article (guidelines) is kind of "Captain Obvious speaking" paper. The author's right about everything (or maybe almost everything, but I didn't have the time to go nitpicking), but it's rather well-known stuff.

Re: If you are bored with Deep Networks

Posted: June 7th, 2018, 4:35 pm
by Cuchulainn
The 1st article is very old (from 2007). A lot of the stuff in it is out of date or applies to things people are no longer really interested in.

The 2nd article (guidelines) is kind of "Captain Obvious speaking" paper. The author's right about everything (or maybe almost everything, but I didn't have the time to go nitpicking), but it's rather well-known stuff.
Fair enough.So things have progressed in leaps and bounds in the period 2007-2017?
2007 is not very old. BP is at least 50 years old. 

Re: If you are bored with Deep Networks

Posted: June 7th, 2018, 7:08 pm
by Cuchulainn
The recent spate of articles are great on description and narrative (the what) but fall short on explanation (the how), as Wittgenstein might say.

Reverse engineering is well-nigh impossible. It's all black box anno 2007 or has that been resolved, ISayMoo?

Does university education not teach how to write unambiguous algorithms?