SERVING THE QUANTITATIVE FINANCE COMMUNITY

 
User avatar
ISayMoo
Topic Author
Posts: 1889
Joined: September 30th, 2015, 8:30 pm

Re: If you are bored with Deep Networks

February 25th, 2019, 7:14 pm

It's Open Source. Google and DeepMind use it for their research. What more can I say?
 
User avatar
Cuchulainn
Posts: 59389
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

March 4th, 2019, 10:33 am

This piece is from "No Silver Bullet" (1986) by Fred Brooks 

Artificial intelligence. Many people expect advances in artificial intelligence to provide the revolutionary breakthrough that will give order-of-magnitude gains in software productivity and quality. [3] I do not. To see why, we must dissect what is meant by "artificial intelligence."
D.L. Parnas has clarified the terminological chaos: [4]
Two quite different definitions of AI are in common use today. AI-1: The use of computers to solve problems that previously could only be solved by applying human intelligence. Al-2: The use of a specific set of programming techniques known as heuristic or rule-based programming. In this approach human experts are studied to determine what heuristics or rules of thumb they use in solving problems.... The program is designed to solve a problem the way that humans seem to solve it.
The first definition has a sliding meaning.... Something can fit the definition of Al-1 today but, once we see how the program works and understand the problem, we will not think of it as Al any more.... Unfortunately I cannot identify a body of technology that is unique to this field.... Most of the work is problem-specific, and some abstraction or creativity is required to see how to transfer it.
I agree completely with this critique. The techniques used for speech recognition seem to have little in common with those used for image recognition, and both are different from those used in expert systems. I have a hard time seeing how image recognition, for example, will make any appreciable difference in programming practice. The same problem is true of speech recognition. The hard thing about building software is deciding what one wants to say, not saying it. No facilitation of expression can give more than marginal gains.
Expert-systems technology, AI-2, deserves a section of its own. 
 
User avatar
Cuchulainn
Posts: 59389
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

March 6th, 2019, 10:17 am

Anonymous Quote

I know GD, but I don't have working experiences with GD. I've some experiences with EM algorithm applying to various Markov models. There are some points you mentioned I can relate as below:
  • Initial guess close to real solution (Analyse Numerique 101).
  • No guarantee that GD is applicable in the first place (assumes cost function is smooth).
  • Convergence to local minimum.
  • The method is iterative, so no true reliable quality of service (QOS).
  • It's not very robust
Besides those points, what I really dislike is the way many machine learning people treat problems. From my experience, standard Gaussian or Gaussian mixture based models are doing okay by using iterative optimization even though it has drawbacks as above. When the model became more complicated, if you take a close look at how they treated them, it's not rare to find them do some approximations or quick-or-dirty fix without saying it or without math reasoning behind it. Sometimes, some quick-and-dirty tricks don't make sense to me or they cannot fully justify itself. I like numerical methods, but I don't like quite a few ML methods because of this.

btw: I was asked by an interviewer who is a computer science guy about ML. I told him my thoughts. And he was pissed off right away. And I just wanted to discuss with him....
 
User avatar
katastrofa
Posts: 7937
Joined: August 16th, 2007, 5:36 am
Location: Alpha Centauri

Re: If you are bored with Deep Networks

March 7th, 2019, 7:08 am

I've seen many erroneous uses of standard metods too. Besides, they can't handle some class of data mining problems (esp. pattern discovery and recognition). I'm under the impression that the criticism like above comes from people who've never worked on them.
 
User avatar
Cuchulainn
Posts: 59389
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

March 7th, 2019, 12:24 pm

 I'm under the impression that the criticism like above comes from people who've never worked on them.
Are you referring to the interviewer or OP?
If OP, then I think you might be missing the point. 95% of people haven't worked on these methods and OP is trying to separate the wheat from the chaff. 
It would be better to say why it is wrong or else just say nothing.
 
User avatar
katastrofa
Posts: 7937
Joined: August 16th, 2007, 5:36 am
Location: Alpha Centauri

Re: If you are bored with Deep Networks

March 7th, 2019, 3:29 pm

OP (ISayMoo) definitely has experience in ML, he develops AI. It was just my general impression about the social-media ML critics, as I would call them.
 
User avatar
Cuchulainn
Posts: 59389
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

March 7th, 2019, 4:16 pm

OP (ISayMoo) definitely has experience in ML, he develops AI. It was just my general impression about the social-media ML critics, as I would call them.
Ah, I meant the Anon (not ISM) who wanted to understand ML but the interviewer got annoyed..

We can converse as adults about AI. AI critics tend to know what they are talking about. You can call them anything you want but that doesn't help.
Both @JohnLeM and myself feel than NN for PDE is just a good old meshless method. And no one out there will confront us on this issue.

'AI is very, very stupid,' says Google's AI leader, at least compared to humans

Be aware of the limits of artificial intelligence, not just the hype.

The AI expectations are sky high.
 
User avatar
ISayMoo
Topic Author
Posts: 1889
Joined: September 30th, 2015, 8:30 pm

Re: If you are bored with Deep Networks

March 7th, 2019, 6:27 pm

I think you and JohnLeM should write a paper, a critical review of existing NN-for-PDE literature, and state your case. Send it to an ML conference or a journal. Post it on Twitter (seriously). Otherwise you're not being noticed, and private emails to individual researchers will be ignored because they're not a public challenge.
 
User avatar
Cuchulainn
Posts: 59389
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

March 7th, 2019, 8:16 pm

I think you and JohnLeM should write a paper, a critical review of existing NN-for-PDE literature, and state your case. Send it to an ML conference or a journal. Post it on Twitter (seriously). Otherwise you're not being noticed, and private emails to individual researchers will be ignored because they're not a public challenge.
I suggested this to John already. I'm not really trying to be noticed, just yet. One thing: I won't use external libraries.

Twitter! Everyone would see it! 
 
User avatar
ISayMoo
Topic Author
Posts: 1889
Joined: September 30th, 2015, 8:30 pm

Re: If you are bored with Deep Networks

March 8th, 2019, 8:20 am

I hope you will write it. I would like to read it.
 
User avatar
Cuchulainn
Posts: 59389
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

March 19th, 2019, 12:32 pm

Doing a bit of browsing on this topic.
One thing I notice is that the vanishing and exploding gradient problems and their resolution seem an active research area.
A bit of a culture shock ..

At one level we can see LSTM as a fancy-pants name for a fix of VGP. It is primarily a numerical problem caused by finite precision? (remind of the endless hullabaloo on negative probability).
Constant Error Carousal unit .. who makes these names up?
 
User avatar
Cuchulainn
Posts: 59389
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

March 22nd, 2019, 4:12 pm

Is DL any good for this object? is it a wheel?

Image
 
User avatar
ISayMoo
Topic Author
Posts: 1889
Joined: September 30th, 2015, 8:30 pm

Re: If you are bored with Deep Networks

March 22nd, 2019, 8:22 pm

Reinventing the doughnut?
 
User avatar
Cuchulainn
Posts: 59389
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: If you are bored with Deep Networks

March 23rd, 2019, 8:30 pm

 
User avatar
ISayMoo
Topic Author
Posts: 1889
Joined: September 30th, 2015, 8:30 pm

Re: If you are bored with Deep Networks

April 2nd, 2019, 10:02 am

Anonymous Quote

I know GD, but I don't have working experiences with GD. I've some experiences with EM algorithm applying to various Markov models. There are some points you mentioned I can relate as below:
  • Initial guess close to real solution (Analyse Numerique 101).
  • No guarantee that GD is applicable in the first place (assumes cost function is smooth).
  • Convergence to local minimum.
  • The method is iterative, so no true reliable quality of service (QOS).
  • It's not very robust
GD converges on non-differentiable functions as long as they're convex. E.g. it will handle f(x) = |x| as well as f(x) = x^2.
ABOUT WILMOTT

PW by JB

Wilmott.com has been "Serving the Quantitative Finance Community" since 2001. Continued...


Twitter LinkedIn Instagram

JOBS BOARD

JOBS BOARD

Looking for a quant job, risk, algo trading,...? Browse jobs here...


GZIP: On