SERVING THE QUANTITATIVE FINANCE COMMUNITY

 
User avatar
Cuchulainn
Topic Author
Posts: 63263
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: Machine Learning: Frequency asked Questions

September 18th, 2020, 6:53 pm

ImageDid Jonathan Swift invent GPT-3?

“... Every one knew how laborious the usual method is of attaining to arts and sciences; whereas, by his contrivance, the most ignorant person, at a reasonable charge, and with a little bodily labour, might write books in philosophy, poetry, politics, laws, mathematics, and theology, without the least assistance from genius or study.” 
Chips chips chips Du du du du du Ci bum ci bum bum Du du du du du Ci bum ci bum bum Du du du du du
http://www.datasimfinancial.com
http://www.datasim.nl
 
User avatar
Cuchulainn
Topic Author
Posts: 63263
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: Machine Learning: Frequency asked Questions

September 18th, 2020, 7:00 pm

A.I. can’t solve this: The coronavirus could be highlighting just how overhyped the industry is

https://www.cnbc.com/2020/04/29/ai-has-limited-role-coronavirus-pandemic.html?__source=sharebar|linkedin&par=sharebar

“It’s fascinating how quiet it is,” said Neil Lawrence, the former director of machine learning at Amazon Cambridge.

“This (pandemic) is showing what bulls--t most AI hype is. It’s great and it will be useful one day but it’s not surprising in a pandemic that we fall back on tried and tested techniques.”
Those techniques include good, old-fashioned statistical techniques and mathematical models. The latter is used to create epidemiological models, which predict how a disease will spread through a population. Right now, these are far more useful than fields of AI like reinforcement learning and natural-language processing.

// In fairness, it is not a law of gravity that AI should be good at everything. Maybe stick to statistics?
What is truly fascinating is that Neil Laurence needed the Covid crisis to make this observation.
He had no other choice,.
You also don't here about ML(Heston) [$]10^4[$] times faster than FDM(Heston) anymore.
Chips chips chips Du du du du du Ci bum ci bum bum Du du du du du Ci bum ci bum bum Du du du du du
http://www.datasimfinancial.com
http://www.datasim.nl
 
User avatar
JohnLeM
Posts: 464
Joined: September 16th, 2008, 7:15 pm

Re: Machine Learning: Frequency asked Questions

September 20th, 2020, 10:23 pm

A.I. can’t solve this: The coronavirus could be highlighting just how overhyped the industry is

https://www.cnbc.com/2020/04/29/ai-has-limited-role-coronavirus-pandemic.html?__source=sharebar|linkedin&par=sharebar

“It’s fascinating how quiet it is,” said Neil Lawrence, the former director of machine learning at Amazon Cambridge.

“This (pandemic) is showing what bulls--t most AI hype is. It’s great and it will be useful one day but it’s not surprising in a pandemic that we fall back on tried and tested techniques.”
Those techniques include good, old-fashioned statistical techniques and mathematical models. The latter is used to create epidemiological models, which predict how a disease will spread through a population. Right now, these are far more useful than fields of AI like reinforcement learning and natural-language processing.

// In fairness, it is not a law of gravity that AI should be good at everything. Maybe stick to statistics?
What is truly fascinating is that Neil Laurence needed the Covid crisis to make this observation.
He had no other choice,.
You also don't here about ML(Heston) [$]10^4[$] times faster than FDM(Heston) anymore.
hi hi...:)
I can't wait for the next hype ! Who bet ?? I bet for quantum computing with conventional machines !
 
User avatar
katastrofa
Posts: 9665
Joined: August 16th, 2007, 5:36 am
Location: Alpha Centauri

Re: Machine Learning: Frequency asked Questions

September 21st, 2020, 9:43 am

Dude, it's not a joke. It's 20 years since D-Wave - high time to deliver the technology to our desks. I'm quite orgasmic about the prospect of coding it.
 
User avatar
JohnLeM
Posts: 464
Joined: September 16th, 2008, 7:15 pm

Re: Machine Learning: Frequency asked Questions

September 23rd, 2020, 9:27 pm

Dude, it's not a joke. It's 20 years since D-Wave - high time to deliver the technology to our desks. I'm quite orgasmic about the prospect of coding it.
Well, if we are speaking about quantum approach with conventional computers, I am afraid that this will end as artificial intelligence: rediscovering the wheel and relabeling all mathematics with "quantum", because it is much more trendy than "spectral analysis". Note that this will make some fresh air, the prefix "neural" was starting to be obsolete.
What is annoying is that I will have to relabel all my powerpoint slides accordingly :/
 
User avatar
katastrofa
Posts: 9665
Joined: August 16th, 2007, 5:36 am
Location: Alpha Centauri

Re: Machine Learning: Frequency asked Questions

September 23rd, 2020, 11:00 pm

Ta-dah!

I particularly like this section - it's almost as muddy as my posts on Wilmott:

"The brain is definitely a macroscopic physical system operating on the scales (of time, space, temperature) which differ crucially from the corresponding quantum scales. (The macroscopic quantum physical phenomena such as e.g. the Bose-Einstein condensate are also characterized by the special conditions which are definitely not fulfilled in the brain.) In particular, the brain is simply too hot to be able perform the real quantum information processing, i.e., to use the quantum carriers of information such as photons, ions, electrons. As is commonly accepted in brain science, the basic unit of information processing is a neuron. It is clear that a neuron cannot be in the superposition of two states: firing and non-firing. Hence, it cannot produce superposition playing the basic role in the quantum information processing. Superpositions of mental states are created by complex neural networks of neurons (and these are classical neural networks). Quantum cognition community states that the activity of such neural networks can produce effects which are formally described as interference (of probabilities) and entanglement."

That's obviously gobbledegook, but I often read that temperature is a problem for quantum devices/systems - that thermal fluctuations destroy potential quantum effects by default. Especially in the case of a brain, which is not a bunch of loose atoms, but a large network of cells stabilised by physical connections and interactions, it doesn't seem to hold. Brain must be robust to thermal fluctuations at finite temperatures, otherwise it would fall apart. Well, unless you fry it.
 
User avatar
Cuchulainn
Topic Author
Posts: 63263
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: Machine Learning: Frequency asked Questions

September 24th, 2020, 12:24 pm

Yeah, alchemistic wafffe.
Chips chips chips Du du du du du Ci bum ci bum bum Du du du du du Ci bum ci bum bum Du du du du du
http://www.datasimfinancial.com
http://www.datasim.nl
 
User avatar
JohnLeM
Posts: 464
Joined: September 16th, 2008, 7:15 pm

Re: Machine Learning: Frequency asked Questions

September 24th, 2020, 12:47 pm

Ta-dah!

I particularly like this section - it's almost as muddy as my posts on Wilmott:

"The brain is definitely a macroscopic physical system operating on the scales (of time, space, temperature) which differ crucially from the corresponding quantum scales. (The macroscopic quantum physical phenomena such as e.g. the Bose-Einstein condensate are also characterized by the special conditions which are definitely not fulfilled in the brain.) In particular, the brain is simply too hot to be able perform the real quantum information processing, i.e., to use the quantum carriers of information such as photons, ions, electrons. As is commonly accepted in brain science, the basic unit of information processing is a neuron. It is clear that a neuron cannot be in the superposition of two states: firing and non-firing. Hence, it cannot produce superposition playing the basic role in the quantum information processing. Superpositions of mental states are created by complex neural networks of neurons (and these are classical neural networks). Quantum cognition community states that the activity of such neural networks can produce effects which are formally described as interference (of probabilities) and entanglement."
Ta-dah!  -- ok, I declare myself vanquished, you humiliated me !
More seriously, this quantum AI gobbledegook seems to tell that :
1) It is spectral analysis: we are already using quantum algorithms, as we already used neural networks before they were trendy. Nothing new under the sun here.
2) However, since we are already using quantum algorithms, we are already ready for the "quantum revolution" !! I mean, the real one, with real quantum computers, that should come soon in a galaxy far from here.
 
User avatar
Cuchulainn
Topic Author
Posts: 63263
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: Machine Learning: Frequency asked Questions

September 24th, 2020, 1:12 pm

Are humans ready for the transition?

www.youtube.com/watch?v=NzlG28B-R8Y
Chips chips chips Du du du du du Ci bum ci bum bum Du du du du du Ci bum ci bum bum Du du du du du
http://www.datasimfinancial.com
http://www.datasim.nl
 
User avatar
katastrofa
Posts: 9665
Joined: August 16th, 2007, 5:36 am
Location: Alpha Centauri

Re: Machine Learning: Frequency asked Questions

September 25th, 2020, 11:37 pm

To the other side you mean?
MobileNetV2_photographer.jpg
A little background: I was hoping to use one of those large Google models for real-time video image recognition after I saw a very impressive demo, but the results I got range from hair-rising to jaw-dropping, e.g. seeing too much (like above - remember the journalist with a telephoto lens shot by a drone in Bagdad from a human error, and the AI was supposed to prevent such errors) or not enough (below - a tank and a guy yielding two rifles? Who cares!)
ResNet100V2_riffle.jpg
In the figures: the models used and top 3 predictions - from the highest to the lowest probabilities.
Last edited by katastrofa on September 26th, 2020, 12:51 am, edited 2 times in total.
 
User avatar
katastrofa
Posts: 9665
Joined: August 16th, 2007, 5:36 am
Location: Alpha Centauri

Re: Machine Learning: Frequency asked Questions

September 25th, 2020, 11:52 pm

I thought the models are better at recognising pets, but my Leela was everything short of a cat:
InceptionV3_IMG_6094.JPG
ResNet50V2_IMG_6092.JPG
InceptionV3_IMG_6090.JPG
In other photos or according to other models she was also a capuchin, a black-footed ferret and a titi (a small monkey).

I'm starting to wonder if all the spectacular demos of image recognition models aren't fakes... Maybe it's within the ethical principles of this science nouveau.
Last edited by katastrofa on September 26th, 2020, 5:39 am, edited 5 times in total.
 
User avatar
katastrofa
Posts: 9665
Joined: August 16th, 2007, 5:36 am
Location: Alpha Centauri

Re: Machine Learning: Frequency asked Questions

September 25th, 2020, 11:55 pm

BTW, the heatmap is sensitivity analysis of the output (i.e. the probability distribution of the classes) to the activations of the last convolutional layer. In simple words, it shows where the model sees those strange things - indeed, in my cat!
Strangely, lots of objects are classified as a dishwasher. Hardly ever the classifications are correct - in most cases they are nonsense.
I feel kind of sorry for all those people spending their lives tweaking those models to beat some benchmarks. Some of them might have had a chance to be real scientists and do something valuable. Not to mention the wasted computational resources.
 
User avatar
JohnLeM
Posts: 464
Joined: September 16th, 2008, 7:15 pm

Re: Machine Learning: Frequency asked Questions

October 6th, 2020, 7:57 pm

BTW, the heatmap is sensitivity analysis of the output (i.e. the probability distribution of the classes) to the activations of the last convolutional layer. In simple words, it shows where the model sees those strange things - indeed, in my cat!
Strangely, lots of objects are classified as a dishwasher. Hardly ever the classifications are correct - in most cases they are nonsense.
I feel kind of sorry for all those people spending their lives tweaking those models to beat some benchmarks. Some of them might have had a chance to be real scientists and do something valuable. Not to mention the wasted computational resources.
@Katastrofa, first of all congratulations, your cat is very cute.
Concerning classification algorithms, I experienced that it can be quite difficult to do efficient ones, particularly if you wish to classify among a wide number of category. Moreover, to be honest, classification algorithms based on neural networks might not be very efficient for that task.

I am not that pessimistic concerning those people, after all this is a way to make advances in science. But I agree with you somehow, thinking that they are a little bit manipulated. They are somehow used as a free army for maintenance and evolution of tools and libraries, that even their conceptors could barely justify with a clear mathematical theory.
 
User avatar
Cuchulainn
Topic Author
Posts: 63263
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: Machine Learning: Frequency asked Questions

October 6th, 2020, 8:44 pm

Who needs maths anyways; I remember this infamous post here

My deeper point was that Cauchy sequences may be an approximation for some things in the real world (as long one does not look too far ahead in time or too closely at tiny epsilons) but Cauchy sequences don't actually occur in the real world. The pure proven properties of Cauchy sequences are only in the minds of mathematicians. Engineers and physicists might use Cauchy sequences as an approximation but must always be aware that they do not actually exist in the physical world.
Chips chips chips Du du du du du Ci bum ci bum bum Du du du du du Ci bum ci bum bum Du du du du du
http://www.datasimfinancial.com
http://www.datasim.nl
 
User avatar
JohnLeM
Posts: 464
Joined: September 16th, 2008, 7:15 pm

Re: Machine Learning: Frequency asked Questions

October 6th, 2020, 8:56 pm

Who needs maths anyways; I remember this infamous post here

My deeper point was that Cauchy sequences may be an approximation for some things in the real world (as long one does not look too far ahead in time or too closely at tiny epsilons) but Cauchy sequences don't actually occur in the real world. The pure proven properties of Cauchy sequences are only in the minds of mathematicians. Engineers and physicists might use Cauchy sequences as an approximation but must always be aware that they do not actually exist in the physical world.
I prefer not to know who wrote this post :)
No need for math today ? Shouldn't we reply with just a kid song ?
"Little pig, little pig, let me come in."
"No, no, by the hair on my chinny chin chin."
"Then I'll huff, and I'll puff, and I'll blow your house in."
ABOUT WILMOTT

PW by JB

Wilmott.com has been "Serving the Quantitative Finance Community" since 2001. Continued...


Twitter LinkedIn Instagram

JOBS BOARD

JOBS BOARD

Looking for a quant job, risk, algo trading,...? Browse jobs here...


GZIP: On