Page 1 of 2

The Final Frontier: negative probability in Machine Learning

Posted: April 9th, 2019, 6:22 pm
by Cuchulainn
Machine Learning, Uncertain Information, and the Inevitability of Negative `Probabilities'

http://videolectures.net/mlws04_lowe_mluii/

It was only a matter of time before someone came up with this brainwave.

 At its heart I want to challenge the assumption that probabilities have to be positive. I want to give several arguments, descriptive and formal, to indicate why the use of positive probabilities is an ideal which is both overly restrictive and unrealisable. Indeed I will argue that the use of non-positive `probabilities' is both inevitable and natural. 

Re: The Final Frontier: negative probability in Machine Learning

Posted: April 9th, 2019, 7:15 pm
by JohnLeM
Machine Learning, Uncertain Information, and the Inevitability of Negative `Probabilities'

http://videolectures.net/mlws04_lowe_mluii/

It was only a matter of time before someone came up with this brainwave.

 At its heart I want to challenge the assumption that probabilities have to be positive. I want to give several arguments, descriptive and formal, to indicate why the use of positive probabilities is an ideal which is both overly restrictive and unrealisable. Indeed I will argue that the use of non-positive `probabilities' is both inevitable and natural. 
"The core of the argument is that in modelling the universe through Machine Learning, we are obliged to make inferences based on finite and hence typically less-than-complete information."
We should definitively launch finite-difference skynet !

Re: The Final Frontier: negative probability in Machine Learning

Posted: April 9th, 2019, 7:43 pm
by Cuchulainn
Image

Re: The Final Frontier: negative probability in Machine Learning

Posted: April 9th, 2019, 10:44 pm
by bearish
The negative probability argument video is 15 years old. Oddly coincidental with our own Collector making similar observations, and a certain quant group at Morgan Stanley trying to embed it in deep models of synthetic CDOs. Not necessarily with success. 

Re: The Final Frontier: negative probability in Machine Learning

Posted: April 9th, 2019, 11:31 pm
by Collector
"Issues in Artificial Intelligence, "the OOM theory suffers from the negative probability problem *NPP)


"Learned OOM may sometimes return negative probabilities for some events"

(OOM: "Observable Operator Models are generalization of HMM models")

Issues in Artificial Intelligence, Robotics and Machine Learning

I look forward to self driving cars! They are so flexible!

Re: The Final Frontier: negative probability in Machine Learning

Posted: April 10th, 2019, 1:42 am
by JohnLeM
"Issues in Artificial Intelligence, "the OOM theory suffers from the negative probability problem *NPP)


"Learned OOM may sometimes return negative probabilities for some events"

(OOM: "Observable Operator Models are generalization of HMM models")

Issues in Artificial Intelligence, Robotics and Machine Learning

I look forward to self driving cars! They are so flexible!
Well, this is quite a curious reaction. When I end computing a negative probability, I usually think "OMG, I should fix this up". Cuchullain is quite good at removing negative probas. I can give some tricks also.

Re: The Final Frontier: negative probability in Machine Learning

Posted: April 10th, 2019, 9:45 am
by ISayMoo
Holy smoke! The POSITIVE probabilities are what should be used to describe incomplete information. If you have complete information, you don't need probability at all.

You can define an object to describe the difference between two probability distributions if you want, but don't call it "probability".

Re: The Final Frontier: negative probability in Machine Learning

Posted: April 10th, 2019, 10:04 am
by Cuchulainn
There's a H in Nicholson

Re: The Final Frontier: negative probability in Machine Learning

Posted: April 10th, 2019, 10:14 am
by Cuchulainn
Holy smoke! The POSITIVE probabilities are what should be used to describe incomplete information. If you have complete information, you don't need probability at all.

You can define an object to describe the difference between two probability distributions (Q. what's a good name?) if you want, but don't call it "probability".
Well said!

In the same way that the difference of two natural numbers is not a natural number but an integer (very precise statement). aka equivalence class of ordered pairs of natural numbers with a binary operator ~.
Feynman's incomplete article was a step in the right direction but many latched on to it as gospel. Give 'em an inch and they'll take a mile.

I would have no problems if someone defines an order relation on probability numbers. Makes things well-defined.

Re: The Final Frontier: negative probability in Machine Learning

Posted: April 10th, 2019, 4:52 pm
by JohnLeM
There's a H in Nicholson
I assume, and confess, that I have a picture of Phyllis Nicolson in my bedroom since two days !

Re: The Final Frontier: negative probability in Machine Learning

Posted: April 11th, 2019, 5:34 am
by JohnLeM
You can define an object to describe the difference between two probability distributions (Q. what's a good name?) if you want, but don't call it "probability".
Deep Wasserstein Distance ?

Re: The Final Frontier: negative probability in Machine Learning

Posted: April 13th, 2019, 12:15 pm
by ISayMoo
There's loads of them.

Re: The Final Frontier: negative probability in Machine Learning

Posted: June 20th, 2019, 2:33 pm
by ikicker
My personal opinion - What I would do is recompute the probability so it's not negative. Problem solved. Probably you need to do something simple with a ratio to adjust P(not) upwards. Create an error handling function.

You're probably looking at something other than a probability or using an incorrect distribution. Something went wrong. Someone once told me that they computed a correlation higher than 1. I don't doubt it happened, I just acknowledge that something unanticipated happened with your math and you need to fix it.

We have a bigger problem with AI, which is probability distributions that look like waves (multi-modal distributions). You have a missing categorical variable, but until you get it you need to either split the data or do something with it. Also, multi-modal distributions overlap, so they are difficult to bifurcate. Do you perform a custom transformation? Do you run k-means and try to impute the category? It's a complete pain in the ass.

Re: The Final Frontier: negative probability in Machine Learning

Posted: June 20th, 2019, 6:11 pm
by Cuchulainn
NP is a thorny issue and it depends who you talk to

engineers; don't believe in it, but APL supports it
physicists believe religiously in it; just change reality if necessary
mathematicians: don't care
ML : of course

Amen

Re: The Final Frontier: negative probability in Machine Learning

Posted: June 20th, 2019, 6:18 pm
by Collector
NP is a thorny issue and it depends who you talk to

engineers; don't believe in it, but APL supports it
physicists believe religiously in it; just change reality if necessary
mathematicians: don't care
ML : of course

Amen
why u also need AI.
AI can be used to detect fake news!  and fake probabilities?

AImen