Suppose we have a graph of noisy real-world data such as market tick data of midprices of a given instrument.

The naive human eye can readily spot trend reversals (these may be random noise but they are trend reversals

nevertheless). For example, an uneducated observer can say: "Hmmm, the prices kept increasing until 14:16 but

then they suddenly started decreasing." I want to know how to algorithmically locate the time of a trend reversal

(14:16 in this case) in such a way that will match ordinary common sense and human judgment.

Of course, this is a difficult problem, but I'd love to have a gateway into industry approaches.

Obvious naive solutions don't work. You can look at local minima/ maxima but they will identify time points in a sequence

like 100 101 100 101 100 101 etc.

Obviously, the problem is not completely well-defined so there's not a unique solution. Signal processing and linear regression

are obvious tools. Maybe the best help is just to give a phrase that will lead to the correct google search.

Thank You

CommodityQuant