In ML 'learning' refers to "changing parameters in order to better achieve some goal" *) like finding {a,b} in y=ax+b that minimizes the LSE for a set of {x,y} pairs.
What you refer to is memory. A very popular branch of NN is "recurrent neural networks" which processes an ordered sequence of inputs and updates some memory state, which in turns affect its output. This is however not called learning if the network doesn't change the parameters that determine how it updates memory based on new inputs. A self driving car might see a car overtaking, updating a memory state which represents that as it happens, but it's not learning to achieve some goal better (to detect overtaking cars,.. given the same frames it will do the exact same memory updates)
*) this is supervised learning, (there is also unsupervised learning like finding clusters in data etc)