They do, that's called a Recurrant model.
And Recurrance is critical for a model to have "memory" of past inputs.
It was one of the key advancements awhile back in data processing for predictive systems, IE speech recognition.
Recurrence is pretty standard now in most neural networks, linear networks are your most basic ones that are mostly just used to demonstrate the basic 101 concepts of ML, they don't have a tonne of practical IRL uses aside from some forms of very basic image processing and stuff like that. Filter functions and etc.