Pretty fucking cool that we are postulating that dreams are not similar to machine learning algorithms in that they are not primarily purposed for adapting to what's experienced during sober consciousness, but rather used in preventing subconscious lower-level instinctual brain functions from over controlling the perception of reality.
The actual opposite is happening with machine learning LLMs where it believes their hallucinations which are derived from obfuscated data is truthful regardless of where it pulled the data from.