Another example, maybe more familiar, comes to mind: making pancakes. As you've probably been told, you should mix your pancake batter only a little, just to break up the big lumps of dry flour, and then set it aside for a few minutes to let the smaller lumps hydrate. You could mix it more, and get smaller lumps, but that would be bad because it would also make your pancakes come out tougher. So the mixing process is one where a little is a good thing, but where it eventually converges to the wrong state, so you should stop early.
@11011110 yeah cooking/baking are interesting examples here because they’re usually about the relationship between physics and chemistry: you think you’re doing something with a monotonic/continuous effect until suddenly it changes discretely into a new state/phase, and you’re usually aiming for something prior to the fixed point
@o Somehow that phrase reminds me of an old generative artwork I remember seeing at an exhibit curated by Marshall Bern at Xerox PARC. The artist (whose name I unfortunately do not remember) modified a Xerox copier to replace a few random pixels of any copy with pixels from an image of Zippy the Pinhead, so that if copied enough times, eventially all documents would converge to Zippy.
@11011110
Diminishing returns?
@dougmerritt Negative returns? The point is less about a small likelihood of additional improvement (although that happens too) and more about the idea that too many iterations starts making things worse instead of better, and eventually a lot worse.
@11011110
This is a tough one.
There's a quote rather than a set phrase, "too much of a good thing", from As You Like It.
An asymptotic expansion (https://en.wikipedia.org/wiki/Asymptotic_expansion) is a mathematical series that diverges, but that still provides a useful approximation to something if you truncate it early enough. Maybe the most famous is the Stirling series, the infinite power series approximation to the factorial that when truncated gives Stirling's approximation (https://en.wikipedia.org/wiki/Stirling%27s_approximation). The full series is divergent, for any fixed value of the factorial, but it takes a lot of terms to diverge and gets very accurate before it does.
I'm convinced there are non-mathematical situations that have analogous behavior. One I frequently encounter involves the bots set loose on Wikipedia to clean up the citations to the literature in the references of Wikipedia articles. Usually, a bot-cleaned citation is better than a hand-written citation, so it is useful to have a bot clean the citations. Once or twice. But the bots tend to run over the same citations many times, occasionally introducing minor errors, like adding a doi that turns out to point to something related to the citation rather than the citation itself. After that point, repeated passes of the bot can amplify the mistake in the citation until it eventually becomes so garbled that one cannot tell what it was originally intended to refer to.
Is there a name for this real-world phenomenon?