Entropy and irreversibility, two fundamental concepts underlying physical processes, and now also at the core of digital processes for simulation and evolution of artificial models. We will touch on some of these aspects, taking as our starting point the well-defined and well-explored mathematical context of infinite sequence spaces over finite alphabets. As we will see, universal estimators of typical signal entropy and cross entropy, based on the asymptotics of recurrences and waiting times, play an important role in information theory, and we will discuss some mathematical results. Starting from their construction, we will introduce and discuss universal estimators of typical signals of entropy production in the context of nonequilibrium statistical mechanics of one-sided shifts on finite alphabets. Finally, we will discuss some applications to DNA sequences, particularly in understanding the so-called Second Chargaff Rule, a still unexplained family of empirical symmetries present in "most" genetic sequences. We will conclude with some suggestions to show how concepts proper to statistical mechanics and thermodynamics, such as entropy and irreversibility, have become fundamental even in the most recent synthetic image generation models.
In collaboration with Giampaolo Cristadoro Vojkan Jakšić, Renaud Raquépas.