savi
savi's profile header
savi

@savi@mastodon.online

October 18, 2021

In his last solo episode, Sean Carroll talks about complexity, and I love how he explains how physics and information theory see the relation between entropy and information differently.

In information theory (à la Claude Shannon), high entropy means high information: you have a uniform probability distribution, you need every bit of information to describe it because it can't be reduced or deduced.

In physics (à la Boltzmann), high entropy means low information: you have high entropy when you have lots of microstates compatible with your macrostate. When you have low entropy, you gain information because only a few microstates can be compatible, that's telling you something.

Elk Logo

Welcome to Elk!

Elk is a nimble Mastodon web client. You can login to your Mastodon account and use it to interact with the fediverse.

Expect some bugs and missing features here and there. Elk is Open Source and we're actively improving it as a community project. Join us and let's build it together!

If you'd like to report a bug, help us testing, give feedback, or contribute, reach out to us on GitHub and get involved.

To boost development, you can sponsor the Team through GitHub Sponsors. We hope you enjoy Elk!

PatakDaniel RoeJoaquín SánchezAnthony Fu三咲智子 Kevin DengTAKAHASHI Shuuji

The Elk Team