In his last solo episode, Sean Carroll talks about complexity, and I love how he explains how physics and information theory see the relation between entropy and information differently.
In information theory (à la Claude Shannon), high entropy means high information: you have a uniform probability distribution, you need every bit of information to describe it because it can't be reduced or deduced.
In physics (à la Boltzmann), high entropy means low information: you have high entropy when you have lots of microstates compatible with your macrostate. When you have low entropy, you gain information because only a few microstates can be compatible, that's telling you something.