Can anyone please explain the equivalence or similarity of entropy in physics and entropy in information systems in layman terms? Sorry I'm no mathematician, but still I am trying ti understand the concepts so that I'll have a better understanding of the concepts. I have an idea of entropy in Physics, but I don't understand when someone says entropy in information systems and its uses and applications. Thanks for your time.
Entropy in physics vs information systems
541 Views Asked by SRaj At
1
There are 1 best solutions below
Related Questions in CRYPTOGRAPHY
- Display API documentation as an IPython notebook
- Proper rendering of Sphinx substitutions in httpdomain extension
- Unable to use the Sphinx Readability theme
- Error no content in "rubric" directive permitted
- Documenting python script entry (__name__ == '__main__') using sphinx
- Python Sphinx anchor on arbitrary line
- build pandas documentation from source using installed pandas rather than an inplace build
- How to use sphinx automodule and exposed functions in __init__
- Sphinx includes 'code' library descriptions where it's not supposed to
- Sphinx-doc: Search tool not returning expected results?
Related Questions in ENTROPY
- Display API documentation as an IPython notebook
- Proper rendering of Sphinx substitutions in httpdomain extension
- Unable to use the Sphinx Readability theme
- Error no content in "rubric" directive permitted
- Documenting python script entry (__name__ == '__main__') using sphinx
- Python Sphinx anchor on arbitrary line
- build pandas documentation from source using installed pandas rather than an inplace build
- How to use sphinx automodule and exposed functions in __init__
- Sphinx includes 'code' library descriptions where it's not supposed to
- Sphinx-doc: Search tool not returning expected results?
Related Questions in INFORMATION-THEORY
- Display API documentation as an IPython notebook
- Proper rendering of Sphinx substitutions in httpdomain extension
- Unable to use the Sphinx Readability theme
- Error no content in "rubric" directive permitted
- Documenting python script entry (__name__ == '__main__') using sphinx
- Python Sphinx anchor on arbitrary line
- build pandas documentation from source using installed pandas rather than an inplace build
- How to use sphinx automodule and exposed functions in __init__
- Sphinx includes 'code' library descriptions where it's not supposed to
- Sphinx-doc: Search tool not returning expected results?
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular # Hahtags
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Information entropy (also called Shannon Information) is the measure of "surprise" about a new bit of information. A system with high entropy has a large surprise. Low entropy, little surprise.
Systems with high entropy are difficult to compress, because every bit is surprising and so has to be recorded.
Systems with low entropy are easy to compress, because you can predict what comes next given what you've seen before.
Counter-intuitively, this means that a TV showing static (white noise) is presenting a lot of information because each frame is random, while a TV show has comparatively little information because most frames can be mostly predicted based on the previous frame. Similarly, a good random number generator is defined by having very high entropy/information/surprise.
It also means that the amount of entropy is highly dependent on context. The digits of pi have very high entropy because an arbitrary one is impossible to predict (assuming pi is normal). But if I know that you will be sending me the digits of pi, then the digits themselves have zero information because I could have computed all of them myself.
The reason all of this plays into cryptography is because the goal of a cryptographic system is generate an output that is indistinguishable from random, which is to say that it takes low-entropy information and outputs high-entropy information. The output of a cryptographic algorithm can have no more entropy than its highest-entropy input. Systems whose highest-entropy input is a human chosen password are going to be very poor crypto systems because they are very predictable (have little information; low entropy). A good crypto system will include a high-entropy value like a well-seeded and unpredictable random number. To the extent that this random number is predictable (has low entropy), the system is weakened.
You must be careful at this point not to over-analogize between thermodynamic and information entropy. In particular, one is almost exclusively interested in entropy gradients in thermodynamics, while entropy is treated as an absolute value in information theory (measured in bits). Conversely, information entropy is sometimes incorrectly thought of as a form of energy that is "depleted" when generating random numbers. This is not true in any useful way, and certainly not like heat energy.
Also, how cryptographers use the word entropy isn't precisely the same as how Shannon used it. See Guesswork is not a substitute for Entropy for one discussion of this.
For how this does and doesn't apply to thermodynamics more broadly (and particularly how it applies to the famous Maxwell's Demon), I recommend the Wikipedia article comparing the two kinds of entropy.