Can anyone please explain the equivalence or similarity of entropy in physics and entropy in information systems in layman terms? Sorry I'm no mathematician, but still I am trying ti understand the concepts so that I'll have a better understanding of the concepts. I have an idea of entropy in Physics, but I don't understand when someone says entropy in information systems and its uses and applications. Thanks for your time.
Entropy in physics vs information systems
573 Views Asked by SRaj At
1
There are 1 best solutions below
Related Questions in CRYPTOGRAPHY
- Do I have to randomize key in OpenSSL
- An exception of type 'System.Security.Cryptography.CryptographicException': keyset does not exist
- crypto.BadPaddingException: data hash wrong (EKYC-Response)
- Decrypted string returns "Length of the data to decrypt is invalid"
- Generate signature using private key with OpenSSL API
- Recovering an ECPublicKey from Java to JavaCard
- Proxy tool for CoAP integrated with DTLS
- Using CmsEnvelopedData with CmsSignedData to verify signed data
- Unchecked returned value causing unexpected states and conditions
- SQL-Server Verify SHA2_512 hash procedure
- SagePay Protocol 3.00 Encryption Error with ASP.NET
- Encrypting with PHP; decrypting with CryptoJS
- How can I write a function to recreate the original text obscured here by css magic?
- What encoding does [BouncyCastle] PKCS10CertificationRequest.getEncoded() return?
- Is integer comparison in Python constant time?
Related Questions in ENTROPY
- How to calculate the entropy of a coin flip
- Behaviour of Address space layout randomization (ASLR) when entropy is low
- Range-coding with no ambiguous or illegal output
- Entropy calculation on a time serie moving window
- kullback leibler divergence limit
- Random 256bit key using SecRandomCopyBytes( ) in iOS
- Cross Entropy, Softmax and the derivative term in Backpropagation
- how to get the sum of alphabetical characters Shannon entropy
- Word VBA Script for Shannon entropy of the alphabetic characters
- Is there no information gain using entropy on 2 classes?
- Computing Shannon entropy of a HTTP header using Python. How to do it?
- Is CryptGenRandom() thread-safe?
- calculate Transfer entropy in r
- Is user delay between random takes is good improvement for PRNG?
- Computation of Mutual Information
Related Questions in INFORMATION-THEORY
- In a group of correlated variables, how can I deduce which subset of variables best describe the remaining variables?
- Optimal way to compute pairwise mutual information using numpy
- Computation of Mutual Information
- How does the entropy of a string of English text signify low quality?
- Entropy Rate of a source of information with memory
- Optimal way to compress 60 bit string
- Are decision trees trying to maximize information gain or entropy?
- What makes a task difficult or 'complex' to machine learn? Regarding complexity of pattern, not computationally
- Minimum expected length of a message
- Markov entropy when probabilities are uneven
- Is there a scientific field dedicated to the quantification of intelligent behavior?
- Calculating PMI for bigram and discrepancy
- Is there an efficient algorithm to verify that a finite code is uniquely decodeable?
- Can information in a bi-directional communication channel be sent more quickly?
- error correction code upper bound
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Information entropy (also called Shannon Information) is the measure of "surprise" about a new bit of information. A system with high entropy has a large surprise. Low entropy, little surprise.
Systems with high entropy are difficult to compress, because every bit is surprising and so has to be recorded.
Systems with low entropy are easy to compress, because you can predict what comes next given what you've seen before.
Counter-intuitively, this means that a TV showing static (white noise) is presenting a lot of information because each frame is random, while a TV show has comparatively little information because most frames can be mostly predicted based on the previous frame. Similarly, a good random number generator is defined by having very high entropy/information/surprise.
It also means that the amount of entropy is highly dependent on context. The digits of pi have very high entropy because an arbitrary one is impossible to predict (assuming pi is normal). But if I know that you will be sending me the digits of pi, then the digits themselves have zero information because I could have computed all of them myself.
The reason all of this plays into cryptography is because the goal of a cryptographic system is generate an output that is indistinguishable from random, which is to say that it takes low-entropy information and outputs high-entropy information. The output of a cryptographic algorithm can have no more entropy than its highest-entropy input. Systems whose highest-entropy input is a human chosen password are going to be very poor crypto systems because they are very predictable (have little information; low entropy). A good crypto system will include a high-entropy value like a well-seeded and unpredictable random number. To the extent that this random number is predictable (has low entropy), the system is weakened.
You must be careful at this point not to over-analogize between thermodynamic and information entropy. In particular, one is almost exclusively interested in entropy gradients in thermodynamics, while entropy is treated as an absolute value in information theory (measured in bits). Conversely, information entropy is sometimes incorrectly thought of as a form of energy that is "depleted" when generating random numbers. This is not true in any useful way, and certainly not like heat energy.
Also, how cryptographers use the word entropy isn't precisely the same as how Shannon used it. See Guesswork is not a substitute for Entropy for one discussion of this.
For how this does and doesn't apply to thermodynamics more broadly (and particularly how it applies to the famous Maxwell's Demon), I recommend the Wikipedia article comparing the two kinds of entropy.