Can anyone please explain the equivalence or similarity of entropy in physics and entropy in information systems in layman terms? Sorry I'm no mathematician, but still I am trying ti understand the concepts so that I'll have a better understanding of the concepts. I have an idea of entropy in Physics, but I don't understand when someone says entropy in information systems and its uses and applications. Thanks for your time.
Entropy in physics vs information systems
573 Views Asked by SRaj At
1
There are 1 best solutions below
Related Questions in CRYPTOGRAPHY
- Secure Messaging Implementation in C#
- How to verify JWS (x5c chain) is signed by apple using Jose
- How to Safely Use Crypto.subtle Property for Local Testing Without Security Risks?
- OpenSSL3.0 RSA Signature Verification in C
- npm install tulind in my crypto server side
- how i need place arg in code for funtion send?
- Mbed TLS: in-place en-/decryption for OAEP doesn't seem to work
- Cannot test cryptographic performance using crypto_aesni_mb
- Installation Private Blockchain
- Encountered this error while implementing NTT cpp code: terminate called after throwing an instance of 'std::bad_alloc' what(): std::bad_alloc
- Cryptography Notion - Diffie-Hellmann
- Hash password with another password
- How to convert CryptAcquireContext to .NET 8 using System.Security.Cryptography methods
- Error "Cannot find module 'crypto'" in WalletConnect module
- Why do some cryptographic signature npm packages (like superdilithium) convert text to an array of integers before signing?
Related Questions in ENTROPY
- Deterministic CTR_DRBG in OpenSSL
- rouble with mounting Python code to FUSE: No response and prolonged processing ---
- Shannon's entropy on FTIR data
- Dissimilarity Index using Segregation Package
- Arithmetic encoding implementation outputting chars
- Python entropy calculation of trigrams
- JS divergence code is giving wrong answers
- I can't work out the entropy formula for TraMineR
- Is there an error with my sample entropy estimator for 3D multivariate signal?
- Use "/dev/urandom" with GMP library to generate a very big random number
- Can the total Entropy of all clusters be greater than 1, after classification?
- How can I calculate in MATLAB the entropy and mutual information of points moving through 3D space?
- Calculating Chao-Shen corrected Jensen-Shannon Divergence
- KL divergence corrected for limited sample size bias
- Error: Image dimensions and neighborhood dimensions do not match
Related Questions in INFORMATION-THEORY
- How to determine the Shannon Entropy using chi-squared
- Information content of a (1D) curve (i.e. spectroscopy)
- R packages lme4 and glmmTMB produce different AIC for the same model and data
- How to calculate the pairwise mutual information (MI) for N number of variables?
- Issue in python computing entropy
- Calculating the Fisher metric in PyTorch
- Rate Distorsion Theory with Time varying model
- Optimal algorithm for encoding data within pre-existing data
- How are NULL DACL and empty DACL treated?
- good compression method or library for time-sequence of float matrix (mp4 for int matrix case though)
- Analysing how well an image is represented?
- How many bits are required to encode information in probability set G = [0.001, 0.002, 0.003, 0.994]?
- Why does RGB use 6 hex digits?
- Sorting a circular list of binary necklaces by rotational hamming distance
- Is there an efficient algorithm to verify that a finite code is uniquely decodeable?
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular # Hahtags
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Information entropy (also called Shannon Information) is the measure of "surprise" about a new bit of information. A system with high entropy has a large surprise. Low entropy, little surprise.
Systems with high entropy are difficult to compress, because every bit is surprising and so has to be recorded.
Systems with low entropy are easy to compress, because you can predict what comes next given what you've seen before.
Counter-intuitively, this means that a TV showing static (white noise) is presenting a lot of information because each frame is random, while a TV show has comparatively little information because most frames can be mostly predicted based on the previous frame. Similarly, a good random number generator is defined by having very high entropy/information/surprise.
It also means that the amount of entropy is highly dependent on context. The digits of pi have very high entropy because an arbitrary one is impossible to predict (assuming pi is normal). But if I know that you will be sending me the digits of pi, then the digits themselves have zero information because I could have computed all of them myself.
The reason all of this plays into cryptography is because the goal of a cryptographic system is generate an output that is indistinguishable from random, which is to say that it takes low-entropy information and outputs high-entropy information. The output of a cryptographic algorithm can have no more entropy than its highest-entropy input. Systems whose highest-entropy input is a human chosen password are going to be very poor crypto systems because they are very predictable (have little information; low entropy). A good crypto system will include a high-entropy value like a well-seeded and unpredictable random number. To the extent that this random number is predictable (has low entropy), the system is weakened.
You must be careful at this point not to over-analogize between thermodynamic and information entropy. In particular, one is almost exclusively interested in entropy gradients in thermodynamics, while entropy is treated as an absolute value in information theory (measured in bits). Conversely, information entropy is sometimes incorrectly thought of as a form of energy that is "depleted" when generating random numbers. This is not true in any useful way, and certainly not like heat energy.
Also, how cryptographers use the word entropy isn't precisely the same as how Shannon used it. See Guesswork is not a substitute for Entropy for one discussion of this.
For how this does and doesn't apply to thermodynamics more broadly (and particularly how it applies to the famous Maxwell's Demon), I recommend the Wikipedia article comparing the two kinds of entropy.