Information theory /

Passion


Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data. Since its inception it has broadened to find applications in many other areas, including statistical inference, natural language processing, cryptography generally, networks other than communication networks — as in neurobiology, the evolution and function of molecular codes, model selection in ecology, thermal physics, quantum computing, plagiarism detection and other forms of data analysis. A key measure of information is known as entropy, which is usually expressed by the average number of bits needed for storage or communication. Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes). Applications of fundamental topics of information

Edit Description
Tags:
freebase.com

NEWS


There are no news.

Acitivities

Hussein ,
Ehsanhosseini ,
Promit
Liked this item.
SHAD
Todoed this item.

Similar items

User activity around Information theory


This site uses cookies to give the best and personalised experience. By continuing to browse the site you are agreeing to our use of cookies. Find out more here.

Sign Up or Login

Invalid username or password
Not yet on CircleMe? Join now
Please input a email
Please input a valid email
Max 50 characters
Email already in use
Please input a username
Min 3 characters
Max 20 characters
Please use only A-Z, 0-9 and "_"
Username is taken
Please input a password
Min 6 characters
Max 20 characters
By clicking the icons, you agree to CircleMe terms & conditions
Already have an account? Login Now