IEEE
You are not logged in, please sign in to edit > Log in / create account  

Category:Information theory

From GHN

Jump to: navigation, search

The processing of information via the use of applied mathematics and electrical engineering

Subcategories

  • Audio coding - the translation of auditory information into digital code
  • Channel coding - code used to protect information over a channel by correcting errors resulting from noise or other interference
  • Codes - rules for converting one piece of information into another
  • Communication channels - a physical or logical connection between two points that allows for the exchange of an information signal
  • Decoding - translating from an coded message into the original language or form
  • Encoding - the process by which information from a source is changed into symbols to be communicated
  • Error compensation - the encoding or transmission of extra information or code to compensate for possible errors
  • Information entropy - the level of uncertainty associated with a random variable (often refers to the "Shannon entropy")
  • Mutual information - occasionally called transinformation, the quantity that measures the mutual dependence of two random variables
  • Rate distortion theory - the branch of information theory which explains lossy data compression and which determines the minimal amount of entropy that should be communicated over a channel
  • Speech coding - the use of the data compression of digital audio signals to encode speech

Pages in category "Information theory"

The following 64 pages are in this category, out of 64 total.

A

B

C

D

E

F

G

G cont.

H

I

J

K

L

M

O

P

R

S

T

V

W