What can I do?

226305 materialEducativo

textoFiltroFicha
  • I like 0
  • Visits 3
  • Comments 0
  • Save to
  • Actions

About this resource...

Entropy in thermodynamics and information theory
DbpediaThing
Wikipedia articleDbpedia source
There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S, of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as H, of Claude Shannon and Ralph Hartley developed in the 1940s. Shannon, although not initially aware of this similarity, commented on it upon publicizing information theory in A Mathematical Theory of Communication.This article explores what links there are between the two concepts, and how far they can be regarded as connected.
Entropy in thermodynamics and information theory

Conceptual map: Entropy in thermodynamics and information theory

Exclusive content for members of

D/i/d/a/c/t/a/l/i/a
Sign in

Mira un ejemplo de lo que te pierdes

Categories:

Fecha publicación: 28.5.2015

Comment

0

Do you want to comment? Sign up or Sign in

Join Didactalia

Browse among 226305 resources and 560510 people

Regístrate >

O conéctate a través de:

Si ya eres usuario, Inicia sesión

Do you want to access more educational content?

Sign in Join a class
x

Add to Didactalia Arrastra el botón a la barra de marcadores del navegador y comparte tus contenidos preferidos. Más info...

Game help
Juegos de anatomía
Selecciona nivel educativo