Warning: strstr() [function.strstr]: Empty delimiter in [path]/includes/class_postbit.php(294) : eval()'d code on line 419

Warning: strstr() [function.strstr]: Empty delimiter in [path]/includes/class_postbit.php(294) : eval()'d code on line 419

Warning: strstr() [function.strstr]: Empty delimiter in [path]/includes/class_postbit.php(294) : eval()'d code on line 419

Warning: strstr() [function.strstr]: Empty delimiter in [path]/includes/class_postbit.php(294) : eval()'d code on line 419

Warning: strstr() [function.strstr]: Empty delimiter in [path]/includes/class_postbit.php(294) : eval()'d code on line 419

Warning: strstr() [function.strstr]: Empty delimiter in [path]/includes/class_postbit.php(294) : eval()'d code on line 419

Warning: strstr() [function.strstr]: Empty delimiter in [path]/includes/class_postbit.php(294) : eval()'d code on line 419
Shannon's Entropy v. Clausius's Entropy? - Science Forums Biology Forum Molecular Biology Forum Physics Chemistry Forum

Go Back   Science Forums Biology Forum Molecular Biology Forum Physics Chemistry Forum > General Science Forums > Physics Forum
Register Search Today's Posts Mark Forums Read

Physics Forum Physics Forum. Discuss and ask physics questions, kinematics and other physics problems.


Shannon's Entropy v. Clausius's Entropy?

Shannon's Entropy v. Clausius's Entropy? - Physics Forum

Shannon's Entropy v. Clausius's Entropy? - Physics Forum. Discuss and ask physics questions, kinematics and other physics problems.


Reply
 
LinkBack Thread Tools Display Modes
  #1  
Old 02-22-2006, 07:06 AM
tadchem
Guest
 
Posts: n/a
Default Shannon's Entropy v. Clausius's Entropy?




<[Only registered users see links. ]> wrote in message
news:%RPKf.15501$[Only registered users see links. ].prodigy.c om...
in

[Only registered users see links. ]
[Only registered users see links. ]
The correspondence between physical entropy and informational entropy is
conceptual. There is no energy involved in changing informational entropy,
and physical energy does not measure uncertainty or information.

The main resemblance is in the way the words are spelled.

"Analogies are like ropes; they tie things together well, but you won't get
very far if you try to push them." - Thaddeus Stout.


Tom Davidson
Richmond, VA


Reply With Quote
  #2  
Old 02-22-2006, 02:55 PM
Guest
 
Posts: n/a
Default Shannon's Entropy v. Clausius's Entropy?


"tadchem" <[Only registered users see links. ]> wrote

involved
I


entropy,

How do you increase informational "order" (eg. alphabetize a list of names)
without doing _any_ work in the thermodynamic sense?


Reply With Quote
  #3  
Old 02-22-2006, 04:38 PM
Doune
Guest
 
Posts: n/a
Default Shannon's Entropy v. Clausius's Entropy?

[Only registered users see links. ] wrote:

<snip>


You can't - but that's not a problem. It doesn't mean that energy has
been stored in the list, it just means that energy has been expended.

Are you waiting for someone to say something like "you cannot increase
informational order without expending physical (thermodynamic) energy"?



SCW

Reply With Quote
  #4  
Old 02-22-2006, 05:36 PM
tadchem
Guest
 
Posts: n/a
Default Shannon's Entropy v. Clausius's Entropy?


[Only registered users see links. ] wrote:

<snip>


Start by looking at what thermodynamic work *is*:

Work can be the integral of force over a distance, the integral of
pressure over a change in volume, or simply changing potential energy,
or other things:
[Only registered users see links. ]

Each of these definitions requires that the system under consideration
posess a property (mass, charge, pressure, tension, etc) that simply is
not defined for information sciences.

The term "work" exists in both contexts - information and physics -
just as "entropy." The definitions are context-sensitive and
irroconcileable between contexts.

The result is a trap that leads the unwary into the fallacy of
equivocation.

Tom Davidson
Richmond, VA

Reply With Quote
  #5  
Old 02-24-2006, 12:14 AM
Guest
 
Posts: n/a
Default Shannon's Entropy v. Clausius's Entropy?


"Doune" <[Only registered users see links. ].uk> wrote


names)



It's interesting you should say that ;-) If it is true that you cannot
increase informational order without a thermodynamic cost, then why can't
you say that energy is somehow "stored" in the information in the same sense
that energy is stored in a rock that I carry up a hill? Also, what I find
particularly interesting is that while as you say any increase in
informational order comes at some cost in physical entropy, nevertheless, as
I understand it, the amount of energy required to perform computation can be
reduced to arbitrarily low levels. Therefore it seems to me that while
increasing informational order does comes at some thermodynamic cost, it
does not scale the same as it does with respect to physical work and kinetic
energy in the realm of matter and energy.


Reply With Quote
  #6  
Old 02-24-2006, 02:21 AM
MG
Guest
 
Posts: n/a
Default Shannon's Entropy v. Clausius's Entropy?


<[Only registered users see links. ]> wrote in message
news:NdsLf.24436$[Only registered users see links. ].prodigy.c om...

To transfer a bit of information you need some signal stronger than the
noise. Pushing down to the quantum the limit, the energy in each bit must be
some multiple of h since it must be sufficient to be detected above the
uncertainty. Shannon developed the channel capacity in terms of S/N ratio.

Even with S/N < 1 there is way of transmitting some information but speed
decreases.

Down at the quantum level, when noise is quantized S/N can not be less than
1.

As interesting as the parallel between Information and thermodynamic is,
in my opinion there is no physical substance to information, it is a
mathematical concept of order in sequence.

Physically speaking a random array of numbers and an ordered one are not
that different. Information is subjective, a symphony may sound better to
you and me than the same notes played randomly but physically the difference
is meaningless.

On the other hands a gas temperature has physical meaning.

Information has no mass, momentum, charge or any of the stuff that physic
deal with.

When we use electronic circuits to manipulate information, then energy
considerations come into play but this is because we use electrons like
beads on an abacus, we move than around like pieces of clay to keep count.

At this point we use physical quantities arranged in conventional pattern to
model the information symbols. Rearranging symbols does not require energy
per se, unless you do the "thinking" in a real physical brain, ether
biological or electronic.

Conceptually squaring a number requires no energy, does 64 contains more
energy than 8? NO. But placing 8 marbles on each axis and filling in the
square requires energy.

Just a few thought about the subject.

Mauro





Reply With Quote
  #7  
Old 02-24-2006, 02:52 AM
Guest
 
Posts: n/a
Default Shannon's Entropy v. Clausius's Entropy?


"MG" <[Only registered users see links. ]> wrote


I liked your comments very much but here I would just point out that to the
extent that information has any impact on "reality" it must become embedded
in a physical system. To the extent that information is embedded (eg. in a
series of switches) it does have some minimal "physical" properties such as
the polarities of the switches.


Reply With Quote
  #8  
Old 02-24-2006, 04:28 PM
Guest
 
Posts: n/a
Default Shannon's Entropy v. Clausius's Entropy?


"Doune" <[Only registered users see links. ].uk> wrote

can't
sense


What is it about information that makes it valuable to an organism? Maybe
one way to put it is this: for most animals, useful information increases
chances for survival and reproduction. This statistical quantity, this
augmented probability, is "lost" when the corresponding information is
degraded. Therefore, what I will prove for my Nobel Comedy Prize is this:
what is "stored" in information, and what you can get out of it, is not the
statistically measured stuff called "energy" but rather that statistically
evaluated phenonemon called probability of survival and reproduction.



Reply With Quote
  #9  
Old 02-26-2006, 02:56 AM
Guest
 
Posts: n/a
Default Shannon's Entropy v. Clausius's Entropy?


"tadchem" <[Only registered users see links. ]> wrote

entropy,

You cited two Wikipedia articles in your post, but you failed to cite the
one that will make you eat your words:

[Only registered users see links. ]

Check it out. I'll await your apology ;-)




Reply With Quote
  #10  
Old 02-26-2006, 02:46 PM
tadchem
Guest
 
Posts: n/a
Default Shannon's Entropy v. Clausius's Entropy?


[Only registered users see links. ] wrote:

<snip repost>


Hold your breath, please.

Purple people are *so* amusing to watch, and I haven't tried to trap
purple people eaters in *decades*.

Tom Davidson
Richmond, VA

Reply With Quote
Reply

Tags
clausius , entropy , shannon


Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Shannon's Entropy v. Clausius's Entropy? MG Physics Forum 8 02-26-2006 02:43 PM
Entropy question: Does spilling a liquid increase the entropy of a substance? bollod Chemistry Forum 5 03-04-2004 10:17 PM
residual entropy Allan Adler Chemistry Forum 1 09-28-2003 07:26 PM


All times are GMT. The time now is 10:09 PM.


Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 2005 - 2012 Molecular Station | All Rights Reserved
Page generated in 0.17382 seconds with 16 queries