Lecture 4 Order and probability. Information and entropy Flashcards

1
Q

All natural processes lead to a net increase in…

A

Entropy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

There exists a natural tendency for an ordered set of things to get disordered through…

A

Interplay of natural forces

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is entropy?

A

Entropy is a measure of the degree of randomness or disorder of a system.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

When does the entropy of a system increase?

A

The entropy of a system increases when it becomes more disordered

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Under what condition can a process only occur spontaneously in?

A

A process can occur spontaneously only if the sum of the entropies of the system and its surroundings increases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The probability of any kind of distribution will be larger if ?

A

It can be realized in a larger number of ways.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

The number of ways, which lead to the realization of a definite situation is a measure of …

A

The probability of the occurrence of it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the Boltzmann equation of entropy

A

S=k.lnW

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Boltzmann’s constant k was defined as a universal constant later by…

A

Max Planck

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Boltzmann’s constant k must have the same unit of measure as…

A

Entropy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the Shannon equation of information theory?

A

I=K ln P

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

In the Shannon equation of information theory, the unit I is determined by…

A

The unit of the factor K.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Which unit is most commonly used for I in the Shannon equation of information theory and what does it express?

A

The bit (“binary digit”) is most commonly used. It expresses the number of binary yes-no decisions, which are needed to determine a given message

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

The idea of a link between information and entropy was first suggested

A

Boltzmann

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Erwin Schrodinger (1944) made the frequently quoted statement…

A

The living system feeds on negative entropy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is Erwin Schrodinger (1944) quoted statement the reason of?

A

This is the reason why sometimes the term “negentropy” is used.

17
Q

What is the real importance of Shannon’s information equation in biophysics?

A

Ans: It is possible to calculate the information content of a protein by the above approach. The requirements are, firstly, a statistical record of the frequency of the occurrence of the individual amino acids in proteins.

18
Q

Using Shanon equation the information content of ……… can be calculated

A

Using Shanon equation the information content of each monomer can be calculated

19
Q

The information content of a ……….. can be obtained in the same way

A

Nucleic acid

20
Q

One mammalian mitochondrial DNA molecule consists of about

A

15000 Nucleotides

21
Q

Assuming that the four possible types of nucleoside bases (adenine, cytosine, guanine, thymine) have an equal probability of occurrence, then the information content of each single nucleotide will have a value

A

Of 2 bits.