# Amanece, que no es poco

Las Ingles

Su importancia geográfica.
Historia de las ingles.
Las ingles de los americanos.
¿Cómo hay que tocar las ingles?

El ruido de las ingles.
Las ingles más famosas.
Las ingles y la literatura.
Un kilo de ingles.
Las ingles de los niños.
Las ingles y la cabeza: relación si la hubiera.

Las ingles en Andalucía, y el Clavel.
Teoría General del Estado y las ingles.
Las ingles negras.
¿Hay una ingle, o hay muchas ingles?
Las ingles de los actores.
La ingle y Dios.

No ha nacido todavía la ingle que me domine.
Las ingles descabaladas, su por qué.
Las ingles putas.
Dibujo a mano de las ingles.
¿Es carne la ingle?
El jaque a la ingle.
¿Satisface hoy en día una ingle? ¿Qué ingle?

# Difference and repetition, the quantitative game science seriously plays

What is the difference between “abababababababababab” and “4c1j5b2p0cv4w1x8rx2y39”? Apart from the alphabet size (the first sequence has only 2 letters, while the second is drawing form a larger pool of letters and numbers), the difference lies in the way of repetition. We can represent the first sequence with a much sorter instruction: repeat “ab” ten times (“ab”x10), whereas the second one does not lend itself to compression. Actually, one could argue that the business of physics has always been about compressing the wide range of natural phenomena into mathematical formulae. One of the most celebrated examples is “F=ma”, which describes an apple falling, the motion of planets, and all their imaginable variable occasions. Actually, compressibility is intimately linked to complexity, which can be defined as “the length of the shortest possible description”. So, the shorter the description of a process, the smaller its complexity. Note that complex does not mean complicated (made of many interconnected parts). So, in order to mathematically define compressibility –which we can show to be proportional to redundancy and inversely proportional to Shannon’s entropy, thus uncertainty– we only need to play a little heuristic exercise and count our savings at the end. For instance, given the sequence “S->abcdcbcdfbcd”, we can define a new symbol A as “A->bcd” and thus write “S->aAcAfA”. If we count the initial and final number of symbols we get 12 and 11 respectively, and thus we save 1 from the initial 12; compressibility yields C=1/12 — like a discount in a supermarket, we can save a little less than 10%. Bear in mind that the letters we use here are abstractions that represent anything that we can write down as a discrete sequence of data: from spikes in the brain, birds singing, DNA sequences, mice pressing for sucrose in an operant box, or babies babbling while learning to speak. If we try now to compress this other sequence “S->babaabaabaa”, we realize that there are many possible candidates of repeating motifs; some are longer and seem better candidates, others are shorter but more repeated. So, one should try the same procedure in all of them (that is why the method is called heuristic, and thus it has to be implemented in computers in order to deal with sequences whose size can be of the order of millions of characters). When we try the longest sequence, “abaa”, we realize that there are no savings possible, thus compressibility is zero in that case. When we try the most frequent sequence, “ba”, we realize we can compress the sequence twice (first defining “A->ba” and then “B->Aa”, obtaining “S->ABBB”), in what is called depth-2 iteration, but still compressibility is zero. Finally, choosing the grammar “A->aba”, we find “S->bAAAa”, thus saving 1 character [Nevill-Manning and Witten 2000]. Our original sequence is then 1/11 compressible. Differently to Shannon’s approach, this is a scale-independent method (or resolution-free) because we did not need to decide a priori on the n-gram length. However, compressibility is actually intimately related to Shannon’s entropy (H): a random sequence is very incompressible (thus it has C=0) and it has maximum entropy (H max), whereas a very repetitive sequence will be highly compressible (C close to 1) and the its entropy should be small (H close to zero). Thus we learn how two different approaches, with two different mathematical implementations, address the same question: how to detect high order in a sequence of data. In fact, that (plus an initial hypothesis framing the expectation on what is to be found, and some necessary statistical “checks” at the end) is what our work as scientists is all about: to discover repeated patterns in what appears to be noise. Science is the quantitative game of difference and repetition.

[Nevill-Manning and Witten 2000] Craig G. Nevill-Manning and Ian H. Witten, “On-Line and Off-Line Heuristics for Inferring Hierarchies of Repetitions in Sequences”, Proceedings of the IEEE, Vol.88, No.11, November 2000.

# Information theory: a flipping coin, an old script and our brain

What does an ancient script, the activity of the brain and the flipping of a coin all have in common? If we consider their measurable properties –as science always tends to proceed when approaching natural phenomena– we might easily claim that the three of them are objects whose matter and energy are well defined in theory and can be measured in practice. The law of conservation of matter discovered by Lavoisier in 1789 and the law of conservation of energy by Mayer in 1842 are indeed the cornerstones of physics and chemistry. However, when trying to explain what is special about living systems, some eminent scientists, like Schrödinger in 1944, pointed out to another concept: information. More than kilograms and joules, bits seemed the key to explain the secret of life. Soon after that, Shannon wrote a monumental article [Shannon 1948] where he proved that the only mathematical function that satisfies three basic properties (continuous, monotonic and additive) for how much information is produced given a discrete sequence of symbols is the sum of -p*logp: Entropy! We sketch Shannon’s proof and apply it to the three cases mentioned above. First, we calculate the entropy of the head-and-tail data sequence obtained as we flip the coin here-and-now (thus, illustrating also the caveats in performing an experiment). We see that our intuition of a “fair” coin (equiprobable and independent in giving head or tails) matches the mathematical property from Shannon’s formula that the entropy is maximum for a fair coin (maximally unpredictable) and zero for a coin that would always yield tails (maximally predictable, so no “surprise”). Then we build onto this real example to imagine that, instead of a coin, we roll a dice, and then we push our imagination into a “gedanken” experiment in which we roll a dice of 27 faces. If each face is a letter of the English alphabet (plus the space) we have a process that could generate English-like scripts. Adding structure to the frequencies of certain letters and to the probabilities of finding two, three or n-letters together (thus, biassing the dice; the opposite of a “fair” dice) we show how the entropy of the sequence produced by the dice would tend to the entropy of real English itself. Thus, we could now take this old and mysterious script from a lost civilization and calculate its entropy too. Surprisingly, this was done and the result approximated the entropy of natural languages such as Tamil or English, while being different (much lower) than a random sequence of symbols. Therefore, mathematically, higher order structure in any text can be measured, revealing syntax. The brain completes this exercise of applying information theory to apparently disparate processes. We explain that neural activity as reflected by the action potential of neurons can be described as a binary signal of 0’s and 1’s (no-spike and spike). Then, Shannon’s ideas and formulas are applicable again to calculate the information encoded by neural activity about the external sensory world: sound, smell, light, heat, etc; any “secondary quality” translated into a common language of bits in the brain (and by the brain). This is how a large body of scientific work, mostly under the name of computational neuroscience, has made substantial progress in answering what the brain does. So, in this brief journey we learn that our (and the mouse) brain, the old script and the gambling coin game at a casino reflect processes whose information can be precisely defined in a mathematical sense and intuitively interpreted thanks to Shannon’s theory. An analogy now made concrete! However, despite the fact that we still have no clue about how to leap from information to meaning –and thus explicate human language and consciousness from the data– entropy remains a conceptual milestone in the history of science, with sprouts both technological and philosophical. From physics’s energy and chemistry’s matter, we jump into biology with the aid of information as entropy and then, pushed by the phenomena neuroscience faces, we are brought up the gradient to face psychology as well –there where life and mind are in confluence– to then immediately be compelled to cast the whole gradient into elusive philosophical questions worth answering.

[Shannon 1948] C. E. Shannon, “A Mathematical Theory of Communication”, The Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948.

# Perspectives on Animal Behavior Comparisons

“In the use of language, for instance, we depend on the fact that names have been given to objects, qualities, and relations, which fix certain similarities and differences in the flow of experience as boundaries containing it, dividing it, directing it. Whenever we describe, we class things or properties or events together or apart on the basis of the similarities and differences marked by the words we choose. Consequently, to the extent that science begins with description, it begins with comparison.

But no two things, no two qualities, no two events are alike in all respects, or alike in none. Any description singles out some similarities and differences to the exclusion of others, which could be the basis of alternative descriptions. Consequently, a demand for a complete description of anything amounts to a contradiction in terms. A demand for a pure description would be equally incoherent, for, of necessity, the similarities and differences that we pick out when we describe anything will depend on what we intend the description for, our expectations about the matter in question, considerations of relevance to some focus of interest, and other prior assumptions. Comparison necessarily assumes perspective.

Perspectives differ among sciences, among schools within a science, and among scientists within a school. Consequently, the history of science is no less littered with controversy and confusion than the history of philosophy, art, or politics. The conception that we have of something is often so deeply and firmly embedded in our minds that we cannot even entertain the possibility of an alternative. Such a conception governs our very perceiving of the thing; we cannot see it any other way and “know” that that is the way it really is. Anyone who claims otherwise we regard as purblind, or perverse, or just plain crazy. The closed-mindedness that can result from this sort of conviction is one of the sources of what my teacher, Niko Tinbergen, used to call “nothing-but-ism”: behavior is nothing but reflexes; morality is nothing but self-interest; everything is nothing but matter in motion.”

Colin G Beer

# The Riddle Of The Universe — Consciousness

“No phenomenon of the life of the soul is so wonderful and so variously interpreted as consciousness. The most contradictory views are current to-day, as they were 2,000 years ago, not only with regard to the nature of this psychic function and its relation to the body, but even as to its diffusion in the organic world and its origin and development. It is more responsible than any other psychic faculty for the erroneous idea of an “immaterial soul” and the belief in “personal immortality”; many of the gravest errors that still dominate even our modern civilisation may be traced to it. Hence it is that I have entitled consciousness “the central mystery of psychology”: it is the strong citadel of all mystic and dualistic errors, before whose ramparts the best equipped efforts of reason threaten to miscarry. This fact would suffice of itself to induce us to make a special critical study of consciousness from our monistic point of view. We shall see that consciousness Is simply a natural phenomenon like any other psychic quality, and that It is subject to the law of substance like all other natural phenomena.

(…)

The only source of our knowledge of consciousness is that faculty itself; that is the chief cause of the extraordinary difficulty of subjecting it to scientific research. Subject and object are one and the same in it: the perceptive subject mirrors itself in its own inner nature, which is to be the object of our inquiry.”

Ernst Haeckel

# Brain and Thought: A philosophical Illusion

“You began by speaking —should I say again to the philosopher— of the brain such as we see it, such as it stands out in the midst of the presentation: so you assumed it to be a part of presentation, an idea, and you were in idealism. There, I repeat, the relation of the brain to the rest of presentation can only be the relation of part to whole. Thence, all of a sudden, you have fled to a reality supposed to lie beneath the presentation. Very good: but such reality is subspatial, which amounts to saying that the brain is no more an independent entity. What you have to do with now is the totality of the real, in itself unknowable, over which is spread the totality of the presentation. You are now, indeed, in realism; and no more in this realism than in the idealism of a moment ago are the cerebral states the equivalent of the whole of presentation: it is – I must repeat it – the whole world of things which is again implied (but, this time, concealed and unknowable) in the whole of perception. But lo! taking the brain apart and dealing with things separately, you are actually continuing to decompose and recompose reality along the same lines and according to the same laws as presentation, which means that you no longer distinguish the one from the other. Back you are, then, in idealism; there you ought to remain. But not at all! You do indeed preserve the brain as it is given in presentation, therefore as an idea, but you forget that if the real is thus spread out in the presentation, if it is extension and not tension, it can no longer compress within itself the powers and virtualities postulated by realism; unheedingly you erect the cerebral movements into the equivalent of the whole of presentation. You are therefore oscillating from idealism to realism and from realism to idealism, but so quickly that you do not perceive the see-saw motion and you think yourself all the time astride the two systems joined into one. This apparent reconciliation of two irreconcilable affirmations is the very essence of the thesis of parallelism.”

Henri Bergson

# Taming the Beast

“So, what are the fears of joining a group? Following is a brief list that other writers have claimed.

Fear of rejection
Fear of not being good enough
Fear of not knowing how (to critique, to [work] in a group, etc.)
Fear of intimacy
Fear of commitment
Fear of being disillusioned
Fear of finding out I can’t write
Fear of finding out how good I am (not very many actually voice this fear, real as it might be)
Fear of failure
Fear of success
Fear of completing something
Fear of not completing something
Fear of being judged
Fear of being found out
Fear of reliving an uncomfortable or difficult experience
Fear of looking foolish
Fear that I’ll lose my voice (and I’ll [do] like everybody else in the group)
Fear that I’ll find out I don’t have anything original to say”

Judy Reeves — Writing Alone, Writing Together

# Modes of Explanation and the Tension of Biology

“This symposium will thoroughly irritate any biologist who comes across it. At first sight it contains a lot of talk and precious little deductive theory; a closer look reveals essays violently attacking the accepted modes of scientific explanation and espousing a biology reformed along more Aristotelian lines. Worse yet, these essays were written by reputable physicists still practicing their trade, emphatically no the “carpenters blaming their tools” who frequent so many theoretical biology congresses. What happened?

This symposium records the attempts of some very intelligent people to digest and understand the disturbing complexities of biology. Many have read Kuhn on scientific revolution, and realise that current models of scientific explanation are as temporary as their predecessors: they are willing to face the possibility that a general theory like the embodied in Einstein’s laws and Maxwell’s equations is impossible in biology. How do they respond?

(…)

In sum, this hook gave me a lot to think about. It has two or three articles, especially Bohm’s first paper, and Kerner’s, which I found quite beautiful. Even the abominable papers, of which there were a number, are abominable in interesting ways and forced me to think about what biology should be. This may be a very personal reaction, however: I doubt if this book will have a very great influence, and doubt if it deserves to. It is simply an unvarnished record of the reactions of intelligent people to the oldest problems in science.”

Egbert G. Leigh, Jr.