Additional Info (PDF — printable, zoom-able.)

Related to "Mathematics and the desperate cries from Quantum Mechanics"

Historical context for the idea that mathematical structures could have their properties dependent on their history:

Long ago, at the Harvard Extension School where I am a math tutor, a student wrote on the blackboard something similar to the following: y = [stuff] / log(2) = [stuff] / .3010 = [stuff] - .3010. Now, this last step is clearly "", but it got me thinking... Their mistake included the belief that the past history of a number — how it came into being — affected its current properties. The more I thought about this, the more the fact that this was rarely if ever the case in the mathematical structures with which I was familiar, seemed both arbitrary and limiting.

Additional points:

i) I played around a bit with combinatorics, in terms of considering that it might "matter" how many ways a "5" say could have come into being — was it the result of 4 + 1, or 3 + 1 + 1, not to mention 7 - 2, etc. Unfortunately I did not get very far.

ii) It may be that it is only now, with the advent of computers, that we practically can, for every symbol, store its history — its origin story — along with the symbol itself.

Related to "Consciousness — a possibly necessary condition for it"

A friend and colleague commented that one could bypass recursive levels of self-reflection and instead imagine a single subroutine — the "I've been asked who I am" subroutine, that simply increments a state variable regarding how times it has been called, and responds commensurately. While technically I understand his point and agree that is possible, I do not find it a particularly convincing argument against my original supposition; I find his implementation simply too "shallow".

Related to "AI: The importance of a name."

See Affectionate Technology paper for additional context.

Related to "Information Theory."

Background — "Shannon Information":

From Scientific American article: "Shannon defined the quantity of information produced by a source--for example, the quantity in a message--by a formula similar to the equation that defines thermodynamic entropy in physics. In its most basic terms, Shannon's informational entropy is the number of binary digits required to encode a message."

Related to "A friend who kept me company during COVID lockdown."

Delightful squirrel photos and videos.