74 points | by apollinaire4 天前
There are only two things that have been understood much later about zero, that it is a quantity that in many circumstances should be treated in the same way as any other numbers and that it requires a special symbol in order to implement a positional system for writing numbers.
The concept of zero itself had already been known many thousands of years before the invention of the positional system for writing numbers.
All the recorded ancient languages were able to answer to the question "How many sheep do you see there?" not only with words meaning "one", "two", "three", "four", "few", "many" and so on, but also with a word meaning "none". Similarly for the answers to a question like "How much barley do you have?".
Nevertheless, the concept of the quantity "zero" must have been understood later than the concepts of "one", "two", "many" and of negation, because in most languages the words similar in meaning to "none", "nobody", "nothing", "null" are derived from negation words together with words meaning "one" or denoting small or indefinite quantities.
Because the first few numbers, especially 0, 1 and 2 have many distinctive properties in comparison with bigger numbers, not only 0 was initially perceived as being in a class somewhat separate from other numbers, but usually also 1 and 2 and sometimes also 3 or even 4.
In many old languages the grammatical behavior of the first few numbers can be different between themselves and also quite different from that of the bigger numbers, which behave more or less in the same way, consistent with the expectation that the big numbers have been added later to the language and at a time when they were perceived as a uniform category.
> Inside the Chaturbhuj Temple in India (left), a wall inscription features the oldest known instance of the digit zero, dated to 876 CE (right). It is part of the number 270.
This is the real oldest known zero:
> Because we know that the Çaka dynasty began in AD 78, we can date the artifact exactly: to the year AD 683. This makes the “0” in “605” the oldest zero ever found of our base-10, “Hindu-Arabic” number system.
> But the Cambodian stone inscription bears the first known zero within the system that evolved into the numbers we use today.
Maybe the Khmers came up with it or maybe they got the idea from India. Either way, we should set the facts straight.
> By AD 150, Ptolemy, influenced by Hipparchus and the Babylonians, was using a symbol for zero ( — ° ) [25][26] in his work on mathematical astronomy called the Syntaxis Mathematica, also known as the Almagest. [27] This Hellenistic zero was perhaps the earliest documented use of a numeral representing zero in the Old World. [28]
Sure, as "that number between -1 and 1 on the number line", zero is mostly unremarkable, but math doesn't really care about number lines that much: as a singularity that arises from mathematical operations, its behaviour, and its "so what is it, really?" is a proper brain twister.
Not even old languages, in many languages today (notably slavic) nouns decline differently with "1", "2", "3", "4" and say "5".
In many languages, including existing ones (Lithuanian, Irish and Slovenian come to mind) there exists a concept of a grammatical number "dual", in addition to singular and plural.
Hence a wonderful piece of Soviet humour: A factory needs 5 fireplace pokers, but none of the workers knows the correct plural form for 5 fireplace pokers; not wanting to appear ignorant when they send their request to management, they request "3 fireplace pokers and 2 more". Some months later they receive the fireplace pokers with a note saying "here are 4 fireplace pokers and 1 more" -- because management didn't know the word either!
i'm a native russian speaker (though i have been living in an english-speaking country for a while), yet it took me a few moments, and i compulsively consulted wiktionary to check myself:
"five fireplace-pokers" would be "пять кочерёг", marked as 'irregular' in the table.
https://www.unicode.org/cldr/charts/46/supplemental/language...
grep for "two" (note: "two" means 'dual', not literally '2')
To an ancient Roman, tomorrow is 2 (ok, II) days from now. Today . . . one . . . tomorrow . . . two . . . two days!
> Because the first few numbers, especially 0, 1 and 2
Considering zero one of "the first few numbers," or even considering zero being a number, or for that matter considering zero at all are very recent concepts for humans.
A millennium ago, asking someone to count all the sheep in a field that had no sheep would be like today asking the sound of one hand clapping.
Already before the ancient Romans, Aristotle had made a classification of the parts of speech based on their meaning as exposed by the kind of questions to which they can provide answers (the so-called "Categories" of Aristotle). For some of his categories he has coined new names, including the name that has been later translated into Latin as "quantity", and Aristotle has divided the quantities into discrete quantities (providing answers to "How many?") and continuous quantities (providing answers to "How much?"). According to Aristotle's definitions, "zero" was certainly a quantity, together with the other numbers and with other words that can express quantities.
I do not think that there is any preserved text from the Classical period of Ancient Greece in which it is stated whether they thought that zero belongs to what they called numbers (arithmoi), or not. In any case whether they had chosen to call zero using the term "number" or not is less important than the fact that they understood perfectly the more general concept of "quantity", which included various subclasses, like numbers (what we call integer numbers) and measures (what we call real numbers), and to which zero also belonged.
About the ancient Romans, the most frequent way of saying in Latin "some", i.e. an unknown number, but at least 1, was "not zero". Therefore regardless whether they were using the word "number" for zero or not, they had a clear concept of a set consisting of 0, 1, 2, 3 and all the other cardinal numbers, so they identified the subset with 1, 2, 3 ... as the complement of the subset containing zero.
The ancient people were much less ignorant than some modern people believe.
In most modern grammar theories and already in some of the works of the Ancient Greeks and Romans, a group of words is recognized as having the same nature if any of those words can be substituted for another of them in a given sentence.
Given the sentences (which have one-to-one correspondences in the ancient languages):
I see zero sheep. (In Latin, "null-" is the word stem for zero.) I see one sheep. I see two sheep. I see three sheep. ...
both a modern grammarian and also some of the ancient ones will conclude that zero, one, two, three ... have the same nature.
So zero was a concept that represented a certain quantity of sheep.
The Romans might have not used the word "number" in reference to zero, but there is no doubt that they understood the concept of a set whose members are 0, 1, 2, 3 and so on.
The proof is that the normal way of saying "some" in Latin, i.e. an unknown number that is at least 1, was to say "not zero", i.e. by identifying the subset 1, 2, 3, ... with the complement of the subset that contains 0 in the set of what now are called natural numbers.
While they did not have standardized words for concepts like set, subset, complement, membership or quantifiers, these concepts of the modern set theory were nevertheless well understood by the educated people in the Ancient World as they were studied within Logic a.k.a. Dialectic.
Is it? I ate all the apples. There are no more apples left. There are zero apples. Don’t we come face to face with nothing left all the time? There are zero guests left. There are zero episodes of the show to watch. There are zero days until Christmas. There is zero money in my wallet.
When there is zero of something, what is there zero of? That question becomes a lot more difficult to answer. In theory you could say there is zero of anything not observed, but this isn't a very precise or useful definition and isn't what's done in practice. The things we choose to actively describe there being zero of depends on knowledge of what existed previously, on cultural and linguistics norms, on context etc.
For example, I can say "there are zero apples in the basket", but this requires me to know that there previously existed apples in the basket as opposed to oranges, or that the addressee of my statement was expecting apples to be there. This knowledge wouldn't be required if there was at least one apple.
Using zero fundamentally requires more mental reasoning than using small positive integers.
"There are zero apples on that apple tree."
You're inventing an abstraction that is less familiar than the reality. If you know apples go in baskets, you've probably picked them. Or that oranges just don't grow around here.
> Using zero fundamentally requires more mental reasoning than using small positive integers.
I disagree. Positive integers don't exist in a vacuum either.
You always start with a context -- the thing you're counting -- and then you are given a number. Whether 3 or 0.
I see no greater conceptual complexity in either case. I think the flaw is in your sentence:
> This knowledge wouldn't be required if there was at least one apple.
Yes it is required. We're not comparing the statements "0 apples" and "1", we're comparing "0 apples" and "1 apple".
The problem with your comparison is that you're already starting from too much.
> You always start with a context
And where does that context come from? The true starting point is nothing. The context needs to arise from something. In order to even form a thought about apples and not any of the other thousands of concepts that you're aware of in the first place, you need a prompt.
In the case of an existing object, the prompt is seeing it. In the case of a non-existing object, the prompt is the combination of social and memory factors that I mentioned before. The former is simple, the latter is complex.
Another thing to consider: humans invented the number zero long after the natural numbers. Children learn to understand it after the natural numbers. The brain processes it differently, as mentioned by the article. There is overwhelming evidence that yes, zero is more complicated than the natural numbers for humans to think about.
If your reasoning ends with the conclusion that it's not actually more complicated, then that is in direct conflict with the evidence and it shows that you must be missing something.
Not necessarily, not at all. It might be, "hmm, do I have any apples left in the sack?" Or "did my daughter retrieve the ears of corn"? Sometimes we randomly come across an object, but in a great many cases (the majority?) we already have the context -- we know what we expect to see (or not see), or what we're looking for, or what we're investigating. Humans are goal-oriented creatures; we're not just responding to current sensory stimuli.
> humans invented the number zero long after the natural numbers
Only in the highly technical sense of a dedicated symbol for balanced financial accounts, or a digit placeholder in a positional number system. Languages all have everyday linguistic equivalents for zero like "none", "no", "aren't any", etc. These mean "zero" in the counting sense, precisely and exactly. There's no evidence at all that these came after something like the number 7.
> and it shows that you must be missing something
Or you yourself, in this case. You are unfortunately looking at an overly limiting definition of context, an overly limiting definition of zero, and you're missing important parts of linguistic history.
There's something or there's nothing
Level 2 if there's something:
There's one or more of this thing and it's not going to transform into same amount of something different.
Also with stolen paintings: you have no painting,but you have evidence of it's existence.
And lastly, in song: "can't buy me love."
And of course physics: T(sub) 0. As well as space flight: t-zero.
These are not equivalent statements though. You simply learned at some early age zero represents absence, but that is not a natural concept.
The following statement does not apply though: - there are no oranges left … as this implies that there have been oranges available at some time before.
"There are 0 apples left." is the answer to the question "How many apples are left?".
"There are 0 oranges." is the answer to the question "How many oranges are on the table?" (or "in the box" or wherever).
Everywhere where 0 appears in speech, it is the result of a counting or measuring operation, which provides the answer to a question, expressed or implied.
That counting or measuring operation could have had any other number as its result, instead of 0, which demonstrates that the nature of 0 is the same as that of any other cardinal number, i.e. it is a quantity (term introduced already by Aristotle, in his "Categories", where the various kinds of concepts and the words that name them were classified by the kinds of questions to which they provide answers).
However, this does not match the reality of how we learn about 0. Children learn 0 conceptually way after learning to count, because the concept is trickier to grasp.
Nevertheless, even if the concept of zero has been understood later than the concepts of 1, 2, 3 ..., it has still been understood in many places at least four thousand to five thousand years ago, i.e. already by the time of the oldest writings that have been preserved and thousands of years before the invention of the positional system of writing numbers, where the importance of zero has greatly increased and where it has required a dedicated graphic symbol.
"are" and "zero", meaning non-existence/non-"are"-ness, in the same sentence is seen as strange by many, yes, and that include me.
Negative numbers however. They have no good physical models, at least that I know of. They are mostly just a tool to make accounting easier, or to denote one out of 2, indistinguishable cases (direction on the axis x, direction of angular velocity, electric charge). But the teachers don't insist confusing the kids with them.
This sentence no verb. Self referentially.
> The whole numbers were synonymous with the integers up until the early 1950s. In the late 1950s, as part of the New Math movement, American elementary school teachers began teaching that whole numbers referred to the natural numbers, excluding negative numbers, while integer included the negative numbers. The whole numbers remain ambiguous to the present day.
https://en.wikipedia.org/wiki/Integer
> In mathematics, the natural numbers are the numbers 0, 1, 2, 3, and so on, possibly excluding 0. Some start counting with 0, defining the natural numbers as the non-negative integers 0, 1, 2, 3, ..., while others start with 1, defining them as the positive integers 1, 2, 3, ... . Some authors acknowledge both definitions whenever convenient. Sometimes, the whole numbers are the natural numbers plus zero. In other cases, the whole numbers refer to all of the integers, including negative integers. The counting numbers are another term for the natural numbers, particularly in primary school education, and are ambiguous as well although typically start at 1.
https://en.wikipedia.org/wiki/Natural_number
What a shame.
In the context of brain, Buddhism and various branches of Hinduism spent a lot of time pondering. Several meditative techniques involve meditating over this hollowness or the ultimate absence.
I do know that Zero the numerical placeholder is different from Buddhist idea of "Zeroness" but in the context of brain it should be similar I feel. Absence of something means cardinality of zero but Buddhists literally asked this question about reality itself.
I've also read the linked article "https://www.quantamagazine.org/why-the-human-brain-perceives... which discusses the discovery of "number neurons" and goes on to discuss that there might be no "number neurons" after all.
Declaring 2x2=0+2+2, 2x1=0+2, 2x0=0 while 2xx2=1x2x2, 2xx1=1x2, 2xx0=1 seemed arbitrary.
What helped was learning about negative exponentiation and exponentiation simplification, so 2xx0 = 2xx2 x 2xx(-2) = 2xx2 / 2xx2 = 1.
That said, I clearly still take issue with unintuitive interpretations of "nothing" https://stackoverflow.com/questions/852414/how-to-dynamicall...
Your teacher didn't tell you (or told you, but you didn't recognize it as something valuable and forgot it) that exponentiation to zero is a new definition over exponentiation to positive integers. Exponentiation to positive integers is defined somehow, and that definition says nothing about exponentiation to zero. It is a new definition, not something that you deduce.
The same holds for 0^0 or 0/0 (with some amounts of confusion, lies, and hypocriticism).
Which I found incredibly silly once I leaned about negative exponentiation and could now deduce the pattern.
Similarly, 0/0 became much more tractable to me once I learned L'Hôpital's rule https://en.wikipedia.org/wiki/L'H%C3%B4pital's_rule
2x0 = 2x1 + 2x(-1) = 2-2 = 0
and 2xx0 = 2xx1 x 2xx(-1) = 2/2 = 1
Inverting the concept and having those patterns stem from a fundamental identity can wait until its not seen as the mathematical equivalent of "Because I said so".Or if I were using latex:
2 \times 0 = 1_{+}
Because multiplication, being repeated addition and exponentiation repeated multiplication both behave the same way. When asked to repeat the operation zero times, they return the unit or 1 of the underlying group, which is typically denoted as 1_{whatever}However, the 1 of addition is 0 when using the standard notation for integers
This has to do with rings and the relationship between the two identities of the two underlying groups. It ultimately stems from the distributive property between multiplication and addition
I think we invent abstractions because they allow us to reason about patterns in reality in a consistent way. The fact that division by zero is undefined is (to my mind) because it doesn't correspond to any useful pattern in the parts of reality we typically apply arithmetic to (accounting, estimation, etc).
What I like about this point of view is that it encourages thinking about what you are trying to accomplish, rather than fixating on formal rules. In some contexts "division by zero" can correspond to a meaningful pattern - look up the geometry of the projective line, for instance. In such cases you might want to include it in your model, rather than declaring it to be undefined as a convention!
> it doesn't correspond to any useful pattern in the parts of reality we typically apply arithmetic to (accounting, estimation, etc).
You can of course re-define it, but then we aren't talking about the same thing any more. The operation of inverting multiplication, is not defined for zero.
Similarly the fact that multiplication and division are inverses is a property of this model. Conceptually you can imagine splitting and copying groups of objects quite independently of one another (and which one you view as fundamental is really a post-hoc choice).
In general these days we mostly see clean mathematical abstractions because all the scaffolding has already been removed by mathematicians past. And as a result people come to believe that this is how mathematics is done. But always there is an initial period of exploration (which eventually gets forgotten) as people try to work out how to axiomatise the various systems they are interested in.
Contrast 0^0 with 0/0. Both of them are considered undefined, but the former is often defined "locally", e.g in a given textbook or article to have the value 1. That's not true for 0/0 because it's not been found to be useful.
In measure theory it's often useful to augment your reals with positive/negative infinity. In projective geometry it's sometimes meaningful to allow division by zero (to counter one detail of your comment). In nonstandard analysis you would consider infinitesmals to be valid numbers, and in game theory you might consider stuff like the surreals which are yet another to view the familiar numbers, with different laws.
You can say we're talking about many different systems here, and that's true in a formal sense. I'm just pointing out that these formal systems come from somewhere, and mathematics is really about the thought-stuff underlying them. You should be willing to bend a rule here and there if it is truer to the concepts you care about - statements like "division by zero is undefined" should never be taken as absolutes.
(but of course this is just my personal philosophy of mathematics, take it with a pinch of salt)
(1) abs(y_i - x_i) <= delta * abs(x_i) for each i.
This number can be computed as
(2) delta = max (abs(y_i - x_i) / abs(x_i)) over each i.
If you wish to allow zero entries in your vectors while keeping the equivalence between (1) and (2), you have to define 0 / 0 = 0.
Take out a scientific calculator and start doing division by smaller and smaller units:
1/0.9
1/0.09
1/0.009
...
1/0.000000000000000000009
So we can then ask what happens if we do this more and more with incremental steps? It seems that we go toward infinity.
thus the more you do this the higher the returned number and I think we just discovered (I think I have to refresh my memory and hope not say something very wrong) the concept of calculus. Then we can use a new concept: limits - what happens when that 0.000000000............N is as close to zero as possible.
It actually tends towards 1 no matter how small the divisor
It doesn't always go to 0, hence why it is undefined
*edited to say feel instead of fill
(no, but since in English I can verb my adjectived nouns into all sorts of noun-y verb craziness, nothing can get nouned and verbed out of adjectiving all day long, because English.)
Zero was invented approximately a thousand years before the pendulum was scientifically developed for use in technology like clocks.
0 it's a mathematical convention. We can't have 0 apples in a basket. We either have some apples or we don't have any.
If 0 is weird, negative numbers are also weird. Infinity is weird. Spaces with more than three dimensions are weird.
Complex numbers are weird from an algebraic perspective but not that weird from a geometric perspective.
Empty set is weird.
Everything that's abstract is weird.
For my mind abstract algebra is more weird than calculus and mathematical analysis so that's why I enjoy calculus and mathematical analysis more.
But being weird and abstract can also be useful.
I just checked my basket, it contains 0 apples.
I wonder where zero is when I point a web cam at a screen and see the tunnel effect. Am I, the observer the zero camera?