
In mathematics, a natural number can mean either an element of the set {1, 2, 3, ...} (the positive integers) or an element of the set {0, 1, 2, 3, ...} (the non-negative integers). The latter is especially preferred in mathematical logic, set theory, and computer science.
Natural numbers have two main purposes: they can be used for counting ("there are 3 apples on the table"), and they can be used for ordering ("this is the 3rd largest city in the country").
Properties of the natural numbers related to divisibility, such as the distribution of prime numbers, are studied in number theory. Problems concerning counting, such as Ramsey theory, are studied in combinatorics
History of natural numbers and the status of zero
The natural numbers had their origins in the words used to count things, beginning with the number 1.
The first major advance in abstraction was the use of numerals to represent numbers. This allowed systems to be developed for recording large numbers. For example, the Babylonians developed a powerful place-value system based essentially on the numerals for 1 and 10. The ancient Egyptians had a system of numerals with distinct hieroglyphs for 1, 10, and all the powers of 10 up to one million. A stone carving from Karnak, dating from around 1500 BC and now at the Louvre in Paris, depicts 276 as 2 hundreds, 7 tens, and 6 ones; and similarly for the number 4,622.
A much later advance in abstraction was the development of the idea of zero as a number with its own numeral. A zero digit had been used in place-value notation as early as 700 BC by the Babylonians, but they omitted it when it would have been the last symbol in the number.[1] The Olmec and Maya civilization used zero as a separate number as early as 1st century BC, developed independently, but this usage did not spread beyond Mesoamerica. The concept as used in modern times originated with the Indian mathematician Brahmagupta in 628. Nevertheless, medieval computists (calculators of Easter), beginning with Dionysius Exiguus in 525, used zero as a number without using a Roman numeral to write it. Instead nullus, the Latin word for "nothing", was employed. The first systematic study of numbers as abstractions (that is, as abstract entities) is usually credited to the Greek philosophers Pythagoras and Archimedes. However, independent studies also occurred at around the same time in India, China, and Mesoamerica.
In the nineteenth century, a set-theoretical definition of natural numbers was developed. With this definition, it was convenient to include 0 (corresponding to the empty set) as a natural number. Including 0 is now the common convention among set theorists, logicians, and computer scientists. Many other mathematicians also include 0, although some have kept the older tradition and take 1 to be the first natural number. Sometimes the set of natural numbers with 0 included is called the set of whole numbers or counting numbers.

ليست هناك تعليقات:
إرسال تعليق