Magic Number of Pi ………….

Magic Number of Pi

The ratio of a circle’s circumference to its diameter is known as pi. In other words, you may always get the same number when you divide the circumference (the distance around the circle) by the diameter for any circle. Pi doesn’t change depending on how big or tiny the circle is. Pi is frequently represented by the symbol, and is pronounced “pie” precisely like the sweet treat.

 

Mathematicians have been working to deepen their grasp for thousands of years, sometimes by precisely computing what it is. For practical calculations, ancient civilizations like the Egyptians and the Babylonians needed reasonably accurate estimations of. Greek mathematician Archimedes developed a method to approximate with arbitrary accuracy in 250 BC. Using geometrical methods, Chinese and Indian mathematicians in the fifth century AD estimated seven digits and five digits, respectively. A millennium later, an infinite series-based formula for computing was discovered. The Welsh mathematician William Jones used the Greek letter to denote the proportion of a circle’s circumference to its diameter for the first time in 1706.

Soon after calculus was developed, hundreds of digits of could be calculated, which was sufficient for all real-world scientific calculations. However, new methods developed by mathematicians and computer scientists in the 20th and 21st centuries, along with rising computing power, allowed for the extension of the decimal representation of to many trillions of digits. The creation of effective algorithms to calculate numerical series as well as the desire of humans to set new records serve as the driving forces behind these computations. Supercomputers have also been put to the test using the massive computations needed.

Pi is recognized to be an irrational number, which means that there is no known way to finish or repeat its digits. But mathematicians have consistently shown an interest with computing the digits of Pi. Fewer than 1,000 Pi digits had been calculated before computers, despite the fact that some people spent their whole lives trying to do so. The race began in 1949 when a machine calculated 2,000 digits. Millions of digits have been calculated, with a supercomputer at the University of Tokyo holding the record (as of September 1999) with a calculation of 206,158,430,000 digits. (the initial 1,000 digits).

================

Mizanur Rahman
Lecturer
Department of Mathematics
National Institute of Science & Technology

Add a Comment

Your email address will not be published.