Generally speaking, a computer is a programmable device used for calculation, data storage, and problem solving. In modern times the computer has been used to do all sorts of complex jobs, such as sending spacecraft to Pluto, communicating around the world, and performing medical surgery. But it hasn't always been this way. The first math calculations were probably figured by counting on the fingers. Tally marks on cave walls or in the sand may have been the next step. Let's take a very simple tour of the history of the computer to find out how we got where we are today.
The first computers were only used for basic math. The Chinese abacus, which was invented around 500 BC, allowed for calculations of addition, subtraction, multiplication, and division. It was called the abacus, from the Greek or Latin word “abax,” which means calculating board. It was, in fact, a board with grooves that allowed the sorting and movement of beads or other small objects for the process of counting. Written numbers did not yet exist, so this was a way to display and keep track of amounts. The actual math was done in the head (or mentally), and the abacus simply allowed a place to record the results.
This was probably the beginning of place value and was followed by the writing of numerals. Fractions and square roots could eventually be calculated on the abacus. The abacus changed many times in design and use from country to country and from century to century. When most of us think of an abacus, we probably imagine one that uses metal rods with beads permanently attached. The beads slide up or down the rods to make calculations. These became the best-used version of the abacus. Learn more about the abacus at the UCMAS Mental Math School's website. Here you will be able to see how calculations are performed on an abacus.
The Slide Rule
By the 1500s, the abacus was beginning to lose its widespread use. In time, abaci (plural of abacus) were replaced by arithmetic and algorithms. In 1622 the slide rule was invented by William Oughtred. This device was a ruler with a sliding attachment that allowed for mathematical calculations of more complex nature using arithmetical concepts. The slide rule was used by students and scientists up until the 1970s and even helped to put people on the moon.
In the 1800s, ways were needed with which to calculate large numbers for the purpose of taking census counts. Herman Hollerith was credited with creating a punch card system to make calculations for just that purpose. It was called a tabulating machine. He was later the owner of a company that became IBM. Learn more about Herman Hollerith and his tabulating machine at Columbia University. Early computers required the use of gears and other mechanical parts to accomplish their tasks.
In the 1930s, J. V. Atanasoff, Konrad Zuse in Germany, and Bell Laboratories are all credited with creating computers of varying types that used electricity and switches to perform their tasks. These were again used for calculating mathematical operations.
Many would say that the transistor was the greatest invention of the 20th century. It did help further the growth of computers. A transistor is a small device used to control electricity. John Bardeen, Walter Brattain, and William Shockley were awarded the Nobel Prize in Physics in 1956 for its invention. Find out more about the transistor at PBS.org. Transistors aided in the development of more complex computer systems.
While the first modern-day computer run by electricity is probably the huge ENIAC (Electrical Numerical Integrator and Calculator), it was nothing like today's computers. It took up a 20 foot by 40 foot room, weighed about 30 tons, and cost $500,000. It was fast by comparison to anything that had existed up to that point. It was able to perform thousands of calculations per second. By the time it was shut down, it had calculated more data than had ever been calculated in the history of mankind, by any method. The ENIAC was in operation from 1945 - 1955.
The first computer program ran a computer in 1948 and consisted of 17 instructions for the machine. During the decades of the 1940s, '50s and '60s, a number of variations of huge-sized computers ran programs for military purposes, designing technical engineering, performing repetitive and complex mathematical calculations, keeping records and data, performing scientific calculations, calculating the movement of the planets and moons of our solar system, streamlining banking processes, and much more. Punch cards and reel-to-reel magnetic tapes were used for storing information. All the while, computers kept getting smaller and smaller. They were still big, but they no longer took up an entire room.
In 1953 the computer language which was later known as COBOL was invented by Grace Hopper, and in 1954 the FORTRAN language was created.
The Growth of Computers
While the size of a computer began to shrink, the computer's purpose and ability to perform tasks changed. The mouse, the computer chip, graphical user interfaces, additional languages, data storage methods, video games, Ethernet (for linking multiple computers), word processing and much more came into the computer world. All of these changes happened over the next 20 - 25 years. By the late 1970s, personal computers from Microsoft, Apple, Radio Shack, IBM and Commodore hit the market, and the average family could own a computer to use on a table in their family room.
In the 1960s computers were so large that it was not useful or convenient to share information between one computer and another. They were not portable, and since information was kept on huge reel-to-reel tapes, the only way to share data was to send the tapes through the U. S. Postal Service (the mail). The United States government was concerned that there should be a way for government agencies to share information between computers in various departments and to be able to share it quickly. So they created ARPANET (Advanced Research Projects Agency Network) for just this purpose. The ARPANET was only available to government agencies and the Department of Defense. This network linked computers and allowed the exchange of important information back and forth. It was such a success that soon additional networks were being created.
By 1983, TCP/IP (Transfer Control Protocol/Internet Protocol) was a type of language created so that different computers from difference manufacturers would be able to “talk” to one another. The government's ARPANET switched over to TCP/IP to be able to utilize all resources and thus the internet was born. This allowed for exchange of information between computers belonging to the government, businesses, and also individual citizens.
The Basic Personal Computer
Schools began to use very simple computers for instruction in the 1980s. The video graphics were initially very crude and students were able to do very little except take tests and play very simple games. Not until the word processor and the internet became part of a computer system were they very useful. Video graphics also improved drastically as did storage systems. Today, most of us have a computer available to us at home, at work, or at school. Many of us have access to computers in all three places. The personal computer, also known as the PC, is really the step that made computers a household word.
The basic personal computer consists of a CPU - also known as a Central Processing Unit. This is the working portion of the computer. Programs that tell the computer what to do are run through this part of the computer. This is really quite fascinating since electricity is either on or off. All the processor does is combine on and off signals to translate that into whatever you are asking the computer to do: add numbers, play a game, type words, or find the phone number for your favorite pizza delivery.
Connected to the CPU is the monitor. The monitor is a screen very similar to your television screen in fact, some people connect their computers to their television screens to get a bigger view of what they are doing with their computer. The monitor uses pixels to display images. Tiny dots glow with instructions from the CPU to show your computer game or your report on penguins. The number of pixels determines the screen resolution and is represented by the height times the width, or dots per inch. The bigger the numbers, the better your images look.
A keyboard is also attached to your CPU. The standard English keyboard is arranged in what is known as the QWERTY layout. This is based on the first 6 letters in the top row being q, w, e, r, t, and y. Typewriters were designed this way, organizing the letters by frequency of use. When computer keyboards came along, it seemed natural to continue this arrangement since people who learned to type could use the same fingering. Many have tried to change this keyboard, but it still stays around.
The mouse was a great addition to the computer. It wasn't really needed until computers had video screens. Then some method of moving the cursor from where it is to where you want it to be became a necessity. The original mouse was known as a ball mouse and used a small ball inside a box that rolled on top of your desk. These got dirty very quickly from rolling along dusty desks, which impacted how well they functioned. The mouse was an ever-changing tool. Buttons for making selections, a ball which was rotated by the user's fingers instead of rolling on the desk, the optical mouse, the laser mouse, the built-in mouse found in laptops, and cordless mice have all been part of the technology that allows a person to make choices while they compute. More recent versions of the monitor allow people to make choices directly to the touch screen using their finger or by using a special pen and skipping the mouse altogether.
People often connect additional equipment to their computers depending on their needs. These extra pieces are known as peripherals. Speakers (so they can hear music or game sounds), printers (so they can print what they typed), scanners (so they can save photographs or documents to their computer), storage devices (so they can store more data). All of these peripherals have also gone through changes over the years to make them better, more efficient, and more affordable to the public.
While the personal computer is used by many, laptops, tablets, and smartphones are quickly becoming the computers of choice. They are much more portable and allow the user to take their computer with them everywhere. These systems can have limitations, but as quickly as computers have advanced so far, the day will come when those limitations will be history too.
The interactive whiteboard is one of the most recent tools to be added to classrooms. This allows teachers to display information from a personal computer screen to a format large enough to show to an entire class. Tapping on the whiteboard works much the same as a touch screen on a laptop or personal computer. The system works through a projector that displays the computer information to a white screen that reacts to the touch of a pen or a person's finger. Many teachers find that having an interactive whiteboard in their classrooms allows them to share experiences with their classes in a whole new way.
Have you used a computer today? Odds are the answer is yes. Computers have all sorts of applications in the world today. Computers connect our phone calls, run traffic lights, heat and cool our homes, vacuum our floors, and help doctors diagnose disease. We use a computer every time we heat leftovers in the microwave. We can read magazines, books, and the news using computers. We schedule appointments, take photographs, listen to music, and share jokes using computers. We can find out what movies are playing in our local theater and then the theater shows us those same movies using other computers. The movies themselves were created using computer equipment.
Meteorologists can track the weather using computers. Airlines schedule flights and keep track of the planes using computers. Stores keep track of their products to know how much is available in their store and when they need to order more. The computer might even do the ordering. The clerk who scans these products when you purchase them is using a computer. And people pay for that same product using a card with a magnetic strip or a computer chip. Just about every type of business uses a computer in one way or another. Computers are used in communication, transportation, law enforcement, science, medicine, security, banking, education, farming, food production, manufacturing, power companies, water companies, and gas companies. Your family's car probably has more computer-driven functions than were used to land the first spacecraft on the moon. Even your video games were created by and run on computer systems. So yes, you have probably used a computer today, or at least benefitted by the use of computers in some way.
Storage for computer information has also changed over the decades. When modern computers were first created, data was a card with holes punched in it. The location of the holes determined the differences that were the data. Lots of card may have been needed to accomplish even one simple task. Then reel-to-reel magnetic tape was used. Huge computers used the tape reels to store and retrieve information. Rooms to store the reels were also good-sized and were often known as libraries. When personal computers came on the scene, big reel-to-reel magnetic tape was replaced by cassette tapes, which could be held in the palm of a hand.
As computers began to get smaller, the storage methods also took on a smaller physical size, but a larger capacity for data. Measuring their storage capacity is done in bytes. A byte is the unit of measure for one character or space. As storage became better and bigger, the measures took on names like kilobytes and megabytes - meaning that a kilobyte (KB) is roughly 1 thousand bytes, and a megabyte (MB) is roughly 1 million bytes.
The original floppy disk was 8 inches square and held 80 KB of data. These shrank in size to 5 ¼ inches square, but held about 100 KB. They were called floppy disks because, well, they were actually floppy. Those disks were replaced by a 3 ½ inch disk that was also called a floppy, even though it was housed in a stiff plastic cover. Those covers even took on various colors for quick identification. These were more durable than the 5 ¼ inch disks and held 1.44 MB of data. Next came the Zip disks. The Zip disk could hold 100 to 750 MB of data. It was more rigid and a little thicker, but often meant that the user would need to purchase an additional drive specially made for it. Few computers came with this size drive as a part of the original equipment. The compact disc, or CD, which is still used today, replaced the floppy and Zip disks. A CD can hold about 700 MB of data. The flash drive is a smaller storage unit that can come in a variety of data sizes and yet it is so small it can hang from a keychain. The flash drive is also called a thumb drive because it is about the same size as a person's thumb. Today we have SD cards, which are very tiny portable memory devices used in digital cameras, smartphones, audio players, and other small equipment. A person can even use cloud-based memory, a method of storing data in a system outside of the person's own physical space and using the internet to access this data. This is just a small description of the various memory devices used during the history of computers. To get a more detailed look at storage systems over the years, find what you want at computerhistory.org.
Computer Programs and Coding
The first computer program was used in 1948. But what exactly is a program? Computers are not intelligent. People think that computers are smart because so much can be done with one. But a computer is simply electricity being turned on or off in various patterns. The program tells the computer what pattern to use and when. Because of the “ons” and “offs” in the computer patterns, this is known as a binary system - binary means two. That sounds very simple, but the combination of “ons” and “offs” must be very exact in order to accomplish the task desired.
A computer cannot think like a human, so instructions for things that we take for granted or that seem like common sense must be programmed into the computer as part of the task. For example, if your teacher tells you to sharpen a pencil, you understand that you will need to take a pencil to the pencil sharpener, put the pencil into the sharpener, and turn the crank, right? But inside that understanding, you use your common sense to tell you to stand up, walk one foot in front of the other, stop when you get to a reasonable distance from the sharpener, hold the pencil in the correct hand, insert it (pointed side into the hole of the sharpener) turn the crank in the correct direction with the other hand while continuing to gently push the pencil further into the hole, pull the pencil out, check the point, repeat the sharpening if necessary, pull the pencil out and check the point again, turn away from the sharpener, walk one foot in front of the other to the teacher, stop, and gently hand it to him. Computers have no common-sense ability and so every step of a task must be explained. If one step is not included (even little ones) that pencil could get sharpened on the eraser end, that is, if pencil sharpening were accomplished by a computer. The example here is probably not detailed enough, in reality. The balancing that you must perform to walk, using your eyes to get across the room, go around furniture, hold the pencil in a horizontal position to put it into the sharpener, know how many times to crank the sharpener, know what a sharp pencil need to look like, and so on and so on. We all know that a computer is not going to sharpen a pencil-but some day they might!!
Computer programs are written in special languages that the computer can understand, often called coding. Computer languages are specific to their use and purpose. This is because computers need to have everything defined and explained. A computer that installs bolts into a car part needs different instructions than a computer that runs a crossword puzzle game or vacuums the floor. For this reason, there are over 2,000 different languages for creating computer programs. Some languages can be used by students as young as elementary school, while others require years of college study to perfect. The program is often called software. A person who writes computer programs is known as a programmer. They can also be known by other terms, depending upon what their end result will be. Software developers, software engineers, software designers, and coders are other terms for people who write computer programs.
All computer programs have values known as constants and values known as variables. A constant is something that stays the same - for example, the background color of your screen stays the same each time you use it. A variable is something that can change - like the numbers that you type into a calculator when working a math problem. Variables can also be something in a program that tells you a fact that changes, such as the date or the day's temperature, or is often used when filling out a form - name, address, phone number, etc., because those change depending upon who fills out the form.
Coding is something that used to be done only by highly trained professionals, but languages have been created that can be used by nearly anyone. Students are learning to code in classrooms everywhere. Some schools have computer coding instruction during their school day and others have after - school computer coding instruction.
The most popular coding languages for students right now are Scratch and Tynker. Scratch was designed specifically for students ages 8 to 18 by students and staff at MIT (Massachusetts Institute of Technology) in 2003. The entire language is visual and uses bricks that represent commands to get the computer to obey instructions. Kids can learn the purpose of variables, play sounds, animate characters, create interactive stories and then share them with others. This useful tool gives kids the basics of how to put a program together and thus they become software engineers.
Tynker is another coding language that students may use to write computer coding. Tynker works on a different concept, instead of just having kids play with codes, it actually teaches coding skills. It involves using an online course specifically to instruct students. Visuals are colorful and kid friendly.
So what are you waiting for? Now is your chance to become a computer software engineer and designer. You can get started now while you are in elementary school and maybe it will be YOU who writes a famous computer program that sharpens pencils!! Good luck!!