Technical report

November 17, 2016 | Author: neysha_lopez44 | Category: N/A
Share Embed Donate


Short Description

Technical Report...

Description

Lopez 1 Neysha Lopez ENC 3241 Professor March 10, 2013

Computer Programming: “The beginning of a new era” Abstract The purpose of this report is to research the past, present and the possible future of computer programming. This report is organized in three major sections. Section one “History” reviews the beginnings of computer programming. This section explains how the beginning of programming was not with computers but with simpler objects like mechanical calculators and difference engines. Section two “Current Territory” examines the current status of computer programming. This section examines the two most used coding languages, C and java, and also it examines the current status of computer programming. Section three “Future” researches the possibilities for computer programming in the near and distant future. This section researches three major concepts that are expected to happen in the future of computer programming like the use of natural language, user defined languages, and nonprocedural and problem defining languages. History From ancient mechanical calculators and percussion instruments to modern computers, computer programming is constantly evolving. Humans have been thinking about developing artificial intelligence for centuries. In 1887 Charles S. Peirce wrote “Precisely how much the business of thinking a machine could possibly be made to perform, and what part of it must be

Lopez 2 left to the living mind is a question not without conceivable practical importance” (Vardi 5). Although in that time humans did not have equipment to invent a machine it was already processing in their minds. The first instrument that was “programed” in history was the mechanical calculator. As showed in figure 1-1, these calculators were mainly used by merchants and stores to simplify the arithmetic when calculating the total purchase and giving change to a costumer. These calculators were programmed by hand and they only performed one function unlike modern computers and calculators.

Figure 1-1 ("Burroughs Adding Machine”) Computers were first developed between 1930’s and 1950’s. This was when the real technological craze started. In 1922 Charles Babbage invented the difference engine; “the difference engine could only be made to execute tasks by changing the gears which executed the calculations. Thus, the earliest form of a computer language was physical motion. Eventually, physical motion was replaced by electrical signals when the US Government built the ENIAC in 1942 (Ferguson 1)”. The difference engine was used to compute polynomial functions and it could perform many arithmetic problems unlike the mechanical calculator. When the difference

Lopez 3 engine was invented its programming was manual which means that there was much physical labor needed. Every time they needed to perform a new polynomial function they needed to reset the difference engine manually. After a few years when the US government built ENIAC, the first programmable general-purpose electronic digital computer, the difference engine changed from manual labor to electrical signals programmed by men. These signals were useful because they no longer had to reset the machine every time they needed to perform an arithmetic function. This marked the beginning of the computer industry as a whole. Ferguson states that in 1945 John Von Neumann developed two important concepts, the “shared-program technique” and the “conditional control transfer” (1). The shared-program technique said that every computer program should not be programmed manually; instead one complex code should be used more than once automatically to program different computer programs and also to reprogram computer programs. The “conditional control transfer” stated that a program should be able to follow commands in any order and it should be logical. By saying that it must be logical Von Neumann means that it should be able to follow simple commands like “if” and “then”, for example “if a > 1, then it is positive”. The first major language created was fortran in 1959 (1). This was a huge advancement in technology but it was restricted to certain words of commands like do and if. After these advancements many different and advanced computer programs have developed. Current Territory “More than 8,500 programming languages have been created and used since Grace Hopper’s A-0 compiler” (Bergin 74). Also, we have seen families of languages flourish though time. These families are composed of one original language followed by the “sons” of the language which are usually more advanced forms of the one language. One example of this is C;

Lopez 4 C is the “parent” of C++. As stated by Paul Hyman, Dennis Ritchie created the C language between 1969 and 1973 (1). C language is the most popular language currently because it is simpler to learn and use. C is the first programming language I leaned at the University and although it is challenging, it is simple once you learn other languages. The second computer programming language that is most popular currently is java. From racing to dress-up, at least once in our lifetimes most humans have played a java game in a computer. Java is famous for their logo showing up while the game loads. “In 1991, a group of Sun Microsystems engineers led by James Gosling decided to develop a language for consumer devices (cable boxes, etc.)”(Carter 1). Most consumers were not interested in java until they shifted the language from being used for consumer devices to being used for web applications. This is when java became popular and now it is the number two computer programming language. It is very popular because it enables people to create any kind of games they are interested in. Even Steve Jobs, the founder of Apple Industries, used a mixture of C, C++ and java to start creating the Apple Industry gadgets and applications. Currently computer programming is something most people have to go to school to learn or at least read books on how to program. It is expected in the future for people to be able to create simple programs without any experience. Computer programming currently is one of the biggest job markets available. Every company from hospitals to fast food chains needs computer programmers to develop software’s, websites, games and more. As explained in figure 1-2 by 2020 there will be only 400,000 students studying computer science and there will be more than 1.4 million jobs. This means that there will be one million more jobs than students. More demand and less availability of students equals more pay and opportunities for students that graduate.

Lopez 5

Figure 1-2 (“Schools”) Future One quote that caught my attention from an article called “My Compiler doesn’t understand me” read the following: “The last time I looked, we had not even found a way to specify the exact layout of a protocol packet and the byte-endianess of its fields” (Kamp 53). It is funny to think of how far science has come from the discovery of fire to the invention of iPad’s, but yet we still complain about the little problems and glitches in our gadgets. In the last two centuries we have had many advances to technology and everyday life in general. Everyday new technologies are found and introduced to the public eye. It is hard to objectively say were computer programming will go next because we even use it in hospitals now to help monitor patients hearts and blood pressure so maybe in a blink of an eye we will be using it in flying cars. There are many different opinions on where computer programming will go but most people agree on the following: “The major broad concepts that we should expect to see in the future are: (1) use of natural language (e.g. English), (2) user defined languages, (3) nonprocedural and problem defining languages…” (Sammet 608). The first concept is the use of

Lopez 6 natural language. This concept would be a great advancement in programming because it will enable the program to understand any simple command the user gives it. These commands can be in the language of preference of the user and the program will still understand. This would be a major advantage because every human being knows a plain language; however, not everyone knows a specific computer programming language like java or C. The second concept is user defined languages. A programmer provides a coding or input to the program but also has to provide the entire math and any information necessary to get to a solution. With user-defined languages the program would be able to find the solution without asking the user for any math or extra information. Programmers are not expecting to ask a program “what is my salary” and for the program to answer without any information; however, asking for less arithmetic and extra information would save time and headaches too. The third concept is nonprocedural and problem defining languages. This would be very useful because it would allow the program to fix its own error or give suggestions on how to fix the error. Currently the program tells the user where the error is but the user has to debug or fix the errors without knowing exactly what is wrong. Most programmers spend more time debugging and fixing errors than actually writing a code so out of all the three future possibilities for computer programming, nonprocedural and problem defining languages is the most important one. Conclusion Programming dates back to the mechanical calculator. In 1922 Charles Babbage invented the difference engine. This engine was used to perform polynomial problems manually. This engine was manual, but was later on restructured to receive electrical signals when United States invented the ENIAC in 1942. After this invention, in 1945, John Von Neumann created the “shared-program technique” and the “conditional control transfer” to help programmers make a

Lopez 7 single code for multiple programs and to make a program follow simple commands like “if” and “then”. Currently there are thousands of computer programming languages available but the two most popular ones are C and java. They are the most popular because they are more simple, advanced and user-friendly than the rest of the programs. In the future computer programmers expect simple language (ex. English) programs to flourish. They also expect to see programming languages that debug or fix themselves. This would help programmers save time so they can make more new programs, applications and more. We have come a long way in a little time on the subject of computer programming; however, we are nowhere near the finish line when it comes to the topic. We expect to see many new inventions not only on computer programming but in the world generally.

Lopez 8 Works Cited Bergin, Thomas J. (Tim). "A History Of The History Of Programming Languages." Communications Of The ACM 50.5 (2007): 69-74. Business Source Premier. Web. 10 Mar. 2013. "Burroughs Adding Machine - Class 5." Burroughs Adding Machine - Class 5. N.p., n.d. Web. 10 Mar. 2013. Carter, Paul. "An Introduction to the Java Programming Language." Binghamton University, n.d. Web. 08 Mar. 2013. Ferguson, Andrew. "A History of Computer Programming Languages." A History of Computer Programming Languages. Brown University, n.d. Web. 08 Mar. 2013. Hyman, Paul. "Dennis Ritchie, 1941--2011." Communications of the ACM Dec. 2011: 21. Ergonomics Abstracts. Web. 07 Mar. 2013. Kamp, Poul-Henning "My Compiler Does Not Understand Me." Communications Of The ACM 55.7 (2012): 51-53. Business Source Premier. Web. 10 Mar. 2013. Sammet, Jean E. "Programming Languages: History and Future." Communications of the ACM, n.d. EBSCO Host. Web. 10 Mar. 2013. "Schools Need to Start Teaching Programming." Famigo. N.p., n.d. Web. 09 Mar. 2013. Vardi, Moshe Y. "Who Begat Computing?." Communications Of The ACM 56.1 (2013): 5. Business Source Premier. Web. 09 Mar. 2013.

View more...

Comments

Copyright ©2017 KUPDF Inc.
SUPPORT KUPDF