Computers are used in information technology (IT) to generate, process, store, retrieve, and exchange many types of data and information[1]. Unlike personal or recreational technology, IT is primarily used in the context of commercial operations. [2] Information and communications technology includes IT (ICT). [3] An information technology system (IT system) is typically an information system, a communications system, or, more specifically, a computer system, complete with all peripheral devices, software, and hardware, that is used by a small number of IT users.
The term "information technology" in its modern sense first appeared in a 1958 article published in the Harvard Business Review; authors Harold J. Leavitt and Thomas L. Whisler noted that although humans have been storing, retrieving, manipulating, and communicating information since the development of the earliest writing systems[4] "The name of the new technology has not yet been given. It will be referred to as information technology (IT)." [5] Three areas make up their definition: processing procedures, the use of statistical and mathematical methods in decision-making, and the computer modelling of higher-order thinking. [5]
The phrase is frequently used as a synonym for computers and computer networks, but it also refers to other means of information dissemination, including telephones and television. Information technology is linked to a number of economic goods and services, such as computer hardware, software, electronics, semiconductors, the internet, telecom equipment, and e-commerce. [6] [a]
There are four different phases of IT development that may be identified based on the storage and processing technologies used: pre-mechanical (3000 BC – 1450 AD), mechanical (1450–1840), electromechanical (1840–1940), and electronic (1940 to present).[4]
Computer science, which is characterised as the comprehensive study of procedure, structure, and the processing of diverse types of data, includes information technology as one of its subfields. We are starting to witness the introduction of computer science-related courses in K–12 education as this discipline continues to advance around the globe and gain prominence. However, questions have been made concerning the fact that most institutions don't provide this field's advanced placement courses. [8]
History of Computer Technology
Before the 1950s, the Massachusetts Institute of Technology (MIT) and Harvard University were where discussions and early ideas on computer circuits and numerical calculations were initially made. The area of computer science and information technology grew increasingly complex throughout time and was able to process more data. Scholarly writings from many groups started to appear in publications. [9]
Alan Turing, J. Presper Eckert, and John Mauchly were regarded as some of the key pioneers of computer technology in the mid-1900s when it comes to early computing. Even though they deserve praise for their innovations, the majority of their efforts were devoted to creating the first digital computer. Along with that, discussions of issues like artificial intelligence up as Turing was beginning to question such technology of the time period.[10]
Since ancient times, tools have been employed to facilitate computing, most likely at first in the shape of a tally stick. [11] The Antikythera mechanism, which dates to around the beginning of the first century BC, is generally regarded as the earliest known geared mechanism and mechanical analogue computer. [12] Similar geared devices did not appear in Europe until the 16th century, and the first mechanical calculator that could execute the four fundamental arithmetical operations was not created until 1645. [13]
Early in the 1940s, electronic computers employing either relays or valves started to develop. The earliest programmable computer in history and, by contemporary standards, one of the first devices that may be regarded as a full computing machine was the electromechanical Zuse Z3, which was finished in 1941. To decipher German communications during the Second World War, Colossus created the first electronic digital computer. It was programmable, but despite only being intended for a specific activity, it was not general-purpose. Additionally, it lacked the capacity to keep its programme in memory; instead, programming was done by modifying the internal wiring with plugs and switches. [14] The Manchester Baby, which executed its first programme on June 21, 1948, was the first distinctly modern electronic digital stored-program computer. [15]
Bell Laboratories' work on transistor development in the late 1940s made it possible to create a new generation of computers with significantly lower power requirements. The Ferranti Mark I, the first commercially available stored-program computer, had 4050 valves and consumed 25 kilowatts of power. In contrast, the University of Manchester's first transistorised computer, which was operational by November 1953, used only 150 watts in its complete form. [16]
The metal-oxide semiconductor field-effect transistor (MOSFET), developed by Mohamed Atalla and Dawon Kahng at Bell Laboratories in 1959, the integrated circuit (IC), developed by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor, and the microprocessor, developed by Ted Hoff, Federico Faggin, Masatoshi Shima, and others in 1960 are a few additional advancements in semiconductor technology and Stanley Mazor at Intel in 1971. These important inventions led to the development of the personal computer (PC) in the 1970s, and the emergence of information and communications technology (ICT).[17]
The phrase "information technology" had been redefined by the year 1984, according to the National Westminster Bank Quarterly Review: "The development of cable television was made possible by the convergence of telecommunications and computing technology (...generally known in Britain as information technology)." The phrase then starts to appear in publications for the International Organization for Standardization starting in 1990. (ISO). [18]
By the twenty-first century, technological advances had already altered the world thanks to the accessibility of various online services. Due to the fact that thirty percent of American workers already worked in this field, the workforce has undergone significant transformation. There were 136.9 million users personally linked to the Internet, or 51 million households. [19] Besides the Internet, new types of technology were also being introduced across the globe, which has improved efficiency and made things easier across the globe.
Along with technology revolutionizing society, millions of processes could be done in seconds. Innovations in communication were also crucial as people began to rely on the computer to communicate through telephone lines and cable. The introduction of email was a really big thing as "companies in one part of the world could communicate by e-mail with suppliers and buyers in another part of the world..."[20]
Not only personally, computers and technology have also revolutionized the marketing industry, resulting in more buyers of their products. During the year of 2002, Americans have exceeded $28 billion in goods just over the Internet alone when e-commerce a decade later resulted in $289 billion in sales.[20] And as computers are rapidly becoming more sophisticated by the day, they are becoming more used as people are becoming more reliant on them during the twenty-first century.
Comments
Post a Comment