Skip to main content

Information Technology

Computers are used in information technology (IT) to generate, process, store, retrieve, and exchange many types of data and information[1]. Unlike personal or recreational technology, IT is primarily used in the context of commercial operations. [2] Information and communications technology includes IT (ICT). [3] An information technology system (IT system) is typically an information system, a communications system, or, more specifically, a computer system, complete with all peripheral devices, software, and hardware, that is used by a small number of IT users.

The term "information technology" in its modern sense first appeared in a 1958 article published in the Harvard Business Review; authors Harold J. Leavitt and Thomas L. Whisler noted that although humans have been storing, retrieving, manipulating, and communicating information since the development of the earliest writing systems[4] "The name of the new technology has not yet been given. It will be referred to as information technology (IT)." [5] Three areas make up their definition: processing procedures, the use of statistical and mathematical methods in decision-making, and the computer modelling of higher-order thinking. [5]

The phrase is frequently used as a synonym for computers and computer networks, but it also refers to other means of information dissemination, including telephones and television. Information technology is linked to a number of economic goods and services, such as computer hardware, software, electronics, semiconductors, the internet, telecom equipment, and e-commerce. [6] [a]

There are four different phases of IT development that may be identified based on the storage and processing technologies used: pre-mechanical (3000 BC – 1450 AD), mechanical (1450–1840), electromechanical (1840–1940), and electronic (1940 to present).[4]

Computer science, which is characterised as the comprehensive study of procedure, structure, and the processing of diverse types of data, includes information technology as one of its subfields. We are starting to witness the introduction of computer science-related courses in K–12 education as this discipline continues to advance around the globe and gain prominence. However, questions have been made concerning the fact that most institutions don't provide this field's advanced placement courses. [8]


History of Computer Technology


Before the 1950s, the Massachusetts Institute of Technology (MIT) and Harvard University were where discussions and early ideas on computer circuits and numerical calculations were initially made. The area of computer science and information technology grew increasingly complex throughout time and was able to process more data. Scholarly writings from many groups started to appear in publications. [9]

Alan Turing, J. Presper Eckert, and John Mauchly were regarded as some of the key pioneers of computer technology in the mid-1900s when it comes to early computing. Even though they deserve praise for their innovations, the majority of their efforts were devoted to creating the first digital computer. Along with that, discussions of issues like artificial intelligence up as Turing was beginning to question such technology of the time period.[10]

Since ancient times, tools have been employed to facilitate computing, most likely at first in the shape of a tally stick. [11] The Antikythera mechanism, which dates to around the beginning of the first century BC, is generally regarded as the earliest known geared mechanism and mechanical analogue computer. [12] Similar geared devices did not appear in Europe until the 16th century, and the first mechanical calculator that could execute the four fundamental arithmetical operations was not created until 1645. [13]

Early in the 1940s, electronic computers employing either relays or valves started to develop. The earliest programmable computer in history and, by contemporary standards, one of the first devices that may be regarded as a full computing machine was the electromechanical Zuse Z3, which was finished in 1941. To decipher German communications during the Second World War, Colossus created the first electronic digital computer. It was programmable, but despite only being intended for a specific activity, it was not general-purpose. Additionally, it lacked the capacity to keep its programme in memory; instead, programming was done by modifying the internal wiring with plugs and switches. [14] The Manchester Baby, which executed its first programme on June 21, 1948, was the first distinctly modern electronic digital stored-program computer. [15]

Bell Laboratories' work on transistor development in the late 1940s made it possible to create a new generation of computers with significantly lower power requirements. The Ferranti Mark I, the first commercially available stored-program computer, had 4050 valves and consumed 25 kilowatts of power. In contrast, the University of Manchester's first transistorised computer, which was operational by November 1953, used only 150 watts in its complete form. [16]

The metal-oxide semiconductor field-effect transistor (MOSFET), developed by Mohamed Atalla and Dawon Kahng at Bell Laboratories in 1959, the integrated circuit (IC), developed by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor, and the microprocessor, developed by Ted Hoff, Federico Faggin, Masatoshi Shima, and others in 1960 are a few additional advancements in semiconductor technology and Stanley Mazor at Intel in 1971. These important inventions led to the development of the personal computer (PC) in the 1970s, and the emergence of information and communications technology (ICT).[17]

The phrase "information technology" had been redefined by the year 1984, according to the National Westminster Bank Quarterly Review: "The development of cable television was made possible by the convergence of telecommunications and computing technology (...generally known in Britain as information technology)." The phrase then starts to appear in publications for the International Organization for Standardization starting in 1990. (ISO). [18]

By the twenty-first century, technological advances had already altered the world thanks to the accessibility of various online services. Due to the fact that thirty percent of American workers already worked in this field, the workforce has undergone significant transformation. There were 136.9 million users personally linked to the Internet, or 51 million households. [19] Besides the Internet, new types of technology were also being introduced across the globe, which has improved efficiency and made things easier across the globe.

Along with technology revolutionizing society, millions of processes could be done in seconds. Innovations in communication were also crucial as people began to rely on the computer to communicate through telephone lines and cable. The introduction of email was a really big thing as "companies in one part of the world could communicate by e-mail with suppliers and buyers in another part of the world..."[20]

Not only personally, computers and technology have also revolutionized the marketing industry, resulting in more buyers of their products. During the year of 2002, Americans have exceeded $28 billion in goods just over the Internet alone when e-commerce a decade later resulted in $289 billion in sales.[20] And as computers are rapidly becoming more sophisticated by the day, they are becoming more used as people are becoming more reliant on them during the twenty-first century.



Comments

Popular posts from this blog

The likelihood of an Arctic "viral spillover" could increase as the temperature changes.

 According to scientists, hosts and viruses that would not typically interact with one another are coming into contact because of water from melting glaciers. According to recently released research, a warmer climate could increase the risk of "viral spillover" by exposing viruses in the Arctic to new settings and hosts. In order to reproduce and spread, viruses need hosts like people, animals, plants, or fungi. Occasionally, though, they might jump to a new host that is immune, as was the case with the COVID-19 pandemic. GO ON READING As a result of Hurricane Julia, displaced Hondurans are considering leaving.  Massive demonstrations against the inflation and environment crises in Paris. In order for China to achieve its climate targets, $17 trillion is required. Researchers have found a link between air pollution and lung cancer. By investigating samples from the northern terrain of Lake Hazen, Canadian scientists sought to learn how climate change would impact spillove...

Understanding the formation of copper deposits of the porphyry type may be essential for the "green economy."

 A key finding regarding the creation of mineral deposits that will help us move toward a "green economy" has been made by scientists. According to a recent collaborative study conducted by Lawrence Carter from the University of Exeter's Camborne School of Mines , the fast shift in the underlying magmatic plumbing system that causes porphyry-type copper deposits to occur. The work challenges the conventional wisdom that magmatic systems' "fertility," or capacity for mineralization, increases gradually over millions of years and on an arc-scale. Instead, it presents a new 4D model for the genesis of porphyry-type copper deposits. Instead, a change from non-mineralizing to porphyry deposit-forming magmas may occur over a period of less than 200 kyrs at a rate that is an order of magnitude faster. This is thought to be caused by a shift in magma production from the middle crust to the lower crust, which was likely brought on by the entry of considerably hotter...

Using the fatty acid binding protein type I gene as a unique DNA marker, Fasciola flukes can be differentiated by species.

  Abstract Background Fasciola hepatica, F. gigantica, and hybrid Fasciola flukes have been distinguished using multiplex polymerase chain reaction (PCR) and PCR-restriction fragment length polymorphism (RFLP) for nucleus phosphoenolpyruvate carboxykinase (pepck) and polymerase delta (pold), respectively. However, both approaches have been reported to exhibit discrimination flaws. The objective of this study was to create a multiplex PCR based on the FABP type I gene, a novel nuclear marker. Methods Using DNA samples of hybrid Fasciola flukes, F. hepatica, and F. gigantica collected from 11 countries in Europe, Latin America, Africa, and Asia, nucleotide sequence variants of FABP type I were examined. For multiplex PCR, two distinct reverse primers for F. hepatica and F. gigantica as well as a common forward primer were created. Results Using multiplex PCR, specific segments of F. hepatica (290 bp) and F. gigantica (190 bp) were amplified satisfactorily. The hybrid flukes, however,...