Monday, November 10, 2008

INFORMATION SYSTEM


INFORMATION SYSTEM



The term information system (IS) sometimes refers to a system of persons, data records and activities that process the data and information in an organization, and it includes the organization's manual and automated processes. Computer-based information systems are the field of study for information technology, elements of which are sometimes called an "information system" as well, a usage some consider to be incorrect.


The study of information systems originated as a sub-discipline of computer science in an attempt to understand and rationalize the management of technology within organizations. It has matured into a major field of management, that is increasingly being emphasized as an important area of research in management studies, and is taught at all major universities and business schools in the world. Börje Langefors introduced the concept of "Information Systems" at the third International Conference on Information Processing and Computer Science in New York in 1965. [4]

Information technology is a very important malleable resource available to executives.[5] Many companies have created a position of Chief Information Officer (CIO) that sits on the executive board with the Chief Executive Officer (CEO), Chief Financial Officer (CFO), Chief Operating Officer (COO) and Chief Technical Officer (CTO).The CTO may also serve as CIO, and vice versa.

Monday, November 3, 2008

HISTORY OF COMPUTER



HISTORY OF COMPUTER


The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables. The timeline of computing presents a summary list of major developments in computing by date.



The earliest known tool for use in computation was the abacus, and it was thought to have been invented in Babylon circa 2400 BC. Its original style of usage was by lines drawn in sand with pebbles. Abaci, of a more modern design, are still used as calculation tools today. This was the first known computer and most advanced system of calculation known to date - preceding Greek methods by 2,000 years.

In 1115 BCE, the South Pointing Chariot was invented in ancient China. It was the first known geared mechanism to use a differential gear, which was later used in analog computers. The Chinese also invented a more sophisticated abacus from around the 2nd century BCE known as the Chinese abacus).

In the 5th century BCE in ancient India, the grammarian Pāṇini formulated the grammar of Sanskrit in 3959 rules known as the Ashtadhyayi which was highly systematized and technical. Panini used metarules, transformations and recursions with such sophistication that his grammar had the computing power equivalent to a Turing machine. Between 200 BCE and 400 CE, Jaina mathematicians in India invented the logarithm. From the 13th century, logarithmic tables were produced by Muslim mathematicians.

The Antikythera mechanism is believed to be the earliest known mechanical analog computer.[1] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to circa 100 BC.




Mechanical analog computer devices appeared again a thousand years later in the medieval Islamic world and were developed by Muslim astronomers, such as the equatorium by Arzachel,[2] the mechanical geared astrolabe by Abū Rayhān al-Bīrūnī,[3] and the torquetum by Jabir ibn Aflah.[4] The first programmable machines were also invented by Muslim engineers, such as the automatic flute player by the Banū Mūsā brothers[5] and the humanoid robots by Al-Jazari.[6] Muslim mathematicians also made important advances in cryptography, such as the development of cryptanalysis and frequency analysis by Alkindus.[7][8]




When John Napier discovered logarithms for computational purposes in the early 17th century, there followed a period of considerable progress by inventors and scientists in making calculating tools.

None of the early computational devices were really computers in the modern sense, and it took considerable advancement in mathematics and theory before the first modern computers could be designed.