Monday, November 10, 2008

INFORMATION SYSTEM


INFORMATION SYSTEM



The term information system (IS) sometimes refers to a system of persons, data records and activities that process the data and information in an organization, and it includes the organization's manual and automated processes. Computer-based information systems are the field of study for information technology, elements of which are sometimes called an "information system" as well, a usage some consider to be incorrect.


The study of information systems originated as a sub-discipline of computer science in an attempt to understand and rationalize the management of technology within organizations. It has matured into a major field of management, that is increasingly being emphasized as an important area of research in management studies, and is taught at all major universities and business schools in the world. Börje Langefors introduced the concept of "Information Systems" at the third International Conference on Information Processing and Computer Science in New York in 1965. [4]

Information technology is a very important malleable resource available to executives.[5] Many companies have created a position of Chief Information Officer (CIO) that sits on the executive board with the Chief Executive Officer (CEO), Chief Financial Officer (CFO), Chief Operating Officer (COO) and Chief Technical Officer (CTO).The CTO may also serve as CIO, and vice versa.

Monday, November 3, 2008

HISTORY OF COMPUTER



HISTORY OF COMPUTER


The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables. The timeline of computing presents a summary list of major developments in computing by date.



The earliest known tool for use in computation was the abacus, and it was thought to have been invented in Babylon circa 2400 BC. Its original style of usage was by lines drawn in sand with pebbles. Abaci, of a more modern design, are still used as calculation tools today. This was the first known computer and most advanced system of calculation known to date - preceding Greek methods by 2,000 years.

In 1115 BCE, the South Pointing Chariot was invented in ancient China. It was the first known geared mechanism to use a differential gear, which was later used in analog computers. The Chinese also invented a more sophisticated abacus from around the 2nd century BCE known as the Chinese abacus).

In the 5th century BCE in ancient India, the grammarian Pāṇini formulated the grammar of Sanskrit in 3959 rules known as the Ashtadhyayi which was highly systematized and technical. Panini used metarules, transformations and recursions with such sophistication that his grammar had the computing power equivalent to a Turing machine. Between 200 BCE and 400 CE, Jaina mathematicians in India invented the logarithm. From the 13th century, logarithmic tables were produced by Muslim mathematicians.

The Antikythera mechanism is believed to be the earliest known mechanical analog computer.[1] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to circa 100 BC.




Mechanical analog computer devices appeared again a thousand years later in the medieval Islamic world and were developed by Muslim astronomers, such as the equatorium by Arzachel,[2] the mechanical geared astrolabe by Abū Rayhān al-Bīrūnī,[3] and the torquetum by Jabir ibn Aflah.[4] The first programmable machines were also invented by Muslim engineers, such as the automatic flute player by the Banū Mūsā brothers[5] and the humanoid robots by Al-Jazari.[6] Muslim mathematicians also made important advances in cryptography, such as the development of cryptanalysis and frequency analysis by Alkindus.[7][8]




When John Napier discovered logarithms for computational purposes in the early 17th century, there followed a period of considerable progress by inventors and scientists in making calculating tools.

None of the early computational devices were really computers in the modern sense, and it took considerable advancement in mathematics and theory before the first modern computers could be designed.

Monday, October 27, 2008

>>> RUBRIC MODEL <<<


. . . RUBRIC MODEL . . .

This article is about rubrics in text and as instructions. For other uses, see Rubric (disambiguation).

Dominican Missal, c. 1240, with rubrics in red (Historical Museum of Lausanne)
Rubrics in an illuminated gradual of ca. 1500A rubric is a word or section of text which is written or printed in red ink to highlight it. The term derives from the Latin: rubrica, meaning red ochre or red chalk,[1] and originates in Medieval illuminated manuscripts from the 13th century or earlier. In these, red letters were used to highlight initial capitals (particularly of psalms), section headings and names of religious significance, a practice known as rubrication, which was a separate stage in the production of a manuscript.

Rubric can also mean the red ink or paint used to make rubrics, or the pigment used to make it.[2] Although red was most often used, other colours came into use from the late Middle Ages onwards, and the word rubric was used for these also.

* * Information Literacy * *


. . .INFORMATION LITERACY . . .

Several conceptions and definitions of information literacy have become prevalent. For example, one conception defines information literacy in terms of a set of competencies that an informed citizen of an information society ought to possess to participate intelligently and actively in that society (from [1]).

The American Library Association's (ALA) Presidential Committee on Information Literacy, Final Report states that, "To be information literate, a person must be able to recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information" (1989).

Jeremy Shapiro & Shelley Hughes (1996) define information literacy as "A new liberal art that extends from knowing how to use computers and access information to critical reflection on the nature of information itself its technical infrastructure and its social, cultural, and philosophical context and impact." (from [2])

Information literacy is becoming a more important part of K-12 education. It is also a vital part of university-level education (Association of College Research Libraries, 2007). In our information-centric world, students must develop skills early on so they are prepared for post-secondary opportunities whether that be the workplace or in pursuit of education.

^^ Search Engine ^^





SEARCH ENGINES

Search engines are very different from subject directories. While humans organize and catalog subject directories, search engines rely on computer programs called spiders or robots to crawl the Web and log the words on each page. With a search engine, keywords related to a topic are typed into a search "box." The search engine scans its database and returns a file with links to websites containing the word or words specified. Because these databases are very large, search engines often return thousands of results. Without search strategies or techniques, finding what you need can be like finding a needle in a haystack.

To use search engines effectively, it is essential to apply techniques that narrow results and push the most relevant pages to the top of the results list. Below are a number of strategies for boosting search engine performance. When a "practice" link appears, click on the link to practice the technique with AltaVista's search engine.

IDENTIFY KEYWORDS
When conducting a search, break down the topic into key concepts. For example, to find information on what the FCC has said about the wireless communications industry, the keywords might be:


Monday, October 6, 2008

History of internet









Prior to the widespread internetworking that led to the Internet, most communication networks were limited by their nature to only allow communications between the stations on the network, and the prevalent computer networking method was based on the central mainframe method. In the 1960s, computer researchers, Levi C. Finch and Robert W. Taylor pioneered calls for a joined-up global network to address interoperability problems. Concurrently, several research programs began to research principles of networking between separate physical networks, and this led to the development of Packet switching. These included Donald Davies (NPL), Paul Baran (RAND Corporation), and Leonard Kleinrock's MIT and UCLA research programs.

This led to the development of several packet switched networking solutions in the late 1960s and 1970s, including ARPANET and X.25. Additionally, public access and hobbyist networking systems grew in popularity, including UUCP and FidoNet. They were however still disjointed separate networks, served only by limited gateways between networks. This led to the application of packet switching to develop a protocol for inter-networking, where multiple different networks could be joined together into a super-framework of networks. By defining a simple common network system, the Internet protocol suite, the concept of the network could be separated from its physical implementation. This spread of inter-network began to form into the idea of a global inter-network that would be called 'The Internet', and this began to quickly spread as existing networks were converted to become compatible with this. This spread quickly across the advanced telecommunication networks of the western world, and then began to penetrate into the rest of the world as it became the de-facto international standard and global network. However, the disparity of growth led to a digital divide that is still a concern today.