Management Information Systems
A management information system (MIS) is an information system used for decision-making, and for the coordination, control, analysis, and visualization of information in an organization. The study of the management information systems involves people, processes and technology in an organizational context.
In a corporate setting, the ultimate goal of the use of a management information system is to increase the value and profits of the business. This is done by providing managers with timely and appropriate information allowing them to make effective decisions within a shorter period of time.
In a corporate setting, the ultimate goal of the use of a management information system is to increase the value and profits of the business. This is done by providing managers with timely and appropriate information allowing them to make effective decisions within a shorter period of time.
History:
While it can be contested that the history of management information systems date as far back as companies using ledgers to keep track of accounting, the modern history of MIS can be divided into five eras originally identified by Kenneth C. Laudon and Jane Laudon in their seminal textbook Management Information Systems.
While it can be contested that the history of management information systems date as far back as companies using ledgers to keep track of accounting, the modern history of MIS can be divided into five eras originally identified by Kenneth C. Laudon and Jane Laudon in their seminal textbook Management Information Systems.
- First Era – Mainframe and minicomputer computing
- Second Era – Personal computers
- Third Era – Client/server networks
- Fourth Era – Enterprise computing
- Fifth Era – Cloud computing
The second era (personal computers) began in 1965 as microprocessors started to compete with mainframes and minicomputers and accelerated the process of decentralizing computing power from large data centers to smaller offices.
In the late 1970s, minicomputer technology gave way to personal computers and relatively low-cost computers were becoming mass market commodities, allowing businesses to provide their employees access to computing power that ten years before would have cost tens of thousands of dollars.
This proliferation of computers created a ready market for interconnecting networks and the popularization of the Internet. (The first microprocessor—a four-bit device intended for a programmable calculator—was introduced in 1971, and microprocessor-based systems were not readily available for several years.
The MITS Altair 8800 was the first commonly known microprocessor-based system, followed closely by the Apple I and II. It is arguable that the microprocessor-based system did not make significant inroads into minicomputer use until 1979, when VisiCalc prompted record sales of the Apple II on which it ran. The IBM PC introduced in 1981 was more broadly palatable to business, but its limitations gated its ability to challenge minicomputer systems until perhaps the late 1980s to early 1990s.)
In the late 1970s, minicomputer technology gave way to personal computers and relatively low-cost computers were becoming mass market commodities, allowing businesses to provide their employees access to computing power that ten years before would have cost tens of thousands of dollars.
This proliferation of computers created a ready market for interconnecting networks and the popularization of the Internet. (The first microprocessor—a four-bit device intended for a programmable calculator—was introduced in 1971, and microprocessor-based systems were not readily available for several years.
The MITS Altair 8800 was the first commonly known microprocessor-based system, followed closely by the Apple I and II. It is arguable that the microprocessor-based system did not make significant inroads into minicomputer use until 1979, when VisiCalc prompted record sales of the Apple II on which it ran. The IBM PC introduced in 1981 was more broadly palatable to business, but its limitations gated its ability to challenge minicomputer systems until perhaps the late 1980s to early 1990s.)
The third era (client/server networks) arose as technological complexity increased, costs decreased, and the end-user (now the ordinary employee) required a system to share information with other employees within an enterprise. Computers on a common network shared information on a server. This lets thousands and even millions of people access data simultaneously on networks referred to as Intranets.
The fourth era (enterprise computing) enabled by high speed networks, consolidated the original department specific software applications into integrated software platforms referred to as enterprise software. This new platform tied all aspects of the business enterprise together offering rich information access encompassing the complete management structure.
The fourth era (enterprise computing) enabled by high speed networks, consolidated the original department specific software applications into integrated software platforms referred to as enterprise software. This new platform tied all aspects of the business enterprise together offering rich information access encompassing the complete management structure.
Need More Information then Download the material!
© Copyright Protected.