Wednesday, 25 November 2009

Computer Networks V

CONNECTIVITY MEDIA.

Some type of media is required in order to connect network components. Various types of cables exist for this purpose; as with most hardware, their price is related to their performance. Two PCs can be connected quite simply and cheaply by using a null modem cable. At the upper end of the spectrum, wireless and even satellite connections are used by large corporations and the military.

The earliest cable to become widely used is coaxial cable (nicknamed "coax"). As it is shielded and resistant to electrical noise, it has proven useful in factory situations. Twisted-pair cable, also called UTP (unshielded twisted pair), has replaced coax in most applications, as it is cost effective. Similar to telephone wire, noise problems prevented it from being accepted more quickly. Underwriters Laboratories rates UTP cable from Levels I through V based on performance. Levels I and II are only suitable for low grade or slower applications.

Fiber optics is the most expensive and the fastest of the cables. Fiber-optic technology has been shown to achieve speeds of several hundred gigabits per second (Gbps) or faster, although most commercial applications to date have settled for between 2.5 and 10 Gbps. Experts have theorized that multiplexing technology can push fiber-optic capacity into the terabits—or trillion bits—per second (Tbps). For these reasons, it is frequently used for high-volume backbones connecting network segments. Another benefit of fiber-optic cable is that it is immune to electrical interference.

Wireless systems are also used for connecting workstations with the file server. Microwave dishes are among the oldest means of connecting computers over long distances, though they are limited to line-of-sight transmissions and can be affected by weather conditions. Depending upon frequency, microwave equipment can transmit up to 30 miles. Another option is satellite transmission, which has been used to transmit price changes among stores in national retail chains.

Networks also require connectors to interface computing devices with the connecting media. While mainframes usually have connectors built in, most PCs require the addition of a network interface card (NIC). Larger, more powerful computers require more expensive connections due to the cost of their high-performance microprocessors and support circuitry. Such devices often implement the protocol to which electronic messages on the network must conform.

Computer Networks IV

NETWORK DEVICES

The types of machines that can be connected to a network include PCs, intelligent workstations, dumb terminals, host computers, clients, and file and other types of servers. File servers control network activity such as printing and data sharing, as well as controlling security. Important factors to consider in selecting a file server include its speed, processor performance, memory, hard drive capacity, and most importantly, its compatibility with network software.

NETWORK COMPUTERS.

A corporate trend since the mid-1990s has been toward so-called network computers (NCs), a variation on the long-established notion of dumb terminals supported by a powerful central system. Spurred by advances in Internet technology, IT managers found that they could save on the high cost of buying and maintaining full-featured PCs for every desktop when only a handful of corporate applications were used, and these could conceivably be retrieved from (or run off) a central computer, the server. Advances in software and data portability, such as HTML documents on the Web and Sun Microsystems' platform-independent Java language, encouraged the idea that NC users could simply download whatever programs and files they needed from a central repository, rather than storing such information locally on each computer.

SERVERS.

Servers are computers that run software to facilitate various kinds of network activities; the software packages that enable such activities are sometimes also called servers. A single physical computer may host a number of server-related processes. The three main types of server functions are file servers, network servers, and printer servers. File servers can be run in either a dedicated or a nondedicated mode. Nondedicated file servers can be used as work stations as well, although workstation functions can take up much of the processor's capacity, resulting in delays for network users. Also, if a workstation program causes the file server to lock up, the entire network may be affected and suffer a possible corruption of data. One compromise for a small office is to use a nondedicated file server as a workstation for a light user. A disk subsystem can increase the performance of a file server in large network applications. Network servers are used to facilitate network activities, such as processing e-mail, while printer servers manage traffic on networked printers.

STORAGE AREA NETWORKS.

Highlighting the need for network storage space, particularly for critical system backups, has been the development of a relatively new set of network technologies known as storage area networks (SANs). SANs, which are high-speed networks of storage devices that can work in conjunction with any number of servers and other network devices, can be deployed as a solution to the inefficiencies of maintaining a host of separate disk subsystems. Although most companies of any size perform routine system backups, the process of backing up as well as restoring data can be slow and cumbersome—a competitive liability for companies that depend heavily on their systems being available 24 hours a day. SANs are used to reduce this liability and improve efficiency.

OTHER NETWORK DEVICES.

Connecting devices such as bridges, routers, and gateways are used to subdivide networks both physically and logically, to extend the range of cabling, and to connect dissimilar networks. Connecting devices can be used extend the range of cabling or to subdivide networks into segments, which is useful for isolating faults. Repeaters simply extend the physical distance that network data can travel by receiving and retransmitting information packets. They do not provide isolation between the components they join. Connecting devices are classified according to the functional layer at which they operate.

Bridges operate at OSI layer two (also known as the data link layer; see Figure 2). They are used to isolate segments from a network backbone, to connect two networks with identical lower layers, and to convert one lower level technology into another. They can be configured to transmit only appropriate messages (filtering).

Routers operate at layer three (network layer). They can also be used to isolate network segments from a backbone, but unlike bridges, they can connect segments with different lower-layer protocols. Software exists which can perform this function, though not usually as fast. "Brouters" are a hybrid between bridges and routes that operate at layers two or three.

Gateways operate at layer four (transport layer) or higher. They are required for minicomputer or mainframe access from PCs and are much more complex and costly than other connecting devices. They are capable of converting data for use between dissimilar networks.

Computer Networks III

BENEFITS OF NETWORKS

Networks can allow businesses to reduce expenses and improve efficiency by sharing data and common equipment, such as printers, among many different computers. While printers can be shared in other ways, such as by carrying information on floppy disks from one PC to another, or using manual or electronic data switches, networks have the capacity to accommodate more users with less frustration. The power of mainframes or minicomputers can be used in harmony with personal computers. The larger machines can process larger and more complex jobs, such as maintaining the millions of records needed by a national company, while individual PCs manned by individual users handle smaller jobs such as word processing. And older equipment can be rotated to less demanding jobs as workstations are upgraded. Many software programs also offer license agreements for networks, which can be more cost effective than purchasing individual copies for each machine. The costs of implementing a network depend on issues of performance, compatibility, and whether value must be added to a turnkey system through additional programming or the addition of special components.

By coordinating all data and applications through a single network, backup copies of data for all systems can be made more consistently than could be expected if left to individual users. Additional, updated software for all machines on a network can be installed through a single PC. A centralized system simplifies other aspects of administration, too. With the proper software, computer security can also be implemented more effectively in a network than among many individual hard drives. Access to files can be restricted to password holders or it can be limited to inquiry-only access for public users. Generally, security measures are more vulnerable at machines with single user operating systems than those with network security precautions.

Computer Networks II

DEVELOPMENT OF NETWORK TECHNOLOGY

Research facilities sponsored by the U.S. Department of Defense were among the first to develop computer networks. Perhaps the most famous example of such a network is the Internet, which began in 1969 as Arpanet, part of a project to link computers at four research sites. One of the most significant developments in these early networks was the concept of packet switching, which encodes data for transmission over networks into small chunks of information that each carry meta-information about where the data are coming from, where they are going, and how each piece fits into the whole. Packet switching, the basis of all modern networking, enables a transmission to be routed through any number of computers to get to its destination, and provides an efficient means of retrieving lost information. If a packet is lost or corrupted, only a single packet need be re-sent, which is handled behind the scenes by the networking software, rather than starting the entire transmission over again.

TECHNICAL STANDARDS.

Several of the most defining advances occurred in the early 1980s. Coming on the heels of IBM's mid-1970s introduction of the Systems Network Architecture (SNA), a proprietary set of highly stable protocols for networking mainframes and mid-range systems, a few important industry wide standards were reached that cleared the path for widespread implementation of networking. The first of these was the debut of the Institute of Electrical and Electronics Engineers' (IEEE) 802.x series of standards, which prescribed the technical specifications for various types of network data exchanges. The IEEE standards, which are updated and expanded periodically, are still in force today. Next, a common architecture model called the Open Systems Interconnection (OSI, see below) was adopted by the International Organization for Standardization (ISO). Although the OSI was only a broad model, it provided network developers with an internationally accepted classification of the different network functions and processes and how they ought to work together. The OSI and the IEEE standards were complementary.

COMMERCIAL IMPLEMENTATION.

The Ethernet LAN protocols both influenced the formation of technical standards and became the most widespread embodiment of those standards. Ethernet was pioneered in the late 1970s by Xerox at its famous Palo Alto Research Center (PARC) with assistance from then-Digital Equipment Corp. (later part of Compaq Corp.) and Intel Corp. Indeed, the experimental Ethernet was the model on which the original IEEE standard was based, and Ethernet quickly became (and still is) the most common commercially produced LAN protocol.

Ethernet employs several hardware standards for various bandwidths and device connections, but it is perhaps best characterized by its use of a protocol called Carrier Sense Multiple Access with Collision Detection (CSMA/CD). CSMA/CD is essentially a set

Ethernet—a series of widely used hardware/software protocols for local area networks
Local area networks (LANs)—networks that are confined to a single building or part of a building and that employ technology to capitalize on the advantages of close proximity (usually speed)
Metropolitan area networks (MANs)—networks that are accessed from multiple sites situated in a relatively concentrated area (within 50 km or so) and that function as a faster alternative to wide area networks
Nodes—individual computers on a network
OSt—Open Systems Interconnection model, a broadly defined international model for the hierarchy of data communications between networked computers
Packets—also called datagrams, these are measured pieces of information (usually ranging 500 to 2,500 bytes in size) in a data transfer that are each separately addressed to their destination and reassembled into the full original message at the receiving end
Protocols—a set of rules dictating how hardware and software communicate with other devices
Storage area networks (SANs)—a high-performance network of storage/backup devices integrated with one or more primary computer networks
Topology—the structure of how networked computers are actually connected to each other and to other network resources
Wide area networks (WANs)—networks that are maintained over two or more separate buildings and use technologies that maximize the ease and cost-effectiveness of connections between distant locations (often at the expense of speed)

of rules for how competing devices can share finite network resources. Through this protocol a computer on the network can determine whether it can send data immediately or whether it must compete with another device for network services. Collision occurs when two devices attempt to use the same resource, and the CSMA/CD protocol provides a simple mechanism for resolving this contention: it halts the colliding operation (the one initiated last) and keeps trying to resend the data at specified intervals until either it succeeds or reaches a maximum number of attempts. If the maximum is reached, the operation may be aborted and data may be lost.

Since its inception Ethernet has enjoyed regular, albeit less rapid, advances in speed parallel to those in microprocessing. The latest generation of Ethernet standards, finalized in late 1998, is Gigabit Ethernet. This Ethernet standard supports transmission of up to I billion bits of data per second, representing a hundredfold improvement over the original Ethernet, which carried data at 10 million bits per second (Mbps). Gigabit Ethernet followed an enhanced 100 Mbps standard from the early 1990s known as Fast Ethernet.

Computer Networks I

A computer network consists of two or more computing devices connected by a medium allowing the exchange of electronic information. These computing devices can be mainframes, workstations, PCs, or specialized computers; they can also be connected to a variety of peripherals, including printers, modems, and CD-ROM towers. Most networks are supported by a host of specialized software and hardware that makes these connections possible, including routers, bridges, and gateways, which help accommodate traffic between unlike systems.

Many different types of computer networks exist. Some, such as local area networks (LANs), metropolitan area networks (MANs), and wide area networks (WANs), are defined by their geographic layout and the differing technologies that support such layouts. LANs are by far the most common, and in most cases, the fastest. Networks may be public, such as the Internet; semi-public, such as subscription networks (including subscription-based Internet service providers and other content-based networks); or private, such as internal corporate LANs, WANs, intranets, and extranets. Most networks are private, but of course the relatively few public ones, like the Internet, support a very large user base. Networks may also be open, or linked to other networks, or closed, which means they are self-contained and do not allow connectivity with outside resources. Most modern corporate networks are somewhere in between; they often allow access to the outside, but tightly restrict access from the outside. "Open" can also describe whether network technology is based on widely accepted standards that multiple hardware/software vendors support, versus a closed or proprietary system that is dependent on a single developer (or very few).

Computers and Computer Systems VII

RECENT TRENDS IN COMPUTER SYSTEMS

OPEN SYSTEMS.

The most significant trend in computer systems, aside from the expanding capabilities of the devices that make up those systems, has been the growth of compatibility between software and hardware products from unrelated suppliers. Formerly, all components of a computer system originated from the same manufacturer, and there were no guarantees that these would be compatible with similar components from other sources. In the more distant past, although it still occurs today, was the common practice of designing software that could only be run on one manufacturer's computer, or even one particular model from a manufacturer. Open standards in operating systems and CPU instructions have done much to combat such limitations.

Open systems tend to be more cost-efficient and easy to manage, for the buyer isn't dependent on a single vendor and can shop around for the best prices, options, and delivery terms for each piece of the system. When open standards are widely deployed, as in the Windows/Intel PC standard, or better yet, the emerging Internet standards, they can improve worker productivity by offering a familiar interface on different systems. Standards also help facilitate data exchange across companies, such as between customers and suppliers, and allow one company to integrate its system more easily with that of another company, such as in a merger or acquisition.

STORAGE AND MULTIMEDIA.

New storage devices and media (e.g., optical discs and removable storage devices) along with multimedia computing have been two strong development areas since the mid-1990s. In the storage sector, the 1990s saw a parade of newer, and moreover, higher capacity storage devices to support burgeoning storage requirements, critical system backups, and data portability. Hard disk capacity on PCs and workstations, and to a lesset extent on midrange systems, has grown vastly since the early 1990s, when it was common to find new PCs with as little as 100 megabytes of disk space. By 1999, the typical new PC was equipped with at least several gigabytes of storage space, and top-of-the-line models came with 20 or more gigabytes.

While hard drives were gaining capacity, new forms of storage emerged. At the start of the 1990s, CD-ROMs were becoming a popular add-on and began showing up as standard features on higher end systems. CD-ROMs, which store data optically and are read by a laser that scans the disc while it spins, offered benefits to both users and software providers because they had relatively large storage capabilities (approximately 650 megabytes of data) and were durable and cheap to use relative to their size (blank discs in the late 1990s cost less than $2 each at retail). Particularly as software applications grew dramatically in size, it became much more practical both for software publishers and users to install program files from CD-ROMs rather than a dozen or more floppy disks. CD-ROMs also led the way toward multimedia use of computers. At their most basic, CD-ROM drives could play conventional audio CDs provided that the user had a sound card and speakers. Many other kinds of software and data products also appeared on CD-ROM in the early and mid-1990s, including a host of database and information retrieval products for business users. Though less common, recordable CD devices were also popular alternatives for users who needed to save large amounts of data for storage or portability, as well as for software developers who wanted to share internally a quick working demonstration version of a program.

While the speed of CD-ROM drives increased steadily, the medium's capacity was limited in light of the needs of more advanced applications, particularly video. By the late 1990s, a successor technology called digital versatile disc (DVD-ROM) was introduced. DVD-ROMs had significantly greater capacity than CD-ROMs, at 4.7 gigabytes, and DVD readers were backward compatible with all of the older CD technology. DVD-ROM offered enough space to fit a full-length digital movie, spawning a new category in the entertainment markets. A related technology was known as DVD-RAM, introduced in 1998, and it promised double-sided capacities of up to 5.2 gigabytes (to start) in a rewritable format. Planned enhancements to DVD-RAM were expected to boost capacity to 17 gigabytes in the early 2000s.

Another important area of storage development lay in so-called removable storage media, including omega Corp.'s Zip and Jaz drives and similar devices. These high-capacity magnetic disk systems allowed users to store 100 megabytes (Zip) or even a gigabyte (Jaz) or more on a single, removable disk. Second-generation Zip drives, the most popular removable media format, with some 16 million U.S. users in 1999, were designed to handle up to 250 megabytes per disk, while retaining compatibility with the older format. Many of the drives themselves were made as external add-on devices for desktop computers and thus could be shared by several users in an office if so desired. More recently, several leading computer manufacturers have offered these drives as standard built-in equipment.

Finally, on a larger scale, more advanced storage systems for entire businesses are an topic of much interest to system administrators. With the proliferation of storage devices and storage needs, large companies have found they need sophisticated management techniques to coordinate enterprise-wide storage in a timely and cost-efficient manner. One key solution has been the development of storage area networks (SANs), which are networks of storage devices that link to other corporate computer systems. SANs provide a high degree of storage management power and reliability. Similarly, network storage management software can perform computer-selected archiving of unused files to free up space on high-traffic network file servers.

Computers and Computer Systems VI

COMPUTERS IN BUSINESS

OVERVIEW.

Computers are used in government agencies, nonprofit organizations, and households, but their impact has been greatest in business and industry. The competitive nature of business has created demands spurring continuous advancements in computer technology and systems design. Meanwhile, declining prices and increasing computing power has led a large percentage of all businesses to invest in computer systems.

Computers are used to process data in all areas of business operations:

  • product design and development
  • manufacturing
  • inventory control and distribution
  • sales and marketing
  • transaction processing
  • customer support
  • accounting and financial management and planning
  • personnel management
  • internal and external communications
  • data exchange with suppliers, customers, and government agencies (including tax filings)

SYSTEM DESIGN AND CONFIGURATION.

Computer systems may be designed for a specific industry's use, including all necessary software, and as such are called "turnkey" systems. Vendors that provide integrated computer systems include original equipment manufacturers (OEMs) and value-added resellers (VARs). VARs, as wholesalers, buy computers, software, and peripherals often from separate suppliers and configure them as ready-to-use systems. Alternatively, a business can have its computer system (or at least the software) custom-designed by a computer service firm. Increasingly, however, businesses have purchased their computers and other system components separately and installed them on their own, as computers have become more standard and compatible with other makes, and as corporations have built up knowledgeable in house technology support staffs. Likewise, the trend in software has been toward customizing software that is based on widely used standards, e.g., Oracle-based database management systems, rather than embarking on completely new proprietary software ventures.

KEY APPLICATIONS.

The most common uses of a computer system are for database management, financial management and accounting, and word processing. Database systems are used to keep track of large amounts of changing information on such subjects as clients, vendors, employees, inventory, supplies, product orders, and service requests. Financial and accounting systems are used for a variety of mathematical calculations on large volumes of numerical data, whether in the record keeping of financial service firms or in the general accounting tasks of any business. Using spreadsheets and database management software, computers may be used by accounts payable, accounts receivable, and payroll departments, among others. In addition to merely processing and tabulating financial data, companies use computers to quickly analyze their cash flow, cost-efficiency, and other critical performance information.

Databases are also used to help make strategic decisions through the use of software based on artificial intelligence or other specialized tools. Database technology is increasingly being applied to storing human knowledge, experience, insights, and solutions to past problems within specific business fields. This is known as a knowledge base. Knowledge bases are frequently associated with expert systems, which are a special form of management decision support systems. The goal of such systems is to provide a decision maker with information that will help him or her make the best possible choices. These systems attempt to harness the knowledge and experience of the most highly trained individuals in a field, and pass this information along to everyone in the business who works in that field. This is increasingly important in complex or rapidly changing professional environments. Expert systems are used in regulatory compliance, contract bidding, production control, customer support, and general training, among other areas.

Computer systems increasingly are also being used in telecommunications. Whether over standard telephone lines, higher speed network cable, or fiber-optic lines, businesses routinely use computers for a host of internal and external communications functions over computer networks. These functions include voice and electronic messaging as well as data exchange. The tremendous growth of the Internet and World Wide Web has played a major role in advancing the communications side of computing.

Computers and Computer Systems V

NETWORK COMPUTERS AND INTERNET APPLIANCES.

The advent of commercial uses of the Internet and related technologies in the mid-1990s triggered a movement by corporate buyers away from conventional PCs and toward so-called network computers (NCs) and other specialized network appliances. In essence, these devices were cheaper, simpler computers that were optimized for network-based tasks such as Web browsing, e-mail, and so forth. The theory was that corporations were wasting a great deal of money and time by buying and maintaining full-featured PCs when some employees may only needed the limited services of an Internet browser and a word processor. Also known as thin clients, these scaled back computers relied on a network to supply them with most of their computing power, centralizing maintenance tasks such as software upgrades and reducing users' abilities to inadvertently corrupt software on their computers. Although estimates varied widely, some experts believed that companies could save up to 80 percent of the cost of buying and maintaining a new computer by installing an NC instead. Many corporate buyers have viewed NCs not as a replacement for their PCs, but as a supplement to them. In some cases, NCs have been bought to supplant older, terminal-based technology rather than PCs.

While some of the initial product offerings in this category received only a tepid welcome from business customers, by the late 1990s NCs appeared to be taking firm root in some markets. Although the overall NC market was estimated at only a few hundred thousand units as of 1998, some forecasters called for volume to rise to more than 2 million units by 2002. This was still only a small fraction of the projected shipments of PCs by that time, which were expected to be well over 100 million units.

SPEED AND PERFORMANCE

Computers are traditionally categorized by size and power; however, rapid advancements in speed and processing capabilities at relatively low costs have blurred many of these categories. The major determinants of computing power include (1) clock speed, (2) the amount of main memory (random access memory or RAM), (3) the size and architecture of the processor's memory caches, (4) the number and capacity of pipelines to feed data, (5) the number of processors the system has (for higher power systems), and (6) other features of the processor's design and its connection to the motherboard. In fact, the chip's connection has been one of the biggest impediments to speed, as the materials used often don't transmit data nearly as fast as the chip itself. No one of these factors guarantees a faster or more powerful computer, though. For instance, computer marketing has commonly focused on clock speed, measured in megahertz, as an indicator of overall speed relative to other computers. While clock speed is important, if other aspects of the processor's architecture are less efficient, there may be no advantage to having a computer with a higher clock speed. Instead, what matters is how all the processor components work in tandem—even modifying how instructions are sent to the processor from software applications can influence speed (e.g., RISC versus CISC instruction architectures).

Computer performance was once regularly gauged by how many instructions the processor could handle per second, measured in million instructions per second (MIPS), and how many floating-point operations it could perform in a second (FLOPS), measured in megaflops or gigaflops. By the late 1980s, these were increasingly viewed as poor predictors of a computer's actual performance. The response has been to develop new measurements that better summarize a processor's abilities in the context of supporting components. One of the most widely known is a battery of performance tests developed by the Standard Performance Evaluation Corporation (SPEC), a consortium of most of the world's top computer manufacturers and related companies. The SPEC benchmarks are updated periodically to address current trends, and the results of SPEC tests on many computer models are made public.

Friday, 20 November 2009

Computers and Computer Systems IV

TYPES OF COMPUTERS

THE MINICOMPUTER.

Until the 1960s all computers were mainframes. They were large and costly systems intended to support many users, sometimes numbering in the hundreds or thousands. A new class of computers, the minicomputer, was introduced in December 1959 with the launch of the PDP-1 by Digital Equipment Corp. (DEC). However, the term "minicomputer" wasn't used until the introduction of the PDP-5 in 1963. These computers were smaller and cheaper than mainframes, and were also programmed to be used interactively, i.e., in real time, instead of in batch mode. Soon after, Hewlett-Packard Co. and Data General also introduced minicomputers, and eventually Wang, Tandem, Datapoint, Prime, and IBM followed suit.

The distinction between the better minicomputers and the lesser mainframes was largely one of marketing. Generally, minicomputers performed the same functions as mainframes but had less storage capacity, processing power, and speed. Traditionally minicomputers had 16-bit processors (as did PCs by the early 1990s), but later ones were 32-bit (as were PCs by the mid-1990s). By contrast, mainframes tended to have 32-bit and 64-bit processors.

Minicomputers became predominant in the 1970s and served a broad range of businesses. The most widely used included the DEC VAX, starting with the VAX-11/780 in 1977, a 32-bit computer that ran DEC's proprietary VMS operating system. The IBM AS/400, introduced in 1988, was one of the most popular minicomputers for small to medium workloads. More recently, the label "midrange systems" has been used more frequently to describe minicomputers.

Mainframes—high-capacity, expensive systems with centralized processing power, usually accessed by terminals or PCs running emulation software, that can run multiple programs at once. Examples: IBM System/390, Amdahl Millennium 2000.
Mini/Midrange Computers—synonymous terms for powerful computers that can host a range of simultaneous users, such as for network servers or small to mid-size database management systems. Examples: IBM AS/400, Hewlett-Packard 3000.
Microcomputers/PCs—desktop and portable computers with extensive built-in processing and storage capabilities. Examples: Compaq Presario, Dell Dimension, Apple Macintosh G3.
Workstations—powerful desktop machines, often containing multiple processors and running Unix or Windows NT, for resource-intensive business or scientific applications. Examples: IBM IntelliStation, Silicon Graphics 02 R10000, Sun UltraSparc.
Network Computers/Thin Clients—scaled-back PC-like devices with less on-board processing power and little or no local storage of programs or data, which are instead housed in a central computer and accessed through simple, universal software tools, such as Web browsers. Examples: IBM Network Station, Sun JavaStation.

THE MICROCOMPUTER.

The development of the microprocessor, a CPU on a single integrated circuit chip, enabled the development of affordable single-user microcomputers. The slow processing power of the early microcomputers, however, made them attractive only to hobbyists and not to the business market. The first microprocessor, introduced by Intel Corp. in 1970, could handle only 4 bits at a time. Later Intel introduced the 8000 series of 8-bit microprocessors. The Altair 8800 was introduced in 1974 and was the first commercially successful microcomputer, although in keeping with the interests of the hobby market, it was actually a kit. In 1977 the personal computer industry got under way with the introduction of off-the-shelf home computers from three separate manufacturers: Commodore Business Machines, Inc.'s PET; Apple Computer, Inc.'s Apple II; and Tandy Corp.'s TRS-80. These were each 8-bit computers that had a maximum of 64 kilobytes of memory and used only floppy disks for storage, instead of an internal hard disk. Popular home computers at the beginning of the 1980s included the Commodore 64 and 128, the Apple 11, and the Atari 500. A software package from Digital Research, Inc. (later acquired by Novell Inc.) known as CP/M, which stood for Control Program/Microprocessor, was the dominant operating system of microcomputers at this time.

In 1979 one of the most important advances for microcomputers came not in hardware, but in software. That year Software Arts, Inc. introduced the world's first spreadsheet software, VisiCalc. Though crude by modem standards, VisiCalc provided a level of interactivity and productivity that was truly unique at the time. With VisiCalc, businesses for the first time had a reason to seriously consider buying a microcomputer, as even mainframes didn't allow users to sit down and chum out a series of user-defined calculations on the spot. VisiCalc was originally developed for the Apple II, but competing spreadsheets were soon developed for other systems, notably 1-2-3 by Lotus Development Corp. (later a unit of IBM).

The familiar term "personal computer" (PC) was coined by IBM with the 1981 launch of its first microcomputer, which was an instant success and helped set new standards for the industry. This was a second major event, after the creation of productivity applications, that helped raise widespread interest in microcomputers, as IBM's name—and marketing channels—signaled legitimacy for business uses. The term PC came to be used for microcomputers generally, and it was also used specifically to designate computers that were compatible with the IBM standard, which was based on the Intel 80x86 chip and Microsoft Corp.'s MS-DOS (Microsoft Disk Operating System). IBM's choice of Microsoft as its operating system vendor was pivotal in the latter's ascent in the software industry, which later would reach monopoly status. By the late 1980s, MS-DOS had overtaken CP/M as the dominant operating system.

Meanwhile, one notable exception to the industry's coalescence around the IBM/Intel/Microsoft axis was Apple Computer's Macintosh line, introduced in 1984. The Mac was based on Apple's own proprietary operating system and graphical user interface called MacOS. The graphical interface attracted a limited but devoted following for the Mac, mostly from schools and businesses engaged in desktop publishing and other graphical work. It wasn't until the 1990s, with the release of advanced versions of Microsoft's Windows operating system that IBM-compatible PCs would begin to approximate the ease of use Mac users enjoyed. However, new applications and enhancements for the Macintosh platform were slow to develop because Apple chose not to make its operating system available for other developers to freely license and adapt programs to—as Microsoft had done with DOS and Windows—preferring instead to keep tighter control over its product.

By the early 1990s IBM-compatible PCs, which by then were 16-bit or 32-bit machines, had become the fastest growing category of computers. This was largely fueled by business adoption of PCs. Their availability, ease of set-up and operation, and relative low cost brought computer technology to even the smallest of enterprises. The middle and late 1990s saw a race among computer makers to beef up the performance of their PCs and components to improve speed, networking abilities, and multimedia capabilities. By the end of the decade, top-level consumer and business PCs had processors with clock speeds upwards of 500 megahertz (MHz), up from just 66 or less five or six years before. RAM use also skyrocketed, averaging 32 to 64 megabytes, versus just 4 or 8 megabytes earlier in the decade.

WORKSTATIONS.

Workstations are a special class of high-end microcomputers. Some workstations, in fact, are indistinguishable from less powerful midrange computers. Workstations are typically used as standalone machines, albeit usually networked, for resource-intensive applications like computer-aided design, three-dimensional modeling and simulation, and other demanding engineering, scientific, and graphical tasks.

The workstation was first introduced in 1980 by Apollo Computer, which was later absorbed by Hewlett-Packard. It was Sun Microsystems, Inc., though, founded two years later, that soon dominated this market segment by producing affordable workstations from standard, off-the-shelf parts and using an adaptation of the versatile and powerful Unix operating system, which had been developed originally by Bell Laboratories. Workstation performance was further enhanced with the adoption of a microprocessor based on reduced instruction set computing (RISC) architecture, pioneered by IBM in 1986. RISC enabled faster processing by limiting the complexity and diversity of instructions the processor handled. (RISC would later reach the general PC market only in the advanced versions of Intel's Pentium chips and related competitors in the mid-1990s). Sun introduced its first RISC-based workstation, the SPARCstation, in 1989. Soon, other workstation manufacturers such as Hewlett-Packard followed Sun's lead by combining RISC hardware and Unix software. This emergent standard helped generate interest in workstations by making them more compatible and consistent across manufacturers.

Computers and Computer Systems III

THE FIRST THREE GENERATIONS OF COMPUTERS.

The first commercially successful computer was the UNIVAC I, a name derived from "universal automatic computer," introduced by Remington Rand Inc. in 1951. It was based on Mauchly and Eckert's second computer, the ED VAC (from "electronic discrete variable automatic computer"), after the researchers had sold Remington Rand the rights to their invention. EDVAC used instruction cards developed by mathematician John von Neumann (1903-1957), and thus became one of the first computers with stored programs. Again, there were other contenders for this title, including another machine developed in Britain. Some believe that the ability to store programs is a defining characteristic of computers, and thus earlier machines like ENIAC don't qualify as computers. However, like ENIAC, UNIVAC and other first-generation computers used vacuum tubes as their primary switching components, and memory devices were made from magnetic drums and thin tubes of liquid mercury. The U.S. Census Bureau took delivery in 1951 of the first UNIVAC machine, weighing 8 tons and consisting of 5,000 vacuum tubes, at a price tag of $159,000 (equivalent to over $1 million in current dollars). In 1954, General Electric Co. acquired the first UNIVAC for commercial purposes, using it to process payroll data.

Due to their high costs, the first computers were aimed at government and research markets, rather than general business and industry. This changed once International Business Machines Corp. entered the computer industry, for the company already had an established sales force and a commercial clientele through its business of leasing electric punched card tabulating machines. The IBM 650 computer, introduced in 1954, used existing IBM punched-card readers, punches, and printers, which its clients already had, and it was also affordable because businesses could lease it. The first business enterprises to rely on computers were those that needed to process large volumes of data, such as banks, insurance companies, and large retail operations. The 650 became the most widely used computer in the world by the end of the 1950s, with 1,800 systems installed, making IBM the world's leading computer manufacturer.

The invention of the transistor in 1947 provided a substitute for vacuum tubes in the second generation of computers. Consequently, the physical size of computer systems was reduced, and reliability improved markedly. Transistor-based computers weren't shipped in large quantity until 1959, however. Computers of this period all used magnetic-core storage systems for main memory. Some used magnetic drums or disks in addition to magnetic tape for auxiliary memory. Examples include the IBM 410 and the Honeywell 800.

The third generation of computers, dating from the mid-1960s to the mid-1970s, used integrated circuits and large-scale integration, in which large quantities of transistors were put on a single wafer of silicon. They were also the first to use operating systems and database management systems. On-line systems were developed, although most processing was still done in batch mode. Examples of computers from this period included the IBM System/360 and 370, the Amdahl 470 V/6, the Control Data 6000 series, the Burroughs 5500, and the Honeywell 200. Amdahl Corp. was credited with reviving the industry in the 1970s by creating better and cheaper machines and spurring competitive developments by IBM and others.

Computers and Computer Systems II

HISTORICAL BACKGROUND

EARLY HISTORY.

Precursors to computers include the abacus, the slide rule, and the punched-card tabulating machine. The concept of programmable computing is attributed to the British mathematician Charles Babbage (1792-1871) in the mid-19th century, who took the idea of using punched cards to store programs from the automatic loom devised by Joseph-Marie Jacquard (1752-1834). Babbage worked on developing a machine that could perform any kind of analytical computation, not merely arithmetic. Automatic data processing was introduced late in the 19th century by statistician Herman Hollerith (1860-1929), who created an electric tabulating machine that processed data by sorting punched cards. The Hollerith machine was used by the U.S. Census Bureau to process its 1890 census.

There is considerable debate over when the first electronic digital computer was invented. Many in the United States have been taught it was the ENIAC, but there are also British and German claimants to the title based on chronology alone, and some also dispute whether ENIAC even fit the definition of a computer. Part of the confusion over the early British computer, called Colossus, came about because the British government kept it a secret, for nearly 30 years, until 1971. The German invention apparently didn't receive much attention because it was created under the Nazi regime in the midst of World War II.

Nonetheless, there is general consensus that many of the major early advances took place in the 1940s. One was completed in 1946 by John W. Mauchly (1907-1980) and J. Presper Eckert, Jr. (1919-1995) at the University of Pennsylvania. Named the electronic numerical integrator and computer, or ENIAC, it was based on designs for an unfinished special-purpose computer made a few years earlier by Iowa State University physics professor John V. Atanasoff. The ENIAC, funded by the U.S. Army to compute artillery shell trajectories, could perform an unprecedented 5,000 additions or 300 multiplications per second. Electronic processing took place through the use of 18,000 vacuum tubes, and the device was programmed manually by plugging wires into three walls of plug-boards containing over 6,000 switches. The tendency for vacuum tubes to burn out, coupled with the difficulties of operating it, made the ENIAC rather unreliable and labor intensive to use.

Thursday, 19 November 2009

Computers and Computer Systems I

A computer is a programmable device that can automatically perform a sequence of calculations or other operations on data without human aid. It can store, retrieve, and process data according to internal instructions.

A computer may be analog, digital, or hybrid, although most today are digital. Digital computers express variables as numbers, usually in the binary system. They are used for general purposes, whereas analog computers are built for specific tasks, typically scientific or technical. The term "computer" is usually synonymous with digital computer, and computers for business are exclusively digital.

The core of any computer is its central processing unit (CPU), commonly called a processor or a chip. The typical CPU consists of an arithmetic-logic unit to carry out calculations; main memory to store data temporarily for processing; and a control unit to control the transfer between memory, input and output sources, and the arithmetic-logic unit.

A computer as such is not fully functional without various peripheral devices for input and output and other functions. These are typically connected to the computer through cables, although some may be built into the same unit with the CPU, and occasionally non-cable technology, such as radio frequency, is used to link peripherals. Standard input peripherals include keyboards, mice, trackballs, scanners, light pens, microphones, and magnetic strip readers. Common output devices include monitors, printers, plotters, and speakers. Modems facilitate both input and output by communicating with other computers. Other types of peripherals include data storage devices like tape drives or optical disk drives for expanded storage capabilities or system backups.

Finally, for a digital computer to function automatically, it requires programs, or sets of instructions written in computer-readable code. This necessary component is commonly known as software.

The distinction between a computer and a computer system is largely academic, since the terms are used interchangeably. The familiar desktop computer set-up is technically a system, since nearly all such computers are used with at least a couple of the peripherals mentioned above. The notion of a standalone computer may be foreign to many people, but for business and technical purposes there are of course specialized computers that don't require such devices to serve their functions.

Information technology Infrastructure

Information technology (IT) is the lifeblood of most businesses. It is used to fulfill administrative and production requirements, and it crosses all industries. Generally, the larger the enterprise, the greater the need for a sophisticated and professionally managed information technology infrastructure.

Computers are at the core of IT, but to say information technology is just computers is to ignore the complexity and diversity of the technologies businesses need to stay afloat in the 21st century. A more complete definition of IT is this: all of an organization's hardware and software for storing, retrieving, transmitting, and managing electronic information. Information in this context is in its broadest sense and includes images and digitized sound and video. Among the tools companies use to manage their information are:

  • personal computers, terminals, and workstations
  • network servers (including Internet) and other networking hardware
  • mainframes
  • scanners, printers, and other peripherals
  • all forms of software, including proprietary systems and site licenses for off-the-shelf packages.

These technologies serve many purposes in a business. Some are purely logistical or convenient and thereby save time and resources. Others are essential to the company's output or its competitive advantage. Examples of IT's benefits to different areas of an enterprise include:

  • timely and efficient delivery of products and services
  • higher sales through better understanding of customer behaviors
  • cost savings from fewer staff hours and reduced human or machine error
  • better resource planning through detailed, accurate, and timely financial information.

Medium to large corporations oversee their often substantial investments in these technologies through a specialized department that may be known simply as IT, or as information systems (IS) or management information systems (MIS). This area may be under the direction of a chief information officer (CIO), but many IT departments report ultimately to the company's chief financial officer (CFO).

ACQUISITIONS AND UPGRADES

Most large organizations must purchase and install new hardware and software on a regular basis. In order to do so effectively, IT managers must be familiar simultaneously with business needs and available technologies. Some purchases may be very routine; the corporation may only need additional units of existing devices it has already implemented, or, as is the case with software upgrades, there may only be one logical course of action. However, many IT-acquisition decisions demand strategic vision for the organization. IT decision-makers must be able to match present and future needs with technological solutions, often in the face of rapidly changing technologies and severe financial repercussions from choosing the wrong technology.

The acquisition process can be especially troublesome when custom software is being implemented. These projects are notorious for exceeding cost estimates and taking longer than planned. Moreover, custom software clients must be wary, after added time and expense, of whether the new system will serve all of the needs it was intended to satisfy—and without loosing the essential strengths and capabilities of the system it is replacing.

SERVICE AND MAINTENANCE

Another requisite to owning information technology is ensuring that it is compatible with other technologies already in place and that it functions properly. Compatibility issues extend from making software applications work together on a single computer to allowing substantially different computer systems to share information. A mix of new and old technology can present special challenges. Over time, most computer equipment requires some form of servicing, usually due to component failures, user mistakes, or obsolescence. This aspect of IT isn't trivial: performing routine service and maintenance in a large IT environment may require a substantial investment in technical staff hours (or outside services) and replacement equipment.

The Impact of Information Technology on Jobs

Information technology has significantly advanced the way businesses do business and the way people do their jobs all over the world. Data, information and research are available at the speed of light, and workers everywhere have access to it. Be it on the Internet, on a company intranet or on a mobile phone or some other type of electronic device, the technology is farther along now from than ever before, with new advances progressing daily.

Before Information Technology

1.    Before current technology came to the forefront, we had about seven basic modes of communication: telephone, telegraph wire, television, radio, mail, fax machines, eventually the pager (or beeper) and the grapevine---over the fence. Many of those technologies were barely old enough to be fully retired when telephonic mobility, the Internet and intranets came into play. Radio signals and wires, plus telephonic cabling, gave us the ability to transport and transfer tons of information faster than the Pony Express, the wiretaps or even physical travel could take it; now those tons of information have been broken into bytes of information that move even faster.

The Information Age

2.    The advent of the mobility, people being attached to their information sources 24 hours a day, 7 days a week, is still a relatively recent occurrence. With the progress that is being made, there is no telling what is in store in the future, but we are already more than several light years away from where we were just 20 years ago. On the job, workers are able to access the information they need within 2 seconds instead of 2 to 7 days. These technologies have all but obliterated the need for post offices or even the expense of overnight mail due to highly sensitive documents that can be encrypted (electronically scrambled) and digitally signed. Also, due to increased security measures, it is virtually impossible for outsiders to access sensitive or private company information. Hacking and computer viruses seem to be on a huge downward spiral.

At-Work Technologies: Software

3.    From agriculture to zoos, thousands of different types of technology are at work and in place to help workers get their jobs done faster and easier and without having to haul down heavy boxes for what used to amount to hours of visual inspection looking for documents or files. Software technologies have also made it possible for people to work from home or to work remotely just about anywhere and also to do their work in a much more efficient and independent manner. For instance, in the garment industry, preproduction (or CAD, computer-assisted design) software packages are used to make digitized dress and clothing patterns. The patterns are then marked up with computerized fabrics that cut the time and wasted materials for physical sampling by about 30 percent.

At-Work Technologies: Hardware and Peripherals

4.    In all environments, no matter what the industry, software technology would be nothing without the tools that make it all possible. These tools include computers and monitors, printers, scanners, digital cameras, web cameras, video cams, handhelds (personal digital assistants or PDAs), printers, digitized faxes, mobile phones and hard line phones, copy machines, duplicators, intrusion alarms and monitoring equipment and microphones.


Specifics on Available Technologies

5.    Information technology on the job is available for many different types of uses and applications, web based or off line. Some of these technologies include e-Commerce (the ability to shop, keep inventory, keep track of returns and deal with customer service issues); e-Learning (the ability to go to school and earn a certificate or degree without having to leave home); day trading (the ability to self-purchase stock and stock options without having to physically call a broker); voice-over-Internet protocol (VoIP--the ability to plan meetings and see, talk, hear and speak with people thousands of miles away and all over the world by using inexpensive hardware and software); and also by the use of wireless communications such as cell phones and PDAs.

 

The Impact of IT on Organizations


Information technology (IT) is dramatically changing the business landscape. Although organization cultures and business strategies shape the use of IT in organizations, more often the influence is stronger the other way round. IT significantly affects strategic options and creates opportunities and issues that managers need to address in many aspects of their business. This page outlines some of the key impacts of technology and the implications for management on:

  • Business strategy - collapsing time and distance, enabling electronic commerce
  • Organization Culture - encouraging the free flow of information
  • Organization Structures - making networking and virtual corporations a reality
  • Management Processes - providing support for complex decision making processes
  • Work - dramatically changing the nature of professional, and now managerial work
  • The workplace - allowing work from home and on the move, as in telework

"Many of the impacts of Information Technology are straightforward. But they are not necessarily obvious, nor are they trivial" (Jack Nilles, Centre for Future Research)

Survey after survey has shown that many organisations are failing to grasp the opportunities that the continual improvements in information technology brings. This page outlines some of the key impacts of technology, taken from an executive presentation that has been delivered to audiences in a range of industries. It highlights the interactions between information systems and strategy, structure, culture, management processes, work, work environment and people. These interactions need the joint attention of business managers, human resource specialists, workers, and systems designers and implementers.

1. Why is this topic important ?

  • Many systems implementations do not meet expectations - or, even worse, fail
  • Many companies have missed strategic (competitive advantage) opportunities afforded by IT

The field is littered with cases of failures, or where introduction of new systems unwittingly altered key organizational processes. Awareness and planning of the issues will help to ensure smoother implementations and increase the overall benefits of information systems.

2. It's a Changing World

We are in the midst of a fundamental transition - from the industrial age to the information age.Yet most of our paradigms and behaviours come to us from the old world. Being so close to this means that we sometimes cannot see the wood from the trees.

  • Eras of Mankind: From physical processes to information ones
  • Industrial Revolution vs Information Revolution
    1760 - today; 1945 - 2000. It's happening 4 times quicker! (Makridakos)
    Note - the industrial revolution gave a performance improvement of 15. In computers that is happening every 7 or 8 years!
  • Today's changing business environment:
    Globalization :: Competition :: Consumer choice :: Demographics :: Lifestyles :: Knowledge intensive
  • The Organizational Response:
    Productivity :: "Customer first" :: Marketing :: Alliances ::
  • Key Needs - responsiveness and adaptability.

3. The Evolution of IT

Not just a response to the environment, but partly a cause of changes e.g. the Chairman of Matsushita says "without doubt the most powerful driver of all is technology".

  • Changing emphasis: Efficiency -> Effectiveness -> Strategic
  • Eras of Computing
    • Era 1: Accounting
    • Era 2: Operations
    • Era 3: End-User/Office
    • Era 4: the 'Wired Society' (Rockart)
  • or Computation - Communications - Cognition
  • 7 Key Technological Trends (there are many)
    - workstations: yesterday's supercomputer on your desk-top
    - portability
    - communications: global networks
    - multimedia
    - smart cards
    - object orientation
    - intelligent software agents
    Result = Total Connectivity

Implications - rethink dimensions of space, time and information (i.e. virtualization)

"any information, anywhere, anytime, anyway you want it" (Davis)
(once we get a few problems e.g. standards, sorted out!)

4. Impact on Business Strategy

These trends open up exciting possibilities for using the resources of space, time and information. In particular information can be made immediately available elsewhere in the organization, it is reproducible at low cost, and is reusable many times.

A strategy planning framework: strategic advantage vs. rethinking deployment of resources

Some examples:

  • US airlines - ticket processing in Bermuda (remote back offices)
  • Retail - distribution flows instantly linked to customer demand
  • Teleselling - round the clock!
  • Customer service: 0800 numbers routed overseas for responses
  • Electronic markets - even in second hand car parts!
  • Benneton - speeding up the value chain from New York to Italy
  • Emerging knowledge markets, such as Bright. (Update 2000: The Bright marketplace no longer exists.)

Today, much IT investment and thinking still going into getting on top of old problems. Some proportion needs to go into experimenting with these new opportunities. In the UK, companies like Direct Line have completely changed the shape of home and car insurance through their innovative approaches, while in Internet Commerce, Amazon.com, eBay and others are changing the rules of many traditional markets.

5. The Organisational Impacts

A similar evolution to strategy: efficiency, effectiveness, transformation. However, our understanding is not as advanced. We are into a complex world of individual and organisational behaviours - people are not as logical as computer systems!

We need a better understanding of:

  • Work - what it is, how it is changing
  • People - what they expect, how they behave
  • Work environments - do they get the best out of people?
  • Teams and Groups - how can they be made more effective?
  • Organization structures - hierarchies and networks, the formal and the informal
  • Organization cultures - the values and beliefs that drive behaviours

We shall look at some of these in more detail.

6. Work - more knowledge based, more varied and less structured

Simple cybernetics - as variety in the environment increases so does the variety in the response mechanisms need to increase to build an effective response. We can automate the routine and standard, but can only hope to offer decision support and similar tools for less structured work.

Why do many IT systems and BPR initiatives fail? Because the IS tradition is one of formalisation and standardisation vs. providing flexibility to changing needs.

  • Types of work - clerical, professional, technical, managerial, and (hopefully) strategic thinking
  • Differences in variety and routine, differences in IT support
  • From automate to informate to ?
  • How IT enhances information value activities:- creation; access; processing; storage; communication
  • Moving up the knowledge hierarchy - data to wisdom
  • Storage and access: plenty of data, little information!
  • Communications: also complex but some evidence of what is more effective. The communications hierarchy - from simple electronic mail to world-wide computer conferencing. [Note - there is a whole separate set of slides on the role of IT on business communications]

The main organisational challenge is to understand the nature of the work that is happening in various groups, and therefore what style of IT support is needed. Too often we automate what we think is routine, but our inflexible systems cannot adapt as the 'routine' changes.

7. Changing Organisational Structures

The hierarchy served us well in stable times. We still need the hierarchy for some aspects of management, but not for information flows.

  • More automation of routine + more knowledge work
    = downsizing + diamond shaped + flatter hierarchies = the
     networked organisation
  • Structure is about differentiation - integration
    Differentiation = specializing in skills
    Integration = pulling them together
    Need both - but segmentalism builds barriers
  • Best structure for non routine work is team work
    Therefore structures to meld teams and to co-ordinate different teams
  • IT networks are a powerful mechanism
  • Many examples within large corporations - multi-disciplinary teams, multi-location teams, ad-hoc and changing teams (boundary busting)
  • Todays networks are but stepping stones to neural nets.

IT enables adaptive structures. Think of organisations as neural systems, with sensing mechanisms, information and knowledge flows and responses to stimuli (e.g. customer demands). Our research has shown that too often the organisational aspects of new systems are ignored - job design, team composition, skill requirements all change.

8. People at Work

"People are our most important asset" - If so, then treat them as such!

Sociologists and psychologists can tell us a lot about people. Systems designers often ignore them .. which perhaps is not surprising if a systems design team has no psychologists or human needs awareness!

  • People are not standardized - they are different!
  • People want to be involved
  • Individuals and Information Technology:
    A good system addresses
    - working conditions
    - ease of learning
    - skill enhancing
    - a sense of control
    - feed-back
    - variety
    [items from a check list on system usability]
  • Involve users early
  • Accommodate individual differences:
    - in work-style
    - in personalities
    - in learning strategies

The basic principles of socio-technical design were established in the 1950s. Today they are too often ignored! We need to move from systems engineer designed systems to user centred design approach. This is not designing systems for the user, nor letting the user design their own, but designing systems together in co-operation. There is a wealth of information from the HCI community on this aspect of systems design.

9. Flexible Work Practices

Demographic and lifestyle changes are changing the shape of the firm - more contracting out, more temporary staff. IT can help with flexible work practices in several areas:

  • Remote working ('front office and back office in different locations)
  • Teleworking - from home, from other locations, from hotel rooms etc.location independence
  • Flexible offices - optimising space and facilities to meet tomorrow's needs, not yesterdays!
    Note that technologies such as CTI (Computer Telephony Integration) create opportunities for location independence within a building.

10. Working Environment

Traditional offices were designed for people and paper. More recent offices were built for information systems, computer rooms, cabling etc. But now we must rethink the office again - and shift from homogenised work spaces to heteregenous 'environments' for different types of tasks and socialisation e.g. for creative work, customer liaison and co-operative work.

This means more 'environments' (e.g. fully equipped meeting rooms and less personal space. Architects, designers, furniture specialists and ergonomics specialists are all part of the systems team!

11. Integration - Across IT, HOF and Strategy

Integration is the nub of the issue. If it were so simple we would have done it long ago!

We have developed a useful integration model that can be used in multi-skilled team development:

  • Integration of Business needs, information systems and HOF (human and organisational factors)
    i.e. strategy/operations - technical - social.
  • The integration wheel - from generic to specific
  • Many types of integration and integration processes
  • Depends on purpose and perspective
  • Don't get locked into one part of the model - move around the sectors, and outwards, as well as inwards.

The best way forward for any organization is to go through a process of developing their own integration model or business architecture.

Summary

The impact of IT on organizations, or parts of organizations, depends on many factors including:

  • Organizational culture
  • Stability of Business Environment
  • Nature of Product and Service (e.g.. how information intensive)
  • Degree of routinization of work
  • Overall intensity of information technology usage

The main impacts are:

  • People - their work, their feelings and motivation
  • New ways of working and work environments
  • The effectiveness of work-groups
  • New communication patterns - changing people's and management's roles and power
  • Organization structures - IT helps knowledge networking; it also allows significant restructuring for standard operations (e.g. BPR)
  • Integration - across functional and geographic barriers
  • New strategic possibilities exploiting information, time and space (e.g. Internet Commerce)