01.05.2017 Views

632598256894

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

through interconnected networks. They are becoming so prevalent in the environment that people<br />

don‟t realize when they are being used.<br />

This trend toward smaller and smaller has been somewhat predictable. In the early 1960s, Gordon<br />

Moore, the inventor of the modern CPU at Intel, developed Moore‟s law, which predicts that the<br />

density of the components on a computer chip will double every 18 to 24 months, thereby doubling<br />

the chip‟s processing power. This hypothesis has proven to be very accurate. Intel recently announced<br />

that it expects that the downsizing of silicon chips with good economics will continue through 2029. It<br />

is interesting to note that Moore is also credited with a second law—that the cost to manufacturers<br />

(R&D, manufacturing, testing) to fulfill Moore‟s first law follows an opposite trend. The cost actually<br />

increases exponentially over time.<br />

This trend toward smaller and smaller is very visible in the offerings for users—laptops, notebooks,<br />

tablets, netbooks, PDAs. Organizations have to determine how best to deliver technology to their<br />

users. Screens formatted for a laptop might not view well on a netbook or PDA. Users expect to have<br />

access to data that they need to make decisions. Users expect powerful software to allow them to excel<br />

at their jobs. Decisions made about user machines impact thousands of computers, not just hundreds.<br />

And while it might be relatively easy to just install a new upgrade of software to a machine, the<br />

organization has to consider the impact on the workers themselves. People tend to resist change;<br />

organizations have to plan for the resistance.<br />

Software<br />

While most older hardware has given way to newer and faster computers, most companies use a<br />

combination of newly acquired software and older, self-developed software. The latter was developed<br />

over a period of years, perhaps more than 25, using COBOL, which, until the early 1990s, was the<br />

standard programming language in business applications. Today, many companies‟ mission-critical<br />

systems still run on mainframe technology, using programs written in COBOL; in fact, there are<br />

billions of lines of COBOL programming code still functional in U.S. businesses. COBOL continues<br />

to be enhanced and more lines of code written each year.<br />

These legacy systems have become a major issue for many, though, and were the key issue behind<br />

the Y2K (2000) problem, which centered on date arithmetic. Many systems stored date fields as<br />

MM/DD/YY and did subtraction to find out the number of days between dates (for example, the age<br />

of an accounts receivable account was the difference between the date of the invoice and the current<br />

date). Subtracting 10/04/99 from 01/04/00 would result in a negative number because the algorithm<br />

was to take the year‟s digits from one date and subtract them from the second date. Much noise was<br />

made about the Y2K problem, fueled by the fact that the programmers for these legacy systems were<br />

no longer with the firms and there was little or no documentation available for these legacy systems.<br />

Some firms used the “impending doom” as a reason to move to ERP systems, while others wrote<br />

work-arounds and still others decided to ride out the storm. When the new millennium rolled in, there<br />

were no significant computer failures and some countries that spent little on the Y2K bug fared as well<br />

as those that spent a great deal. The instant debate was whether the absence of computer failures was<br />

the result of the efforts to compensate for it or the problem itself was greatly overstated.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!