logo

Seven more turning points in tech history

Neil McAllister | Sunday, 22 June 2008


Reading e-mail on the Web is annoying; that was Microsoft's judgment in 2000, back when the Web was a page-based medium. Each HTTP request meant a round trip to the server that refreshed the entire page, which was no good for high-activity applications like e-mail clients. So, to make Outlook Web Access 2000 more usable, Microsoft developers created a way for browsers to communicate with Web servers, by loading small amounts of data asynchronously.

Surprisingly, the idea stuck. Despite a troubled history with Internet Explorer, the Mozilla Project built similar functionality into Mozilla 1.0 in 2002, calling it XMLHttpRequest. The floodgates were opened, and a new way of coding for the Web was born.

It's hard to believe that Facebook, GMail, Google Maps, and countless other Ajax-enabled sites owe their existence to Microsoft's lead. But it's a good thing they did; if they had waited for the W3C to standardize XMLHttpRequest, they would still be waiting today.

In 2003, a black cloud had descended over open source's poster child. The SCO Group, led by new CEO Darl McBride, claimed ownership of key portions of the Linux kernel. Cautious Linux customers were warned to pay license fees to SCO, lest they find themselves on the wrong end of a copyright-infringement lawsuit.

But SCO had underestimated Linux's importance to the enterprise, and particularly to IBM. Why SCO thought it could outmatch Big Blue's lawyers (or its deep pockets) is anyone's guess. What matters is the outcome. One by one, IBM's lawyers knocked down SCO's arguments, establishing Linux's legitimacy as a matter of court record.

As the lawsuits lumbered on, McBride and company were ridiculed, then bankrupted. Meanwhile, the Linux business boomed. With allies such as Computer Associates, IBM, Novell, and Red Hat willing to take up its defense, the open-source operating system was clearly here to stay. Ironically, the lawsuit that was meant to be the death blow for Linux may have succeeded only in ushering in its golden age.

In the early days of PC chip manufacture, speed was the name of the game. All you had to do was crank up the clock cycles and watch performance-hungry customers come running. But as the new century dawned and clock speeds soared into the gigahertz, old chip designs couldn't keep up. They ran too hot and consumed too much power. Enter the Pentium M, a radical new chip pioneered by a team at Intel's Haifa, Israel, labs, led by Mooly Eden.

Though the Pentium M was intended for mobile PCs, with lower power consumption and more efficient instruction pipelines than contemporary CPUs, it became clear that Intel was onto a breakthrough, even for desktops.

Soon the die was cast. Today, Intel's leading Core series of chips, launched in 2006, are derived from the Pentium M, while the company's earlier architecture is due to be retired later this year. From now on, the chips that win the race will have to be not just faster, but smarter

Before Novell NetWare debuted in 1985, transferring files in the typical office meant handing off a floppy disk. NetWare's affordable PC networking quickly took the computing world by storm; by the late 1980s, Novell claimed a 90% share of the market.

But Novell never foresaw NetWare's Achilles' heel. If Microsoft was famously slow to embrace the Internet, the same goes double for Novell.

By most standards, Windows NT was clunky compared to NetWare, but it had one clear advantage: native support for TCP/IP, the core protocol of the Internet. NetWare servers relied on the older protocols IPX and SPX, which made it harder to integrate NetWare servers with FTP clients, browsers, and Internet e-mail. As demand for the Internet soared, businesses began replacing their NetWare servers with Windows, and NetWare networks were soon headed the way of the floppy disk.

Running a large IT shop was never easy, but at least you didn't have the government breathing down your neck. That changed in 2002, however, with the passage of the Sarbanes-Oxley Act.

Enron, WorldCom, and other corporate accounting scandals exposed the need for greater accountability on the part of publicly traded companies. But accountability means auditing, and to audit you need records. Unfortunately, the burden of record-keeping required by Sarbanes-Oxley fell to IT.

By 2005, IT managers found themselves spending an inordinate amount of time and money on Sarb-Ox compliance. Vague rules and unproven technologies left many guessing. And if Sarb-Ox wasn't bad enough, health care companies had the added burden of HIPAA to worry about.

Whether the next administration will see fit to revisit these regulations remains to be seen, but for IT there's no going back. Regulatory compliance has cemented its position as a key component of business operations, for better or for worse.

The Macintosh has always stood apart, even before Apple launched its "Think Different" ad campaign in 1997. In defiance of the x86 platform's dominance of the PC chip market, the earliest Macs used Motorola 68000-series CPUs. Later, when performance demanded an upgrade, Apple switched to the PowerPC, but the net effect was the same: Macs and PCs were as fundamentally different as, well, apples and oranges.

But Apple couldn't fight the tide forever. Performance bottlenecks and high power consumption dogged the PowerPC, and by 2005 its future as a general-purpose processor seemed doubtful. In June of that year, Apple announced that it would begin shipping Macs based on Intel processors, ending 20 years of Thinking Different about CPUs.

In so doing, the PC processor market effectively became a monoculture. Virtually every mainstream computer you can buy today is based on Intel's architecture. Macs can even run Windows. But it's OK, Mac fans; if what's inside doesn't make you feel different, how it looks still can.

As the year 2000 approached, U.S. companies saw the Y2k problem as a threat to their software. What they didn't anticipate was the impact it would have on the U.S. workforce.

Faced with a shortage of hands to address the Y2k crisis, IT departments looked abroad for answers. They found an untapped gold mine. The rise of the Internet, coupled with social and economic reforms, had fostered a veritable army of highly skilled workers in the developing world.

Indian companies, including Infosys and Wipro, were among the first to popularize offshore IT outsourcing, but companies in Russia, Eastern Europe, China, and elsewhere would soon follow. Meanwhile, the Immigration Reform and Control Act of 1990 had created the H-1B visa program, which made it easier for U.S. companies to import foreign workers.

Today the Y2k problem may be long behind us, but these labor trends show no sign of slowing. As more and more countries awaken to the Internet economy, IT workers must struggle to stay competitive in an increasingly flat world.