Moore’s Law is an empirical observation—that the density of computer memory chips doubles about every 18 months, and it has been doing so for the past four decades. Magnetic storage capacity, and to a less-regular extent, processor speeds and telecommunications bandwidth have also been increasing exponentially in a complementary fashion. We all know the results, not just in consumer products but also in industrial and business processes that soak up this capacity in all sorts of creative and unexpected ways. I was reminded of this the other day when I was asked by a journalist to comment on the failed AOL merger with Time Warner. My response was that the merger DID work—it is called YouTube. YouTube combines AOL's feeling of being part of a virtual community, with network television’s ability to deliver video entertainment and news. In 1999, Steve Case had a vision of this merger, but given the underlying technological base, he thought it would require a combination of two very large corporations, with gleaming corporate headquarters designed by famous architects, with armies of highly-paid vice-presidents, chauffeured Town Cars, expensive meals, etc. YouTube pulled the merger off on a comparative shoestring, but YouTube couldn’t have happened if it weren’t for Moore's Law, which provided the bandwidth and video capability that was not there in 1999. Computing does not proceed in a linear fashion, although the steady progression of Moore’s Law may be an exception to this rule. The late Alan J. Perlis once said, “Computer Science is embarrassed by the computer." So are historians of computing. We have no choice but to follow the computer, however awkward or embarassing a place it puts us in.