It’s hard to imagine life without the Internet: no smart phones, tablets, PCs, Netflix, the kids without their games. Impossible, you say? Not really, because we have the Internet thanks to a series of conditions in the United States that made it possible to create it in the first place and that continue to influence its availability. There is no law that says it must stay, nor any economic reason why it should, if someone cannot make a profit from it. It exists because people, public institutions, and corporations want it, pay for its use by subscribing to an Internet service, and purchased a device that accesses it, such as a smart phone. As long as these communities can live with each other, we will have the Internet. Maybe.
American history teaches us that availability of the Internet is a precarious one, constantly undergoing good and bad changes and that routinely needs the attention of all three constituencies—citizens, public institutions, and corporations—to sustain it. Corporations are very active, because there are revenues and profits to be made off the Internet. Regulators and lawmakers are a bit less active, trying to keep up and deal with the corporations. And the least involved are the people. Normally, the history lesson is that the balance of involvement works “good enough,” but periodically everyone has to engage in an information policy issue more intensely than normal. We are at one of those times now.
After the 2016 national elections, the nation is trying to get back to the concerns of daily life. A new administration is picking up on issues placed on hold during the campaign; a new Congress is turning its attention to new policy issues, reflecting the nature of its political makeup, while federal regulatory agencies are busy as well. Lobbyists have returned to work on their clients’ interests. Citizens must do so as well, as historians consider them to be some of the most intense users of information in the world.
The role of the Internet is a subset of a bigger issue, that of the role of information in society. Facts were as important in shaping how the nation evolved as any other factor: democracy, good economy, education, immigrants populating our continent, and so forth. For over two hundred years, Americans promoted access to information because availability made it possible for them to thrive economically, to lead better quality lives, and to deal with the threats to the nation’s health, military security, and environmental conditions. We can skim off history its top lessons that to apply to the Internet today.
One certainty is that Americans will continue to rely extensively on information to assist in all that they do. But, in so far as the role of information goes, the United States is at a crossroads, having to decide whether information stays roughly “more-or-less” as it has been or begins to make conscious decisions about what should change. Historical evidence—not cynicism—suggests that the United States will move incrementally through one information-related issue to another, changing its collective practices and policies in an evolutionary manner to satisfy whatever constituencies want or can give ground on. We have the experience of using PCs for over 40 years, computers for over 75 years, and telecommunications for over 150 years from the telegraph to the Internet. The discussion is bigger than just Republican vs. Democratic perspectives, because it involves all three constituencies, each of which is larger than any major political party. And people will take action in support of their viewpoints.
Taking proactive actions is not without precedence. The process by which the constitution was developed in the eighteenth century offers an exemplary precedent, but even then it was a two-stage process: the base document followed by a dozen amendments. Decisions were consciously made in support of the free flow of information. With the constitution in place and yet an entire governmental infrastructure to create from scratch, the Congress made it a high priority to establish the US Post Office, the nation’s first country-wide information infrastructure. It was easier in those days to be overt in policy matters as the number of people needed convincing to sign on to a policy was smaller and the number of citizens engaged in debates were more muted than today. The Revolutionary War Era was not a onetime event. Proactive action would be taken at other times, and even in today’s complex social and information ecosystem it still can be.
Passage of the Land Grant College Act (1862) was another one of those moments. What is now evident is that the genius of that law was not how Congress funded universities, but in crafting their mission. Universities had to have far more than an educational mission of teaching students. These had to create new information, innovative ways of doing things, and proactively put those developments in front of American citizens. That is why, for example, a national network of county agents climbed on their horses, hitched up their carriages, or drove their cars and pickup trucks and visited most, if not all, family farms in America over the next century. Much New Deal legislation in the 1930s offers other examples. In time, we may come to see some of the decisions of the Federal Communications Commission (FCC) that opened up the Internet to growing swaths of the public in the 1970s-early 2000s as important milestones. The FCC did this as new tools became available which made using the Internet easier.
But history teaches one other lesson: if you want to keep your Internet, pay attention to what the national government and companies do, engaging in the political and economic activism, because that activity has always shaped access to information. Protecting it today is as important for the future of information in the United States as was the protection Americans gave to the Post Office for over two centuries. Most certain is that what gets done on the Internet will keep changing, for better or for worse, and that reality is driven by what the three constituencies do.