The swing started with a single computer...
In the early days of computing, we all used dumb terminals to connect to a central computer.
|"PDP-11" by ToastyKen|
Then we got to the "personal" computer with things like the Altair-800 and the Apple ][. No sharing at all!
We fulfilled Purnelle's Law: "one user, one CPU."
In the 80's and 90's the Personal Computers got more and more powerful. Multiple processors and graphics cards concentrated the computing power at each user.
The pendulum reached an end-point sometime in the early 90's when we started networking computers. Centralized data inside companies and email moving across the public Internet started the swing back away from computing at the user.
When the Internet became accessible to regular people, we started seeing the pendulum of computing power move away from the user's workstation and back towards centralized servers.
Larry Ellison of Oracle has been promoting the "thin client" model of computing since the early 1990's.
In this model, lightweight terminals connect to high-powered centralized servers.
Thin clients aren't supposed to be full personal computers. They don't have drives or much local storage, but they do have enough computing power to run rich graphic interfaces.
The rise of the Web changed this direction slightly by letting pretty powerful personal computers readily access data and services on remote servers.
What most people refer to as "cloud computing" nowadays is just a virtualization of these remote servers.
But storing your data on a remote system requires enough bandwidth to store and retrieve that data. New services like Google's Music and the Amazon Cloud Drive require a ubiquitous Internet connection.
Consumers' barely have enough bandwidth for these services now, and the ISPs are constantly lowering bandwidth caps and raising rates.
This does not bode well for the Ubiquitous Internet. It will become less practical to use remote storage for a while as the telcos and cable companies squeeze consumers.
Netflix streaming video service and OnLive's remote game services could really eat up consumer bandwidth allotment.
At the same time, local storage is still growing. While spinning drive capacity growth has slowed slightly, I believe that is caused by limitations in operating systems suppressing demand for larger drives.
The adoption of newer operating systems and the popularity of high-def video content will push the demand for local storage and we should see capacity catch back up the historic rate in a couple years.
Expect to see at least 10 terabytes in your pocket by the end of the decade. If we move from spinning disks to solid-state storage, capacity could grow even faster.
With bandwidth getting more expensive, and storage getting cheaper, the pendulum will probably begin to swing back towards local storage and computing.
Combine that with neighborhood networking like that envisioned by Bob Frankston, and we start seeing a real cloud forming. Every household shares in the bandwidth and shares in the storage. File storage becomes highly redundant and durable.
We already see this sort of storage available with services like Windows Live Mesh and Symform which spread your data across multiple nodes.
But until we get to this mesh, I predict the pendulum will move towards local storage with occasional synchronization over wifi. Streaming will be reserved for special content because of the extra cost.
But once everything is in a peer-to-peer mesh, we will swing into a new golden age...