Hypertext aspires to link information, implicitly all information, but does it care where that information leads?
The atomic scientists on the Manhattan Project often tell of the horror of their creation but the thrill in the making. As an elite community they assembled the newest knowledge about reality itself and unlocked power unknown and unforeseeable before their own lifetimes.
As We May Think describes a machine that would give an individual scientist ready access to an equal mass of knowledge as that of the Manhattan Project, this time in microfilm, not a collection of human minds.
The internet is a "child of the '60s" built by children of the '60s on the possibly misguided notion that all computers in the world should be connected into a single network as peers. What made them think that was a good idea?
The buildout of the web involved installing browsers on the recently deployed "personal computers". Wiki makes read/write possible even when these browsers didn't include the intended editing capability. Wikipedia picks this up with the charter to assemble all human knowledge.
The march of hypertext pauses briefly to ask, can we trust wikipedia? The answer seems to be yes. At least we aren't asking that so much anymore since we all find it so useful.
Wikipedians show a burst of moralish judgement when they vote to black out Wikipedia in protest of threatening copyright law. They and others prevail but not without consequences.
The distributed web has no center to which moral decisions made elsewhere can be applied. The Law (where did that come from?) goes after DNS, the domain name system. Kim Dotcom is shut down, briefly.
So with this history in mind, is it a good idea to break this last thread of control over hypertext by flooding the worlds computers with anything we might think to write?