It was never designed Ñit just grew like
Topsy. It is largely manual system masquerading as computer system and its data substructure is antediluvian.
Take hypertext Ñ the blue words that, when clicked on, take you somewhere else. It looks very computerised. But the invisible pointers behind the blue words have to be hard-coded, so they all have to be entered, and kept up to date, manually.
Worse, the links point to physical addresses, and the linking system has no intelligence, so if an address changes the pointer then points nowhere Ñ and up pops that nasty 404 message.
Partly for those reasons, and because hypertext is fundamentally out of step with humans, those who code the links are compelled to assume omniscience.
They must presume to know where users will want to go from those blue words; they must presume that they will want to go somewhere “only” from those words; they must presume that those are the only places they will want to go; and they must presume that those places will remain places for as long as the pointers remain pointers.
That matters little when hypertext is there to navigate inside website or CD-ROM encyclopaedia.
But for roaming the global Web it is unfriendly and restrictive.
The fundamental problem is that hypertext does not work the way humans work. The links we make in thought and conversation are linguistic.
Our take-off points are words Ñ any word. We bounce all over the place. We are not restricted to some prearranged links hard-coded into our lives.
The only things that work in human society are those that fit the way humans work. So in the web we need to be able to leap linguistically from anywhere to anywhere. Links must be dynamic, automatic, universal, and as up to date as now.
The closest thing to linguistic linking in today’s web is provided by the search-engines. But they are stuck on the outside of the web like so many ticks on cow. No tick can suck whole cow dry; no search-engine can index the whole web.
There are hundreds of millions of pages on the web and the best engines have indexed only tens of millions. And the indexes can never be up to date, because their spiders (software agents that crawl round the web dragging out the data) are always far behind events.
To be true computer system, every page must be automatically indexed as it is created; it should be impossible to create page without the index knowing about it.
Then the web would be universally accessible through the index-engine; the index would always be up to date; and hypertext would be redundant (except for menus, etc). Every page would be linked linguistically. The only manual operation would be the creation of pages.
The web’s inability to keep users up to date automatically is fundamental shortcoming. If it had been “designed”, you could ask it for everything new or changed in any subject since you last asked or called.
And in order to prevent out-of-date pages, it should be fundamental that pages can have start- and end-dates, and thus die automatically.
The Internet is also, damnably, 7-bit system. In the 21st century we are stuck with the antediluvian data-coding system of 1960s America (when the US Defence Department germ of the Net was born).
The Net was built only for ASCII (American Standard Code for Information Interchange Ñ text format), which recognises only American, so uses only 7 of the 8 bits in byte.
Now we have global Net, so we need the whole byte, not just for other languages but also for images, sound, and such things as the control characters in WP. In the meantime, everything that uses the 8th bit has to be translated into 7-bit data.
The thing is kludge, mere InterimNet. It is time it was made into what human beings really want. In the mid 1980s New Zealand system had all the facilities described above – but the NZPO killed it (but its like could easily be retrofitted to our part of the Net).
We have found out what can be done even with kludge. Now give us real network in which retrieval matches storage.
An Internet that runs on computers, not on the endless hypertext grindings of web slaves and the errant crawlings of spiders.
Nobilangelo Ceramalus is writer, desktop publisher, graphics designer, webmaster and image processor.