hypertext   4411

« earlier    

Guide to Cyberspace 6.1: What is the World-Wide Web?
What the World-Wide Web (WWW, W3) project has done is provide users on computer networks with a consistent means to access a variety of media in a simplified fashion. Using a popular software interface to the Web called Mosaic, the Web project has changed the way people view and create information - it has created the first true global hypermedia network.

with an important exception: hypertext contains connections within the text to other documents.

The Web, although still in its infancy, has already enabled many of these examples. It facilitates the easy exchange of hypermedia through networked environments from anything as small as two Macintoshes connected together to something as large as the global Internet.

Because servers usually operate only when documents are requested, they put a minimal amount of workload on the computers they run on.

The phrase "World-Wide Web" is often used to refer to the collective network of servers speaking HTTP as well as the global body of information available using the protocol.

Somewhere along the way, "The World Wide Web" came to mean not just the technologies and protocol's that powered it, but the whole damn thing.
4 days ago by thotw
Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web: Tim Berners-Lee: 9780062515872: Amazon.com: Books
The system had to have one other dunamental property. It had to be completely decentralized. That would be the only way a new person somewhere could start to use it without asking for access.... So long as I didn't introduce some central link database, everything would scale nicely... Any node would be to able to link to any other node. This would give the system the flexibility that was needed and be the key to a universal system.

Hypertext would be most powerful if it could conceivably poin to absolutely anything. Eeach would have an address by which it could be referenced.

> He pithched his idea around, but people rejected it for being too simple

They also insisted on a central link database to ensure that there were no broken links. Their vision was limited to sending text that was fixed and consisten - in this case, whole books. I was looking at a living world of hypertext, in which all the pages would be constantly changing

The basic revelation was that one information space could include them all, giving huge power and consistency. Many of the technical decisions arose from that. The need to encode the name or address of every information object in one URI string was apparent. The need to make all documents in some way "equal" was also essential. The system could not constrain the user; a person should be able to link with equal ease to any document where it happened to be stored.

I therefore defined HTTP [because FTP and NNTP weren't enough], a protocol simple enough to be able to get a Web page fast enough for hypertext browsing. The target was a fetch of about one-tenth of a second, so tehre was no time for a conversation. It had to be "Get this document" and "Here it is!"

Berners-Lee insisted that his hypertext system was powerful because of the Internet. But he had trouble demonstrating that.
4 days ago by thotw
HTTP is said to be stateless. This means it does not need to hold any information about each request. We've solved some of these problems with query variables. Links can point to anything. Unrecognized headers are ignored, this is forgiving.

"to acieve this, the client sends a (weighted) list of the formats it can handle, and the server replies with data in any of these formats that it can produce" - Neither depends on the other.

URI's are flexible enough to support any protocol, not just HTTP. Make's it easy to give an address to anything.

The web is forgiving, and easily tested on which is perhaps responsible for fast-paced iterations on the platform.

The web is dumb, and ultimately, very forgiving. This is a pattern you will see out across a lot of its technologies and design decisions.
5 days ago by thotw
The Curse of Xanadu | WIRED
Nelson's life is so full of unfinished projects that it might fairly be said to be built from them, much as lace is built from holes or Philip Johnson's glass house from windows.

Xanadu, a global hypertext publishing system, is the longest-running vaporware story in the history of the computer industry. It has been in development for more than 30 years

Xanadu was meant to be a universal library, a worldwide hypertext publishing tool, a system to resolve copyright disputes, and a meritocratic forum for discussion and debate.

The inventor suffers from an extreme case of Attention Deficit Disorder, a recently named psychological syndrome whose symptoms include unusual sensitivity to interruption.

Xanadu, the ultimate hypertext information system, began as Ted Nelson's quest for personal liberation. The inventor's hummingbird mind and his inability to keep track of anything left him relatively helpless. He wanted to be a writer and a filmmaker, but he needed a way to avoid getting lost in the frantic multiplication of associations his brain produced. His great inspiration was to imagine a computer program that could keep track of all the divergent paths of his thinking and writing. To this concept of branching, nonlinear writing, Nelson gave the name hypertext.

Nelson records everything and remembers nothing. Xanadu was to have been his cure. To assist in the procedure, he called upon a team of professionals, some of whom also happened to be his closest friends and disciples.

In the end, the patient survived the operation. But it nearly killed the doctors.

On his long walk home, he came up with the four maxims that have guided his life: most people are fools, most authority is malignant, God does not exist, and everything is wrong.

Hypertext was invented during his first year at Harvard, when Nelson attempted, as a term project, to create a "writing system" that allowed users to store their work, change it, and print it out. In contrast to the first experimental word processors, Nelson's design included features for comparing alternate versions of text side by side, backtracking through sequential versions, and revision by outline

The word hypertext was coined by Nelson and published in a paper delivered to a national conference of the Association for Computing Machinery in 1965. Adding to his design for a nonsequential writing tool, Nelson proposed a feature called "zippered lists," in which elements in one text would be linked to related or identical elements in other texts.

Impressed by the literary employees of the publishing house, and wanting to impress them in return, he christened his hypertext system Xanadu.

It was a name of uncanny exactitude. Xanadu is the elaborate palace in Kubla Khan."

To find help, Nelson was forced to go outside official channels. The first disciples he acquired belonged to a group of hackers known as the R.E.S.I.S.T.O.R.S., which stood for Radically Emphatic Students Interested in Science, Technology, and Other Research Studies. Unlike the mainstream programmers Nelson encountered, the Resistors shared Nelson's sense of humor, his mischief, and his lack of respect for authority.

But during a rare period of fierce programming, the three collaborators created an interesting data structure that governed the movement of huge sections of text in and out of the computer's memory. They called their invention "the enfilade."

The first real work had been achieved, and the first concession to secrecy had been made.

Over a network, linked documents, version comparison, and non-sequential writing would create a "docuverse" capable of storing and representing the artistic and scientific legacy of humanity.

Gregory intended to call Nelson, but destiny moved more quickly: the repairman had hardly returned to Ann Arbor when Nelson telephoned the Neuman Computer Exchange and asked the person who answered the phone to trade a thousand copies of Computer Lib for a used PDP-11.

As a guest lecturer in Nelson's class, Miller ran through his ideas for a Xanadu-like software system. Afterward, he was approached by one of the students, Stuart Greene. Miller asked Greene what the reaction to his ideas had been.

Nelson's book brought him growing acclaim, and in 1979, he decided it was time to gather his disciples. He called upon Roger Gregory to lead the effort. Although Gregory was in Ann Arbor, Nelson insisted that everybody move to Swarthmore so he could exercise his influence at close range. Obediently, Gregory rented a house and invited the other programmers to join him. Mark Miller returned to Pennsylvania, where the Xanadu devotees aimed to finish the project in a single, serious summer of coding.

Come September, Gregory stayed in Pennsylvania and rented another house. As programmers came and went, the house provided a frame for Xanadu's slow progress.

he move to Datapoint was a concession to the reality principle, as well as an acknowledgment that the most important aspects of the Swarthmore group's work so far had been design rather than coding. At Datapoint, the Xanadu programmers could explore their ideas in a corporate setting that offered the latest equipment and a decent paycheck.

From its rosy expansion at the turn of the decade, the project had, by 1984, collapsed into a constricted sphere of hackers clustered around Roger Gregory

After witnessing the process for a few months, McClary got the impression that he wasn't part of a software development team but of a sect in the process of self-destruction

They were dead-accurate when they sketched a future of many-to-many communication, universal digital publishing, links between documents, and the capacity for infinite storage. When they started, they had been ahead of their time. But by the mid-1980s, they were barely ahead of it.

He suspected that with the help of Autodesk, which was founded to give its original partners, themselves programmers, a way to produce and sell their tools, Xanadu might be transformed from a cult into a company. And when the founder of Autodesk wrote an enthusiastic note about Xanadu, his executives were inclined to pay attention

Yet in 1988, the Autodesk deal was nothing but good news. On April 6, John Walker issued a press release announcing that Autodesk would acquire 80 percent of Xanadu.

McClary had plenty of experience taking obscure directions from technical managers and turning them into massive, working programs in C. He abandoned his lucrative Michigan consulting practice to rejoin the project he had left unfinished nearly 10 years before.

Gathered together in a nice, new office in Palo Alto, with fully stocked refrigerators and comfortable furniture, the Xanadu team prepared to build the ultimate hypertext system. For once, they had tools, including as much computing power as their hearts desired. Regular paychecks allowed them to be revolutionaries and pay their rent. And even their executive manager accepted that their mission was to change the world.

The basic features of the Xanadu hypertext system planned at Autodesk in 1989 were relatively unchanged from the ones discussed by the early Xanadu programmers at Swarthmore in 1979.

Xanadu was to consist of easy-to-edit documents. Links would be available both to and from any part of any document. Anybody could create a link, even in a document they did not write. And parts of documents could be quoted in other documents without copying. The idea of quoting without copying was called transclusion, and it was the heart of Xanadu's most innovative commercial feature - a royalty and copyright scheme.

Transclusion was extremely challenging to the programmers, for it meant that there could be no redundancy in the grand Xanadu library. Every text could exist only as an original. Every user in the world would have to have instant access to the same underlying collection of documents.

Although Gregory stayed on, the Xerox PARC programmers won all the battles, beginning with the most important one. Gregory's old Xanadu code was thrown away. The programmer's face, seven years later, still goes slack with disappointment when he thinks about it.

Rather than push their product into the marketplace quickly, where it could compete, adapt, or die, the Xanadu programmers intended to produce their revolution ab initio.

Carol Bartz's task during her first months on the job was to take a stern inventory of the company's most promising projects. And four months after she became CEO of Autodesk, Bartz announced that the company's investment in Xanadu was finished.

It was not until a Xanadu meeting in the summer of 1992 that he first felt the cold shock of reality. "This feeling came over me - my God, they are not going to do it," he says. "I had believed them all this time."

Nelson claims not to remember the details of the conflict, but according to Shapiro, the end came at a board meeting in the end of 1992, when Nelson said frankly that he was not going to cooperate with the plans of any company that had Shapiro in control.

"The front end is the most important thing," Jellinghaus slowly understood. "If you don't have a good front end, it doesn't matter how good the back end is. Moreover, if you do have a good front end, it doesn't matter how bad the back end is."

One evening at the end of November 1994, a group of the programmers, with the approval of Miller and Ann Hardy, went to the Memex office and pulled the plug. They carried the machines out with them, leaving a bare space.

He pointed out that the Web still lacks nearly every one of the advanced features he and his colleagues were trying to realize. There is no transclusion. There is no way to create links inside other writers' documents. There is no way to follow all the references to a specific document. Most importantly, the World Wide Web is no friend to logic. Rather, it permits infinite redundancy and encourages maximum confusion.
I pause for a moment to speak to Xanadu. It's … [more]
5 days ago by thotw
James Lloyd's answer to What is the use of http? - Quora
HTTP only presumes a reliable transport; any protocol that provides such guarantees can be used.” e.g. TCP.

HTTP is stateless. The lifetime of a connection corresponds to a single request-response sequence. The pure HTTP server implementation treats every request as if it was brand-new.
5 days ago by thotw
Ted Nelson's two-way links - ReadWrite
“Today’s one-way hypertext – the World Wide Web – is far too shallow. The Xanadu project foresaw world-wide hypertext and has always endeavored to create a much deeper system. The Web, however, took over with a very shallow structure. Our simple, but very different structure – for details see “The Xanadu Model” – allows –

Cheap and democratic as it was, Berners-Lee’s Web didn’t have half the features Xanadu promised to, and two-way linking was one of them. Without a central server it couldn’t be enforced, and to make authorship of pages as simple as possible – given the state of the art at the time – it had to be left out along with automatic attribution, micropayments, copyright management, unbreakable links, and most of Nelson’s other ideas.

But by bootstrapping the current Web piece-by-piece, instead of trying to develop a grand mind-blowing concept like Xanadu, maybe that’s the way to fulfil Ted Nelson’s vision – even if it’s not the exact system he has in mind.
6 days ago by thotw
HyperText Design Issues: Topology
It may be useful to have bidirectional links from the point of view of managing data. For example: if a document is destroyed or moved, one is aware of what dangling links will be created, and can possibly fix them.

A compromise that links be one-way in the data model, but that a reverse link is created when any link is made, so long as this can be done without infringing protection. An alternative is for the reverse links to be gathered by a background process operating on a basically monodirectionally linked web. See Building Back-links.
6 days ago by thotw
A Brief(ish) History of the Web Universe – Part I: The Pre-Web | briankardell
Note that nearly all of the applications discussed thus far, including OWL, were commercial endeavors.  In 1986, authors who wanted to use OWL to publish bought a license for about $500 and then viewers licensed readers for about $100.  To keep this in perspective, in adjusted dollars this is roughly $1,063 for writers and $204 for readers.  This is just how software was done, but it’s kind of different from the open source ethos of the Internet.

It might have been potentially “easy” to create a nice HyperCard stack and auto-transform to HTML based on content type negotiation – but which part was document and which part was application? It was actually much easier to just deliver HTML which could be generated any number of ways – and with the current digital expectations of the day, on the machines they were using, that was just fine. Thus, the simple line mode browser that made the fewest assumptions possible was born as something that could be distributed to all the CERN machines (and all the world’s machines – more on this below).

The Web was free and simple, which speaks to its ideology, but also changed the game a bit and opened up the door for stuff.
6 days ago by thotw
Broad Band – What History’s Female Internet Pioneers can Teach us about Tomorrow - Claire L. Evans - btconfDUS2018 in beyond tellerrand on Vimeo
Wendy Hall

Using links and connections to turn data into knowledge

Wendy Hall - Domesday Discs - BBC project to digitize everyday British life. Released on lazer discs.

Hall dedicated her time to creating a system for libraries based on the Domesday Discs and hypertext. University of SouthHampton

By 1989 she had created an entire hypertext system called Microcosm.

On the web, 404 links mean some linsk are lost. Microcosm used a database called a linkbase that stored just links. It communicated with documents, and wasn't embedded directly in documents. Links could have multiple sources, multiple destinations.

Intermedia - first sytem to include a hyperlink. Brown U

Sun Link System - Created at sun microsystems by Amy Pearl

Symbolics document examiner - Created by Janet Walker. First system to have bookmarks.

Notecards - Xerox PARC, co-developed by Kathy Marshall

From the book:
"Any text referencing another is considered a form of hypertext: sequels.... footnotes, endnotes, marginala; and parenthetical asides"

"The DOmesday Discs were interactive, using interconnected links that could be navigated with a cursor, much as we've accustomed to doing on the web today."

Southampton University had a massive archival library.

"The Mountbatten archive was the perfect test case for a hypertext project: a vast, interrelated collection of documents, spanning many differnet media, subject to as many reading as there could be perspectives"

"Because of the nature of Microcosm links, that connection isn't isolated to a single underlined, hyperlink-blue instance of those words. Rather, it's connected to the idea of Gandhi, following the man wherever his name ma turn up, across every document in the system."

"In the years between 1984 and 1991, a flurry of hypertext systems like Microcosm emerged from univerisites and from research labs at technology companies like Apple, IBM, Xerox, Symbolics, and Sun Microsystems. Each suggested different linking conventions..."

"NoteCards the first system Cathy worked on at Xerox PARC, was modeled after the kind of old-school writing techniques about which we'd soon find ourselves debating the relative merits. The software emulated the way you wrote papers when you were in junior high: with notecards and filex boxes. Using hypertext links, users could chain their cards into complex collections, sequences and mental maps."

NoteCards influenced HyperCard

Hypertext '87 was the first hypertext fconference in Chapel Hill, North Carolina.

Evans' main point is that hypertext is about the why between links and fundamentally about the way in which people use software. It was transformative in this way.

"They also demonstrate just how complex and nuanced hypertext can be, when the technology is explored ot its fullest potential: it supports not just links but entire menal maps, systems that model - and more important, change - our minds."

The web debuted at Hypertext '91, but no one really paid attention. Still by '93, half of all demos were Web based and TBL became hypertext famous

"I saw the Web through a Microcosm viewer," she explains "Of course, Tim saw it completely the other way around."

Here's where I won't start. The Web. Because hypertext certainly didn't start there.

Talk through a few experiments with hypertext, but also about it's goal more largely.

One of the web's biggest differences is with broken links, or only one-way connections. It removed, in essence, the "Why" of hypertext.

Web technologies were simplistiic and dumb by design, both to their advantage and detrement.

Evans points out that the web actually lacks many of the features of advanced hypertext systems. This was by design. It was the nature of decentralization.

Want to know what the web is? HTTP, HTML and the URL. That's it.
6 days ago by thotw
Undum returns
Undum, an interactive fiction platform that was never as widely adapted as it should’ve been, is now on Github, complete with all the documentation. Go write something!
Links  IF  hypertext 
8 weeks ago by samplereality
Uncle Buddy’s Phantom Funhouse
still want to play this and read the linked articles, video
Shadsy  HyperCard  PhilSalvador  game  document  review  2018  ArthurNewkirk  UncleBuddy  SF  JohnMcDaid  1993  Mac  hypertext  cassette  music 
9 weeks ago by cosmic
Wikirace - Wikipedia
A Wikirace (IPA: /ˈwɪ.ki.rɛɪs/) is a race between any number of participants, using links to travel from one Wikipedia page to another. The first person to reach the destination page, or the person that reaches the destination using the fewest links, wins the race. Intermediary pages may also be required.
wikipedia  wiki  games  hypertext 
11 weeks ago by jbrennan

« earlier    

related tags

!post:twitter  #elit  #mustread  1945  1977  1980s  1989  1990s  1993  1994  1995  1996  1997  2000s  2012  2016  2017  2018  _excellent  academia  academic  accessibility  address  advertising  aesthetics  algorithm  algorithms  analytics  animation  annotation  api  apple  apple2  apple2gs  application  apps  archive  archives  archiving  are.na  arena  argument  art  arthurnewkirk  article  artwriting  ascii  assemblage  assistant  aswemaythink  attention  augment  augmentedreality  autobiography  behind  bento  bewsterkahle  biography  biopolitics  blogging  body  book  bookmarking  bookmarklet  boon  browser  brutalist  capture  cassette  categorization  cern  charlesbroskoski  client  clissification  code  collaboration  collection  colonialism  communication  complexity  computation  computer-science  computer  computers  computing  conceptmap  connections  consumption  content  context  cool  coolthings  copyleft  copyright  copywriting  creativecommons  criticism  crowdsourcing  culdesac  culture  curriculum  cybernetics  cyberspace  danbri  data  databases  death  debt  decentralization  decentralizedweb  dematerialization  democracy  denegrigar  design  development  device  diaspora  digital  displacement  distributed  distribution  dita  document  documents  doors  douglasengelbart  dreams  duchamp  eff  elit  elo  emulation  environment  epub  experience  facet  failure  favorites  feminism  files  filing  film  foaf  folksonomy  fragile  frontier  future  game  games  gardens  generator  gernsback_machine  global  gnu  google  gopher  grading  gui  guilt  henrikolesen  hierarchy  highway  history  homeschool  howto  howweread  html  http  hyper  hypercad  hypercard  hypergate  hyperlink  hypermedia  identity  if  image  imagination  imagine  inbetween  inclusion  infomanagement  information-typing  information  infrastructure  innovation  instapaper  institutions  intent  interactivefiction  interactivepublishing  internet  internetarchive  interpretation  interstitial  interviews  javascript  johnmcdaid  johnseelybrown  jornalismo  journalism  js  judithhopf  karlywildenhaus  knowledge  language  learning  lessons  librarians  libraries  library  link  linkjournalism  links  listening  literature  longform  luisbunel  mac  macintosh  macos  manuelarturoabreu  mapping  market  marthalampland  max  meaning  media  memex  memory  metadata  metaphor  microfilm  microformats  microsoft  mike-caulfield  military  miscellany  mlis  mobile  model  models  mosaic  multimedia  multivalence  multiverse  museums  music  myst  narrative  netart  netscape  network  networking  networks  nielsennormangroup  nodecard  noetic_umwelt  olialialina  online  ontology  open  organization  p2p  participation  pathfinders  paulduguid  paulford  paulotlet  pc  pedagogy  performance  phd  philosophy  philsalvador  photography  physical  plagiarism  platform  podcast  poetry  poland  preservation  privilege  productivity  programming  project  protocol  protocols  psychographics  publishing  python  queerness  rdf  rdfweb  reading  reality  recreation  relationships  research  rest  retro  retrocomputing  review  road  science  scifi  search  semantic-web  semantic  sequence  serialization  server  service  sf  sgml  shadsy  shareware  simulation  skin  social  socialmedia  software  soulseek  space  spatial  stage  standards  storyspace  stream  streams  stuartmoulthrop  stéphanemallarmé  subversion  surfing  surveillance  susanleighstar  systems  tagging  tags  tattoo  taxonomy  technical  technicalwriting  technocracy  technology  ted-nelson  ted_nelson  tednelson  television  text-adventure  text  thinking  timbernerslee  time  tools  topology  transmission  transreal  traversal  troff  twine  unclebuddy  unix  url  user  userexperience  utopia  vannevar_bush  vannevarbush  victoriareis  video  vision  visual  visualization  voice  vr  w3c  warelogging  web  web:history  webaim  webdesign  webhistory  website  wiki  wikipedia  wikis  williamgibson  women  writing  ww2  www  xanadu  xml 

Copy this bookmark: