-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable wiki URLs to declare a lineup using location.hash in addition to location.pathname #280
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A few notes about some trade-off decisions we know about.
state.syncDomWithLocation = -> | ||
main = "" | ||
for {site, slug} in state.fromLocation(location) | ||
dataSite = "" | ||
if site != "view" | ||
dataSite = """data-site="#{site}" """ | ||
main += """<div id="#{slug}" #{dataSite}class="page"></div>\n""" | ||
$('section.main').html(main) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function is where we introduce more confusion to the bootstrapping of wiki-client.
We replace the contents of <section class="main">
with our own page divs computed from window.location
.
For comparison, here is where @paul90 made a similar intervention in the DAT/hypercore variant.
paul90/wiki-client-dat-variant@15cb656#diff-5b6ac0ea3ef5483a6e071c1ff85d99a3c4ef31d97b567deff46686fb93a25921R17-R33
For the time being, we are deferring the work to understand and change the existing wiki-client bootstrap.
before -> | ||
global.$ = (el) -> {attr: (key) -> el[key]} | ||
global.history = | ||
pushState: (state, title, url) -> actual = url | ||
lineup.bestTitle = () -> title |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We're stubbing jQuery's $
, history
, and the imported lineup
. In specific test cases below we also stub location
, and document.title
. These allow the tests to pass where we do not have these browser APIs, but they will also be too tightly coupled to the current implementations.
We have a companion to this PR in a separate repo: readonly.js This javascript program copies client side code and pages into a folder suitable for publishing with github pages. There is some chance we'll include the publishing script in this PR before we're done. |
Let's consider for a moment what one must do to be sure that a hash url saved will continue to work in the presence of url format evolution. First we note that some server will provide the wiki-client that must recognize any particular url and the name of that server is already recorded in the saved url. This gives us considerable latitude. However, if a server operator chooses to vary the scheme for sites made public, they have some obligation to support all variations going forward. This seems a modest burden but one the authors of an open-source wiki-client must bear. It is possible that the url scheme could include some page specific information. We could, for example, provide a ghost page with some information currently lost to the url. For example, a search result could include the search term in the url so that the search is easily repeated. Maybe the hash is base 64 json? Continuing in this direction, Wiki-21 has suggested developing the "panel" abstraction. The hash could then consist of a rollup of plugin, panel and lineup state to be reconstituted at will. These opportunities are beyond this pull request unless some thought experiment along these lines suggests how we might enable this future. |
The most recent commit in this PR sets up this format to support static wikis and to open some room for evolution:
We've chosen The conformance test cases with this PR is in the direction of a test suite supporting URL format evolution. But these cases to not yet express that story very well.
This was also in my mind as I was envisioning the internal structure of the parsed URL—looks similar to the panels in wiki-21. And I think this was the motivator to introduce a parameter to give a name to the format, as we similarly give names to items to indicate which plugin should interpret the text. Imagine something like the following as code that would construct a base 64 representation:
The base64 json would give ample room for innovation. More terse formats such as our current implementation can be transformed into that internal structure—indeed, that's the implementation we currently have. |
This is not a complete test of everything state does. We have focused tests around the uses of location.pathname. We will be introducing use of location.hash and want to be sure we know if we brake previous behavior.
We introduce an internal structure to represent the Reader's intent for the lineup. We believe this opens several opportunities. We enables a URL search parameter to provide the intended lineup while preserving support for existing use of URL pathname. Using URL search instead of URL pathname opens the door to statically hosted wikis. (n.b. introducing this additional behavior means this commit is not a pure refactoring) We anticipate this intermediate structure enabling other changes to clarify responsibilities between state and lineup. The {site, slug} invites {site, slug, revision} and could open the door for better support of non-European languages with something like {site, slug?, revision, title} Previous experiments in the DAT/hypercore wiki variant and seran outputs have used the URL hash where this commit chooses to use URL search. One rationale is to avoid collisions in our URL semantics with whatever Google are experimenting with in their recent chrome plugin for links to text fragments. Google use #:~:... to delimit these new fragment links. One other decision hidden in this code is the choice to use the specific search parameter "pathname" to hold the intended lineup. This opens experimenting with other URL structures like the slug@site/slug@site we have in some other experiments.
small steps toward making setUrl adapt to both pathname and search
This commit revisits a decision described in commit 4ea7ab1. We previously chose location.search partly to avoid collisions with Google Chrome's experiments with links to text fragments. We have learned that wiki-server replies with 302 redirects. In this commit we choose to use location.hash to allow us to release this functionality without requiring changes to wiki-server.
This commit enables wiki-client to support a statically hosted wiki while maintaining its support for dynamically hosted node-based wikis. We hope to find time to follow this commit with some refactoring. In particular, state.syncDomWithLocation() is hacking the DOM of the html page early in the bootstrapping of the client, undoing or redoing some of the work wiki-server does for us. This will unfortunately increase the confusion in the wiki bootstrapping. There are details in the client's bootstrapping that need disentangling between wiki.coffee, legacy.coffee, and state.coffee. Some changes will also be wanted in wiki-server. What we get in exchange for this increase in confusion is the ability for authors to host wikis in github pages or in similar static web sites. We believe this paves the way toward easier wiki hosting.
Goals
Enable wiki-client to support a statically hosted wiki.
Preserve support for dynamically hosted, node-based wikis.
Rationale
We introduce an internal structure to represent the Reader's intent for the lineup. We believe this opens several opportunities.
We enable a URL hash parameter to provide the intended lineup while preserving support for existing use of URL pathname. Using URL hash instead of URL pathname opens the door to statically hosted wikis.
We anticipate this intermediate structure enabling other changes to clarify responsibilities between state and lineup.
The {site, slug} invites {site, slug, revision} and could open the door for better support of non-European languages with something like {site, slug?, revision, title}
We notice that using the URL hash invites collisions. Google are experimenting in a recent chrome plugin to enable the use of
#:~:
to delimit links to text fragments. This PR makes no attempt to detect or resolve such collisions.One other decision hidden in this code is the choice to use the specific parameter "pathname" to hold the intended lineup. This opens experimenting with other URL structures like the slug@site/slug@site we have in Seran Outpost and other experiments.