We consider what one might expect of a federated wiki server under diverse circumstances. We hope a brief reflection will guide improvements in servers, clients and the protocols between them.
See also Laptop Server
We have two full-featured servers, ruby and node, that support interactive content creation.
We have various read-only servers that participate by serving json into a client started elsewhere.
We've had marginal success with purely static sites where client assumptions make bootstrapping awkward.
We've made two versions of a c2.com wiki adapter, one batch, the other interactive.
We've come to depend on sitemaps. Sites without them can hardly be considered federated.
We're finding more value in server-side rendering for latency of initial payload and improved accessibility.
We've explored including data by batch or on-demand page generation as well as a streaming side-channel.
We've made drag-and-drop a workable mechanism for connecting previously unrelated sites.
A server that exists to be read remotely must serve JSON with CORS headers. See JSON Schema
A root page, welcome-visitors, must be present and presumably leads to other useful content.
A flag and sitemap should be present in order to appear as a functioning member of a neighborhood.
A composite of available plugin factory.json menu entries should be available as factories.json.
A stubbed version of a lineup should be generated for (/view/page)+ urls used for browser history.
A server that supports editing must handle ajax posts of actions, perform those actions, and persist the results.
A user should be able to claim a server and then later authenticate as the one user with permission to edit.
A family compliant gradient flag (favicon.png) should be generated for sites that don't yet have them.
A server may host multiple virtual sites from a single server instance (a wiki farm).
A server may accept submissions of local changes to be stored and served as subdomains.
Servers might support more exotic sitemaps that include enough information to trace links forward or backwards through pages.
Servers should support some safe and reliable mechanism for deleting unwanted pages and undeleting them when they become wanted. See Deleting Me Softly
Servers shouldn't need to participate in browser history by parsing and echoing lineups.
Servers have, but shouldn't continue, fetching from remote servers during a fork. Rather, the client should relay content.