Giter Site home page Giter Site logo

wiki's Introduction

Federated Wiki (Node.js server version)

The original wiki was written in a week and cloned within a week after that. The concept was shown to be fruitful while leaving other implementers room to innovate. When we ask for simple, we are looking for the same kind of simplicity: nothing to distract from our innovation in federation. -- Smallest Federated Wiki

Since the earlier creation of the node version, to complement the original Ruby implementation, there has been a risk of the two versions diverging. A first step to prevent divergence was to extract the client code into the wiki-client. This wiki-client was then used by both the Ruby and Node servers. However, with both server repositories retained the static components of the client, together with the plug-ins there remained some risk of divergence.

In this latest version of the node version of Federated Wiki, we continue by:

  • including all the client components in the wiki-client, and
  • moving all plug-ins into their own repositories, see below for a list.

When we originally extracted the wiki-client, we included it back into wiki whilst building the wiki package. This had the unforeseen consequence that when creating an updated wiki-client it was also necessary to create a new version of the wiki package for the updated client to be available. To avoid this we no longer include wiki packages in the package for the server.

Here we have a new wiki repository, and package, which only exist to pull together the federated wiki modules (wiki-server, wiki-client, and plug-ins) and start the server.

Using Federated Wiki

Learn how to wiki by reading fed.wiki.org

Running your own Server

The quickest way to set up wiki on your local machine is to install it globally with npm:

$ npm install -g wiki
$ wiki

Visit localhost:3000 to see your wiki. If you choose a host visible to the internet then others in the federation can use your work.

Running a test server

If you would prefer to test wiki without installing system wide, you can do the following (in a directory of your choice):

$ npm install wiki --global-style
$ npx wiki --data ./data

Without the --data argument, running wiki (installed globally or otherwise) will store data in ~/.wiki.

N.B. The wiki packages must to be installed with --global-style to work.

Updating the Server Software

From time to time some of the packages that makeup the wiki software will be updated. To see if updates are available for any of the wiki packages, run:

$ npm outdated --silent -g | grep '^Package\|^wiki'

If there are any updates available, the globally installed wiki can be updated by re-installing it:

$ npm install -g wiki

We have to install as running npm update -g wiki will only work if the wiki package itself has been updated.

An alternative approach would be to run npm update in the directory containing the wiki install. The location for running this will vary depending on which platform you are on.

If you installed the wiki package locally, rather than globally, you can run npm outdated --silent | grep '^Package\|^wiki' and npm update in the directory you installed the wiki package.

Server Options

Options for the server can be passed in many ways:

  • As command line flags
  • As a configuration JSON file specified with --config
  • As a config.json file in the root folder or cwd.
  • As env vars prefixed with wiki_

Higher in the list takes precedence. The server will then try to guess all unspecified options.

Configuring Security

By default a default security module is configured. This makes the wiki read only.

Details on how to configure the bundled Passport based security module, and the migration from using Mozilla Persona, see security configuration

N.B. The Mozilla Persona service closes on 30th November 2016.

Neighborhood Seeding

Two options are added for seeding a neighborhood.

When running a server farm --autoseed will populate the neighborhood with the other sites in the farm that have been visited.

Adding --neighbours 'comma separated list of sites' will add those sites to the neighborhood.

Datastore options


NOTE: This release sees a change in how the support for different datastores is provided, and how they are configured. The previous configuration method is depreciated, and will be removed in a future version.

There are a number of legacy database page stores

  • they DO NOT work with wiki in farm mode.

A number of datastores are supported. Use the --database and --data options to configure, or use the config.json.

The default location of the datastore is ~/.wiki, which contains two sub-directories pages and status:

  • pages is used with flatfiles, or leveldb, to store your pages, and
  • status stores the site's favicon, and a file containing the identity (email address) of the site owner.

flatfiles (default)

The default path to store page data is in a "default-data" subdirectory of the install directory. You can override it like this:

$ wiki --data FILESYSTEM_PATH

leveldb

Support for leveldb is added by installing the wiki-storage-leveldb package, this can be achieved by running npm install wiki-storage-leveldb -save in this directory.

The leveldb datastore uses JSON encoded leveldb format and is configured by providing a filesystem path:

$ wiki --database '{"type": "leveldb"}' --data FILESYSTEM_PATH

The leveldb datastore allows for a graceful upgrade path. If a page is not found in leveldb the flatfile datastore will be consulted.

Participation

We're happy to take issues or pull requests regarding the goals and their implementation within this code.

A wider-ranging conversation is documented in the GitHub ReadMe of the founding project, SFW.

The contributing page provides details of the repositories that form the node.js version of Federated Wiki, together with some guidance for developers.

License

MIT License

wiki's People

Contributors

chmac avatar dobbs avatar joshuabenuck avatar lenada avatar nrn avatar paul90 avatar sebhoss avatar wardcunningham avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

wiki's Issues

Trouble: Can't Get Page

When fetching the following page I get an error - http://wiki.parliamentofthings.org/view/map-of-science:

The page handler has run into problems with this request.

{"slug":"map-of-science"}
The requested url.
/map-of-science.json
The server reported status.
500

From the console I get the following information and warnings:

Using //@ to indicate sourceMappingURL pragmas is deprecated. Use //# instead map-of-science:618:0
Using //@ to indicate sourceMappingURL pragmas is deprecated. Use //# instead jquery-migrate-1.1.1.min.js:3:0
"Window Name: wiki.parliamentofthings.org" map-of-science:666:458
syntax error map-of-science.json:1:1
"pageHandler.get error" Object { readyState: 4, getResponseHeader: .ajax/N.getResponseHeader(), getAllResponseHeaders: .ajax/N.getAllResponseHeaders(), setRequestHeader: .ajax/N.setRequestHeader(), overrideMimeType: .ajax/N.overrideMimeType(), statusCode: .ajax/N.statusCode(), abort: .ajax/N.abort(), state: .Deferred/r.state(), always: .Deferred/r.always(), then: .Deferred/r.then(), 11 more… } 500 "error" "Internal Server Error" map-of-science:666:26625
".page keys " Array [ "154966d1" ] map-of-science:667:7521
"lineup keys" Array [ "154966d1" ] map-of-science:667:7660
"ajax error" Object { type: "ajaxError", timeStamp: 1428657373645, jQuery19106335917615261464: true, isTrigger: true, namespace: "", namespace_re: null, result: undefined, target: HTMLDocument → map-of-science, delegateTarget: HTMLDocument → map-of-science, currentTarget: HTMLDocument → map-of-science, 2 more… } Object { readyState: 4, getResponseHeader: .ajax/N.getResponseHeader(), getAllResponseHeaders: .ajax/N.getAllResponseHeaders(), setRequestHeader: .ajax/N.setRequestHeader(), overrideMimeType: .ajax/N.overrideMimeType(), statusCode: .ajax/N.statusCode(), abort: .ajax/N.abort(), state: .Deferred/r.state(), always: .Deferred/r.always(), then: .Deferred/r.then(), 11 more… } Object { url: "/map-of-science.json?random=9c116c9b", type: "GET", isLocal: false, global: true, processData: true, async: true, contentType: "application/x-www-form-urlencoded; charset=UTF-8", accepts: Object, contents: Object, responseFields: Object, 11 more… } map-of-science:666:13426
Use of getPreventDefault() is deprecated. Use defaultPrevented instead. map-of-science:620:0
"textEditor" "900f81790a29ef8c" Object { append: true } map-of-science:666:5158
"textEditor" "900f81790a29ef8c" Object { append: true } map-of-science:666:5158

neighbourhood seeding

It has been a while since neighbourhood seeding was added in #18

With a number of people having forage or journal sites, and the latest easiest install being in farm mode by default. I wonder if it might be an idea to use the sub-domain structure to direct the seeding beyond the simple add all active sites within the farm, and a specific list.

The idea would be to add active sub-domains of the accessed domain in the neighbourhood. So, for example, those writing a journal wiki could create something like journal.example.wiki as an entry point for visitors, which would have a seeded neighbourhood including its sub-domains, for example 2014.journal.example.wiki and 2015.journal.example.wiki.

Data sources

Currently we only have the data plug-in. Historically we also had a data path/directory, still referred to in the Metabolism plug-in.

It would be nice to re-introduce this, possibly with provision for automatic creating of a data page which allow a snapshot of the data to be forked. Though this would require some though, as the existence of a data page should not block access to the underlying data on the origin server.

how to publish a new plugin

I have a decent literate plugin.
I would like to publish it via npm in order to have it installed on OpenShift as described by Paul Rodwell.
I am not sure whether you used some sort of prepublish script or what you did.
Please explain.
Thanks in advance

Express 4 - testing this update

The express 4 version of the server package wiki-server has been published.

There are a number of significant changes with this version, so the minor version of both the wiki-server and wiki-client has been incremented. While this repository has been updated to use these new version, there will be a short pause before the wiki package is updated, to allow some extra testing.

To use this new version, before the package version is published, it can be installed by running:
npm install git+https://github.com/fedwiki/wiki-node.git

Things to note about this update:

  • update to Express 4.
  • update to CoffeeScript 1.8.
  • an option for neighbourhood seeding, see ReadMe
  • Storage options for levelDB, mongoDB and redis are moving to the own packages - support for storage packages added. With this release the database integration remains in the core, using the configuration from the previous version. The new storage packages are used with a slight change to configuration, see the ReadMe. The new storage packages are:

Hubzilla and Zot -- any insights for federated wiki?

Is there anything that the federated wiki community could learn from the Hubzilla as "a distributed network of hubs" at http://hubzilla.org/sandbox/index.html, and/or Zot, described as "a JSON-based web framework for implementing secure decentralised communications and services"?

Hubzilla seems to be a project that has evolved from Friendica, and is finding some alternative ways of getting around some roadblocks to Diaspora*. See the Hubzilla History at https://hubzilla.site/help/history .

Zot is part of Hubzilla at https://github.com/redmatrix/hubzilla/blob/master/include/zot.php , and there's an Intro to Zot at https://hubzilla.site/help/develop.

I've been experimenting with Hubzilla at https://hubzilla.site/channel/systems. I have some confusion about the way they do channels -- they can be (i) social, (ii) forum, (iii) feed, or (iv) special, as described at https://hubzilla.site/help/roles . This seems to be inexperience on my part, so I should think differently.

The project is intriguing to me.

Copying/forking multiple pages (i.e. complete wikis)

If I get to the state where there are a significant number of wiki pages to be compared and merged together, asking a collaborator to fork one page at a time will get to be annoying. As an example, there could be a complete wiki site for a pattern language, and forking each one independently could be tedious.

In looking at Tiddlywiki v5 (which uses node.js), I can see that it would be relatively straightforward for someone to copy a directory of Tiddlers (which are now separate files, rather than one huge file). This means that migrating a whole wiki site is relatively straighttforward.

On some previous communications, I think that @WardCunningham had said that he maintains a private wiki site on machine behind a firewall, and then pushes content out to public sites. Is this done by forking every private page into a public page, or is there some batch copying that is done?

Rename this repo to "wiki" to match npm package "wiki"

This repo dates back to a period of flux when the ruby implementation was still the "reference" implementation. However, the ruby version has fallen into disrepair. This repo is now our reference and should be easily cloned and managed by server operators who which to keep a variation on their sites but still want to track this repo as their upstream.

All this will be a lot simpler for everyone if this repo were renamed "wiki".

GitHub documentation says this will be easy.
https://help.github.com/articles/renaming-a-repository/

We have 30 forks already. Folks, what do you think? Better if this is "wiki"? Comments here could explain what followers might do with their existing forks. GitHub says that even this is not required but preferable. We are trying to simplify after all.

We would also like to see forks of this repo flourish as scoped packages easily installed by anyone from npm. Npm docs say even this is easy.
https://docs.npmjs.com/getting-started/scoped-packages

Comments are welcome as to whether there are other cleanup activities which should be conducted here before making this change. I'm interested specifically in normalizing how we handle independent plugins. Can we think about a name change in about a week?

Newline convention interferes with launching wiki.

I installed the latest npm wiki on my work mac today. When I launch it I get the error message:

$ wiki
env: node\r: No such file or directory

On further examination I find that this is because the script in /usr/local/bin/wiki uses \r\n as it's line termination.

I notice that the file in this repo, when cloned to my mac, has the desired \n convention. Could it be that this is a difference distributing files via npm and git?

Visibility of References (plugin-references)

It took me a while to find out that references are references, after I first wondered about how pages are composed. Turned out I wasn't able to see the slightly blue background, which would have helped me quite a lot in identifying these are specific elements of the page. I was looking at a (tiny) bit of an angle at the display which made all the difference here. As this the first time that I notice this making a difference, I assume it might be helpful to highlight these a bit more distinctively.

I played around with some settings and after all want to suggest to highlight the external links created by the references-plugin in the same way external pages are highlighted. Is that a sound idea in the sense of conformity?

I used the same box-shadow but had to make the border a bit smaller and found an indentation of the paragraph further helpful. The CSS code I used was this (instead of the background-color: #F0F8FF; that was there):

.reference {
    box-shadow: 0px 0px 10px 0px rgba(0, 180, 220, 0.5) inset;
    margin-left: 15px;
    padding: 7px;
}

The paragraph within gets an inline style set by the reference-plugin in line 15. This needs another setting in addition with the above, otherwise the top looks "ugly". Either change it to the following there (if inline is actually desired - plugin-architecture or something?) or move it to the css as well while at it with something like this:

 .reference p {
    margin-top: 3px;
    margin-bottom: 3px;    
}

The latter is not tested, but .reference pshould address the paragraph within the reference-div, right? I would have made this and created a pull-request after testing, but I wasn't able to find any source for the CSS file, yet. Is that maybe generated on-the-fly or something?

localhost:3000 contains a folder /Users/wcunningham/...

I was trying to fix what I consider an aesthetic bug in factory plugin (the list of plugins is centered, so the bullets are not aligned...see image in choose plugin page) and realized that following the contributing instructions I was not able to have my factory.js loaded in my locally run wiki.
Looking at the Chrome Dev Tools Network tab, it is not loaded (contrary for example to my other personal plugins).
Then if I look in the Sources tab, under localhost:3000, I find the following folder
/Users/wcunningham/test-refactor/wiki-client
and in its lib directory I see a factory.js file (not mine).

The instructions for contributing are ok with my plugins, probably because they are new ones not in the standard set...or anyway I am lost. Suggestions?

Sitemaps should ignore forks when dating pages.

A recently forked page will show up in the twins as 'newer' implying that there is something worth seeing there that isn't on the current page. This isn't true. Even with implicit forks, those caused by editing, the forking and editing are recorded in separate actions which means nothing is lost by ignoring those forks too.

The twins includes a category 'same' which shows up when pages are copied from site to site using some operating system commands. It makes sense that copies made with the fork button, and rightfully recorded in the journal, should be characterized the same way in twins.

Hover over fork actions would continue to report the date of the copying as they should.

Support for HTTPS and other privacy technologies

With Let's Encrypt in public beta, I decided to put SFW on HTTPS: I have SFW running on localhost and use the Caddy webserver as a frontend to handle TLS certificates and proxying.

Having Caddy in the front of SFW works well on a http connection, but on a https connection several things seem to fail:

  1. The browser lock icon doesn't go green, probably meaning that there are insecure elements on the page. Which are those?
  2. In the lower right corner, the site icons keeps turning (on http they almost instantly come to a halt).
  3. When clicking wiki links, e.g., links on the “How To Wiki” page, the pages turn up with a title only — there are no contents.

Is there anything I can do to remedy these issues, save for running SFW on an insecure connection?

command line wiki reports wrong version

This site says the current version is 0.1.6.

https://www.npmjs.org/package/wiki

If I install it and run the command line wiki -v it reports version 0.1.3.

I'm guessing that the command line wiki program is reporting the version of the wiki-server package which is 0.1.3, but it is only part of wiki.

https://www.npmjs.org/package/wiki-server

I have rev'd the wiki-client, wiki-server and wiki packages, in that order, trying to make the new printing css available to someone who is just running wiki install -g wiki. I found that npm update wiki wouldn't update unless I rev'd the wiki package.

Am I publishing correctly? Do we have everything set up right to publish updates?

Shutting down persona.org in November 2016

Mozilla have just announced that persona.org is going to be closed, below is a copy of the announcement.


Hi Everyone,

When the Mozilla Identity team transitioned Persona to community
ownership, we committed resources to operational and security support
throughout 2014 [1], and renewed that commitment for 2015 [2]. Due to
low, declining usage, we are reallocating the project’s dedicated,
ongoing resources and will shut down the persona.org services that we run.

Persona.org and related domains will be taken offline on November 30th,
2016.

If you run a website that relies on Persona, you will need to implement
an alternative login solution for your users. We have assembled a wiki
page with additional information and guidelines for migration [3], but
here are the important things you need to know:

Between now and November 30th, 2016, Mozilla will continue to support
the Persona service at a maintenance level:

  • Security issues will be resolved in a timely manner and the services
    will be kept online, but we do not expect to develop or deploy any
    new features.
  • Support will continue to be available on the dev-identity mailing
    list [4] and in the #services-dev IRC channel [5].
  • All websites that rely on persona.org will need to migrate to
    another means of authentication during this period.

Beginning on November 30th, 2016, the Persona service hosted by Mozilla
will be decommissioned:

  • All services hosted on the persona.org domain will be shut down.
  • Mozilla will retain control of the persona.org domain and will
    not transfer it to a third party.
  • Since the privacy of user data is of utmost importance to Mozilla,
    we will destroy all user data stored on the persona.org servers,
    and will not transfer it to third parties.

We intentionally designed Persona to expose email addresses rather than
opaque identifiers, which should ease the transition to other systems
that provide verified email addresses. You can find guidelines on
alternative login solutions on the wiki [3] and we will continue to
update them over the coming year. We strongly encourage affected teams
to openly discuss and blog about their migrations on the dev-identity
mailing list so that others can learn from their experience.

Thank you for your support and involvement with Persona. If you have any
questions, please post them to the dev-identity mailing list.

Ryan

[1]
http://identity.mozilla.com/post/78873831485/transitioning-persona-to-community-ownership
[2] https://groups.google.com/forum/#!topic/mozilla.dev.identity/rPIm7GxOeNU
[3]
https://wiki.mozilla.org/Identity/Persona_Shutdown_Guidelines_for_Reliers
[4] https://lists.mozilla.org/listinfo/dev-identity
[5] http://irc.mozilla.org/#services-dev

claiming wiki via persona doesn't seem to function

When I log in with Persona with wiki's "Claim with your Email" button, I'm successfully verified by persona, wiki refreshes with my information, then wiki refreshes again with a modal stating:

Login Failure

It looks as if you are accessing the site using an alternative address.

Please check that you are using the correct address to access this site.

This is a private instance.

Latest Coffee-Script breaks wiki

The latest version of coffee-script, 1.7.0, release causes problems.

module.js:340
    throw err;
          ^
Error: Cannot find module 'wiki-server/lib/cli'
  at Function.Module._resolveFilename (module.js:338:15)
  at Function.Module._load (module.js:280:25)
  at Module.require (module.js:364:17)
  at require (module.js:380:17)
  at Object.<anonymous> (C:\tmp\wiki\node_modules\wiki\bin\server.js:5:1)
  at Module._compile (module.js:456:26)
  at Object.Module._extensions..js (module.js:474:10)
  at Module.load (module.js:356:32)
  at Function.Module._load (module.js:312:12)
  at Function.Module.runMain (module.js:497:10)
  at startup (node.js:119:16)
  at node.js:902:3

There looks to be an issue over there to cover this, but simply changing server.js to include the extension raises more errors

C:\tmp\wiki\node_modules\wiki\node_modules\wiki-server\lib\cli.coffee:1
(function (exports, require, module, __filename, __dirname) { # **cli.coffee**
                                                              ^
SyntaxError: Unexpected token ILLEGAL
  at Module._compile (module.js:439:25)
  at Object.Module._extensions..js (module.js:474:10)
  at Module.load (module.js:356:32)
  at Function.Module._load (module.js:312:12)
  at Module.require (module.js:364:17)
  at require (module.js:380:17)
  at Object.<anonymous> (C:\tmp\wiki\node_modules\wiki\bin\server.js:5:1)
  at Module._compile (module.js:456:26)
  at Object.Module._extensions..js (module.js:474:10)
  at Module.load (module.js:356:32)
  at Function.Module._load (module.js:312:12)
  at Function.Module.runMain (module.js:497:10)
  at startup (node.js:119:16)
  at node.js:902:3

As that error is in the generated JavaScript it looks as if they have bigger problems...

Quick questions on plugin dev

Here's the expand for my reworked plugin :

expand = (text)->
    mp = new MarkupProcessor('')
    text = mp.page(text)
    text = wiki.resolveLinks(text)

And here's emit

emit = ($item, item) ->
  $item.append """
    <p style="background-color:#eee;padding:15px;">
    #{expand item.text}
    </p>
 """

Two questions.

  1. I get a failure trying to resolveLinks in the unit-tests because it says it can't find "wiki". Presumably it's not in scope in the tests. Though it seems to be when rendered by SFW itself.

This isn't fatal but is there a way to avoid it?

  1. The HTML in the result is still being escaped into <h3> &amp; etc. I'm not doing this in my MarkupProcessor, and I removed the lines that did it in the example expand. So where is it happening? And can I turn it off?

Plug-ins (default set)

While we are adding the Markdown plug-in, #28, are there any of the older plug-ins that should be removed from the default install?

My initial though is that there are a few that rely on specific server attached hardware, twadio and txtzyme. And probably logwatch and parse as well.

Thoughts...

Pages with Unicode Titles Don't Work

I ran the wiki on my machine and noticed that pages with (for example) Japanese-only titles don't work. However, if the title includes some ASCII it will work. So "テスト" results in a page I can't edit, but "テストfiddle" results in an editable page, but only "fiddle" shows in links. Is this a bug?

URL semantics

I want to be able to refactor different wikis aside by pasting a URL after the other

http://change.lab/view/welcome-visitors/http://jon.patterns.wiki.transformap.co/view/welcome-visitors

into the URL bar, like I am doing it at times, usually including the neccessary adaptations, when I don't want to leave traces of drags and drops.

If we understand "crude and unappealing" (tm) interfaces as unwelcoming human interaction choices, probably also reading all these /view fragments may not be needed at all one day? Yet I understand their existence is implied if we want to be able to distinguish between a remote and the current location.

Now I am asking myself why I am avoiding the drag and the drop at all.

Migrating to fedwiki org.

Part of the refactor, see WardCunningham/Smallest-Federated-Wiki#403

  • review the documentation, and update
  • move to fedwiki org, as wiki-node

This will replace the existing wiki npm package, so need to get all the docs to reflect this, including details on how to contribute (both to the existing components, and also writing new plugins).

A few more steps that are needed once wiki and wiki-client steps are completed.

  • ensure package.json is updated, using npm rather than git to install the wiki-server and wiki-client packages, as well as new repository and issues url
  • publish to npm - this is the last step, as it will be published as wiki.

Problem building sitemap

After upgrading a Fedora system, a simple error helped me digging into SELinux and Systemd again.

Since the custom SELinux policies which had been in place for the local, systemd-governed nvm-wiki combination got reverted, the node process didn't have sufficient permissions anymore to fully interact with the flat file storage.

It became obvious when trying to start and access a localhost wiki and SELinux popping up with complaints. After half of them were solved, a newly created page was in browser cache rather than on the server.

An answer to circumvent the general SELinux tragedy with Node.js in a systemd environment can be found in https://github.com/almereyda/node-systemd-selinux

Additionally due to setting up .nvm and .wiki in the same home folder of the wiki user, but running via the init process, further policies apply. This scenario appears to be of a rare kind, why it is probably also the reason for running into such strict constraints.

module systemd-nvm-wiki 1.0;

require {
        type init_t;
        type user_home_t;
        class process execmem;
        class file { create execute execute_no_trans ioctl open read rename unlink write };
        class lnk_file { getattr read };
}

#============= init_t ==============

allow init_t self:process execmem;
allow init_t user_home_t:file { create execute execute_no_trans ioctl open read rename unlink write };
allow init_t user_home_t:lnk_file { getattr read };

Yet an inconsistent state appeared after resolving the different permission errors one after another.
The only usable hint was a console output that stated

Problem building sitemap: a-tempospatial--or-spatiotemporal--investigation e:  [SyntaxError: Unexpected end of input]

As one of the initial policy errors had also been related to the sitemap and a missing rename capability, only following the error message to https://github.com/fedwiki/wiki-server/blob/9ad81cb9f23c680c3a91e4676f704e3f6cb4f7d5/lib/page.coffee#L173 helped in understanding how this could be related to the suspicious Internal Server Error the wiki client displayed when trying to fetch the inconsistent state of the problematic page.

A more verbose error could have helped triaging the faulty storage file in question much faster.

Javascript dependency and Anonymous Network (I2P)

I am basically able to navigate pages of a fedwiki with JavaScript (js) disabled. Some fields, however, get displayed as a grey box only, naming the type of content though. I found out "html" most often marks headings. Why is js needed to display a heading? Others I stumbled upon until now are "code" and "method". Allowing arbitrary sites to execute js at least can be seen as a security risk (I do think it should be). I hoped the federation would allow a system where browsing is possible without js while editing can be done on ones local server with js enabled then.

How realistic is that idea? I ask this, because I had the idea of making fedwikis run within I2P, the Invisible Internet Project. This would probably require some changes for the federation (due to different addresses) and I'm trying to determine if I should attempt that. Let me shortly explain why that idea appeals to me. I2p makes certification authorities obsolete and combines key-exchange with address-acquisition. That means, once you got the address, you also have the key and thus are done with verification. It also solves the problem of dynamically changing ip addresses and removes the requirement to register a domain. As such it sounds like fedwiki and i2p would be rather perfect fits. It provides (adjustable) anonymity in addition, which people require in varying degrees. Being able to provide it definitely should be seen as plus nowadays. All things that are at least nice to have for a federated application. Actually the discussion today is more if it isn't "must have". But that aside ...

However, having to enable js to browse a site is a requirement that seems not acceptable to me. I assume you go easier on security, but for me it's a no-go on all but a few selected sites. So allowing js for all sites I come along while jumping from wiki to wiki really is no-no. But I understand if you don't care and only ask you for your estimation how easy/hard getting javascript-free-browseability and -cloneability to ones own site might be. Just in case it doesn't make much sense at all with the current implementation. Don't get me wrong, I don't expect you to do what I would like to have. Just trying to find out if it is sensible to approach this with FedWikis current state at all. Concerning the networking stuff I'm pretty sure it's doable although I haven't looked into any glory details yet.

Thx 4 your time!

Can I run my wiki at http://my.server.com/wiki ?

I am trying to set up a wiki server on a server that already has a Web site running at http://my.server.com/. The easiest workaround is to use another port than 80 or 443, but then I can't use my own wiki because I live behind a firewall that blocks almost all ports.

A solution that I have been exploring is to use http://my.server.com/wiki as the root URL, using the ProxyPass feature of Apache (which is running the main Web server). This fails because wiki uses absolute URLs for everything. Can this be configured somehow?

Contributing docs

Looks as if having contributing.md with developer guidance appear when raising an issue as "guidelines for contributing"

Suggest rename contributing.md to developer-notes.md and probably write something to help those raising new issues.

Are there post Persona plans ?

I'am interested in deploying a large wiki farm.
Are there currently alternatives to Persona ?
I m hesitant to begin a wiki farm without an understanding of what the direction will be post persona

Thanks

How to convert a federated wiki into a farm (or subdomains)?

Last year, I started a Wiki Openshift Quickstart based on the instructions from @paul90 to create a wiki. This led to instructions at http://fed.coevolving.com/view/wiki-openshift-quickstart .

In using the wiki, and starting to federate content from other sites, I noticed that searches were starting to return a ridiculous number of irrelevant hits, as the search wouldn't be only on my domain, but other domains as well.

In the fall, I learned from @WardCunningham that the way to keep neighbourhoods small is to run multiple wikis. This makes sense, as when I'm working within a limited interest, I don't really want to be searching the world.

The way that Ward does this seems to be by managing multiple subdomains, i.e. there's fed.wiki.org , and then there's ward.fed.wiki.org.

Is it an easy modification to enable the multiple subdomains? Is there a differentiation here between the subdomain, and a wiki farm? I presume a wiki farm would allow more users to sign up, which isn't my direction. I would encourage people to follow a set of directions to set up their own federated wikis (or would help them to do so) so that separate identities are easily apparent.

Graphs for knowledge representation

Let's start a discussion thread on potential approaches to graphs (which some might call maps) for the federated wiki. @bobbyno and I spent 90 minutes discussing this so far. I will be travelling for some weeks, so writing may help clarity, that we can discuss together on a future video call.

A story: For a new pattern language (for service systems), federated wiki should work well for one pattern each page. However, if the number of patterns gets to be large (say, 100), selecting a subset (say, 12) and appreciating the relations between them could be difficult. A map (or graph) could aid the visualization in various groupings of patterns.

Christopher Alexander himself had a artistically-drawn graph in his 1968 book on Multi-Service Centers.

Some features: To improve the quality of systems thinking (i.e. perspectives on parts and wholes), it would be helpful to differentiate between structures (e.g. part-part arrangements in space) and processes (e.g. part-part arrangements in time). This has been a direction encouraged in the systems engineering community with INCOSE, in Object Process Methodology. The essential constructs are (i) objects (drawn as rectangles), (ii) processes (drawn as ovals) and (iii) states (drawn as rounded rectangles). Edges are directed. If we could get non-technical professionals (e.g sociologists, not engineers) to draw rectangles and ovals, even if all of the edges were not correctly represented, we would be ahead. The assessment to date is that non-technical systems thinkers can't handle the 13 constructs of SysML.

Two directions
(1) We could start with Trivial Graph Format, and add some additional markup to differentiate between rectangles, ovals and rounded rectangles. TGF is not a formally accepted standard, so this could be a simpler way to represent the graph.

(2) We could aim for the DOT graph description language and choose a subset of basics to implement. The simple example of an ethane molecule doesn't look too hard. This approach may be more complicated to manage if the diagram gets large.

Technologies to learn from
(a) Tiddlymap "is a TiddlyWiki plugin that allows you to link your wiki-topics (tiddlers) in order to create clickable graphs". In TiddlyWiki 5 (i.e. the node.js version), the plugin uses Vis.js.
(b) The Graphviz plug-in for Dokuwiki " can create directed and non-directed graph images from a textual description language called 'dot' using the Graphviz program".
(c) The Graphviz extension for Mediawiki " lets you create and display graphs as in-line images on wiki pages using tools from the open-source Graphviz and Mscgen projects".
(d) yFiles for HTML "features advanced UI controls for viewing and editing diagrams and unequaled layout algorithms for automatically arranging complex diagrams at the click of a button".
(e) draw.io is a "free online diagram software for making flow charts, process diagrams, org charts, UML, ER and network diagrams", built on top of mxGraph.

@bobbyno and I talked through part of a typical dishwasher example used in systems engineering training. Objects include (i) a door and (ii) a switch. States include (i) switch open and (ii) switch closed. Processes include (i) closing the door, (ii) washing dishes, and (iii) opening the door.

The simplest implementation could be just for TGF. However, satisfying the story described above leads to a slightly larger minimal viable product.

How should we think about approaching the representation of graphs in a wiki?

Installation problems on Debian Jessie

I tried to install on Debian Jessie with

root@host:~# apt-get install npm
[...]
root@host:~# npm install -g wiki
[...]
root@host:~# wiki
/usr/bin/env: node: No such file or directory

During installation several warnings were issued. Mostly because of optional dependencies, but also

/bin/sh: 1: node: not found
gyp: Call to 'node -e "require('nan')"' returned exit status 127. while trying to load binding.gyp
gyp ERR! configure error
gyp ERR! stack Error: `gyp` failed with exit code: 1
gyp ERR! stack     at ChildProcess.onCpExit (/usr/share/node-gyp/lib/configure.js:344:16)
gyp ERR! stack     at ChildProcess.emit (events.js:98:17)
gyp ERR! stack     at Process.ChildProcess._handle.onexit (child_process.js:809:12)
gyp ERR! System Linux 3.16.0-4-amd64
gyp ERR! command "nodejs" "/usr/bin/node-gyp" "rebuild"
gyp ERR! cwd /usr/local/lib/node_modules/wiki/node_modules/wiki-plugin-linkmap/node_modules/ws/node_modules/utf-8-validate
gyp ERR! node -v v0.10.29
gyp ERR! node-gyp -v v0.12.2
gyp ERR! not ok
npm WARN This failure might be due to the use of legacy binary "node"
npm WARN For further explanations, please read
/usr/share/doc/nodejs/README.Debian

Looking into the referenced file I found Scripts calling Node.js as a shell command must be changed to instead use the "nodejs" command.. Now I'm trying to find out what I do need to change to get this to run, but my searching was unsuccessful so far. I did search the net and the issues, but that didn't help me. Help appreciated.

Handle cookies when not allowed

I noticed a FedWiki does not display it's content properly even with Javascript enabled if cookies are forbidden. It took me a moment to realise what's the problem. I don't know if there is a technical reason, as the client should still be able to retrieve the content? So maybe that's even a bug. If not, maybe consider displaying a header with a warning if the cookie is not present but needed?

wiki-node-server + wiki-client : allow anonymous, collaborative wikis by config + local wiki configs in a farm + multiple editors per wiki

Currently I'm hacking wiki-client to not display the login button at all, to provide loginless Wikis from time to time.


1. single site servers

This would mean the server is presented the --anonymous/--no-authentication/-noauth flag, intervening with wiki-client not to present the Login Button.

2 farms

In the farm case, sometimes I don't want to disable the login for all sites, else refer to 1..

It could be nice to add additional metadata to the initial welcome-visitors json, in a place that cannot be edited if anonymous, that provides additional information about a wiki, like the idea above.

I might be playing with just pasting arbitrary JSON into one of the existing factories, to see how it behaves.

If that works, a key-value pair like noauth:true would already be enough for the server and client not to show login at all.


multiple authentication for collaborative wikis

As of now, every wiki only supports one user. In certain environments, people (i.e. @hhalpin) want to have a mix of Etherpads and Wikipedia. Hackpad does something very similiar already today, but lacks open sources (and belongs to Dropbox > Condoleezza Rice ...).

WebID (@bblfish) could help to provide this authentication layer for an also semantic wiki implementation. Also see WardCunningham/Smallest-Federated-Wiki#411 for discussions about Semantics and Federated Wiki.

Federated wiki on a laptop

The "Hosting and Installation Guide" says " ... you will need to have a dedicated computer that is always connected to the internet" at https://github.com/WardCunningham/Smallest-Federated-Wiki/wiki/Hosting-and-Installation-Guide .

What is the impact if the computer is not always connected, (i) to the individual working on a disconnected laptop, and (ii) to the federation than can not rely on a persistent connection?

If a laptop running a federated wiki is disconnected from the Internet for 48 hours (e.g. travel time for an airplane ride plus ground transportation on both end), and edits are done on the disconnected laptop, will the federation "catch up" on changes? Across the federation, if other forks start showing up, are there indicators that will (eventually) show up?

Contributing Wiki Plugin Transport

Please consider including the Transport plugin in the standard wiki distribution.

I have been holding onto this hoping to work out better plugin catalog mechanisms but don't want to hold it up any longer. Instead I would like to establish criteria by which we would accept new plugins.

https://github.com/WardCunningham/wiki-plugin-transport
https://www.npmjs.com/package/wiki-plugin-transport

There is one outstanding issue regarding possible error handling and possible CORS issues too. It is my judgement that we will see more Transporters if the capability to invoke them is widely available. Experience will dictate what development/debugging support will be appropriate.

I am happy to invoke whatever github and npm ownership transfers are appropriate. Help me develop a checklist for such transfers. Thank you.

how wiki forever

after $ wiki the server in on. but not forever on. please tell me how to forever?

warnings when `npm install` of this package with io.js

Here's what happened when I did an npm install that included wiki: 0.6.x in the deps:

npm WARN engine [email protected]: wanted: {"node":"0.10"} (current: {"node":"1.6.2","npm":"2.7.1"})

repeated 42 more times for:
wiki-plugin-data wiki-plugin-markdown wiki-plugin-factory wiki-plugin-favicon
wiki-server wiki-plugin-activity wiki-plugin-future wiki-plugin-changes
wiki-plugin-html wiki-plugin-line wiki-plugin-logwatch wiki-plugin-audio
wiki-plugin-chart wiki-plugin-calculator wiki-plugin-calendar
wiki-plugin-federatedwiki wiki-plugin-mathjax wiki-plugin-parse
wiki-plugin-efficiency wiki-plugin-bytebeat wiki-plugin-report
wiki-plugin-flagmatic wiki-plugin-reduce wiki-plugin-metabolism
wiki-plugin-txtzyme wiki-plugin-reference wiki-plugin-force wiki-plugin-bars
wiki-plugin-pushpin wiki-plugin-code wiki-plugin-scatter wiki-plugin-radar
wiki-plugin-paragraph wiki-plugin-image wiki-plugin-twadio wiki-plugin-map
wiki-plugin-grep wiki-plugin-linkmap wiki-plugin-method wiki-plugin-roster
wiki-plugin-pagefold wiki-plugin-rollup wiki-plugin-video

no icons (setup issue?)

I followed the instruction to install, but I don't get the icons, so cannot use it. I get content for the fist page "Welcome Visitors" but blank page but the title when I follow the links.

If I start with -d node_modules/wiki/node_modules/wiki-server/default-data, I get the top left hand corner icon, but still not "+" icon etc.. at the bottom.

Did the npm install miss a package?
Am I supposed to ad some parameter?

Thanks.

PS: Let me know if there's a mailing list better suited for this.

Contributing Wiki Plugin Search

Please consider including the Search plugin in the standard wiki distribution.

I have been holding onto this hoping to work out better plugin catalog mechanisms but don't want to hold it up any longer. Instead I would like to establish criteria by which we would accept new plugins.

See also #68

https://github.com/WardCunningham/wiki-plugin-search
https://www.npmjs.com/package/wiki-plugin-search

There are no outstanding issues for this plugin. The plugin is dependent on a scrape/search service prototype that I host at search.fed.wiki.org. I know how this could be converted to elasticsearch but have not done so because of low search traffic and the experimental opportunities for new kinds of search.

I have been holding onto fedwiki/wiki-client#132 which adds a novel associative search to every page. I will advance this request once the search plugin that it uses has been published.

I am happy to invoke whatever github and npm ownership transfers are appropriate. Help me develop a checklist for such transfers. Thank you.

Punycode subdomains

If we find internationalized subdomains like for the testwiki of our friend François,

who is supposed to be the source of the term degrowth,
they are parsed to Punycode and only then passed to wiki, leading to the ugly display of

within the address bar of our companion's browser.

Should we use some sort of JavaScript URL rewriting to make at least the user believe to be able to use Non-ASCII characters in the desired address of a newly to be created wiki?

Page out of sync across federation?

As is my usual practice, I went to http://fed.coevolving.com/view/digests-from-sfw-meetings to add a page. The last time that I edited it was 2014-10-01. I didn't see a "new" notification on that page, so I added a new link for the digest on 2014-11-12.

I now see at http://fed.wiki.org/view/digests-from-sfw-meetings that there was an edit for 2014-10-29 over there. Should that have notified me with a "new" update on the page on coevolving.com?

Could this have been a problem introduced because the code base was upgraded last week (and now I'm downlevel)?

Installs, but won't start

Over in WardCunningham/Smallest-Federated-Wiki#403 Ward reports:

I've tried an install on my mac. It installed smoothly but won't start. Here is what I've tried.

$ wiki-exp
env: node\r: No such file or directory
$ wiki-exp --data ~/.wiki
env: node\r: No such file or directory
$ mkdir foo
$ wiki-exp --data foo
env: node\r: No such file or directory

Database Page Stores do not work with farm mode

All the database based page storage options do not work correctly when the server is run in farm mode. This is caused by each of the servers in the farm get started with the same database configuration. So, they are all accessing the same data.

An issue is raised for this issue in each of the storage option repos.

Plugins development

I'm not sure if this is now the right repository to ask in, but is

https://gist.github.com/WardCunningham/fced775fcedcee133b32

up-to-date / still relevant?

I'm finding that when I create a new plugin with that script : sym-link from my wiki/node_modules directory and restart wiki, I don't see a page at /about-myplugin-plugin.html as the instructions suggest. Nor does it seem to recognise plugins with that name.

(I'm just updating my version of SFW after about 18 months, so I need to update my plugin to the new standard.)

Provide list of / links to special client pages

I do not understand the underlying concepts of the js-client-architecture, but realised that one thing that was confusing to me were the special pages (what's your taxonomy here?) like /recent-changes.html. For my perception these present a client-functionality which I then would expect to be available in the client. At the current I had to meet a link to such a page on some server to actually get to know about/use it. Could be easily addressed though, I guess. For example by placing icons/buttons in the bottom line on the right. At least for me the browser page usually has a width that leaves 1/3 to 1/4 of that bottom line on the right empty. On the other hand a more distinct separation from the login, which rather seems server-side to me, might be desirable.

If you don't want that for some reason, providing a list of all special pages available at a prominent place so that any user has to notice them might be an alternative?

Edit Actions don't complete

I'm seeing edit actions silently fail. This seems to be associated with large payloads like Factory plugin posting a dropped image. I believe the same root cause is in play for this report, an explicit fork of a remote page to the origin server that also silently fails. I've not seen the problem for images or forks using the ruby/sinatra server.

From the client side using Chrome's inspector the PUT hangs pending for four minutes and then eventually fails. The wiki-client code catches the failure and stores the remote page in browser local storage under the origin name. That's something, but not the desired result.

image

From the server side the express log doesn't show a PUT for some time. Presumably the PUT is in progress until jQuery ajax times out and closes the connection. jQuery tries several times over the four minute period. None of the express logs show any status code being returned.

image

The payload for this PUT (if I read Chrome's inspector correctly) is one url encoded line of 24399 characters.

I'm running wiki with one argument, -f. Wiki -v reports:

wiki: 0.3.5
wiki-server: 0.2.1
wiki-client: 0.2.12
wiki-plugin-activity: 0.1.1
wiki-plugin-bars: 0.1.3
wiki-plugin-bytebeat: 0.1.1
wiki-plugin-calculator: 0.1.3
wiki-plugin-calendar: 0.1.2
wiki-plugin-changes: 0.1.2
wiki-plugin-chart: 0.2.1
wiki-plugin-code: 0.1.1
wiki-plugin-data: 0.1.1
wiki-plugin-efficiency: 0.1.2
wiki-plugin-factory: 0.1.1
wiki-plugin-favicon: 0.1.1
wiki-plugin-federatedwiki: 0.1.1
wiki-plugin-force: 0.1.1
wiki-plugin-future: 0.1.1
wiki-plugin-html: 0.0.2
wiki-plugin-image: 0.1.1
wiki-plugin-line: 0.1.2
wiki-plugin-linkmap: 0.1.2
wiki-plugin-logwatch: 0.1.1
wiki-plugin-map: 0.1.3
wiki-plugin-mathjax: 0.1.1
wiki-plugin-metabolism: 0.1.1
wiki-plugin-method: 0.1.5
wiki-plugin-pagefold: 0.1.1
wiki-plugin-paragraph: 0.1.1
wiki-plugin-parse: 0.1.1
wiki-plugin-pushpin: 0.1.1
wiki-plugin-radar: 0.1.1
wiki-plugin-reduce: 0.1.1
wiki-plugin-reference: 0.1.1
wiki-plugin-report: 0.1.1
wiki-plugin-rollup: 0.1.1
wiki-plugin-scatter: 0.1.2
wiki-plugin-twadio: 0.1.2
wiki-plugin-txtzyme: 0.1.4
wiki-plugin-video: 0.1.1

The server is running on CentOS release 6.5 (Final). I've seen similar behavior running on Mac OS.

This would seem to be some protocol confusion between jQuery and Node.js. This seems unlikely so I am unsure of how to continue debugging. Suggestions welcome. Pull requests would be delightful.

wiki-node or wiki-node-server?

Hi,

I'm wondering which one of these to use and the difference between them. It's a bit confusing that they have similar names and with equal activity :)

Any help is greatly appreciated!

URLs sometimes don't work directly

Copying URLs from the URL bar in the browser and entering the website directly doesn't always work.

In some of the cases it directs to 'Welcome visitors'. Clicking the back button in the browser then leads to the desired page. Here's two examples of this:
http://idiom.sfw.c2.com/view/welcome-visitors/view/idiom-or-pattern
http://pattern.sfw.c2.com/view/welcome-visitors/view/search-results/view/pattern-backlash

I am using Mozilla Firefox 45.0 on Ubuntu LTS 14.04.

Recently Cloned Page Misbehaves

I can confirm a persistent problem my end (using latest FireFox on OSX). Forked pages are not recognised when drag and dropped onto factories to create reference links. Not all the time but frequently this is the case (I can't pin down why).

I do know the fix however. If the forked page refuses to allow itself to be dropped onto a factory - you can encourage her by refreshing the web page (command R) - then it will work.

This odd behaviour with regard to recently cloned pages is also true when you drag and drop it onto a new tab. Instead of opening a new panel, it replaces the entire url. Again refreshing the web page that contains the recently cloned page restores the normal behaviour.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.