DataCouch is currently being rewritten! Stay tuned
max-mapper / datacouch Goto Github PK
View Code? Open in Web Editor NEW[ON HIATUS] distributed, collaborative dataset sharing
[ON HIATUS] distributed, collaborative dataset sharing
If I try to delete a dataset from the Edit->delete dataset menu entry I get the following error: Server error: Fatal XHR Error - and nothing happens.
use github login
I started with a fresh clone of your latest node_server branch
Steps to replicate:
would it be best to add json parse to node_server branch to prevent this from happening to other in the future??
Log listed below
Coles-MacBook-Air:datacouch cole$ node run
The "sys" module is now called "util". It should have a similar interface.
node.js:201
throw e; // process.nextTick error, or 'error' event on first tick
^
Error: Cannot find module 'csv'
at Function._resolveFilename (module.js:334:11)
at Function._load (module.js:279:25)
at Module.require (module.js:357:17)
at require (module.js:368:17)
at Object. (/Users/cole/Public/github/datacouch/service/csv_uploader.js:2:11)
at Module._compile (module.js:432:26)
at Object..js (module.js:450:10)
at Module.load (module.js:351:31)
at Function._load (module.js:310:12)
at Module.require (module.js:357:17)
Coles-MacBook-Air:datacouch cole$ npm install
npm WARN [email protected] dependencies field should be hash of : pairs
npm http GET https://registry.npmjs.org/csv/0.0.10
npm http GET https://registry.npmjs.org/couchapp/0.8.1
npm http 304 https://registry.npmjs.org/couchapp/0.8.1
npm http 304 https://registry.npmjs.org/csv/0.0.10
[email protected] ./node_modules/couchapp
[email protected] ./node_modules/csv
Coles-MacBook-Air:datacouch cole$ node run
The "sys" module is now called "util". It should have a similar interface.
node.js:201
throw e; // process.nextTick error, or 'error' event on first tick
^
Error: Cannot find module 'jsonparse'
at Function._resolveFilename (module.js:334:11)
at Function._load (module.js:279:25)
at Module.require (module.js:357:17)
at require (module.js:368:17)
at Object. (/Users/cole/Public/github/datacouch/node_modules/JSONStream/index.js:2:14)
at Module._compile (module.js:432:26)
at Object..js (module.js:450:10)
at Module.load (module.js:351:31)
at Function._load (module.js:310:12)
at Module.require (module.js:357:17)
Coles-MacBook-Air:datacouch cole$ npm install
npm WARN [email protected] dependencies field should be hash of : pairs
Coles-MacBook-Air:datacouch cole$ node run
The "sys" module is now called "util". It should have a similar interface.
node.js:201
throw e; // process.nextTick error, or 'error' event on first tick
^
Error: Cannot find module 'jsonparse'
I have encountered a few problems in attempting to contribute datasets (all data in question available here: http://bvmou.dyndns.org/_utils/)
In general, when loading from for example:
http://bvmou.dyndns.org/sf_bike_ped_acc/_design/recline/_rewrite/api/json
I will get prompted to select the "docs" array, will be prompted that it is fetching, will be prompted that it is saving, and then not see the docs reflected in the web interface (or after several hours.) It is possible that I am breaking some sort of transfer or cpu limit imposed by iriscouch.
In two cases I successfully loaded (probably excessively) large csv's, with the intention of creating geometries that would be recognizable to a mapping library by editing their field names in the recline interface, but I wasn't able to and instead tried to overwrite them with valid geojson from a local couchdb, leading to the above errors.
Semi-related workaround might be to have a dataset type with sample fields and compressed attachments. For example the 18M attachment "nyacc_newline_separated_geo.tar.gz" at http://bv.iriscouch.com/_utils/document.html?new_york_accidents/_design/ny_accidents is something like 255 M uncompressed and ~300 in couch plus several gigs of spatial index. If you are interested in some of these types datasets (not a given), that might be a convenience.
Imported a CSV with 98 rows, then clicked Applications and chose polymapper. Spinner is still spinning. Happens in surf (bare webkit browser) and Chrome 15.0.874.51 beta.
I think there is a spot in dataset update from paste api where json is being parsed more than once, leading to the below error:
[2012-01-29 23:01:06.771] [INFO] console - [request] POST http://bvm:******@localhost:5983/dcd667629546bef4b657d5275993013357/_ensure_full_commit
undefined:1
s���� �M
^
SyntaxError: Unexpected token '
at Object.parse (native)
at IncomingMessage.<anonymous> (/home/bvm/prg/lang/js/datacouch/node_modules/tako/index.js:423:33)
at IncomingMessage.<anonymous> (events.js:88:20)
at IncomingMessage.<anonymous> (/home/bvm/prg/lang/js/datacouch/node_modules/tako/index.js:303:13)
at IncomingMessage.<anonymous> (events.js:88:20)
at HTTPParser.onMessageComplete (http.js:137:23)
at Socket.ondata (http.js:1387:22)
at TCP.onread (net.js:354:27)
bvm@bvm:~/prg/lang/js/datacouch$ node run.js
Tako index.js after line 420 will quieten with the below, but this feels inconsistent with the rest of the module, like casting to pointer void twice to shut up a compiler, but I just sort of add here in case it is more obvious what unexpected things might be happening, or what should be provisioned beforehand in these cases (ie, starting out a dataset with a paste)
if (req.method === "PUT" || req.method === "POST") {
if (req.headers['content-type'] === 'application/json') {
req.on('body', function (body) {
try {
req.emit('json', JSON.parse(body))
} catch (e) {
try {
req.emit('json', body)
} catch (f) {
throw "wasnt too parsed"
}
}
})
}
}
Will this work as a standalone couchapp or does it require a middle layer (node.js) to serve some requests for Twitter authentication and other things?
Have done a bit of interacting with the live system and finding a couple of things:
In app install, am getting a not found dialog (also at burritomap vhost)
From paste json: the uploading documents gif spinner is not closing on upload complete.
Test dataset is a 650k set of 177 world countries, with their ISO-type codes and stuff and simplified geographies, json array available here: http://h.sfgeo.org/a300mphach/jsonpnaturalearth/countries.json
In general getting some slow, hangy interactions with urls like:
http://datacouch.com/socket.io/1/jsonp-polling/1-numbesnumbers-5440?t=13-numbers-861&i=0
and same as above .../xhr-polling/...
(I am probably doing all this while you are pushing changes ;)
We can maybe break the pieces of this into separate issues and close as they are addressed, am not sure which of several possible interactions are triggering unexpected behavior
If I try to delete all documents in a dataset (e.g. if the CSV import format was wrong) only some documents are deleted (approx 1500 per run). The dataset I tried to empty had ~9000 documents to start with.
Implement common data enhancements as transformations, so that they can be easily applied to the original data set when it is updated.
For example - City A
uploads crime data, where one field is IN ALL CAPS
. User B
painstakingly rewrites this field as lowercase, except for the first letter. City A
appends newer data. Now, part of the data is Pretty
and part of it is UGLY
. Unless User B
goes back and updates the data again, it will be in mixed formats - and that would be an abomination.
If there was a way to specify the desired transformation, it could be automatically applied every time the data was uploaded (assuming the uploader approved the transformation).
Specifying a transform:
(Forgive me if this feature exists or is in the works - I saw the video on CfA and this idea popped into my head)
Keep up the good work!
I couldn't find a way from a user profile page back to the main page. Well except the back button. Data view has the header linked up but the user pages don't.
Hiya - just wondering what's in store for the rewrite, is it alive, etc? thx
I was trying to edit the column titles for data in my sfusd data and the full column of data seemed to disappear instead
When opening a data set that I own from the home page or my profile the "EDIT" link for the data details doesn't show up. If I switch to applications then back to data the sidebar reloads and EDIT appears.
Hello,
I would like to use datacouch programatically, therefore load data using the couchdb API. However, whenever I click on the token link, the page keeps loading during minutes without returning anything. If I open the 'token' hyperlink in a new tab, I get 'Unauthorized' anwer.
Any solution? Did I do something wrong?
Using firefox (iceweasel) on Linux
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.