Giter Site home page Giter Site logo

couchimport's Introduction

CouchImport

Build Status npm version

Introduction

When populating CouchDB databases, often the source of the data is initially a CSV or TSV file. couchimport is designed to assist you with importing flat data into CouchDB efficiently. It can be used either as command-line utilities couchimport and couchexport or the underlying functions can be used programmatically:

  • simply pipe the data file to couchimport on the command line.
  • handles tab or comma-separated data.
  • uses Node.js's streams for memory efficiency.
  • plug in a custom function to add your own changes before the data is written.
  • writes the data in bulk for speed.
  • can also read huge JSON files using a streaming JSON parser.
  • allows multiple HTTP writes to happen at once using the --parallelism option.

schematic

Installation

Requirements

  • node.js = npm
  sudo npm install -g couchimport

Configuration

couchimport's configuration parameters can be stored in environment variables or supplied as command line arguments.

The location of CouchDB

Simply set the COUCH_URL environment variable e.g. for a hosted Cloudant database

  export COUCH_URL="https://myusername:[email protected]"

or a local CouchDB installation:

  export COUCH_URL="http://localhost:5984"

IAM Authentication

Alternatively, if you are using IAM authentication with IBM Cloudant, then supply two environment variables:

  • COUCH_URL - the URL of your Cloudant host e.g. https://myhost.cloudant.com (note absence of username and password in URL).
  • IAM_API_KEY - the IAM API KEY e.g. ABC123515-151215.

The name of the database - default "test"

Define the name of the CouchDB database to write to by setting the COUCH_DATABASE environment variable e.g.

  export COUCH_DATABASE="mydatabase"

Transformation function - default nothing

Define the path of a file containing a transformation function e.g.

  export COUCH_TRANSFORM="/home/myuser/transform.js"

The file should:

  • be a JavaScript file
  • export one function that takes a single doc and returns a single object or an array of objects if you need to split a row into multiple docs.

(see examples directory).

Delimiter - default "\t"

The define the column delimiter in the input data e.g.

  export COUCH_DELIMITER=","

Running

Simply pipe the text data into "couchimport":

  cat ~/test.tsv | couchimport

This example downloads public crime data, unzips and imports it:

  curl 'http://data.octo.dc.gov/feeds/crime_incidents/archive/crime_incidents_2013_CSV.zip' > crime.zip
  unzip crime.zip
  export COUCH_DATABASE="crime_2013"
  export COUCH_DELIMITER=","
  ccurl -X PUT /crime_2013
  cat crime_incidents_2013_CSV.csv | couchimport

In the above example we use (ccurl)[https://github.com/glynnbird/ccurl], a command-line utility that uses the same environment variables as couchimport.

Output

The following output is visible on the console when "couchimport" runs:

couchimport
-----------
 url         : "https://****:****@myhost.cloudant.com"
 database    : "test"
 delimiter   : "\t"
 buffer      : 500
 parallelism : 1
 type        : "text"
-----------
  couchimport Written ok:500 - failed: 0 -  (500) +0ms
  couchimport { documents: 500, failed: 0, total: 500, totalfailed: 0 } +0ms
  couchimport Written ok:499 - failed: 0 -  (999) +368ms
  couchimport { documents: 499, failed: 0, total: 999, totalfailed: 0 } +368ms
  couchimport writecomplete { total: 999, totalfailed: 0 } +0ms
  couchimport Import complete +81ms

The configuration, whether default or overriden by environment variables or command line arguments, is shown. This is followed by a line of output for each block of 500 documents written, plus a cumulative total.

Preview mode

If you want to see a preview of the JSON that would be created from your csv/tsv files then add --preview true to your command-line:

    > cat text.txt | couchimport --preview true
    Detected a TAB column delimiter
    { product_id: '1',
      brand: 'Gibson',
      type: 'Electric',
      range: 'ES 330',
      sold: 'FALSE' }

As well as showing a JSON preview, preview mode also attempts to detect the column delimiter character for you.

Importing large JSON documents

If your source document is a GeoJSON text file, couchimport can be used. Let's say your JSON looks like this:

{ "features": [ { "a":1}, {"a":2}] }

and we need to import each feature object into CouchDB as separate documents, then this can be imported using the type="json" argument and specifying the JSON path using the jsonpath argument:

  cat myfile.json | couchimport --database mydb --type json --jsonpath "features.*"

Importing JSON Lines file

If your source document is a JSON Lines text file, couchimport can be used. Let's say your JSON Lines looks like this:

{"a":1}
{"a":2}
{"a":3}
{"a":4}
{"a":5}
{"a":6}
{"a":7}
{"a":8}
{"a":9}

and we need to import each line as a JSON object into CouchDB as separate documents, then this can be imported using the type="jsonl" argument:

  cat myfile.json | couchimport --database mydb --type jsonl

Importing a stream of JSONs

If your source data is a lot of JSON objects meshed or appended together, couchimport can be used. Let's say your file:

{"a":1}{"a":2}  {"a":3}{"a":4}
{"a":5}          {"a":6}
{"a":7}{"a":8}



{"a":9}

and we need to import each JSON objet to CouchDB as separate documents, then this can be imported using the type="jsonl" argument:

  cat myfile.json.blob | couchimport --database mydb --type jsonl

Overwriting existing data

If you are importing data into a CouchDB database that already contains data, and you are supplying a document _id in your source data, then and values of _id will fail to write because CouchDB will report a 409 Document Conflict. If you want your supplied data to supercede existing data then supply --overwrite true/-o true as a command-line option. This will instruct couchimport to fetch the existing documents' current _rev values and inject them into the imported data stream.

Note: Using overwrite mode is slower because an additional API call is required per batch of data imported. USe caution when importing data into a data set that is being changed by another actor at the same time.

Environment variables

  • COUCH_URL - the url of the CouchDB instance (required, or to be supplied on the command line)
  • COUCH_DATABASE - the database to deal with (required, or to be supplied on the command line)
  • COUCH_DELIMITER - the delimiter to use (default '\t', not required)
  • COUCH_TRANSFORM - the path of a transformation function (not required)
  • COUCHIMPORT_META - a json object which will be passed to the transform function (not required)
  • COUCH_BUFFER_SIZE - the number of records written to CouchDB per bulk write (defaults to 500, not required)
  • COUCH_FILETYPE - the type of file being imported, either "text", "json" or "jsonl" (defaults to "text", not required)
  • COUCH_JSON_PATH - the path into the incoming JSON document (only required for COUCH_FILETYPE=json imports)
  • COUCH_PREVIEW - run in preview mode
  • COUCH_IGNORE_FIELDS - a comma-separated list of field names to ignore on import or export e.g. price,url,image
  • COUCH_OVERWRITE - overwrite existing document revisions with supplied data
  • COUCH_PARALLELISM - the maximum number of HTTP requests to have in flight at any one time (default: 1)
  • COUCH_MAX_WPS - the maximum number of write API calls to make per second (rate limiting) (default: 0 - no rate limiting)
  • COUCH_RETRY - whether to retry requests which yield a 429 response (default: false)

Command-line parameters

You can also configure couchimport and couchexport using command-line parameters:

  • --help - show help
  • --version - simply prints the version and exits
  • --url/-u - the url of the CouchDB instance (required, or to be supplied in the environment)
  • --database/--db/-d - the database to deal with (required, or to be supplied in the environment)
  • --delimiter - the delimiter to use (default '\t', not required)
  • --transform - the path of a transformation function (not required)
  • --meta/-m - a json object which will be passed to the transform function (not required)
  • --buffer/-b - the number of records written to CouchDB per bulk write (defaults to 500, not required)
  • --type/-t - the type of file being imported, either "text", "json" or "jsonl" (defaults to "text", not required)
  • --jsonpath/-j - the path into the incoming JSON document (only required for type=json imports)
  • --preview/-p - if 'true', runs in preview mode (default false)
  • --ignorefields/-i - a comma-separated list of fields to ignore input or output (default none)
  • --parallelism - the number of HTTP request to have in flight at any one time (default 1)
  • --maxwps - the maximum number of write API calls to make per second (default 0 - no rate limiting)
  • --overwrite/-o - overwrite existing document revisions with supplied data (default: false)
  • --retry/-r - whether to retry requests which yield a 429 response (default: false)

e.g.

    cat test.csv | couchimport --database  bob --delimiter ","

couchexport

If you have structured data in a CouchDB or Cloudant that has fixed keys and values e.g.

{
    "_id": "badger",
    "_rev": "5-a9283409e3253a0f3e07713f42cd4d40",
    "wiki_page": "http://en.wikipedia.org/wiki/Badger",
    "min_weight": 7,
    "max_weight": 30,
    "min_length": 0.6,
    "max_length": 0.9,
    "latin_name": "Meles meles",
    "class": "mammal",
    "diet": "omnivore",
    "a": true
}

then it can be exported to a CSV like so (note how we set the delimiter):

    couchexport --url http://localhost:5984 --database animaldb --delimiter "," > test.csv

or to a TSV like so (we don't need to specify the delimiter since tab \t is the default):

    couchexport --url http://localhost:5984 --database animaldb > test.tsv

or to a stream of JSON:

    couchexport --url http://localhost:5984 --database animaldb --type jsonl

N.B.

  • design documents are ignored
  • the first non-design document is used to define the headings
  • if subsequent documents have different keys, then unexpected things may happen
  • COUCH_DELIMITER or --delimiter can be used to provide a custom column delimiter (not required when tab-delimited)
  • if your document values contain carriage returns or the column delimiter, then this may not be the tool for you
  • you may supply a JavaScript --transform function to modify the data on its way out

Using programmatically

In your project, add couchimport into the dependencies of your package.json or run npm install couchimport. In your code, require the library with

    var couchimport = require('couchimport');

and your options are set in an object whose keys are the same as the COUCH_* environment variables:

e.g.

   var opts = { delimiter: ",", url: "http://localhost:5984", database: "mydb" };

To import data from a readable stream (rs):

    var rs = process.stdin;
    couchimport.importStream(rs, opts, function(err,data) {
       console.log("done");
    });

To import data from a named file:

    couchimport.importFile("input.txt", opts, function(err,data) {
       console.log("done",err,data);
    });

To export data to a writable stream (ws):

   var ws = process.stdout;
   couchimport.exportStream(ws, opts, function(err, data) {
     console.log("done",err,data);
   });

To export data to a named file:

   couchimport.exportFile("output.txt", opts, function(err, data) {
      console.log("done",err,data);
   });

To preview a file:

    couchimport.previewCSVFile('./hp.csv', opts, function(err, data, delimiter) {
      console.log("done", err, data, delimiter);
    });

To preview a CSV/TSV on a URL:

    couchimport.previewURL('https://myhosting.com/hp.csv', opts, function(err, data) {
      console.log("done", err, data, delimiter);  
    });

Monitoring an import

Both importStream and importFile return an EventEmitter which emits

  • written event on a successful write
  • writeerror event when a complete write operation fails
  • writecomplete event after the last write has finished
  • writefail event when an individual line in the CSV fails to be saved as a doc

e.g.

couchimport.importFile("input.txt", opts, function(err,data) {
  console.log("done",err,data);
}).on("written", function(data) {
  // data = { documents: 500, failed:6, total: 63000, totalfailed: 42}
});

The emitted data is an object containing:

  • documents - the number of documents written in the last batch
  • total - the total number of documents written so far
  • failed - the number of documents failed to write in the last batch
  • totalfailed - the number of documents that failed to write in total

Parallelism & Rate limiting

Using the COUCH_PARALLELISM environment variable or the --parallelism command-line option, couchimport can be configured to write data in multiple parallel operations. If you have the networkbandwidth, this can significantly speed up large data imports e.g.

  cat bigdata.csv | couchimport --database mydb --parallelism 10 --delimiter ","

This can be combined with the COUCH_MAX_WPS/--maxwps parameter to limit the number write API calls dispatched per second to make sure you don't exceed the number writes on a rate-limited service.

couchimport's People

Contributors

assafmo avatar benjspriggs avatar dependabot[bot] avatar glynnbird avatar gr2m avatar greenkeeperio-bot avatar jason-cooke avatar jdfitzgerald avatar jkryspin avatar lornajane avatar micmath avatar rajrsingh avatar smeans avatar terichadbourne avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

couchimport's Issues

Commas not escaped on export

When using couchexport with a , as a delimiter, the commas inside the fields are not escaped, and therefore the CSV is broken. Usually CSV files will wrap fields containing commas with "s.

Using couchimport v. 0.7.0

Confused about JSONPath parameter

Want to create a test database.
Production database is ~ 3M rows (Cloudant)
I query out my random sample with a _view, and use ?include_docs=true with ccurl piped to a file.

Now I got JSON like this:

{"total_rows":10000,"offset":0,"rows":[
   {"id":" 61001",
    "key":" 61001",
    "value":1,
    "doc": {
       "_id":"61001",
       "_rev":"7-7b34fbeadcb40c5c7034cd3628da5d7c",
       "field1":""}
   },

etc.

So, to import that back up into a new Cloudant database, doing this:

export COUCH_DATABASE="new-db-small"
export COUCH_PARALLELISM=10
export COUCH_FILETYPE=json

export COUCH_JSON_PATH="rows.*.doc"
cat ./data/tickets_small.json | couchimport

and I'm not finding the documents in the source json. I tried:
rows.* : was wrong, got the id, key, value, doc properties
.rows[*].doc : which should work if it was a JSONpath per Stefan Goessner, I think
and a bunch of trial and error.
So, what am I missing? I'm certain you've done exactly this, considering the output of a view.

Crashing Consistently

Loading a JSON file to Cloundant. Any file throws and error around the 117000 record mark.

Posting the console dump here:

<--- Last few GCs --->

35952 ms: Scavenge 1399.3 (1458.1) -> 1399.3 (1458.1) MB, 1.1 / 0 ms (+ 1.0 ms in 1 steps since last GC) [allocation failure] [incremental marking delaying mark-sweep].
36882 ms: Mark-sweep 1399.3 (1458.1) -> 1398.9 (1458.1) MB, 930.4 / 0 ms (+ 1.0 ms in 1 steps since start of marking, biggest step 1.0 ms) [last resort gc].
37793 ms: Mark-sweep 1398.9 (1458.1) -> 1398.9 (1458.1) MB, 910.4 / 0 ms [last resort gc].

<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x30eacf1e3ac1
2: write [/usr/local/lib/node_modules/couchimport/node_modules/jsonparse/jsonparse.js:~80] [pc=0x38b956c216c0](this=0x1cf13944ecc1 <a Parser with map 0x288e820a8131>,buffer=0xdf746b4d31 <an Uint8Array with map 0x288e82005759)
3: /* anonymous */ [/usr/local/lib/node_modules/couchimport/node_modules/JSONStream/index.js:~20] [pc=0x38b956ca4417] (this=0x1cf13944ee19 <a Stream with map 0x2...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Abort trap: 6

Export custom data from object

{ "_id": "d4ebc3b9397ba3faaacde2bfb80089e7", "_rev": "2-69b7ea5b6dd7495ff3d26a0fc5630825", "personal": { "firstName": "Dan", "lastName": "Denney", "dob": "12/26/87", "gender": "male" }, "work": { "workTitle": "Software Engineer", "workCompany": "Code School" }, "social": { "website": "dandenney.com", "emailId": "[email protected]", "twitter": "dandenney1", "facebook": "dandenney" }, "address": { "houseNumber": "52", "streetName": "main street", "city": "lodz", "country": "poland" } }

Want to export simple object with attributes firstName, lastName & company
We created transform.js
// example transformation function // -- remove leading and trailing quotes var x = function(doc) { doc.firstName = doc.personal.firstName; doc.lastName= doc.personal.lastName; doc.company = doc.work.workCompany; delete doc.work; delete doc.social; delete doc.address; return doc; } module.exports = x;

Command used
couchexport --url http://localhost:5984 --database tomtomdemo --delimiter "," --transform "D:\Places\optimus\Tools\couchDbTools\transform.js" > test_1.csv

Result : not expected
_id,personal,work,social,address
d4ebc3b9397ba3faaacde2bfb80089e7,[object Object],[object Object],[object Object],[object Object]

Getting writefail events even for successful document inserts

Almost there -- and looking good, except I'm getting a writefail for every document insert, when they're succeeding?

Writefail
{ id: '01532024366550100050901',
rev: '3-d8fe0cefe3cdc9914d3040d425ea84ff' }
Writefail
{ id: '01532329456550100051600',
rev: '3-0321a1818f40d1bbf5a07412718bc15c' }
Writefail
{ id: '01532420316550100015440',
rev: '2-fe828a1c6bfd16ee3963d6b415e58454' }
{ documents: 0, failed: 500, total: 0, totalfailed: 6500 }

But, I checked in Cloudant ... the document inserts are definitely succeeding.

Getting Killed after importing about 20k records

Hi,
I'm trying to import a fairly big data set (~3 million entries) by using couchimport, but the process always gets killed for some reason.
Here's my output:

root@leb-01122233:~# cat dailydump.txt | couchimport
******************
 COUCHIMPORT - configuration
   {"COUCH_URL":"http://****:****@127.0.0.1:5984","COUCH_DATABASE":"torrents","COUCH_TRANSFORM":null,"COUCH_DELIMETER":"|"}
******************
Written 500  ( 500 )
Written 500  ( 1000 )
...
Written 500  ( 21500 )
Written 500  ( 22000 )
Written 500  ( 22500 )
Written 500  ( 23000 )
Killed

any ideas?

IMPORT BIGDATA

I'm trying to load a large amount of data 100GB size csv format, and use the --parallelism parameters and --buffer but I can not improve loading time.
Can you help me solve this or How to consume more RAM ??

Add a version switch

Allow users to check the tool is installed without actually running the import piece. I expected to be able to do couchimport --version to check that the tool was working and what version I had. Could we add this?

Not getting the writefail event as expected

I'm passing in a file I know should be causing document conflicts, but I'm receiving the written event instead of the writefail. Any thoughts?

couchimport.importFile(jsonFile, opts, function(err,data) {
console.log("Imported file: " + jsonFile,err,data);
}).on("writefail", function(data) {
console.log(data);
console.log("Writefail");
}).on("written", function(data) {
console.log(data);
console.log("Written");
}).on("writeerror", function(data) {
console.log(data);
console.log("WriteError");
});

{ documents: 2, total: 2 }
Written
Imported file: /dev/shm/SOE/inv_test.json null { total: 2 }

end event listener registration error

Hi all, I've been working on a tool to identify instances of events registered to the wrong object in uses of some JavaScript event-driven APIs, as part of a research project.
The tool flagged line 66 in includes/preview.js, on the registration of the “end” event.

The reason I believe this is indicative of an error is as follows (from looking at the nodejs http API documentation).
The return of agent.get is an http.ClientRequest. But, “end” is an event on a readable stream, and http.ClientRequest is a writable stream.

Since the argument to the callback passed into agent.get is an http.IncomingMessage, which is a readable stream, then my guess is that the listener for “end” maybe should be registered on this variable instead.
Specifically, I would guess the code should instead be

 agent.get(u, function (rs) {
    rs.on('data', function (d) {
      b = Buffer.concat([b, d])
      if (b.length > 10000) {
        rs.destroy()
        alldone()
      }
    });
    rs.on(‘end’, alldone); // this registration has been moved
  }).on('error', alldone)

Thanks!

"file not found" when using IAM_KEY authentication

Fresh install of couchimport.
Verified all runs well when using id and password authentication.

Trying to use IAM_KEY auth.

  1. export IAM_API_KEY=<value from my cloudant service credentials 'apikey'>
  2. run couchimport

get the following:

couchimport

url : "https://6f0e3c7d-3b09-4fd0-b253-c26d43892ac6-bluemix.cloudantnosqldb.appdomain.cloud"
database : "unlocodes_data"
delimiter : "\t"
buffer : 500
parallelism : 1
type : "jsonl"

Error: ENOENT: no such file or directory, open '/Users/kbiegert/.ccurl/keycache.json'
at Object.openSync (fs.js:457:3)
at Object.readFileSync (fs.js:359:35)
at Object.init (/usr/local/Cellar/node/13.8.0/lib/node_modules/couchimport/node_modules/ccurllib/index.js:21:20)
at Object.getToken (/usr/local/Cellar/node/13.8.0/lib/node_modules/couchimport/includes/iam.js:5:14)
at module.exports (/usr/local/Cellar/node/13.8.0/lib/node_modules/couchimport/includes/writer.js:20:7)
at Object.importStream (/usr/local/Cellar/node/13.8.0/lib/node_modules/couchimport/app.js:22:49)
at Object. (/usr/local/Cellar/node/13.8.0/lib/node_modules/couchimport/bin/couchimport.bin.js:48:15)
at Module._compile (internal/modules/cjs/loader.js:1157:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1177:10)
at Module.load (internal/modules/cjs/loader.js:1001:32) {
errno: -2,
syscall: 'open',
code: 'ENOENT',
path: '/Users/kbiegert/.ccurl/keycache.json'
}
events.js:298
throw er; // Unhandled 'error' event
^

Error [ERR_METHOD_NOT_IMPLEMENTED]: The _transform() method is not implemented
at Transform._transform (_stream_transform.js:166:6)
at Transform._read (_stream_transform.js:191:10)
at Transform._write (_stream_transform.js:179:12)
at doWrite (_stream_writable.js:441:12)
at writeOrBuffer (_stream_writable.js:425:5)
at Transform.Writable.write (_stream_writable.js:316:11)
at Transform.ondata (_stream_readable.js:714:22)
at Transform.emit (events.js:321:20)
at addChunk (_stream_readable.js:294:12)
at readableAddChunk (_stream_readable.js:275:11)
Emitted 'error' event on Transform instance at:
at errorOrDestroy (internal/streams/destroy.js:108:12)
at Transform.onerror (_stream_readable.js:746:7)
at Transform.emit (events.js:321:20)
at errorOrDestroy (internal/streams/destroy.js:108:12)
at onwriteError (_stream_writable.js:456:5)
at onwrite (_stream_writable.js:483:5)
at Transform.afterTransform (_stream_transform.js:98:3)
at Transform._transform (_stream_transform.js:166:3)
at Transform._read (_stream_transform.js:191:10)
at Transform._write (_stream_transform.js:179:12) {
code: 'ERR_METHOD_NOT_IMPLEMENTED'
}

Approach to capture errors when running from the command line?

I'll start by saying that couchimport is blazing fast when importing documents into Cloudant, no complaints there!

I'm calling couchimport from the command line to import large JSON files containing 60k documents per file, or so. Is there a way I can get the utility to tell me which documents failed to write, for example, due to revision conflict errors?

cat inv_1.json | couchimport --db invoices --type json --jsonpath "docs.*"

Limit the requests per second this library makes

When using with Cloudant Lite plan on Bluemix there is a rate limit imposed on customers. Users exceed that rate of API calls will start receiving HTTP 429 replies.

When importing large data sets, it's best to stick to a maximum API call (say 5 per second) to avoid the 429 responses.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.