This is a preview release. The API may change.
DataFire is an open source integration framework - think Grunt for APIs, or Zapier for the command line. It is built on top of open standards such as RSS and Open API. Flows can be run locally, on AWS Lambda, Google Cloud, or Azure via the Serverless framework, or on DataFire.io.
DataFire natively supports over 250 public APIs including:
• Slack • GitHub • Twilio • Trello • Spotify • Instagram • Gmail • Google Analytics • YouTube •
as well as MongoDB, RSS feeds, and custom integrations.
Be sure to install DataFire both globally and as a project dependency.
npm install -g datafire
npm install --save datafire
You can use the command line tool to search for and install integrations, as well as make test calls.
See DataFire-flows/headlines for a reference project.
The DataFire-flows account has a few example flows you can clone and try.
- News Headlines - Send yourself a daily e-mail with headlines from NPR, CNN, and NYTimes
- Listen to This - Create a Spotify playlist from tracks posted to Reddit's r/listentothis
- GitHub to Trello - Create Trello cards for every issue in your repo
- Heroku Crash Alerts - Get a Slack message when a Heroku process crashes
See Flows.md for the full documentation
Flows allow you to make a series of calls to different APIs and services. You can synchronize, transfer, and react to data, no matter where it's stored.
You can view this flow in the [examples directory](./examples/0. quickstart).
This quick tutorial will fetch stories from Hacker News, get the details for the top story, then store the results to a local file.
First, let's create a new folder and add the Hacker News integration:
mkdir hacker_news_flow && cd hacker_news_flow
npm install datafire
datafire integrate hacker_news
Now we can create a Flow. Edit ./getTopStory.js
:
const datafire = require('datafire');
const fs = require('fs');
const hackerNews = datafire.Integration.new('hacker_news');
const flow = module.exports =
new datafire.Flow('Top HN Story', 'Copies the top HN story to a local file');
flow
.step('stories', {
do: hackerNews.getStories(),
params: {storyType: 'top'},
})
.step('story_details', {
do: hackerNews.getItem(),
params: data => {
return {itemID: data.stories[0]}
}
})
.step('write_file', {
do: data => {
fs.writeFileSync('./story.json', JSON.stringify(data.story_details, null, 2));
}
});
Now let's run it:
datafire run -f ./getTopStory.js
You should see story.json
in your current directory.
Run
datafire --help
ordatafire <command> --help
for more info
datafire list -a # View all available integrations
datafire list -a -q news # Search for integrations by keyword
datafire list # View installed integrations
datafire integrate google-gmail # Add integrations by name
npm install --save @datafire/google-gmail # Or by NPM package
datafire describe google-gmail # Show info and operations
datafire describe google-gmail -o users.messages.list # Show operation details
datafire describe google-gmail -o "GET /{userId}/messages" # Alternative operation name
datafire authenticate google-gmail # Store credentials for later use
# Make a test call to the API
datafire call github -o "GET /users"
# Use stored credentials with --as
datafire call github -o "GET /user" --as account_alias
# Pass parameters with --params.foo
datafire call github -o "GET /users/{username}" --params.username karpathy
# Run a flow
datafire run ./getMessages.js
See Integrations.md for the full documentation
You can add new integrations automatically from an OpenAPI specification or RSS feed. There is also experimental support for writing custom integrations.
See Authentication.md for the full documentation
DataFire can store authentication details for each integration, and multiple accounts can be created for a given integration. Support for basic authentication (username/password), API keys, and OAuth 2.0 is built-in.
See RunningFlows.md for the full documentation
Once you've written a flow, you have a number of options for running it:
- Manually on the command line
- On a schedule with cron
- On AWS Lambda
- Inside a Serverless project
- On DataFire.io
Lamdba, Serverless, and DataFire all offer ways to run your flow either on a schedule or in response to HTTP requests (webhooks).