Giter Site home page Giter Site logo

api-data-to-csv's Introduction

API Data to CSV file generator

Get all the data from an API endpoint. If API endpoint has Ex: 50,000 data apiDataToCsv will automatically generate endpoints to get 50,000 data and make asynchronous request, gather the data and export data to a CSV file.

How it works

Suppose you have an API endpoint

https://example.com/api/vi/users

In this endpoint you have 50,000 data. And you want to export all 50k data in a single CSV file with single click. But your backend send paginated data like this -

https://example.com/api/vi/users?offset=0&limit=200

Only 200 data you'll get from single HTTP request and you've to make 250 HTTP request to get all 50k data.

apiDataToCsv won't reduce the HTTP request but it'll make 250 HTTP request for you and it'll automatically generate all the endpoints itself based on the response total of first HTTP response.

Let assume this is your first API response with 10 data limit

https://example.com/api/vi/users?offset=0&limit=10

{
  "success": true,
  "message": "Data fetch succeeded.",
  "data": {
    "count": 500,
    "rows": [
      {
        "id": 1,
        "email": "[email protected]",
        "isEmailVerified": true,
        "createdAt": "2023-01-16T19:18:34.000Z",
        "updatedAt": "2023-01-16T19:18:34.000Z"
      },
      ....
      ....
      ....,
      {
        "id": 10,
        "email": "[email protected]",
        "isEmailVerified": true,
        "createdAt": "2023-01-16T19:18:49.000Z",
        "updatedAt": "2023-01-16T19:18:49.000Z"
      }
    ]
  }
}

apiDataToCsv will calculate count with limit and make it's own endpoint for getting all 500 data and export in a single csv file

Example

Install

npm i api-data-to-csv

For multiple HTTP request. Use if want to export all the data available in this endpoint.

fetchAllData({
    url: string, 
    query?: string, 
    limit?: number, 
    promiseLimit?: number, 
    total?: number, 
    countKey?: string, 
    listKey?: string
})

Example values:

{
    url: `https://example.com/ap/endpoint`, 
    query: "query={"id": 1}&include={"model": "ExampleModel", "as": "exampleModel"}", // Any query. Note: Don't add 'limit' and 'offset' with query
    limit = 200, // Data limit per request. Default: 200
    total = 5000, // Total number of data. if not defined package will get the data from the first API call 
    dataKey = , // data key for API response object. default: 'data'
    dataListKey = 'rows', // Data list 'key' in API response. Default: 'rows'
    countKey = 'count' // Total number of data defined in API response data 'key'. Default: 'count'
    promiseLimit = 10, // Promise concurrency limit Default: 10
}

For signle HTTP request. Use if want to export only the passed Endpoint data.

fetchData({
    url: string, 
    query?: string, 
    limit?: number, 
    offset?: number
})

Example values:

{
    url: `https://example.com/ap/endpoint`, 
    query: "query={"id": 1}&include={"model": "ExampleModel", "as": "exampleModel"}", // Any query. Note: Don't add 'limit' and 'offset' with query
    limit = 50, // Default: 50 
    offset = 0, // Default: 0 
    dataKey = , // data key for API response object. default: 'data'
    dataListKey = 'rows', // Data list 'key' in API response. Default: 'rows'
    countKey = 'count' // Total number of data defined in API response data 'key'. Default: 'count'
}

ReactJs example

Codesandbox Demo

import { fetchData, fetchAllData, exportToCsv } from 'apiDataToCsv'

function App() {

    const handleAllDownload = async () => {
        setLoading(true)
        try {
            const data = await fetchAllData({
                url: 'https://example.com/api/endpoint',
                limit: 100,
                // offset: 0,
                // total: 500,
                // query: ''
            })
            exportToCsv({data}) 
            setLoading(false)
        } catch (error) {
            console.log(error)
            setLoading(false)
        }
    }

    const handleDownload = async () => {
        setLoading(true)
        try {
            const data = await fetchAllData({
                url: 'https://example.com/api/endpoint',
                limit: 100,
                offset: 0,
                // query: '',
            })
            exportToCsv({ data }) 
            setLoading(false)
        } catch (error) {
            console.log(error)
            setLoading(false)
        }
    }

    return (
        <div>
            <button onClick={handleAllDownload}>Export to CSV</button>
            <button onClick={handleDownload}>Export to CSV</button>
        </div>
    )
}

Only from object array exportToCsv

const data = [
    { id: 1, name: "a", title: "title", complete: false, position: "C" },
    { id: 2, name: "a", title: "title", complete: false, position: "C" },
    { id: 3, name: "a", title: "title", complete: false, position: "C" },
    { id: 4, name: "a", title: "title", complete: false, position: "C" },
    { id: 5, name: "a", title: "title", complete: false, position: "C" }
  ];


  exportToCsv({ data, fileName: 'my_user_data' })

api-data-to-csv's People

Contributors

rahadkc avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.