Giter Site home page Giter Site logo

bradjasper / importjson Goto Github PK

View Code? Open in Web Editor NEW
2.1K 95.0 1.1K 521 KB

Import JSON into Google Sheets, this library adds various ImportJSON functions to your spreadsheet

License: GNU Lesser General Public License v3.0

JavaScript 3.68% HTML 86.11% CSS 8.43% Roff 1.78%
spreadsheet google-sheets importjson parsing cryptocurrencies

importjson's Introduction

ImportJSON

NOTE: This repo is currently unmaintained and looking for a new developer. If you are interested please reach out to [email protected]

Import JSON from any URL directly into your Google Sheets. ImportJSON.gs adds an =ImportJSON() function to your spreadsheet, allowing quick and easy JSON importing. To use go to Tools > Script Editor and add the ImportJSON.gs file. Now in your spreadsheet you can access the ImportJSON() function. Use it like this:

=ImportJSON("https://mysafeinfo.com/api/data?list=bestnovels&format=json&rows=20&alias=cnt=count,avg=average_rank,tt=title,au=author,yr=year", "/title")

Here are all the functions available:

Function Description
ImportJSON For use by end users to import a JSON feed from a URL
ImportJSONFromSheet For use by end users to import JSON from one of the Sheets
ImportJSONViaPost For use by end users to import a JSON feed from a URL using POST parameters
ImportJSONBasicAuth For use by end users to import a JSON feed from a URL with HTTP Basic Auth
ImportJSONAdvanced For use by script developers to easily extend the functionality of this library

Review ImportJSON.gs for more info on how to use these in detail.

Version

  • v1.6.0 (June 2, 2019) Fixed null values (thanks @gdesmedt1)
  • v1.5.0 (January 11, 2019) Adds ability to include all headers in a fixed order even when no data is present for a given header in some or all rows.
  • v1.4.0 (July 23, 2017) - Project transferred to Brad Jasper. Fixed off-by-one array bug. Fixed previous value bug. Added custom annotations. Added ImportJSONFromSheet and ImportJSONBasicAuth.
  • v1.3.0 - Adds ability to import the text from a set of rows containing the text to parse. All cells are concatenated
  • v1.2.1 - Fixed a bug with how nested arrays are handled. The rowIndex counter wasn't incrementing properly when parsing.
  • v1.2.0 - Added ImportJSONViaPost and support for fetchOptions to ImportJSONAdvanced
  • v1.1.1 - Added a version number using Google Scripts Versioning so other developers can use the library
  • v1.1.0 - Added support for the noHeaders option
  • v1.0.0 - Initial release

How can you help?

Website archive

This code base used to be hosted at http://blog.fastfedora.com/projects/import-json and contained a lot of useful information. It has been archived at https://rawgit.com/bradjasper/ImportJSON/master/archive/blog.fastfedora.com/projects/import-json.html

Alternatives

Some of this if possible internally with Google App Scripts External APIs, like UrlFetch: https://developers.google.com/apps-script/guides/services/external

These require a Google account and an explicit permission, but in some cases may be a good fit.

importjson's People

Contributors

bradjasper avatar fastfedora avatar kasitmp avatar kenorb avatar luislobo avatar pbazerque avatar tommyvernieri avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

importjson's Issues

Authentication issues with ImportJSON

Hi,

I am trying to import JSON from URL into Google Sheets but I am getting an "Error code 401. Truncated server response: HTTP Digest: Access denied" despite using "user=bob&apikey=xxxx". Could this be because authentication is asked in a "pop-up" window? See attached.

Thanks,

Martin
screenshot1-popup

Figure out how to handle multiple objects in Facebook API using id fields

The Facebook API allows a user to pass in a comma-separated list of IDs using the 'id' URL parameter, which returns a list of objects (see https://developers.facebook.com/docs/reference/api/request-parameters/ ).

However, the returned JSON does not use an array of objects, but a key-value map of objects where the key is the ID of each object. ImportJSON parses this as a single row of data rather than multiple rows of data.

For instance, the following is parsed as a single row with two sets of columns, one group starting with the path '/15765808054' and another starting with the path '/116335138405853':

=ImportJSON("http://graph.facebook.com/?ids=15765808054,116335138405853", "/", "rawHeaders")

This is correct behavior with the current version of ImportJSON. However, it would be useful to be able to parse this as two separate rows. To do so, we'd need a way of either a) ignoring part of the path or b) using wildcards in the path.

If we could ignore the first component in the path, then '/15765808054/about' and '/116335138405853/about' would both look like '/about' and we could treat the two objects as an array.

If we could use wildcard in paths, we could maybe define the query path as '/*/' and treat that query as saying that everything in the first component should be ignored. Alternative syntaxes might be '//' or '/.../', since we may want to use the asterisk as a wildcard character in the future.

Using paths to convert key-value maps into arrays might be a more general solution. So '///' would say: start at the third-level and return everything as if it were a first-level array. With future wildcard support, we could even do '///title*' which says to take the values of all the keys that start with title at the third level and treat them as a first level array.

Empty value uses previous value

I'm using this great script to import my Trello history into Google Spreadsheets.
For whatever reason, if a node's value is an emptry string, it uses the previous value for the same node. It continues to do this until it encounters a non-empty value and then repeats the issue.
For instance, if "/cards/desc" was one of my queries and returned say, 6 cards, my output might look like (assuming cards 2,3 and 5 have no description):

     Data Column Header:
     value from first card
     value from first card
     value from first card
     value from fourth card
     value from fourth card
     value from sixth card

Data updates

My doc never appears to be pulling the value again from a URL past the first time. How can I make it refresh periodically?

Not importing more than 256 characters

Hello,

I'm trying to import the JSON from this URL.

I place the following in cell A1 after adding ImportJSON.gs to my spreadsheet:

=ImportJSON("https://matthewkosloski.me/foo.json")

It appears that not all the data is being imported.

The first column has all the data (all 256 characters of text).
The second column imports all of the data except for the period at the end (the 257th character).
The third column only imports up to the 256th character (cutting off at "digni").

This is the data that is being imported:

255 Chars 257 Chars 512 Chars
Iste eaque consectetur veritatis. Dolorum at, nihil numquam eius aliquam tenetur assumenda ut harum! Officia praesentium repellat rerum assumenda asperiores magnam aliquid inventore eaque perferendis debitis ad aspernatur blanditiis impedit, harum sit ap. Lorem ipsum dolor sit amet, consectetur adipisicing elit. Magnam impedit officia animi, cumque praesentium sed minima non aut, quisquam ex, esse deleniti perferendis cum nam perspiciatis explicabo ad! Accusamus vel ducimus non consequatur facere, nihil rec Deleniti ad debitis quo sed totam deserunt sequi voluptatum consequuntur, maiores non ipsa qui perspiciatis, nisi praesentium. Velit nisi ex, numquam cum ad quia ipsam rerum deleniti repellendus nemo explicabo esse? Iure, soluta et iusto maxime totam digni

This is what I should be getting:

255 Chars 257 Chars 512 Chars
Iste eaque consectetur veritatis. Dolorum at, nihil numquam eius aliquam tenetur assumenda ut harum! Officia praesentium repellat rerum assumenda asperiores magnam aliquid inventore eaque perferendis debitis ad aspernatur blanditiis impedit, harum sit ap. Lorem ipsum dolor sit amet, consectetur adipisicing elit. Magnam impedit officia animi, cumque praesentium sed minima non aut, quisquam ex, esse deleniti perferendis cum nam perspiciatis explicabo ad! Accusamus vel ducimus non consequatur facere, nihil rec. Deleniti ad debitis quo sed totam deserunt sequi voluptatum consequuntur, maiores non ipsa qui perspiciatis, nisi praesentium. Velit nisi ex, numquam cum ad quia ipsam rerum deleniti repellendus nemo explicabo esse? Iure, soluta et iusto maxime totam dignissimos dolore impedit sapiente, nam accusamus ratione, dolores ut praesentium, nulla sed quis voluptates tempora voluptatum. Consectetur in nisi delectus quasi optio impedit deleniti voluptates natus placeat dolorum dolore ad ullam alias illum maxime veni.

As you can see, there should be a period at the end of "rec" in column two, and column three has much more text.

I pasted my JSON in a linter and it says it's valid JSON. Are you able to recreate this issue?

It appears that it's only importing up to 256 characters of text.

Nevertheless, this is a really cool, valuable tool. Thanks for making it!

Time out

Hi,

Very often I get error "504" on row 212:

504 Gateway Time-out

The server didn't respond in time.

How I can change connection-timeout value to for example 2000ms?

Anyone have problems with creating charts or aggregate functions on imported data?

First of all, thanks for your code!

I'm unable to create a Chart, or perform simple aggregate functions like =SUM(A1:A10), on imported numeric json data in a Google Sheet. The Sheet is basically acting like the numbers are null, or zero, when charting or aggregates are involved, even though the numeric values are clearly there.

Even weirder: given values of 100 and 101 in cells A1 and A2, respectively, a formula in another cell like =A1+A2 correctly evaluates to 201, but =SUM(A1:A2) evaluates to 0.0.

Can anyone reproduce, or am I going crazy?

how could I get a list of epic keys from my JIRA instance using importJSON?

I can make a POST call using a REST API browser extension that returns me a JSON. The URL is

https://<projectName>.atlassian.net/rest/api/2/search

And the payload is:

{ 
	"jql": "project = <projectName>and issuetype = epic", 
	"startAt": 0, 
	"maxResults": 50, 
	"fields": [ "key" ]
}

I also have to supply the username and password.

I have tried with:
=ImportJSON("https://<projectName>.atlassian.net/rest/api/2/search","/key","user=username&apikey=password&jql": "project = projectName and issuetype = epic", "fields": "key")

I probably have not fully understood the function's signature. Url is pretty obvious, but query and parse options are still a bit unclear.

There is an error message:
Request failed for https://projectName.atlassian.net/rest/api/2/search returned code 415 (line 212).

Thanks in advance,

Can I use a row headers?

I have a json with nexts dictionnaies like:

{
"row_head_1": {"col_head_1": "data_1_1", "col_head_2": "data_1_2"},
"row_head_2": {"col_head_1": "data_2_1", "col_head_2": "data_2_2"}
}

Can I get a table like:

              col_head_1    col_head_2
row_head_1    data_1_1    data_1_2
row_head_2    data_2_1    data_2_2

Ability to import JSONPath queries

Feature request to include the ability to write queries using JSONPath.

Say I have the following data:

{
    "data": {
        "401": {
            "inactive": {
                "25H": {
                    "Ra-den": [
                        {
                            "date": "1372364585000",
                            "value": 135239,
                            "duration": 399154
                        },
                        {
                            "date": "1373917966000",
                            "value": 138991,
                            "duration": 380561
                        },
                        {
                            "date": "1374699514000",
                            "value": 154499,
                            "duration": 386774
                        },
                        {
                            "date": "1378670807000",
                            "value": 239680,
                            "duration": 354066
                        }
                    ]
                }
            }
        },
        "402": {
            "inactive": {
                "10N": {
                    "Jinrokh_the_Breaker": [
                        {
                            "date": "1370978275000",
                            "value": 200075,
                            "duration": 128514
                        }
}

I would like to have the ability to use a query path of: $.....duration to return all the duration values in the document.

Example:

=ImportJSON("http://raidbots.com/json/playerdata/eu/ysondre/laterr", "$.....duration", "noTruncate")

Control over column order

When using with the Trello api (haven't test with other sources) there doesn't seem to be control over column order.
I'm using "/cards/dateLastActivity,/cards/name,/cards/desc,/cards/shortUrl" as my queries but the columns still get rendered as Datelastactivity, Desc, Name, Shorturl

Is there a way for importJSON to ignore "Warning: apc_store()" errors to retrieve remaining data in an API call?

When using the function, I occasionally get the following error as a row of text at the top of the results from my API call:

Warning: apc_store(): Potential cache slam averted for key '/var/sandbox/gw2spidy/config/cnf/env' in /var/sandbox/gw2spidy/vendor/igorw/src/Igorw/Silex/Env.php on line 71 {"sell-or-buy":"sell","count":1000,"page":1,"last_page":3,"total":2946,"results":[ {"listing_datetime":"2018-01-09 05:36:10 UTC","unit_price":40,"quantity":12040413,"listings":74713}, {"listing_datetime":"2018-01-09 04:57:21 UTC","unit_price":32,"quantity":12038672,"listings":74615}, ...

The data is still in the JSON response, but the function can't seem to read past the text error, and therefore it pops an error in the spreadsheet and doesn't pull down any data. Is there a way for the formula to ignore that erroneous text, or would this need to be an issue fixed on the side of the API developer?

Import JSON refresh with Sheet Trigger

Hi,
the script works perfectly and is really helpfull for me.
It is possibile you implement the function sheet trigger to create a automatic refresh.
I have tested so many variants of refresh but all not working with your script.
Thank`s in Advance for help

Problems with underscore "_" in json attributes

Hi all,

I have problems when json attributes come with underscore. Example

{ "courses" : { "request": { url:"https" }, "course": { "url":"http", "number: "0000", "number_id":"xxx1" } } }

When I used the ImportJSON function, I just can see 3 colums, not 4

Missing First line of data

Hello

I've tried the script on a sample json source : http://api.geonames.org/postalCodeLookupJSON?postalcode=6600&country=AT&username=demo wich is also attached to this post

According to the source, i expect to get the first line with code 70805
"postalcodes":[{"adminCode2":"708","adminCode3":"70805","adminName3":"Breitenwang"...
But on my spreadsheat, i've got :
`Admincode2 Admincode3 Adminname3 Admincode1

708 70806 Ehenbichl 07

708 70820 Lechaschau 07

708 70822 Musau 07`

the first line is missing.
I've got the same issue with others json database export.

Thx
test.txt

How to sort by columns and not get refreshed

Script is great and easy to use. Problem I have is that whenever I click to sort on some column API I am using to get data gets refreshed and I get data sorted the same as before.

Wondering how could I solve this, maybe to remove onLoad and have on some button clicked that loads the data, or maybe to keep sort somehow on refresh?

Avoiding cache

Because we can't use references to cells with dynamic values, see: #4 (comment)
By inserting the code below, between L214 and L215, the fetches will always fetch new copies when you reload the spreadsheet.

  /* Add time stamp
   * Added by ©TriMoon™ <https://github.com/TriMoon>
   *
   * To get part after the '?' character we use
   * String.split(url,'?')[1]
   */
  url += (String.split(url,'?')[1] ? '&' : '?') + 'time=' + Date.now();

Here are links to the refered lines

  1. function ImportJSONAdvanced(url, fetchOptions, query, parseOptions, includeFunc, transformFunc) {
  2. var jsondata = UrlFetchApp.fetch(url, fetchOptions);

Rows get merged incorrectly with certain JSON

The JSON below gets parsed as three rows when it should get parsed as four rows. See the expected and actual results below the JSON.

The problem lies in the parseData_ function. The rowIndex is not being incremented correctly. A partial solution is to add rowIndex++; on line 284 of version 1.2.0, but this should be tested to make sure it doesn't break other JSON inputs, and doesn't completely solve the problem.

JSON Input

[
   { 
      "id":"123" 
   },

   { 
      "id":"457",
      "jobs": [
         {
            "rate":"5.45"
         },
         {
            "rate":"5.75",
            "country":"US"
         }
      ]
   },

   {
      "id":"458",
      "jobs": [
         {
            "rate":"5.55",
            "feedback":
               {
                  "score":"5.0"
               }
         }
      ]
   }
]

Expected Results

IdJobs RateJobs CountryJobs Feedback Score
123
4575.45
4575.75US
4585.555.0

Actual Results

IdJobs RateJobs CountryJobs Feedback Score
123
4575.45
4585.55US5.0

using concatenate causes "invalid parameter" error (%20 space in parameters)

I am trying to use concatentate to create the input parameters with a variable for the importJSON function. The concatenated string looks and works fine if I copy it into the importJSON function but if I reference the string as part of the function or use concatenate in the function I get an invalid argument error.

Here is the concatenate string. Cell I7 is the date. For example 2017-11-27.

=concatenate("""https://www.alphavantage.co/query?function=TIME_SERIES_DAILY&symbol=MSFT&apikey=demo""",",","""","/Time Series (Daily)/",I7,"/4. close""", ", ","""noHeaders""")

Here is the output string with the date variable:

"https://www.alphavantage.co/query?function=TIME_SERIES_DAILY&symbol=MSFT&apikey=demo","/Time Series (Daily)/2017-11-27/4. close", "noHeaders"

This works if I copy the concatenate output string manually into the function:

=importJSON("https://www.alphavantage.co/query?function=TIME_SERIES_DAILY&symbol=MSFT&apikey=demo","/Time Series (Daily)/2017-11-27/4. close", "noHeaders")

These dont work:

=importJSON(concatenate("""https://www.alphavantage.co/query?function=TIME_SERIES_DAILY&symbol=MSFT&apikey=demo""",",","""","/Time Series (Daily)/",I7,"/4. close""", ", ","""noHeaders"""))

=importJSON(K9) >> K9 is cell where the concatenate function is

parseData_ function bug

Line 286 Should be:
if (i >= 0 && data[state.rowIndex])

It drops the first entry when it puts the response to the spreadsheet.

Option to print empty values on unset requested paths

I'm currently trying to work with a pretty unpredictable API that, depending on the things I request, returns different paths.
For instance, one call for a product will return size, weight, price, name, etc.
Another product returns only price and name.
If you were to import it like:

size | weight | price | name
=ImportJSON("https://api.com/product1", "/size,/weight,/price,/name", "noHeaders")
=ImportJSON("https://api.com/product2", "/size,/weight,/price,/name", "noHeaders")

You'd end up with something like

size | weight | price | name
32   | 10kg   | $80   | blouse
$42  | carrot |       |

Note how price and name shifted to the size and weight fields.
A parseOption to include unset paths in the output as well would solve this problem.

Replace script for IMPORTXML (for larger xml files)

Hi,

I want to import a Google Shopping XML to a Google Spreadsheet, but Google Spreadsheet shows a error:

Error Resource at url contents exceeded maximum size.

I need to import the data (and update it each 6 hours) and I don't know how with this error.

A example of what I using is this:

=IMPORTXML("http://avambu.xtechcommerce.com/datafeeds/google_shopping","//channel/item")
But with a larger XML.

There is a script or another solution for this? Your script will work for this type of file?

Thanks!

Add option to return scalar arrays as arrays

Scalar arrays currently get collapsed into a single cell, with values separated by commas.

A use case exists where it would be better to return the values as an array. For instance:

{
   blah: "blah",
   aaData: [
      [
         "John",
         "Smith",
         "[email protected]"
      ],
      [
         "Toby",
         "Harris",
         "[email protected]"
      ]
}

Currently calling ImportJSON with a query of "/aaData" returns a 1 x 1 array containing the value:

John,Smith,[email protected],Tody,Harris,[email protected]

Instead, there should be an option to return the values as a 3 x 2 array:

John | Smith  | [email protected]
Tody | Harris | [email protected]

Auto-update

Hi!

Thx for your code!

I have question about auto-update.

Try run by timer this:
function myUpdate() {

var ss = SpreadsheetApp.getActiveSpreadsheet();
ss.setActiveSheet(ss.getSheetByName("AUPD"));

var testIt = ImportJSON("***");

var cell = ss.getRange("A1");
cell.setValue( testIt );

}


But result only in one cell A1.


How to properly do an automatic update on a timer?

Querying a specific field doesn't work

I've tried both queries. One with "/title" and one without. The one with "/title" only returns that there is a parsing error.

With error:
=ImportJSON("https://mysafeinfo.com/api/data?list=bestnovels&format=json&rows=20&alias=cnt=count,avg=average_rank,tt=title,au=author,yr=year", "/title")

Without error:
=ImportJSON("https://mysafeinfo.com/api/data?list=bestnovels&format=json&rows=20&alias=cnt=count,avg=average_rank,tt=title,au=author,yr=year")

Rows missing when parsing some arrays

This data set is not returning all of its rows when queried with the query “/facets/name,/facets/fields/label,/facets/fields/count”.

The problem is that the /facets/fields is skipping rows from 'name:LANGUAGE', 'name:TYPE' and 'name:YEAR' among others. It is also merging in the wrong value for /facets/name.

View this sample spreadsheet that uses the following call:

=ImportJSON(“http://europeana.eu/api//v2/search.json?wskey=ISXVIQBYBQ&query=*:*&start=1&rows=0&profile=facets”, 
            “/facets/name,/facets/fields/label,/facets/fields/count”, 
            “noTruncate”)`

Auto refresh on sheet reload?

Is there anyway to make this script load fresh data when the spreadsheet is reloaded by user instead of modifying the cell where the function is called?
I tried #57 but no luck on spreadsheet reload...

For Excel ?

Hello,
I really like your project, I use it for my calculations. But I can not always use google spreadsheet , can you rewrite this code for an ms excel in the language of the vba?
Or how can me run your code in excel?

Figure out how to import JSON-P data

JSON-P adds JavaScript into the JSON format so it can be parsed using a <script> tag and be assigned a variable or passed as a parameter into a callback function. The current ImportJSON function expects pure JSON, without the JavaScript code added.

The problem is that JSON-P is not a standard. There code can be anything that is valid JavaScript (although this may change if the "application/json-p" MIME type gets adopted). Auto-detecting a JSON-P stream seems problematic.

One potential solution might be to ignore the first x number of characters and the last y number of characters, then parse the substring between the two. The user of ImportJSON would need to figure out these counts and then pass them in. This would only work if the JSON-P used the same JavaScript code each time, which is highly likely.

In the meantime, users wanting to import JSON-P data can manually change the code in ImportJSONAdvanced to:

{
  var jsondata    = UrlFetchApp.fetch(url, fetchOptions);
  var content     = jsondata.getContentText();
  var ignoreStart = 0;
  var ignoreEnd   = 0;
  var object      = JSON.parse(content.substring(ignoreStart, content.length - ignoreEnd);

  return parseJSONObject_(object, query, parseOptions, includeFunc, transformFunc);
}

Modifying the values of ignoreStart and ignoreEnd to the length of the JavaScript that leads into the JSON and the characters at the end of the stream that finish off the JavaScript statement.

If you try this, please post whether this was successful for you as a comment to this issue. If it's working for enough people without errors, then I can add it into the main codebase in a more extensible way.

importJSON doesnt like "^" symbol

As the title states, it seems the importJSON function doesnt like the ^ symbol as part of the URL. I have verified the URL works and return a JSON if I enter it into my web browser but the function returns an Invalid Argument error.

ImportJSONBasicAuth doesn't seem to work

I’ve tried using ImportJSONBasicAuth but i still get an error. I used the prescribed format: ImportJSONBasicAuth(url, username, password, query, parseOptions)

See attached screenshot. (from #46)
screenshot of error

query limitation?

I'm trying to parse specific columns from this JSON.
When I get to around 6-7 columns I get errors. Parsing the fields by themselves seems to work fine.
"Internal error executing the custom function"
Any known issues, or suggested workarounds?
Also seems like it sometimes works if I change the order of the query list.

https://dl.dropboxusercontent.com/s/907ibas1qdtddft/ussdagames2.json?dl=0
/startDate,/awayAgeGroup,/awayName,/awayScore,/homeName,/homeScore,/isPlayed

Whitespace in URL causes script to fail

Hi Brad,

I tripped over an unexpected bug yesterday when I accidentally added a carriage return at the end of my URL, and spent ages trying to figure out why my URL was invalid.

I didn't pick it up initially because my URL was so long that it just looked like the URL was wrapping in the Google Sheet cell.

I finally found the problem, but I'd like to suggest adding the following code at line 215, before the URL is fetched:

url = url.replace(/\s/g, ""); // remove whitespace from URL

or something similar, to make it more robust and save some other poor soul from tearing their hair out like I did.

Thanks for taking this valuable project over, and I'm sure it will get even better.

Have an awesome day,

Nikki

Rate Request Limiting Loading....

I'm using =ImportJson() to populate over 5000+ cells and it works really well. Problem is.. i'm often stuck on a sheet where there's 1000's of cells are in "Loading..." My data isn't very fast changing and only needs to be refreshed every 72 hours (or even less). Is there any way to enforce this sort of behavior?

Additional Information: The server managing the JSON responses runs very fast and most of the time the response is cached. I think the problem is more due to Google Sheets managing 1000's of ImportJSON() commands on a single page.

Any insight would be helpful. Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.