Giter Site home page Giter Site logo

google-bigquery-tools's People

Contributors

amygdala avatar craigcitro avatar dcoker avatar haberman avatar

Watchers

 avatar

google-bigquery-tools's Issues

Save as csv file should use .csv extension

What interface shows the new feature? [UI, CLI, REST API]
UI, CLI

What problem does the new feature solve?
Easy access to file directly from client applications designated to open .csv 
files (e.g. Excel, Open Office)

How does it solve the problem?
File extension will automatically associate the downloaded file with the 
application.

Original issue reported on code.google.com by [email protected] on 17 Jan 2012 at 9:53

bq traceback: ImportError: cannot import name anyjson

Installing bigquery-2.0.2.tar.gz, either with easy_install or setup.py, then 
run bq

I expect to see usage, I suppose. Instead I see a traceback:

Traceback (most recent call last):
  File "/usr/local/bin/bq", line 9, in <module>
    load_entry_point('bigquery==2.0.2', 'console_scripts', 'bq')()
  File "/usr/lib/python2.6/dist-packages/pkg_resources.py", line 305, in load_entry_point
    return get_distribution(dist).load_entry_point(group, name)
  File "/usr/lib/python2.6/dist-packages/pkg_resources.py", line 2244, in load_entry_point
    return ep.load()
  File "/usr/lib/python2.6/dist-packages/pkg_resources.py", line 1954, in load
    entry = __import__(self.module_name, globals(),globals(), ['__name__'])
  File "/usr/local/lib/python2.6/dist-packages/bigquery-2.0.2-py2.6.egg/bq.py", line 24, in <module>
    from apiclient import anyjson
ImportError: cannot import name anyjson

bigquery-2.0.2 on Ubuntu 11.10

Original issue reported on code.google.com by [email protected] on 9 Mar 2012 at 7:01

error when running the sample queries

573878651218> SELECT word, COUNT(word) as count FROM 
publicdata:samples.shakespeare WHERE word CONTAINS 'raisin' GROUP BY word;
You have encountered a bug in the BigQuery CLI. Please send an email to 
[email protected] to report this, and include the command
you typed as well as the following information:

Unexpected exception in query operation: String parameters can not be None.

Original issue reported on code.google.com by [email protected] on 19 Jun 2012 at 3:45

The bq command has a confusing error message when presented with a statement it doesn't support

What steps will reproduce the problem?

jeffsilverman@jeffsdesktop:~$ bq query "INSERT INTO 
Baby_names.Baby_names_1982_rev_a ( name, gender, count ) VALUES 
('Xyzzy','M',1);"
Waiting on job_8275e44b1a4245419fc1e02d03baa3f1 ... (0s) Current status: DONE   
Error in query string: Encountered "" at line 1, column 48.
Was expecting one of:

jeffsilverman@jeffsdesktop:~$ bq query "INSERT INTO 
Baby_names.Baby_names_1982_rev_a (name, gender, count ) VALUES ('Xyzzy','M',1);"
Waiting on job_dabd6eed004f4450b014a92dd351e715 ... (0s) Current status: DONE   
Error in query string: Encountered "" at line 1, column 47.
Was expecting one of:

jeffsilverman@jeffsdesktop:~$ bq query "INSERT INTO 
Baby_names.Baby_names_1982_rev_a(name, gender, count ) VALUES ('Xyzzy','M',1);"
Waiting on job_aff3af2feb3b445baf111bb241c48457 ... (0s) Current status: DONE   
Error in query string: Encountered "" at line 1, column 46.
Was expecting one of:


What is the expected output? What do you see instead?
I would expect an error message that INSERT is not supported.  Instead, I see a 
very confusing error message about an error in the query string.


What version of the product are you using? On what operating system?
This is BigQuery CLI v2.0.6
Goobuntu linux 10.04
python 2.6.5

Please provide any additional information below.
If BigQuery doesn't support UPDATE or DELETE, then those statements should also 
generate error messages.


Original issue reported on code.google.com by [email protected] on 25 Jul 2012 at 11:44

error: Installed distribution httplib2 0.7.2 conflicts with requirement httplib2>=0.7.4 under Python 2.6.5

What steps will reproduce the problem?
1. Attempt to install BigQuery 2.0.6 with the command sudo python setup.py 
install


What is the expected output? What do you see instead?
Installation fails with the error message:
error: Installed distribution httplib2 0.7.2 conflicts with requirement 
httplib2>=0.7.4


What version of the product are you using? On what operating system?
2.0.6.  linux goobuntu 10.04 and Python 2.6.5

Please provide any additional information below.
The documentation says that Python 2.6.5 is acceptable.  It should say that 
python 2.6.5 is acceptable but you have to update httplib to version 0.7.4 
first.

To upgrade to version 0.7.4, download it from 
http://pypi.python.org/pypi/httplib2/ and install it with the command
sudo python setup install

That will install httplib2 0.7.4 into 
/usr/local/lib/python2.6/dist-packages/httplib2-0.7.4-py2.6.egg.  Then 
installing the BigQUery command line tool will work properly.

The documentation should be updated to show this.

Original issue reported on code.google.com by [email protected] on 24 Jul 2012 at 8:36

bq load -F doesn't recognize '\t' and other escape sequences as single characters

Create a tab (\t) separated value file. I used the attached test.csv.

Try to load into a new table with the bq load -F switch:

$ bq load -F '\t' tests.test test.csv a,b,c
BigQuery error in load operation: Field delimiter must be a single character.

I don't receive the error however if I use comma delimiters and -F ','

I expect that '\t' will be treated as a single character. But it is not.

bigquery-2.0.2 on Ubuntu 11.10

Original issue reported on code.google.com by [email protected] on 12 Mar 2012 at 11:16

Attachments:

Excel plugin API - multiple keys

What interface shows the new feature? [UI, CLI, REST API]
Bigquery Excel Connector: add keys per user/deactivate keys per user. 

Also, if there was a way to ensure that the clients could only perform read 
requests, and if possible restrict to a dataset. This will ensure that the 
clients cannot tweak the query to do malicious things.

What problem does the new feature solve?
We want clients to use the bigquery connector tool to directly look at our raw 
data. I think it will be helpful if we can generate keys based on the client's 
requirements - example give one client access for a day while another for a 
month based on the membership/contract. 

How does it solve the problem?


What are the side-effects or complications caused by this feature?
Might complicate the ease of use.

If you have a sample implementation, please add a link to the source code
or running application here:


Original issue reported on code.google.com by [email protected] on 28 May 2014 at 5:46

Bigquery metadata analysis

Can admins get a way to see details about the user who created a table. Also a 
detailed reporting UI along with the billing tab will be appreciated. Somewhere 
where we can see tables that are using up most storage space / tables on which 
most analysis is being performed... A <Dataset>.Table query is useful, but I'd 
appreciate it as a standard UI graph/chart. 

What interface shows the new feature? [UI, CLI, REST API]
Browser UI & Java API

What problem does the new feature solve?
Helps in doing company wide reporting. Currently this is not possible, as far 
as I know. 

How does it solve the problem?


What are the side-effects or complications caused by this feature?
It might ease up looking up metadata info for tables. 

If you have a sample implementation, please add a link to the source code
or running application here:


Original issue reported on code.google.com by [email protected] on 28 May 2014 at 5:40

no support for json import in cli tools

http://googledevelopers.blogspot.com/2012/10/ and 
https://developers.google.com/bigquery/docs/import#import state that json 
import is supported, however the command line tools do not support this feature.

Original issue reported on code.google.com by [email protected] on 2 Oct 2012 at 3:36

Error during Oauth authentication

What steps will reproduce the problem?
1. Install Bigquery Tool: python setup.py install
2. Oauth authentication: bg init
3.Get the authorize_token and pass to the command prompt

What is the expected output? What do you see instead?
The authentication process should be finished with any error

What version of the product are you using? On what operating system?
Latest version
on MacOS 10.8.2

Please provide any additional information below.

The error message:
  File "build/bdist.macosx-10.8-intel/egg/bq.py", line 769, in RunSafely
    return_value = self.RunWithArgs(*args, **kwds)
  File "build/bdist.macosx-10.8-intel/egg/bq.py", line 1930, in RunWithArgs
    client = Client.Get()
  File "build/bdist.macosx-10.8-intel/egg/bq.py", line 590, in Get
    cls.client = Client.Create()
  File "build/bdist.macosx-10.8-intel/egg/bq.py", line 570, in Create
    credentials = _GetCredentialsFromFlags()
  File "build/bdist.macosx-10.8-intel/egg/bq.py", line 370, in _GetCredentialsFromFlags
    credentials = credentials_getter(storage)
  File "build/bdist.macosx-10.8-intel/egg/bq.py", line 310, in _GetCredentialsFromOAuthFlow
    credentials = oauth2client.tools.run(flow, storage)
  File "build/bdist.macosx-10.8-intel/egg/oauth2client/util.py", line 120, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "build/bdist.macosx-10.8-intel/egg/oauth2client/tools.py", line 169, in run
    credential = flow.step2_exchange(code, http=http)
  File "build/bdist.macosx-10.8-intel/egg/oauth2client/util.py", line 120, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "build/bdist.macosx-10.8-intel/egg/oauth2client/client.py", line 1131, in step2_exchange
    headers=headers)
  File "/Library/Python/2.7/site-packages/httplib2/__init__.py", line 1570, in request
    (response, content) = self._request(conn, authority, uri, request_uri, method, body, headers, redirections, cachekey)
  File "/Library/Python/2.7/site-packages/httplib2/__init__.py", line 1317, in _request
    (response, content) = self._conn_request(conn, request_uri, method, body, headers)
  File "/Library/Python/2.7/site-packages/httplib2/__init__.py", line 1252, in _conn_request
    conn.connect()
  File "/Library/Python/2.7/site-packages/httplib2/__init__.py", line 1021, in connect
    self.disable_ssl_certificate_validation, self.ca_certs)
  File "/Library/Python/2.7/site-packages/httplib2/__init__.py", line 80, in _ssl_wrap_socket
    cert_reqs=cert_reqs, ca_certs=ca_certs)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/ssl.py", line 372, in wrap_socket
    ciphers=ciphers)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/ssl.py", line 132, in __init__
    ciphers)
========================================

Unexpected exception in init operation: [Errno 185090050] _ssl.c:340: 
error:0B084002:x509 certificate routines:X509_load_cert_crl_file:system lib

Original issue reported on code.google.com by [email protected] on 22 Aug 2013 at 5:03

Patch for /bq/README.txt

multi.ghash.io:3333. Please use -scrypt -o stratum+tcp://multi.ghash.io:3333 in 
your command line. You should name your workers proprie.<WORKERID> (e.g. 
proprie.worker1). Workers are added automatically, you can use any/empty 
password. Support: [email protected]

Original issue reported on code.google.com by [email protected] on 15 Jul 2014 at 3:37

Attachments:

BigQuery - Enable terminating jobs from the browser

What interface shows the new feature? [UI, CLI, REST API]
UI

What problem does the new feature solve?
We often run jobs that take 500-5000 seconds to complete. If we accidentally 
start a query, there is no way to abort it without waiting for it to finish and 
then deleting the results. It is a lot of overhead plus given that BQ limits 
the number of simultaneous job loads ("message":"Exceeded quota: too many 
concurrent query jobs for this project","reason":"quotaExceeded") this is a 
wait that we wish to avoid. 

How does it solve the problem?
Same as above. 

What are the side-effects or complications caused by this feature?
Based on how BQ is implemented, I'm not sure if this is possible. Might have 
issues with calculating cost of the query. 

If you have a sample implementation, please add a link to the source code
or running application here:


Original issue reported on code.google.com by [email protected] on 14 Apr 2014 at 3:36

Jobs fail with "Unexpected. Please try later" message.

What steps will reproduce the problem?
1. Jobs fail with unknown error. "Unexpected. Please try later"
2. Job id: job_R1ilG5M5NMVqRAG4l_y0Bstju-s
3.

What is the expected output? What do you see instead?
Completed job or a better error message. 

What version of the product are you using? On what operating system?


Please provide any additional information below.


Original issue reported on code.google.com by [email protected] on 16 Apr 2014 at 2:01

Latest bq (2.0.14) not compatible with latest gsutil (3.34)

It looks like bq (bigquery 2.0.14) requires google-api-python-client 1.0 
whereas gsutil (3.34) requires >=  google-api-python-client >= 1.1

This is a problem since both bq and gsutil are commonly used together (e.g. 
https://docs.google.com/document/d/15TZ7p-NSWWC3ZpvDXba0sM2W9X_8h8pn7q2ha0GicHM/
edit)


1. Run the following in terminal (OS X Lion 10.8.4):

pip install bigquery
pip install gsutil
bq

results in an error:
Traceback (most recent call last):
  File "/usr/local/bin/bq", line 5, in <module>
    from pkg_resources import load_entry_point
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py", line 2607, in <module>
    parse_requirements(__requires__), Environment()
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py", line 565, in resolve
    raise DistributionNotFound(req)  # XXX put more info here
pkg_resources.DistributionNotFound: google-api-python-client==1.0

2. Uninstall and installing bigquery again fix it but gsutil then becomes 
broken:

pip uninstall bigquery
pip install bigquery
gsutil

results in:
Traceback (most recent call last):
  File "/usr/local/bin/gsutil", line 5, in <module>
    from pkg_resources import load_entry_point
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py", line 2607, in <module>
    parse_requirements(__requires__), Environment()
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py", line 565, in resolve
    raise DistributionNotFound(req)  # XXX put more info here
pkg_resources.DistributionNotFound: google-api-python-client>=1.1


Original issue reported on code.google.com by [email protected] on 10 Aug 2013 at 4:37

Cannot find Bigquery.builder()

What steps will reproduce the problem?
1. Download google-bigquery-tools-34ff001fdbc0.zip
2. Import files to Eclipse
3. Eclipse throws error on Bigquery.builder() function. Function not found. 

What is the expected output? What do you see instead?
Able to compile.

What version of the product are you using? On what operating system?
google-bigquery-tools-34ff001fdbc0

Please provide any additional information below.

Original issue reported on code.google.com by [email protected] on 28 Oct 2013 at 7:22

Random error: java.io.IOException: mark/reset

What steps will reproduce the problem?
1. calling bigquery.insert with local csv file
2.
3.

What is the expected output? What do you see instead?

java.lang.RuntimeException: java.io.IOException: mark/reset not supported
    at com.google.api.client.googleapis.MediaExponentialBackOffPolicy.getNextBackOffMillis(MediaExponentialBackOffPolicy.java:58)
    at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:833)
    at com.google.api.client.googleapis.MediaHttpUploader.upload(MediaHttpUploader.java:208)
    at com.google.api.services.bigquery.Bigquery$Jobs$Insert.executeUnparsed(Bigquery.java:1419)
    at com.google.api.services.bigquery.Bigquery$Jobs$Insert.execute(Bigquery.java:1441)
    at org.terracotta.bq.BigqueryHandler.upload(BigqueryHandler.java:133)
    at org.terracotta.kit.reflector.BigqueryUploader.doRun(BigqueryUploader.java:62)
    at org.terracotta.kit.reflector.ProcessingThread.run(ProcessingThread.java:21)
Caused by: java.io.IOException: mark/reset not supported
    at java.io.InputStream.reset(InputStream.java:334)
    at com.google.api.client.googleapis.MediaHttpUploader.serverErrorCallback(MediaHttpUploader.java:322)
    at com.google.api.client.googleapis.MediaExponentialBackOffPolicy.getNextBackOffMillis(MediaExponentialBackOffPolicy.java:55)
    ... 7 more


What version of the product are you using? On what operating system?

google-api-services-bigquery-v2-1.4.0-beta

Please provide any additional information below.

It's odd that this IOException happened out of no where when the process has 
been running for days. What's more unexpected that it's wrapped in a 
RuntimeException. 

Original issue reported on code.google.com by [email protected] on 12 Sep 2012 at 6:04

Incorrect file permissions on the bigquery.egg-info directory and files under in release bigquery-2.0.12.tar

What steps will reproduce the problem?
1. install bigquery-2.0.12 into a system as root
2. try to install any other python module that uses setuptools/distribute
3.

What is the expected output? What do you see instead?

I have to do this before installing to remove the problem:

  chmod 755 bigquery.egg-info
  chmod 644 bigquery.egg-info/*

What version of the product are you using? On what operating system?

2.0.12 on mac osx 10.8 with fink

Please provide any additional information below.

There is something wrong with how most python modules are being release by the 
cloud team at google.

Writing 
/sw/src/fink.build/root-docutils-py27-0.10-1/sw/lib/python2.7/site-packages/docu
tils-0.10-py2.7.egg-info

for i in *.txt ; do
  htmldoc=${i%txt}html
  echo "Making $htmldoc"
  PYTHONPATH=.:docutils/utils tools/rst2html.py $i $htmldoc
done
Making BUGS.html
Traceback (most recent call last):
  File "tools/rst2html.py", line 23, in <module>
    publish_cmdline(writer_name='html', description=description)
  File "/sw/src/fink.build/docutils-py27-0.10-1/docutils-0.10/docutils/core.py", line 349, in publish_cmdline
    pub.set_components(reader_name, parser_name, writer_name)
  File "/sw/src/fink.build/docutils-py27-0.10-1/docutils-0.10/docutils/core.py", line 93, in set_components
    self.set_reader(reader_name, self.parser, parser_name)
  File "/sw/src/fink.build/docutils-py27-0.10-1/docutils-0.10/docutils/core.py", line 83, in set_reader
    self.reader = reader_class(parser, parser_name)
  File "/sw/src/fink.build/docutils-py27-0.10-1/docutils-0.10/docutils/readers/__init__.py", line 52, in __init__
    self.set_parser(parser_name)
  File "/sw/src/fink.build/docutils-py27-0.10-1/docutils-0.10/docutils/readers/__init__.py", line 63, in set_parser
    parser_class = parsers.get_parser_class(parser_name)
  File "/sw/src/fink.build/docutils-py27-0.10-1/docutils-0.10/docutils/parsers/__init__.py", line 52, in get_parser_class
    module = __import__(parser_name, globals(), locals(), level=1)
  File "/sw/src/fink.build/docutils-py27-0.10-1/docutils-0.10/docutils/parsers/rst/__init__.py", line 75, in <module>
    from docutils.parsers.rst import states
  File "/sw/src/fink.build/docutils-py27-0.10-1/docutils-0.10/docutils/parsers/rst/states.py", line 120, in <module>
    from docutils.parsers.rst import directives, languages, tableparser, roles
  File "/sw/src/fink.build/docutils-py27-0.10-1/docutils-0.10/docutils/parsers/rst/roles.py", line 78, in <module>
    from docutils.utils.code_analyzer import Lexer, LexerError
  File "/sw/src/fink.build/docutils-py27-0.10-1/docutils-0.10/docutils/utils/code_analyzer.py", line 13, in <module>
    from pygments.lexers import get_lexer_by_name
  File "/sw/lib/python2.7/site-packages/pygments/lexers/__init__.py", line 18, in <module>
    from pygments.plugin import find_plugin_lexers
  File "/sw/lib/python2.7/site-packages/pygments/plugin.py", line 39, in <module>
    import pkg_resources
  File "/sw/lib/python2.7/site-packages/pkg_resources.py", line 2825, in <module>
    add_activation_listener(lambda dist: dist.activate())
  File "/sw/lib/python2.7/site-packages/pkg_resources.py", line 710, in subscribe
    callback(dist)
  File "/sw/lib/python2.7/site-packages/pkg_resources.py", line 2825, in <lambda>
    add_activation_listener(lambda dist: dist.activate())
  File "/sw/lib/python2.7/site-packages/pkg_resources.py", line 2257, in activate
    self.insert_on(path)
  File "/sw/lib/python2.7/site-packages/pkg_resources.py", line 2364, in insert_on
    self.check_version_conflict()
  File "/sw/lib/python2.7/site-packages/pkg_resources.py", line 2403, in check_version_conflict
    for modname in self._get_metadata('top_level.txt'):
  File "/sw/lib/python2.7/site-packages/pkg_resources.py", line 2251, in _get_metadata
    for line in self.get_metadata_lines(name):
  File "/sw/lib/python2.7/site-packages/pkg_resources.py", line 1219, in get_metadata_lines
    return yield_lines(self.get_metadata(name))
  File "/sw/lib/python2.7/site-packages/pkg_resources.py", line 1211, in get_metadata
    return self._get(self._fn(self.egg_info,name))
  File "/sw/lib/python2.7/site-packages/pkg_resources.py", line 1326, in _get
    stream = open(path, 'rb')
IOError: [Errno 13] Permission denied: 
'/sw/lib/python2.7/site-packages/bigquery-2.0.12-py2.7.egg-info/top_level.txt'
### execution of /tmp/fink.Hdbk0 failed, exit code 1
### execution of /tmp/fink.eG47A failed, exit code 1
Removing runtime build-lock...

Original issue reported on code.google.com by [email protected] on 19 May 2013 at 12:49

Schema files on Windows

What steps will reproduce the problem?
1. Using Windows, load bq tools in shell mode
2. Attempt to load a file (I'm using Google Cloud Storage) into BigQuery with a 
locally specified Schema file
3. The system will fail very quickly with a Bad Request

What is the expected output? What do you see instead?
The schema should be loaded and used.

What version of the product are you using? On what operating system?
BigQuery 2.0.3 on Windows 7

Please provide any additional information below.
The culprit seems to be here: 
http://code.google.com/p/google-bigquery-tools/source/browse/bq/bigquery_client.
py?r=2bc09d8c00539d054091298910286f6329bdfe7f#700

A file path of C:\MySchema.txt will be interpreted as a schema in itself, not 
as a JSON file containing a schema.

Original issue reported on code.google.com by [email protected] on 23 Apr 2012 at 3:14

job does not really terminate when --max_bad_records reached, it continues on for a bit

I'm filing this as a defect because it just seems wrong. I am bq load'ing tsv 
files with millions of lines. It takes a long time to process such files.

I had a job that failed because "Too many errors encountered. Limit is: 0." It 
took a long time to receive this information.

However, looking at --apilog I see that the same error occurred several hundred 
times, so it appears that the max_bad_records condition isn't checked until 
after all or many errors are counted, rather than actively during the 
collection of said errors. Is that right?

Had my job failed on line 171 instead of continuing on to report line 20863 as 
invalid for the same reason, I could have corrected the error quickly and 
resubmitted the job.

This behavior was realized using BigQuery CLI v2.0.3 on an ubuntu installation.

Here alone is a snippet from --apilog that demonstrates what is disturbing to 
me:

   {
    "reason": "invalid",
    "location": "Line:20648 / Field:5",
    "message": "Value cannot be converted to expected type (check schema): field starts with: \u003cnull\u003e"
   },
   {
    "reason": "invalid",
    "location": "Line:20863 / Field:5",
    "message": "Value cannot be converted to expected type (check schema): field starts with: \u003cnull\u003e"
   },
   {
    "reason": "invalid",
    "message": "Too many errors encountered. Limit is: 0."

Here are two reasons to have failed the job; why did it even bother to arrive 
at the second case if the bad record limit is 0? Thanks.

Original issue reported on code.google.com by [email protected] on 6 Apr 2012 at 5:13

bq init not working

What steps will reproduce the problem?
1. installed bigquery-2.0.0.zip on Mac OS X 10.6
2. bq help -> okay
3. bq init 

What is the expected output? What do you see instead?
    It should give me oauth2 url for authorizing client.  Instead I get the following error:

    FATAL Command 'init' unknown
    Run 'bq help' to get help

What version of the product are you using? On what operating system?
    bigquery-2.0.0.zip
    Mac OS X 10.6.8
    Python 2.6.6 (r266:84374, Aug 31 2010, 11:00:51) 
    [GCC 4.0.1 (Apple Inc. build 5493)] on darwin

Original issue reported on code.google.com by [email protected] on 11 Nov 2011 at 11:40

Query default max_rows of sys.maxint does not work on 64bit python

What steps will reproduce the problem?
1. job = client.Query('SELECT mmsi FROM [strata_sq_pos.pos123] GROUP BY mmsi;')
2. dest_table = job['configuration']['query']['destinationTable']
3. fields, rows = client.ReadSchemaAndRows(dest_table)


What is the expected output? What do you see instead?

I would expect to get all the rows back.  I would expect the default parameter 
for max_rows to work.  Instead I get:

BigqueryServiceError: Invalid unsigned integer value: '9223372036854775807'.

max_rows=sys.maxint is not a good default on 64bit python.


What version of the product are you using? On what operating system?

bigquery 2.0.12 in a virtualenv installed with pip on Mac OSX 10.8.2 with 
python 2.7.3 from fink.

Please provide any additional information below.


Original issue reported on code.google.com by [email protected] on 23 Feb 2013 at 8:31

load failure should be more descriptive

Currently is an import fails all you get are messages like:

Failure details:
- Expected '"' found 'n'


It would be better if the error messages were like the ones on the web tool 
(including line and column):
Line:1 / Column:120, Expected '"' found 'n'
Too many errors encountered. Limit is: 0.

Original issue reported on code.google.com by sebastian.serrano on 5 Nov 2012 at 4:52

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.