This library has moved to https://github.com/googleapis/google-cloud-python/tree/main/packages/google-cloud-monitoring
License: Apache License 2.0
python-monitoring's Introduction
Google APIs
This repository contains the original interface definitions of public
Google APIs that support both REST and gRPC protocols. Reading the
original interface definitions can provide a better understanding of
Google APIs and help you to utilize them more efficiently. You can also
use these definitions with open source tools to generate client
libraries, documentation, and other artifacts.
Building
Bazel
The recommended way to build the API client libraries is through
Bazel >= 4.2.2.
Bazel packages exist in all the libraries for Java, Go, Python, Ruby, Node.js, PHP and C#.
Overview
Google APIs are typically deployed as API services that are hosted
under different DNS names. One API service may implement multiple APIs
and multiple versions of the same API.
Google APIs use Protocol Buffers
version 3 (proto3) as their Interface Definition Language (IDL) to
define the API interface and the structure of the payload messages. The
same interface definition is used for both REST and RPC versions of the
API, which can be accessed over different wire protocols.
There are several ways of accessing Google APIs:
JSON over HTTP: You can access all Google APIs directly using JSON
over HTTP, using
Google API client library
or third-party API client libraries.
Protocol Buffers over gRPC: You can access Google APIs published
in this repository through GRPC, which is
a high-performance binary RPC protocol over HTTP/2. It offers many
useful features, including request/response multiplex and full-duplex
streaming.
Google Cloud Client Libraries:
You can use these libraries to access Google Cloud APIs. They are based
on gRPC for better performance and provide idiomatic client surface for
better developer experience.
This repository uses a directory hierarchy that reflects the Google
API product structure. In general, every API has its own root
directory, and each major version of the API has its own subdirectory.
The proto package names exactly match the directory: this makes it
easy to locate the proto definitions and ensures that the generated
client libraries have idiomatic namespaces in most programming
languages. Alongside the API directories live the configuration files
for the GAPIC toolkit.
NOTE: The major version of an API is used to indicate breaking
change to the API.
Generate gRPC Source Code
To generate gRPC source code for Google APIs in this repository, you
first need to install both Protocol Buffers and gRPC on your local
machine, then you can run make LANGUAGE=xxx all to generate the
source code. You need to integrate the generated source code into
your application build system.
NOTE: The Makefile is only intended to generate source code for the
entire repository. It is not for generating linkable client library
for a specific API. Please see other repositories under
https://github.com/googleapis for generating linkable client libraries.
Go gRPC Source Code
It is difficult to generate Go gRPC source code from this repository,
since Go has different directory structure.
Please use this repository instead.
series = monitoring_v3.TimeSeries()
Traceback (most recent call last):
File "", line 1, in
AttributeError: 'module' object has no attribute 'TimeSeries'
getting this error while trying to create timeseries data.
please help on this.
state = <grpc._channel._RPCState object at 0x7f1bcf73a128>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f1bcf725588>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626167412.739682685","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >
Traceback (most recent call last):
File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/google/api_core/grpc_helpers.py", line 73, in error_remapped_callable
return callable_(*args, **kwargs)
File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/grpc/_channel.py", line 946, in __call__
return _end_unary_response_blocking(state, call, False, None)
File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/grpc/_channel.py", line 849, in _end_unary_response_blocking
raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INVALID_ARGUMENT
details = "One or more errors parsing the query."
debug_error_string = "{"created":"@1618853686.419251000","description":"Error received from peer ipv6:[2a00:1450:4007:807::200a]:443","file":"src/core/lib/surface/call.cc","file_line":1068,"grpc_message":"One or more errors parsing the query.","grpc_status":3}"
>
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/ocervello/.virtualenvs/sabre/bin/promql-convert", line 33, in <module>
sys.exit(load_entry_point('promql-to-mql', 'console_scripts', 'promql-convert')())
File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/click-8.0.0rc1-py3.8.egg/click/core.py", line 1041, in __call__
return self.main(*args, **kwargs)
File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/click-8.0.0rc1-py3.8.egg/click/core.py", line 971, in main
rv = self.invoke(ctx)
File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/click-8.0.0rc1-py3.8.egg/click/core.py", line 1309, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/click-8.0.0rc1-py3.8.egg/click/core.py", line 715, in invoke
return callback(*args, **kwargs)
File "/Users/ocervello/Workspace/dev/promgrafana-cloudops/promql_to_mql/cli.py", line 164, in convert
result = test_mql_expression(expression)
File "/Users/ocervello/Workspace/dev/promgrafana-cloudops/promql_to_mql/test.py", line 51, in test_mql_expression
result = client.query_time_series(request)
File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/google/cloud/monitoring_v3/services/query_service/client.py", line 381, in query_time_series
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py", line 145, in __call__
return wrapped_func(*args, **kwargs)
File "/Users/ocervello/.virtualenvs/sabre/lib/python3.8/site-packages/google/api_core/grpc_helpers.py", line 75, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File "<string>", line 3, in raise_from
google.api_core.exceptions.InvalidArgument: 400 One or more errors parsing the query.
Note that this query works in the Metrics Explorer Query editor, so not sure what's going on here.
state = <grpc._channel._RPCState object at 0x7f8c62145d30>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f8c62183800>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1621681575.665381091","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >
.nox/py-3-8/lib/python3.8/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:78: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1125: in create_time_series
rpc(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]
state = <grpc._channel._RPCState object at 0x7fc4faeb8eb0>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fc4faf08280>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626174412.869136579","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7fc4faf48b80>
def test_list_metric_descriptors(capsys):
snippets.list_metric_descriptors(PROJECT_ID)
snippets_test.py:71:
snippets.py:213: in list_metric_descriptors
for descriptor in client.list_metric_descriptors(name=project_name):
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:636: in list_metric_descriptors
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:285: in retry_wrapped_func
return retry_target(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:188: in retry_target
return target()
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
Package name: google-cloud-monitoring
Current release: alpha
Proposed release: GA
Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
Required
28 days elapsed since last alphabeta release with new API surface
Server API is GA
Package API is stable, and we can commit to backward compatibility
All dependencies are GA
Optional
Most common / important scenarios have descriptive samples
Public manual methods have at least one usage sample each (excluding overloads)
Per-API README includes a full description of the API
Per-API README contains at least one โgetting startedโ sample using the most common API scenario
Manual code has been reviewed by API producer
Manual code has been reviewed by a DPE responsible for samples
'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
capsys = <_pytest.capture.CaptureFixture object at 0x7fa0c7aa5810>
pochan =
@pytest.mark.flaky(rerun_filter=delay_on_aborted, max_runs=5)
def test_enable_alert_policies(capsys, pochan):
# These sleep calls are for mitigating the following error:
# "409 Too many concurrent edits to the project configuration.
# Please try again."
# Having multiple projects will void these `sleep()` calls.
# See also #3310
time.sleep(2)
snippets.enable_alert_policies(pochan.project_name, True)
out, _ = capsys.readouterr()
assert (
"Enabled {0}".format(pochan.project_name) in out
or "{} is already enabled".format(pochan.alert_policy.name) in out
)
time.sleep(2)
snippets.enable_alert_policies(pochan.project_name, False)
out, _ = capsys.readouterr()
assert (
"Disabled {}".format(pochan.project_name) in out
or "{} is already disabled".format(pochan.alert_policy.name) in out
)
E assert ('Disabled projects/python-docs-samples-tests-py37' in 'Policy projects/python-docs-samples-tests-py37/alertPolicies/14818226130155215502 is already disabled\nPolicy projects/python-docs-samples-tests-py37/alertPolicies/2404774893716709115 is already disabled\n' or 'projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319 is already disabled' in 'Policy projects/python-docs-samples-tests-py37/alertPolicies/14818226130155215502 is already disabled\nPolicy projects/python-docs-samples-tests-py37/alertPolicies/2404774893716709115 is already disabled\n')
E + where 'Disabled projects/python-docs-samples-tests-py37' = <built-in method format of str object at 0x7fa0c9cebcf0>('projects/python-docs-samples-tests-py37')
E + where <built-in method format of str object at 0x7fa0c9cebcf0> = 'Disabled {}'.format
E + and 'projects/python-docs-samples-tests-py37' = <snippets_test.PochanFixture object at 0x7fa0c7bb5350>.project_name
E + and 'projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319 is already disabled' = <built-in method format of str object at 0x7fa0c9c00120>('projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319')
E + where <built-in method format of str object at 0x7fa0c9c00120> = '{} is already disabled'.format
E + and 'projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319' = name: "projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319"\ndisplay_name: "snippets-test-ugrikrch..."projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319/conditions/486425858163223454"\n}\nenabled {\n}\n.name
E + where name: "projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319"\ndisplay_name: "snippets-test-ugrikrch..."projects/python-docs-samples-tests-py37/alertPolicies/486425858163221319/conditions/486425858163223454"\n}\nenabled {\n}\n = <snippets_test.PochanFixture object at 0x7fa0c7bb5350>.alert_policy
state = <grpc._channel._RPCState object at 0x7f60843c4510>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f6085c365f0>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626170607.662749462","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >
Is your feature request related to a problem? Please describe.
It would be very useful to have the ability to filter based on user labels, e.g. a label that we have added to a GKE container that ends up in the user_labels section in Stackdriver. This kind of filtering is possible via the UI/API AFAIK, using the metadata.user_labels field.
Describe the solution you'd like
Exposing this field to the Python SDK
Describe alternatives you've considered
Using the API, but it's always better to have this functionality into the SDK
I do see different results for the same query (all parameter values are fixed), which I can't understand currently. Any idea why this can happen? Is this something expected?
Execute the provided example function multiple times
You will see different results even though the query uses interval with fixed start/end time
Code example
fromgoogle.cloudimportmonitoring_v3importtimedeflist_time_series_aggregate(project_id):
# [START monitoring_read_timeseries_align]client=monitoring_v3.MetricServiceClient()
project_name=client.project_path(project_id)
interval=monitoring_v3.types.TimeInterval()
# fixed point in time, offset 7 daysinterval.end_time.seconds=1607634588interval.end_time.nanos=836217000interval.start_time.seconds=1607029788interval.start_time.nanos=836217000aggregation=monitoring_v3.types.Aggregation()
aggregation.alignment_period.seconds=604800# 7 daysaggregation.per_series_aligner= (
monitoring_v3.enums.Aggregation.Aligner.ALIGN_SUM)
aggregation.cross_series_reducer= (
monitoring_v3.enums.Aggregation.Reducer.REDUCE_SUM)
results=client.list_time_series(
project_name,
'metric.type="compute.googleapis.com/instance/cpu/utilization"',
interval,
monitoring_v3.enums.ListTimeSeriesRequest.TimeSeriesView.FULL,
aggregation)
print(list(results)[0].points[0].value.double_value)
Traceback (most recent call last):
File "bug.py", line 1, in <module>
from opentelemetry.exporter.cloud_monitoring import CloudMonitoringMetricsExporter
File "/Users/ant/t/opentelemetry-bug/opentelemetry-bug/lib/python3.8/site-packages/opentelemetry/exporter/cloud_monitoring/__init__.py", line 9, in <module>
from google.cloud.monitoring_v3.proto.metric_pb2 import TimeSeries
ModuleNotFoundError: No module named 'google.cloud.monitoring_v3.proto.metric_pb2'
What is the expected behavior?
Not fail
What is the actual behavior?
Exception
Additional context
My guess is that the issue is due to the release google-cloud-monitoring 2.0.0 on 2020-10-06 (https://pypi.org/project/google-cloud-monitoring/). I tried to pin the version to <2 in the virtual environment and it helped:
Is your feature request related to a problem? Please describe.
Currently exposed API has only MetricServiceClient that allows writing metrics. The problem is that this client is synchronous and each call consumes some ~40ms to complete which might be critical where performance is needed. To solve this problem it's necessary to create some sort of a wrapper (e.g. similar to what logging library does).
The monitoring client is trying to be too smart and starts appending other filter expressions like:
Field filter had an invalid value of "metric.type = "" AND select_slo_burn_rate("projects/1063791683888/services/ist:bmenasha-anthos-cert-labs-1-zone-us-central1-b-central-default-reviews/serviceLevelObjectives/reviews-availability-slo", 30d)": Time series selectors cannot be used with metric type restrictions."
See google/cloud/monitoring_v3/query.py which always prefixes the filter with metric.type:
Traceback (most recent call last):
File "justwork.py", line 12, in <module>
for time_series in query:
File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/cloud/monitoring_v3/query.py", line 438, in iter
for ts in self._client.list_time_series(**params):
File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/page_iterator.py", line 204, in _items_iter
for page in self._page_iter(increment=False):
File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/page_iterator.py", line 235, in _page_iter
page = self._next_page()
File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/page_iterator.py", line 526, in _next_page
response = self._method(self._request)
File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py", line 143, in __call__
return wrapped_func(*args, **kwargs)
File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/retry.py", line 277, in retry_wrapped_func
on_error=on_error,
File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/retry.py", line 182, in retry_target
return target()
File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/timeout.py", line 214, in func_with_timeout
return func(*args, **kwargs)
File "/home/bmenasha/py-envs/get-metrics3/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File "<string>", line 3, in raise_from
google.api_core.exceptions.InvalidArgument: 400 Field filter had an invalid value of "metric.type = "" AND select_slo_burn_rate("projects/1063791683888/services/ist:bmenasha-anthos-cert-labs-1-zone-us-central1-b-central-default-reviews/serviceLevelObjectives/reviews-availability-slo", 30d)": Time series selectors cannot be used with metric type restrictions.
Making sure to follow these steps will guarantee the quickest resolution possible.
Thanks!
i'm using a python 3.7 cloud function to read entries from a pub/sub and stream metrics to stackdriver
i use the lib google-cloud-monitoring 0.34.0 to export metrics to stackdriver
Steps to reproduce
send metrics to exporters (in my case bigquery and stackdriver)
importloggingimporttimefromgoogle.cloudimportmonitoring_v3fromexporters.baseimportExporterLOGGER=logging.getLogger(__name__)
DEFAULT_METRIC_TYPE="custom.googleapis.com/test/error_budget_burn_rate"DEFAULT_METRIC_DESCRIPTION= ("Speed at which the error budget for a given""aggregation window is consumed")
classStackdriverExporter(Exporter):
"""Stackdriver Monitoring exporter class."""def__init__(self):
self.client=monitoring_v3.MetricServiceClient()
defexport(self, data, **config):
"""Export data to Stackdriver Monitoring. Args: data (dict): Data to send to Stackdriver Monitoring. config (dict): Stackdriver Monitoring metric config. project_id (str): Stackdriver host project id. custom_metric_type (str): Custom metric type. custom_metric_unit (str): Custom metric unit. Returns: object: Stackdriver Monitoring API result. """self.create_metric_descriptor(data, **config)
self.create_timeseries(data, **config)
defcreate_timeseries(self, data, **config):
"""Create Stackdriver Monitoring timeseries. Args: data (dict): Data to send to Stackdriver Monitoring. config (dict): Metric config. Returns: object: Metric descriptor. """metric_type=DEFAULT_METRIC_TYPEseries=monitoring_v3.types.TimeSeries()
series.metric.type=config.get('metric_type', metric_type)
# Write timeseries metric labels.series.metric.labels['timestamp_human'] =str(data['timestamp_human'])
series.metric.labels['error_budget_policy_step_name'] =str(
data['error_budget_policy_step_name'])
series.metric.labels['measurement_window_seconds'] =str(data['measurement_window_seconds'])
series.metric.labels['cluster_name'] =str(data['cluster_name'])
series.metric.labels['msg_vpn_name'] =str(data['msg_vpn_name'])
series.metric.labels['service_name'] =str(data['service_name'])
series.metric.labels['feature_name'] =str(data['feature_name'])
series.metric.labels['slo_name'] =str(data['slo_name'])
series.metric.labels['alerting_burn_rate_threshold'] =str(
data['alerting_burn_rate_threshold'])
# Use the generic resource 'global'.series.resource.type='global'series.resource.labels['project_id'] =str(config['project_id'])
# Create a new data point.point=series.points.add()
# Define end point timestamp. (changed by CF timestamp to ingest all datas)timestamp=time.time()
point.interval.end_time.seconds=int(timestamp)
point.interval.end_time.nanos=int(
(timestamp-point.interval.end_time.seconds) *10**9)
# Set the metric value.point.value.double_value=data['error_budget_burn_rate']
# Record the timeseries to Stackdriver Monitoring.project=self.client.project_path(config['project_id'])
result=self.client.create_time_series(project, [series])
labels=series.metric.labelsLOGGER.debug(
f"timestamp: {timestamp} burnrate: {point.value.double_value}"\
f"{labels['service_name']}-{labels['feature_name']}-"\
f"{labels['slo_name']}-{labels['error_budget_policy_step_name']}")
returnresultdefcreate_metric_descriptor(self, data, **config):
"""Create Stackdriver Monitoring metric descriptor. Args: config (dict): Metric config. Returns: object: Metric descriptor. """metric_type=DEFAULT_METRIC_TYPEproject=self.client.project_path(config['project_id'])
descriptor=monitoring_v3.types.MetricDescriptor()
descriptor.type=config.get('metric_type', metric_type)
descriptor.metric_kind= (
monitoring_v3.enums.MetricDescriptor.MetricKind.GAUGE)
descriptor.value_type= (
monitoring_v3.enums.MetricDescriptor.ValueType.DOUBLE)
descriptor.description=config.get('metric_description',
DEFAULT_METRIC_DESCRIPTION)
self.client.create_metric_descriptor(project, descriptor)
returndescriptor
Stack trace
Traceback (most recent call last): File "/env/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable return callable_(*args, **kwargs) File "/env/local/lib/python3.7/site-packages/grpc/_channel.py", line 826, in __call__ return _end_unary_response_blocking(state, call, False, None) File "/env/local/lib/python3.7/site-packages/grpc/_channel.py", line 729, in _end_unary_response_blocking raise _InactiveRpcError(state) grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with: status = StatusCode.INTERNAL details = "One or more TimeSeries could not be written: An internal error occurred.: timeSeries[0]" debug_error_string = "{"created":"@1583919882.548713353","description":"Error received from peer ipv4:173.194.76.95:443","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"One or more TimeSeries could not be written: An internal error occurred.: timeSeries[0]","grpc_status":13}" > The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 383, in run_background_function _function_handler.invoke_user_function(event_object) File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 217, in invoke_user_function return call_user_function(request_or_event) File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 214, in call_user_function event_context.Context(**request_or_event.context)) File "/user_code/main.py", line 35, in main streamer(json.loads(base64.b64decode(event['data']).decode('utf-8')), slo_config) File "/user_code/streamer.py", line 29, in streamer export(data, exporters) File "/user_code/streamer.py", line 55, in export ret = exporter.export(data, **config) File "/user_code/exporters/stackdriver.py", line 49, in export self.create_timeseries(data, **config) File "/user_code/exporters/stackdriver.py", line 96, in create_timeseries result = self.client.create_time_series(project, [series]) File "/env/local/lib/python3.7/site-packages/google/cloud/monitoring_v3/gapic/metric_service_client.py", line 1039, in create_time_series request, retry=retry, timeout=timeout, metadata=metadata File "/env/local/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py", line 143, in __call__ return wrapped_func(*args, **kwargs) File "/env/local/lib/python3.7/site-packages/google/api_core/retry.py", line 286, in retry_wrapped_func on_error=on_error, File "/env/local/lib/python3.7/site-packages/google/api_core/retry.py", line 184, in retry_target return target() File "/env/local/lib/python3.7/site-packages/google/api_core/timeout.py", line 214, in func_with_timeout return func(*args, **kwargs) File "/env/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable six.raise_from(exceptions.from_grpc_error(exc), exc) File "<string>", line 3, in raise_from google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: An internal error occurred.: timeSeries[0]
After various tests, removing the descriptor creation, seems to correct the issue !
Igoogle/api/label.proto=google/api/label.proto -Igoogle/api/launch_stage.proto=google/api/launch_stage.proto -Igoogle/api/metric.proto=google/api/metric.proto -Igoogle/protobuf/duration.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/duration_proto/google/protobuf/duration.proto -Igoogle/api/monitored_resource.proto=google/api/monitored_resource.proto -Igoogle/protobuf/struct.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/struct_proto/google/protobuf/struct.proto -Igoogle/api/resource.proto=google/api/resource.proto -Igoogle/rpc/status.proto=google/rpc/status.proto -Igoogle/type/calendar_period.proto=google/type/calendar_period.proto -Igoogle/protobuf/empty.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/empty_proto/google/protobuf/empty.proto -Igoogle/protobuf/field_mask.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/field_mask_proto/google/protobuf/field_mask.proto -Igoogle/protobuf/wrappers.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/wrappers_proto/google/protobuf/wrappers.proto google/monitoring/v3/alert.proto google/monitoring/v3/alert_service.proto google/monitoring/v3/common.proto google/monitoring/v3/dropped_labels.proto google/monitoring/v3/group.proto google/monitoring/v3/group_service.proto google/monitoring/v3/metric.proto google/monitoring/v3/metric_service.proto google/monitoring/v3/mutation_record.proto google/monitoring/v3/notification.proto google/monitoring/v3/notification_service.proto google/monitoring/v3/service.proto google/monitoring/v3/service_service.proto google/monitoring/v3/span_context.proto google/monitoring/v3/uptime.proto google/monitoring/v3/uptime_service.proto` failed (Exit 1) protoc failed: error executing command bazel-out/host/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional '--plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin' ... (remaining 54 argument(s) skipped)
Use --sandbox_debug to see verbose messages from the sandbox
google/monitoring/v3/metric.proto:24:1: warning: Import google/protobuf/duration.proto is unused.
google/monitoring/v3/metric.proto:19:1: warning: Import google/api/distribution.proto is unused.
google/monitoring/v3/metric_service.proto:28:1: warning: Import google/protobuf/duration.proto is unused.
google/monitoring/v3/metric_service.proto:25:1: warning: Import google/monitoring/v3/alert.proto is unused.
google/monitoring/v3/notification_service.proto:26:1: warning: Import google/protobuf/struct.proto is unused.
google/monitoring/v3/service.proto:22:1: warning: Import google/protobuf/timestamp.proto is unused.
google/monitoring/v3/service.proto:19:1: warning: Import google/api/monitored_resource.proto is unused.
google/monitoring/v3/uptime_service.proto:24:1: warning: Import google/protobuf/duration.proto is unused.
Traceback (most recent call last):
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate_with_pandoc.py", line 3, in <module>
from gapic.cli import generate
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate.py", line 23, in <module>
from gapic import generator
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/__init__.py", line 21, in <module>
from .generator import Generator
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/generator.py", line 24, in <module>
from gapic.samplegen import manifest, samplegen
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/__init__.py", line 15, in <module>
from gapic.samplegen import samplegen
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/samplegen.py", line 27, in <module>
from gapic.schema import wrappers
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/__init__.py", line 23, in <module>
from gapic.schema.api import API
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/25/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/api.py", line 29, in <module>
from google.api_core import exceptions # type: ignore
ModuleNotFoundError: No module named 'google.api_core'
--python_gapic_out: protoc-gen-python_gapic: Plugin failed with status code 1.
Target //google/monitoring/v3:monitoring-v3-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 1.080s, Critical Path: 0.85s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/python-monitoring/synth.py", line 38, in <module>
proto_output_path="google/cloud/monitoring_v3/proto"
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 197, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/monitoring/v3:monitoring-v3-py']' returned non-zero exit status 1.
2021-01-28 05:44:04,304 autosynth [ERROR] > Synthesis failed
2021-01-28 05:44:04,304 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 8a7e2f7 chore: reorder classes (#73)
2021-01-28 05:44:04,309 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-01-28 05:44:04,315 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
state = <grpc._channel._RPCState object at 0x7f8c62145d30>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f8c62183800>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1621681575.665381091","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >
.nox/py-3-8/lib/python3.8/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:78: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1125: in create_time_series
rpc(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]
Please investigate and fix this issue within 5 business days. While it remains broken,
this library cannot be updated with changes to the python-monitoring API, and the library grows
stale.
l/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:77:1
DEBUG: Rule 'com_google_protoc_java_resource_names_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "4b714b35ee04ba90f560ee60e64c7357428efcb6b0f3a298f343f8ec2c6d4a5d"
DEBUG: Call stack for the definition of repository 'com_google_protoc_java_resource_names_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:234:1
DEBUG: Rule 'protoc_docs_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "33b387245455775e0de45869c7355cc5a9e98b396a6fc43b02812a63b75fee20"
DEBUG: Call stack for the definition of repository 'protoc_docs_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:258:1
DEBUG: Rule 'rules_python' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "48f7e716f4098b85296ad93f5a133baf712968c13fbc2fdf3a6136158fe86eac"
DEBUG: Call stack for the definition of repository 'rules_python' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:42:1
DEBUG: Rule 'gapic_generator_python' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "fe995def6873fcbdc2a8764ef4bce96eb971a9d1950fe9db9be442f3c64fb3b6"
DEBUG: Call stack for the definition of repository 'gapic_generator_python' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:278:1
DEBUG: Rule 'com_googleapis_gapic_generator_go' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "c0d0efba86429cee5e52baf838165b0ed7cafae1748d025abec109d25e006628"
DEBUG: Call stack for the definition of repository 'com_googleapis_gapic_generator_go' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:300:1
DEBUG: Rule 'gapic_generator_php' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "3dffc5c34a5f35666843df04b42d6ce1c545b992f9c093a777ec40833b548d86"
DEBUG: Call stack for the definition of repository 'gapic_generator_php' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:364:1
DEBUG: Rule 'gapic_generator_csharp' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "4db430cfb9293e4521ec8e8138f8095faf035d8e752cf332d227710d749939eb"
DEBUG: Call stack for the definition of repository 'gapic_generator_csharp' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:386:1
DEBUG: Rule 'gapic_generator_ruby' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "a14ec475388542f2ea70d16d75579065758acc4b99fdd6d59463d54e1a9e4499"
DEBUG: Call stack for the definition of repository 'gapic_generator_ruby' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:400:1
DEBUG: /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/rules_python/python/pip.bzl:61:5: DEPRECATED: the pip_repositories rule has been replaced with pip_install, please see rules_python 0.1 release notes
DEBUG: Rule 'bazel_skylib' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "1dde365491125a3db70731e25658dfdd3bc5dbdfd11b840b3e987ecf043c7ca0"
DEBUG: Call stack for the definition of repository 'bazel_skylib' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:35:1
Analyzing: target //google/monitoring/v3:monitoring-v3-py (1 packages loaded, 0 targets configured)
ERROR: /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/upb/bazel/upb_proto_library.bzl:257:29: aspect() got unexpected keyword argument 'incompatible_use_toolchain_transition'
ERROR: Analysis of target '//google/monitoring/v3:monitoring-v3-py' failed; build aborted: error loading package '@com_github_grpc_grpc//': in /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/com_github_grpc_grpc/bazel/grpc_build_system.bzl: Extension file 'bazel/upb_proto_library.bzl' has errors
INFO: Elapsed time: 0.368s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (5 packages loaded, 4 targets configured)
FAILED: Build did NOT complete successfully (5 packages loaded, 4 targets configured)
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/python-monitoring/synth.py", line 38, in <module>
proto_output_path="google/cloud/monitoring_v3/proto"
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
return self._generate_code(service, version, "python", False, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 204, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/monitoring/v3:monitoring-v3-py']' returned non-zero exit status 1.
2021-04-27 04:22:37,719 autosynth [ERROR] > Synthesis failed
2021-04-27 04:22:37,720 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at af95e7f chore(revert): revert preventing normalization (#126)
2021-04-27 04:22:37,725 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-04-27 04:22:37,729 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 356, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 191, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 336, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 68, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
Please investigate and fix this issue within 5 business days. While it remains broken,
this library cannot be updated with changes to the python-monitoring API, and the library grows
stale.
43f189a2a73/external/com_google_api_gax_java/repositories.bzl:60:5
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:217:1
DEBUG: Rule 'com_google_api_codegen' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "47e2d7649bfcef198515f1412853cd1ff784fa65e9543ef80a81ab601e4600c6"
DEBUG: Call stack for the definition of repository 'com_google_api_codegen' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:77:1
DEBUG: Rule 'com_google_protoc_java_resource_names_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "4b714b35ee04ba90f560ee60e64c7357428efcb6b0f3a298f343f8ec2c6d4a5d"
DEBUG: Call stack for the definition of repository 'com_google_protoc_java_resource_names_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:234:1
DEBUG: Rule 'protoc_docs_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "33b387245455775e0de45869c7355cc5a9e98b396a6fc43b02812a63b75fee20"
DEBUG: Call stack for the definition of repository 'protoc_docs_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:258:1
DEBUG: Rule 'rules_python' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "48f7e716f4098b85296ad93f5a133baf712968c13fbc2fdf3a6136158fe86eac"
DEBUG: Call stack for the definition of repository 'rules_python' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:42:1
DEBUG: Rule 'gapic_generator_python' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "bad076ad037cc7e23978af204d73abc4479a3a9fc40a016ceb4fe94c0153dcc8"
DEBUG: Call stack for the definition of repository 'gapic_generator_python' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:278:1
DEBUG: Rule 'com_googleapis_gapic_generator_go' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "c0d0efba86429cee5e52baf838165b0ed7cafae1748d025abec109d25e006628"
DEBUG: Call stack for the definition of repository 'com_googleapis_gapic_generator_go' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:300:1
DEBUG: Rule 'gapic_generator_php' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "8ba95eb35076a796b1dad2bb424532b7fc2610ae2f8b4e2bebaed0286fcb2a54"
DEBUG: Call stack for the definition of repository 'gapic_generator_php' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:364:1
DEBUG: Rule 'gapic_generator_ruby' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "a14ec475388542f2ea70d16d75579065758acc4b99fdd6d59463d54e1a9e4499"
DEBUG: Call stack for the definition of repository 'gapic_generator_ruby' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:408:1
DEBUG: /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/rules_python/python/pip.bzl:61:5: DEPRECATED: the pip_repositories rule has been replaced with pip_install, please see rules_python 0.1 release notes
DEBUG: Rule 'bazel_skylib' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "1dde365491125a3db70731e25658dfdd3bc5dbdfd11b840b3e987ecf043c7ca0"
DEBUG: Call stack for the definition of repository 'bazel_skylib' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:35:1
Analyzing: target //google/monitoring/v3:monitoring-v3-py (1 packages loaded, 0 targets configured)
ERROR: /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/upb/bazel/upb_proto_library.bzl:257:29: aspect() got unexpected keyword argument 'incompatible_use_toolchain_transition'
ERROR: Analysis of target '//google/monitoring/v3:monitoring-v3-py' failed; build aborted: error loading package '@com_github_grpc_grpc//': in /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/com_github_grpc_grpc/bazel/grpc_build_system.bzl: Extension file 'bazel/upb_proto_library.bzl' has errors
INFO: Elapsed time: 0.326s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (3 packages loaded, 0 targets configured)
FAILED: Build did NOT complete successfully (3 packages loaded, 0 targets configured)
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/python-monitoring/synth.py", line 38, in <module>
proto_output_path="google/cloud/monitoring_v3/proto"
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
return self._generate_code(service, version, "python", False, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 204, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/monitoring/v3:monitoring-v3-py']' returned non-zero exit status 1.
2021-05-04 04:22:32,787 autosynth [ERROR] > Synthesis failed
2021-05-04 04:22:32,787 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at af95e7f chore(revert): revert preventing normalization (#126)
2021-05-04 04:22:32,793 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-05-04 04:22:32,798 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 356, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 191, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 336, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 68, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
E assert 'snippets-test-enclcibdnr' in 'name display_name\n---------------------------...est-iejfpcqoox\nprojects/python-docs-samples-tests-py38/alertPolicies/6723190430256917261 snippets-test-ndmxzmxehz\n'
E + where 'snippets-test-enclcibdnr' = name: "projects/python-docs-samples-tests-py38/alertPolicies/10500225107291066435"\ndisplay_name: "snippets-test-enclci...jects/python-docs-samples-tests-py38/alertPolicies/10500225107291066435/conditions/10500225107291068640"\n}\nenabled {\n}\n.display_name
E + where name: "projects/python-docs-samples-tests-py38/alertPolicies/10500225107291066435"\ndisplay_name: "snippets-test-enclci...jects/python-docs-samples-tests-py38/alertPolicies/10500225107291066435/conditions/10500225107291068640"\n}\nenabled {\n}\n = <snippets_test.PochanFixture object at 0x7f0449cbc8e0>.alert_policy
state = <grpc._channel._RPCState object at 0x7f580c501090>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f580c4968c0>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626170581.487223623","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7f580c4e8450>
pochan = <snippets_test.PochanFixture object at 0x7f580de09e10>
@pytest.mark.flaky(rerun_filter=delay_on_aborted, max_runs=5)
def test_enable_alert_policies(capsys, pochan):
# These sleep calls are for mitigating the following error:
# "409 Too many concurrent edits to the project configuration.
# Please try again."
# Having multiple projects will void these `sleep()` calls.
# See also #3310
time.sleep(2)
snippets.enable_alert_policies(pochan.project_name, True, "name='{}'".format(pochan.alert_policy.name))
out, _ = capsys.readouterr()
assert (
"Enabled {0}".format(pochan.project_name) in out
or "{} is already enabled".format(pochan.alert_policy.name) in out
)
time.sleep(2)
When I try to to compose queries with the Python API, the resulting pandas Dataframe is always empty. Are there any examples of how I'm supposed to use this library?
Is your feature request related to a problem? Please describe.
Feature request: support unit tests on alert policies without a real GCP project_id or using actual time series in Google Cloud Monitoring.
Describe the solution you'd like
I'd like to have a way to unit test my alert policies.
I have alert policies stored in my repository as JSON that can be loaded into memory as an AlertPolicy object.
I'd like to have unit tests to make sure the conditions are expressed correctly.
The idea is to create a fake time series, apply the alert conditions, then verify whether the alert would fire (or not) as expected.
Create different test cases with different fake time series to cover all the logic in the alert conditions.
Describe alternatives you've considered
I searched the codes here but didn't find anything related to evaluating alert conditions against a time series. It seems hard to create unit tests without an MQL evaluation function.
Additional context
I just wanted to unit test alert policies before sending them to Google Cloud Monitoring. I don't want to wake up oncalls at 3am because of buggy alert conditions.
state = <grpc._channel._RPCState object at 0x7f8c62145d30>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f8c62183800>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1621681575.665381091","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >
.nox/py-3-8/lib/python3.8/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:78: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1125: in create_time_series
rpc(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]
Thanks for stopping by to let us know something could be better!
PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.
Please run down the following list and make sure you've tried the usual "quick fixes":
state = <grpc._channel._RPCState object at 0x7fa08a2c0f10>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fa07bba8f00>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1607079964.947410511","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1062,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >
.nox/py-3-7/lib/python3.7/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:70: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1094: in create_time_series
request, retry=retry, timeout=timeout, metadata=metadata,
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]
state = <grpc._channel._RPCState object at 0x7f8c62145d30>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f8c62183800>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1621681575.665381091","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >
.nox/py-3-8/lib/python3.8/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:78: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1125: in create_time_series
rpc(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]
ope_struct_54__handle_cancellation_from_core.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132284:72: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_55__schedule_rpc_coro.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132290:65: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_56__handle_rpc.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132296:67: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_57__request_call.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132302:71: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_58__server_main_loop.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132308:59: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_59_start.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132314:74: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_60__start_shutting_down.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132320:62: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_61_shutdown.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132326:74: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'?
__pyx_type_7_cython_6cygrpc___pyx_scope_struct_62_wait_for_termination.tp_print = 0;
^~~~~~~~
tp_dict
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp: In function 'PyObject* __Pyx_decode_c_bytes(const char*, Py_ssize_t, Py_ssize_t, Py_ssize_t, const char*, const char*, PyObject* (*)(const char*, Py_ssize_t, const char*))':
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:136866:45: warning: 'PyObject* PyUnicode_FromUnicode(const Py_UNICODE*, Py_ssize_t)' is deprecated [-Wdeprecated-declarations]
return PyUnicode_FromUnicode(NULL, 0);
^
In file included from bazel-out/host/bin/external/local_config_python/_python3/_python3_include/unicodeobject.h:1026:0,
from bazel-out/host/bin/external/local_config_python/_python3/_python3_include/Python.h:97,
from bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:4:
bazel-out/host/bin/external/local_config_python/_python3/_python3_include/cpython/unicodeobject.h:551:42: note: declared here
Py_DEPRECATED(3.3) PyAPI_FUNC(PyObject*) PyUnicode_FromUnicode(
^~~~~~~~~~~~~~~~~~~~~
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp: In function 'void __pyx_f_7_cython_6cygrpc__unified_socket_write(int)':
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:72692:3: warning: ignoring return value of 'ssize_t write(int, const void*, size_t)', declared with attribute warn_unused_result [-Wunused-result]
(void)(write(__pyx_v_fd, ((char *)"1"), 1));
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp: At global scope:
bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:144607:1: warning: 'void __Pyx_PyAsyncGen_Fini()' defined but not used [-Wunused-function]
__Pyx_PyAsyncGen_Fini(void)
^~~~~~~~~~~~~~~~~~~~~
Target //google/monitoring/v3:monitoring-v3-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 4.248s, Critical Path: 4.02s
INFO: 8 processes: 8 linux-sandbox.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully
Traceback (most recent call last):
File "/usr/local/lib/python3.9/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/local/lib/python3.9/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 790, in exec_module
File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
File "/root/.cache/synthtool/python-monitoring/synth.py", line 32, in <module>
v3_library = gapic.py_library(
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 45, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 182, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 27, in run
return subprocess.run(
File "/usr/local/lib/python3.9/subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/monitoring/v3:monitoring-v3-py']' returned non-zero exit status 1.
2020-12-05 03:08:23,957 autosynth [ERROR] > Synthesis failed
2020-12-05 03:08:23,957 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 2976654 chore(deps): update ubuntu docker tag to v20.10 (#52)
2020-12-05 03:08:23,963 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2020-12-05 03:08:23,968 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/usr/local/lib/python3.9/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/local/lib/python3.9/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/usr/local/lib/python3.9/subprocess.py", line 456, in check_returncode
raise CalledProcessError(self.returncode, self.args, self.stdout,
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
params = self._build_query_params(headers_only, page_size)
> for ts in self._client.list_time_series(**params):
E TypeError: list_time_series() got an unexpected keyword argument 'aggregation'
.../site-packages/google/cloud/monitoring_v3/query.py:446: TypeError
state = <grpc._channel._RPCState object at 0x7fab9ffb2a50>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fab9ffe49b0>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626170630.632382368","description":"Error received from peer ipv4:74.125.197.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >
state = <grpc._channel._RPCState object at 0x7fa08a2c0f10>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fa07bba8f00>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1607079964.947410511","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1062,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >
.nox/py-3-7/lib/python3.7/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:70: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1094: in create_time_series
request, retry=retry, timeout=timeout, metadata=metadata,
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]
While using the function to monitor CPU utilization via cloud functions we got the erroe:
cannot import name 'monitoring_v3' from 'google.cloud' (unknown location)
Traceback (most recent call last):
from google.cloud import monitoring_v3
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/init.py", line 17, in
from .services.alert_policy_service import AlertPolicyServiceClient
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/services/alert_policy_service/init.py", line 16, in
from .client import AlertPolicyServiceClient
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/services/alert_policy_service/client.py", line 33, in
from google.cloud.monitoring_v3.services.alert_policy_service import pagers
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/services/alert_policy_service/pagers.py", line 27, in
from google.cloud.monitoring_v3.types import alert
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/types/init.py", line 16, in
from .alert import AlertPolicy
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/types/alert.py", line 18, in
from google.cloud.monitoring_v3.types import common
File "/opt/conda/envs/env/lib/python3.9/site-packages/google/cloud/monitoring_v3/types/common.py", line 48, in
class ServiceTier(proto.Enum):
File "/opt/conda/envs/env/lib/python3.9/site-packages/proto/enums.py", line 72, in new
cls = super().new(mcls, name, bases, attrs)
File "/opt/conda/envs/env/lib/python3.9/enum.py", line 288, in new
enum_member = new(enum_class, *args)
TypeError: int() argument must be a string, a bytes-like object or a number, not 'dict'
It's easy to fix with python=3.7.10 if anyone wonders.
I'm getting the following error when using list(timeseries_response) to iterate over the pages in the response. The error happens only on big timeseries response containing lots of data.
Cloning into 'working_repo'...
Switched to branch 'autosynth'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/synth.py.
On branch autosynth
nothing to commit, working tree clean
HEAD detached at FETCH_HEAD
nothing to commit, working tree clean
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:6aec9c34db0e4be221cdaf6faba27bdc07cfea846808b3d3b964dfce3a9a0f9b
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/monitoring/artman_monitoring.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/group_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/group_service.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/metric.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/metric.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/alert_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/alert_service.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/dropped_labels.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/dropped_labels.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/group.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/group.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/common.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/common.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/notification.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/notification.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/metric_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/metric_service.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/service.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/service_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/service_service.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/alert.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/alert.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/uptime_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/uptime_service.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/mutation_record.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/mutation_record.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/uptime.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/uptime.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/span_context.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/span_context.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/monitoring/v3/notification_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto/notification_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/monitoring-v3/google/cloud/monitoring_v3/proto.
synthtool > Replaced 'def .*\\(([^\\)]+)\n.*metadata=None\\):\n\\s+"""(.*\n)*?\\s+"""\n' in google/cloud/monitoring_v3/gapic/metric_service_client.py.
synthtool > Replaced 'def .*\\(([^\\)]+)\n.*metadata=None\\):\n\\s+"""(.*\n)*?\\s+"""\n' in google/cloud/monitoring_v3/gapic/group_service_client.py.
synthtool > Replaced 'def .*\\(([^\\)]+)\n.*metadata=None\\):\n\\s+"""(.*\n)*?\\s+"""\n' in google/cloud/monitoring_v3/gapic/alert_policy_service_client.py.
synthtool > Replaced 'def .*\\(([^\\)]+)\n.*metadata=None\\):\n\\s+"""(.*\n)*?\\s+"""\n' in google/cloud/monitoring_v3/gapic/service_monitoring_service_client.py.
synthtool > Replaced 'def .*\\(([^\\)]+)\n.*metadata=None\\):\n\\s+"""(.*\n)*?\\s+"""\n' in google/cloud/monitoring_v3/gapic/notification_channel_service_client.py.
synthtool > Replaced 'def .*\\(([^\\)]+)\n.*metadata=None\\):\n\\s+"""(.*\n)*?\\s+"""\n' in google/cloud/monitoring_v3/gapic/uptime_check_service_client.py.
synthtool > Replaced '(^.*$\\n)*' in google/cloud/monitoring_v3/proto/common_pb2.py.
synthtool > No replacements made in google/cloud/monitoring_v3/gapic/alert_policy_service_client.py for pattern then a new `\[CONDITION_ID\]` is created.
, maybe replacement is not longer needed?
synthtool > Replaced ' ::\n\n' in google/cloud/monitoring_v3/gapic/alert_policy_service_client.py.
synthtool > No replacements made in google/cloud/monitoring_v3/proto/metric_service_pb2.py for pattern ^(\s+)have an ``id`` label: :: resource.type =
.*, maybe replacement is not longer needed?
synthtool > Replaced 'from google.cloud.monitoring_v3.gapic import notification_channel_service_client\n' in google/cloud/monitoring_v3/__init__.py.
synthtool > Replaced 'notification_channel_service_client.NotificationChannelServiceClient' in google/cloud/monitoring_v3/__init__.py.
.coveragerc
.flake8
.github/CONTRIBUTING.md
.github/ISSUE_TEMPLATE/bug_report.md
.github/ISSUE_TEMPLATE/feature_request.md
.github/ISSUE_TEMPLATE/support_request.md
.github/PULL_REQUEST_TEMPLATE.md
.github/release-please.yml
.gitignore
.kokoro/build.sh
.kokoro/continuous/common.cfg
.kokoro/continuous/continuous.cfg
.kokoro/docs/common.cfg
.kokoro/docs/docs.cfg
.kokoro/presubmit/common.cfg
.kokoro/presubmit/presubmit.cfg
.kokoro/publish-docs.sh
.kokoro/release.sh
.kokoro/release/common.cfg
.kokoro/release/release.cfg
.kokoro/trampoline.sh
CODE_OF_CONDUCT.md
CONTRIBUTING.rst
LICENSE
MANIFEST.in
docs/_static/custom.css
docs/_templates/layout.html
docs/conf.py.j2
noxfile.py.j2
renovate.json
setup.cfg
Running session blacken
Creating virtual environment (virtualenv) using python3.6 in .nox/blacken
pip install black==19.3b0
Error: pip is not installed into the virtualenv, it is located at /tmpfs/src/git/autosynth/env/bin/pip. Pass external=True into run() to explicitly allow this.
Session blacken failed.
synthtool > Failed executing nox -s blacken:
None
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/synth.py", line 99, in <module>
s.shell.run(["nox", "-s", "blacken"], hide_output=False)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.
Synthesis failed
Google internal developers can see the full log here.
state = <grpc._channel._RPCState object at 0x7fa08a2c0f10>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fa07bba8f00>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1607079964.947410511","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1062,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >
.nox/py-3-7/lib/python3.7/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:70: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1094: in create_time_series
request, retry=retry, timeout=timeout, metadata=metadata,
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]
Igoogle/api/label.proto=google/api/label.proto -Igoogle/api/launch_stage.proto=google/api/launch_stage.proto -Igoogle/api/metric.proto=google/api/metric.proto -Igoogle/protobuf/duration.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/duration_proto/google/protobuf/duration.proto -Igoogle/api/monitored_resource.proto=google/api/monitored_resource.proto -Igoogle/protobuf/struct.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/struct_proto/google/protobuf/struct.proto -Igoogle/api/resource.proto=google/api/resource.proto -Igoogle/rpc/status.proto=google/rpc/status.proto -Igoogle/type/calendar_period.proto=google/type/calendar_period.proto -Igoogle/protobuf/empty.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/empty_proto/google/protobuf/empty.proto -Igoogle/protobuf/field_mask.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/field_mask_proto/google/protobuf/field_mask.proto -Igoogle/protobuf/wrappers.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/wrappers_proto/google/protobuf/wrappers.proto google/monitoring/v3/alert.proto google/monitoring/v3/alert_service.proto google/monitoring/v3/common.proto google/monitoring/v3/dropped_labels.proto google/monitoring/v3/group.proto google/monitoring/v3/group_service.proto google/monitoring/v3/metric.proto google/monitoring/v3/metric_service.proto google/monitoring/v3/mutation_record.proto google/monitoring/v3/notification.proto google/monitoring/v3/notification_service.proto google/monitoring/v3/service.proto google/monitoring/v3/service_service.proto google/monitoring/v3/span_context.proto google/monitoring/v3/uptime.proto google/monitoring/v3/uptime_service.proto` failed (Exit 1) protoc failed: error executing command bazel-out/host/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional '--plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin' ... (remaining 54 argument(s) skipped)
Use --sandbox_debug to see verbose messages from the sandbox
google/monitoring/v3/metric.proto:24:1: warning: Import google/protobuf/duration.proto is unused.
google/monitoring/v3/metric.proto:19:1: warning: Import google/api/distribution.proto is unused.
google/monitoring/v3/metric_service.proto:28:1: warning: Import google/protobuf/duration.proto is unused.
google/monitoring/v3/metric_service.proto:25:1: warning: Import google/monitoring/v3/alert.proto is unused.
google/monitoring/v3/notification_service.proto:26:1: warning: Import google/protobuf/struct.proto is unused.
google/monitoring/v3/service.proto:22:1: warning: Import google/protobuf/timestamp.proto is unused.
google/monitoring/v3/service.proto:19:1: warning: Import google/api/monitored_resource.proto is unused.
google/monitoring/v3/uptime_service.proto:24:1: warning: Import google/protobuf/duration.proto is unused.
Traceback (most recent call last):
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate_with_pandoc.py", line 3, in <module>
from gapic.cli import generate
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate.py", line 23, in <module>
from gapic import generator
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/__init__.py", line 21, in <module>
from .generator import Generator
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/generator.py", line 24, in <module>
from gapic.samplegen import manifest, samplegen
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/__init__.py", line 15, in <module>
from gapic.samplegen import samplegen
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/samplegen.py", line 27, in <module>
from gapic.schema import wrappers
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/__init__.py", line 23, in <module>
from gapic.schema.api import API
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/24/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/api.py", line 29, in <module>
from google.api_core import exceptions # type: ignore
ModuleNotFoundError: No module named 'google.api_core'
--python_gapic_out: protoc-gen-python_gapic: Plugin failed with status code 1.
Target //google/monitoring/v3:monitoring-v3-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 1.092s, Critical Path: 0.85s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/python-monitoring/synth.py", line 38, in <module>
proto_output_path="google/cloud/monitoring_v3/proto"
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 193, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/monitoring/v3:monitoring-v3-py']' returned non-zero exit status 1.
2021-01-21 05:43:50,584 autosynth [ERROR] > Synthesis failed
2021-01-21 05:43:50,584 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 8a7e2f7 chore: reorder classes (#73)
2021-01-21 05:43:50,591 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-01-21 05:43:50,597 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
Just use the following import from google.cloud import monitoring_v3 in any file.
Code example
fromgoogle.cloudimportmonitoring_v3
Stack trace
File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/__init__.py", line 18, in <module>
from .services.alert_policy_service import AlertPolicyServiceClient
File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/services/alert_policy_service/__init__.py", line 18, in <module>
from .client import AlertPolicyServiceClient
File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/services/alert_policy_service/client.py", line 35, in <module>
from google.cloud.monitoring_v3.services.alert_policy_service import pagers
File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/services/alert_policy_service/pagers.py", line 29, in <module>
from google.cloud.monitoring_v3.types import alert
File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/types/__init__.py", line 18, in <module>
from .alert import AlertPolicy
File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/types/alert.py", line 21, in <module>
from google.cloud.monitoring_v3.types import common
File "venv/lib/python3.6/site-packages/google/cloud/monitoring_v3/types/common.py", line 51, in <module>
class ServiceTier(proto.Enum):
File "venv/lib/python3.6/site-packages/proto/enums.py", line 72, in __new__
cls = super().__new__(mcls, name, bases, attrs)
File ".pyenv/versions/3.6.10/lib/python3.6/enum.py", line 201, in __new__
enum_member = __new__(enum_class, *args)
TypeError: int() argument must be a string, a bytes-like object or a number, not 'dict'
The problem is in the file: google/cloud/monitoring_v3/types/common.py and class ServiceTier(proto.Enum). When I remove the property _pb_options = {"deprecated": True} everything works correctly.
Thanks for stopping by to let us know something could be better!
PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.
Is your feature request related to a problem? Please describe.
Currently the snippet doesn't show how to add labels to the custom metric, unlike how java sdk snippet does.
Describe the solution you'd like
Add a test custom label to the metric in the create_metric_descriptor function
state = <grpc._channel._RPCState object at 0x7fa08a2c0f10>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fa07bba8f00>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INTERNAL
E details = "One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]"
E debug_error_string = "{"created":"@1607079964.947410511","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1062,"grpc_message":"One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
E >
.nox/py-3-7/lib/python3.7/site-packages/backoff/_sync.py:94: in retry
ret = target(*args, **kwargs)
snippets_test.py:48: in write
snippets.write_time_series(PROJECT_ID)
snippets.py:70: in write_time_series
client.create_time_series(name=project_name, time_series=[series])
../../../../google/cloud/monitoring_v3/services/metric_service/client.py:1094: in create_time_series
request, retry=retry, timeout=timeout, metadata=metadata,
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.INTERNAL
details = "One or more TimeSeries could...internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]","grpc_status":13}"
???
E google.api_core.exceptions.InternalServerError: 500 One or more TimeSeries could not be written: Internal error encountered. Please retry after a few seconds. If internal errors persist, contact support at https://cloud.google.com/support/docs.: timeSeries[0]
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
~/miniconda3/lib/python3.8/site-packages/proto/message.py in __getattr__(self, key)
559 try:
--> 560 pb_type = self._meta.fields[key].pb_type
561 pb_value = getattr(self._pb, key)
KeyError: 'WhichOneof'
During handling of the above exception, another exception occurred:
AttributeError Traceback (most recent call last)
<ipython-input-19-a95d81571ff4> in <module>
8 print(type(query))
9
---> 10 query.as_dataframe()
~/miniconda3/lib/python3.8/site-packages/google/cloud/monitoring_v3/query.py in as_dataframe(self, label, labels)
533 :returns: A dataframe where each column represents one time series.
534 """
--> 535 return _dataframe._build_dataframe(self, label, labels)
536
537 def __deepcopy__(self, memo):
~/miniconda3/lib/python3.8/site-packages/google/cloud/monitoring_v3/_dataframe.py in _build_dataframe(time_series_iterable, label, labels)
94 for time_series in time_series_iterable:
95 pandas_series = pandas.Series(
---> 96 data=[_extract_value(point.value) for point in time_series.points],
97 index=[
98 point.interval.end_time.ToNanoseconds() for point in time_series.points
~/miniconda3/lib/python3.8/site-packages/google/cloud/monitoring_v3/_dataframe.py in <listcomp>(.0)
94 for time_series in time_series_iterable:
95 pandas_series = pandas.Series(
---> 96 data=[_extract_value(point.value) for point in time_series.points],
97 index=[
98 point.interval.end_time.ToNanoseconds() for point in time_series.points
~/miniconda3/lib/python3.8/site-packages/google/cloud/monitoring_v3/_dataframe.py in _extract_value(typed_value)
47 def _extract_value(typed_value):
48 """Extract the value from a TypedValue."""
---> 49 value_type = typed_value.WhichOneof("value")
50 return typed_value.__getattribute__(value_type)
51
~/miniconda3/lib/python3.8/site-packages/proto/message.py in __getattr__(self, key)
563 return marshal.to_python(pb_type, pb_value, absent=key not in self)
564 except KeyError as ex:
--> 565 raise AttributeError(str(ex))
566
567 def __ne__(self, other):
AttributeError: 'WhichOneof'