Giter Site home page Giter Site logo

quota-monitoring-solution's Introduction

Quota Monitoring and Alerting

An easy-to-deploy Looker Studio Dashboard with alerting capabilities, showing usage and quota limits in an organization or folder.

Google Cloud enforces quotas on resource usage for project owners, setting a limit on how much of a particular Google Cloud resource your project can use. Each quota limit represents a specific countable resource, such as the number of API requests made per day to the number of load balancers used concurrently by your application.

Quotas are enforced for a variety of reasons:

  • To protect the community of Google Cloud users by preventing unforeseen spikes in usage.
  • To help you manage resources. For example, you can set your own limits on service usage while developing and testing your applications.

We are introducing a new custom quota monitoring and alerting solution for Google Cloud customers.

1. Summary

Quota Monitoring Solution is a stand-alone application of an easy-to-deploy Looker Studio dashboard with alerting capabilities showing all usage and quota limits in an organization or folder.

1.1 Four Initial Features

key-features

*The data refresh rate depends on the configured frequency to run the application.

2. Architecture

architecture

The architecture is built using Google Cloud managed services - Cloud Functions, Pub/Sub, Dataflow and BigQuery.

  • The solution is architected to scale using Pub/Sub.
  • Cloud Scheduler is used to trigger Cloud Functions. This is also an user interface to configure frequency, parent nodes, alert threshold and email Ids. Parent node could be an organization Id, folder id, list of organization Ids or list of folder Ids.
  • Cloud Functions are used to scan quotas across projects for the configured parent node.
  • BigQuery is used to store data.
  • Alert threshold will be applicable across all metrics.
  • Alerts can be received by Email, Mobile App, PagerDuty, SMS, Slack, Webhooks and Pub/Sub. Cloud Monitoring custom log metric has been leveraged to create Alerts.
  • Easy to get started and deploy with Looker Studio Dashboard. In addition to Looker Studio, other visualization tools can be configured.
  • The Looker Studio report can be scheduled to be emailed to appropriate team for weekly/daily reporting.

3. Configuring Quota Monitoring and Alerting

configuration

  1. Upload csv file with columns: project_id,email_id,app_code,dashboard_url

  2. For applications with more than 1 projects the project_id column can take a string with more than project. Reference: CSV file

  3. *Note 1 project will only have 1 app-code, but app-code can have more than 1 project.

    e.g. If you have two rows in the csv file:

    edge-retail-374401|pub-sub-example-394521, appcode1

    edge-retail-374401, appcode2

    edge-retail-374401 will end up with appcode2.

  4. Cloud scheduler will trigger configAppAlerts for each app code in csv:

  5. Create custom log metric

  6. Create notification channel

  7. Create Alert using custom log metric & notification channel

  8. Upload all data to big query

4. Deployment Guide

Content

4.1 Prerequisites

  1. Host Project - A project where the BigQuery instance, Cloud Function and Cloud Scheduler will be deployed. For example Project A.

  2. Target Node - The Organization or folder or project which will be scanned for Quota Metrics. For example Org A and Folder A.

  3. Project Owner role on host Project A. IAM Admin role in target Org A and target Folder A.

  4. Google Cloud SDK is installed. Detailed instructions to install the SDK here. See the Getting Started page for an introduction to using gcloud and terraform.

  5. Terraform version >= 0.14.6 installed. Instructions to install terraform here

    • Verify terraform version after installing.
    terraform -version

    The output should look like:

    Terraform v0.14.6
    + provider registry.terraform.io/hashicorp/google v3.57.0

    Note - Minimum required version v0.14.6. Lower terraform versions may not work.

4.2 Initial Setup

  1. In local workstation create a new directory to run terraform and store credential file

    mkdir <directory name like quota-monitoring-dashboard>
    cd <directory name>
  2. Set default project in config to host project A

    gcloud config set project <HOST_PROJECT_ID>

    The output should look like:

    Updated property [core/project].
  3. Ensure that the latest version of all installed components is installed on the local workstation.

    gcloud components update
  4. Cloud Scheduler depends on the App Engine application. Create an App Engine application in the host project. Replace the region. List of regions where App Engine is available can be found here.

    gcloud app create --region=<region>

    Note: Cloud Scheduler (below) needs to be in the same region as App Engine. Use the same region in terraform as mentioned here.

    The output should look like:

    You are creating an app for project [quota-monitoring-project-3].
    WARNING: Creating an App Engine application for a project is irreversible and the region
    cannot be changed. More information about regions is at
    <https://cloud.google.com/appengine/docs/locations>.
    
    Creating App Engine application in project [quota-monitoring-project-1] and region [us-east1]....done.
    
    Success! The app is now created. Please use `gcloud app deploy` to deploy your first app.

4.3 Create Service Account

  1. In local workstation, setup environment variables. Replace the name of the Service Account in the commands below

    export DEFAULT_PROJECT_ID=$(gcloud config get-value core/project 2> /dev/null)
    export SERVICE_ACCOUNT_ID="sa-"$DEFAULT_PROJECT_ID
    export DISPLAY_NAME="sa-"$DEFAULT_PROJECT_ID
  2. Verify host project Id.

    echo $DEFAULT_PROJECT_ID
  3. Create Service Account

    gcloud iam service-accounts create $SERVICE_ACCOUNT_ID --description="Service Account to scan quota usage" --display-name=$DISPLAY_NAME

    The output should look like:

    Created service account [sa-quota-monitoring-project-1].

4.4 Grant Roles to Service Account

4.4.1 Grant Roles in the Host Project

The following roles need to be added to the Service Account in the host project i.e. Project A:

  • BigQuery
    • BigQuery Data Editor
    • BigQuery Job User
  • Cloud Functions
    • Cloud Functions Admin
  • Cloud Scheduler
    • Cloud Scheduler Admin
  • Pub/Sub
    • Pub/Sub Admin
  • Run Terraform
    • Service Account User
    • Enable APIs
    • Service Usage Admin
  • Storage Bucket
    • Storage Admin
  • Scan Quotas
    • Cloud Asset Viewer
    • Compute Network Viewer
    • Compute Viewer
  • Monitoring
    • Notification Channel Editor
    • Alert Policy Editor
    • Viewer
    • Metric Writer
  • Logs
    • Logs Configuration Writer
    • Log Writer
  • IAM
    • Security Admin
  1. Run following commands to assign the roles:

    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/bigquery.dataEditor" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/bigquery.jobUser" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/cloudfunctions.admin" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/cloudscheduler.admin" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/pubsub.admin" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/iam.serviceAccountUser" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/storage.admin" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/serviceusage.serviceUsageAdmin" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/cloudasset.viewer" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/compute.networkViewer" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/compute.viewer" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/monitoring.notificationChannelEditor" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/monitoring.alertPolicyEditor" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/logging.configWriter" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/logging.logWriter" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/monitoring.viewer" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/monitoring.metricWriter" --condition=None
    
    gcloud projects add-iam-policy-binding $DEFAULT_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/iam.securityAdmin" --condition=None

4.4.2 Grant Roles in the Target Folder

SKIP THIS STEP IF THE FOLDER IS NOT THE TARGET TO SCAN QUOTA

If you want to scan projects in the folder, add following roles to the Service Account created in the previous step at the target folder A:

  • Cloud Asset Viewer
  • Compute Network Viewer
  • Compute Viewer
  • Folder Viewer
  • Monitoring Viewer
  1. Set target folder id

    export TARGET_FOLDER_ID=<target folder id like 38659473572>
  2. Run the following commands add to the roles to the service account

    gcloud alpha resource-manager folders add-iam-policy-binding  $TARGET_FOLDER_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/cloudasset.viewer"
    
    gcloud alpha resource-manager folders add-iam-policy-binding  $TARGET_FOLDER_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/compute.networkViewer"
    
    gcloud alpha resource-manager folders add-iam-policy-binding  $TARGET_FOLDER_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/compute.viewer"
    
    gcloud alpha resource-manager folders add-iam-policy-binding  $TARGET_FOLDER_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/resourcemanager.folderViewer"
    
    gcloud alpha resource-manager folders add-iam-policy-binding  $TARGET_FOLDER_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/monitoring.viewer"

    Note: If this fails, run the commands again

4.4.3 Grant Roles in the Target Organization

SKIP THIS STEP IF THE ORGANIZATION IS NOT THE TARGET

If you want to scan projects in the org, add following roles to the Service Account created in the previous step at the Org A:

  • Cloud Asset Viewer
  • Compute Network Viewer
  • Compute Viewer
  • Org Viewer
  • Folder Viewer
  • Monitoring Viewer

org-service-acccount-roles

  1. Set target organization id

    export TARGET_ORG_ID=<target org id ex. 38659473572>
  2. Run the following commands to add to the roles to the service account

    gcloud organizations add-iam-policy-binding  $TARGET_ORG_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com" --role="roles/cloudasset.viewer" --condition=None
    
    gcloud organizations add-iam-policy-binding  $TARGET_ORG_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com"  --role="roles/compute.networkViewer" --condition=None
    
    gcloud organizations add-iam-policy-binding  $TARGET_ORG_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com"  --role="roles/compute.viewer" --condition=None
    
    gcloud organizations add-iam-policy-binding  $TARGET_ORG_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com"  --role="roles/resourcemanager.folderViewer" --condition=None
    
    gcloud organizations add-iam-policy-binding  $TARGET_ORG_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com"  --role="roles/resourcemanager.organizationViewer" --condition=None
    
    gcloud organizations add-iam-policy-binding  $TARGET_ORG_ID --member="serviceAccount:$SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com"  --role="roles/monitoring.viewer" --condition=None

4.5 Download the Source Code

  1. Clone the Quota Management Solution repo

    git clone https://github.com/google/quota-monitoring-solution.git quota-monitorings-solution
  2. Change directories into the Terraform example

    cd ./quota-monitorings-solution/terraform/example

4.6 Set OAuth Token Using Service Account Impersonization

Impersonate your host project service account and set environment variable using temporary token to authenticate terraform. You will need to make sure your user has the Service Account Token Creator role to create short-lived credentials.

gcloud config set auth/impersonate_service_account \
    $SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com

export GOOGLE_OAUTH_ACCESS_TOKEN=$(gcloud auth print-access-token)
  • TIP: If you get an error saying unable to impersonate, you will need to unset the impersonation. Have the role added similar to below, then try again.

    # unset impersonation
    gcloud config unset auth/impersonate_service_account
    
    # set your current authenticated user as var
    PROJECT_USER=$(gcloud config get-value core/account)
    
    # grant IAM role serviceAccountTokenCreator
    gcloud iam service-accounts add-iam-policy-binding $SERVICE_ACCOUNT_ID@$DEFAULT_PROJECT_ID.iam.gserviceaccount.com \
        --member user:$PROJECT_USER \
        --role roles/iam.serviceAccountTokenCreator \
        --condition=None

4.7 Configure Terraform

  1. Verify that you have these 3 files in your local directory:

    • main.tf
    • variables.tf
    • terraform.tfvars
  2. Open terraform.tfvars file in your favourite editor and change values for the variables.

    vi terraform.tfvars
  3. For region, use the same region as used for App Engine in earlier steps.

    The variables source_code_base_url, qms_version, source_code_zip and source_code_notification_zip on the QMS module are used to download the source for the QMS Cloud Functions from the latest GitHub release.

    To deploy the latest unreleased code from a local clone of the QMS repository, set qms_version to main

4.8 Run Terraform

  1. Run terraform commands

    • terraform init
    • terraform plan
    • terraform apply
      • On Prompt Enter a value: yes
  2. This will:

    • Enable required APIs
    • Create all resources and connect them.

    Note: In case terraform fails, run terraform plan and terraform apply again

  3. Stop impersonating service account (when finished with terraform)

    gcloud config unset auth/impersonate_service_account

4.9 Testing

  1. Initiate first job run in Cloud Scheduler.

    Console

    Click 'Run Now' on Cloud Job scheduler.

    Note: The status of the ‘Run Now’ button changes to ‘Running’ for a fraction of seconds.

    run-cloud-scheduler

    Terminal

    gcloud scheduler jobs run quota-monitoring-cron-job --location <region>
    gcloud scheduler jobs run quota-monitoring-app-alert-config --location <region>
  2. To verify that the program ran successfully, check the BigQuery Table. The time to load data in BigQuery might take a few minutes. The execution time depends on the number of projects to scan. A sample BigQuery table will look like this: test-bigquery-table

4.10 Looker Studio Dashboard setup

  1. Go to the Looker Studio dashboard template. A Looker Studio dashboard will look like this: ds-updated-quotas-dashboard

  2. Make a copy of the template from the copy icon at the top bar (top - right corner) ds-dropdown-copy

  3. Click on ‘Copy Report’ button without changing datasource options ds-copy-report-fixed-new-data-source

  4. This will create a copy of the report and open in Edit mode. If not click on ‘Edit’ button on top right corner in copied template: ds-edit-mode-updated

  5. Select any one table like below ‘Disks Total GB - Quotas’ is selected. On the right panel in ‘Data’ tab, click on icon ‘edit data source’ ds_edit_data_source It will open the data source details ![ds_datasource_config_step_1]img/ds_datasource_config_step_1.png

  6. Replace the BigQuery Project Id of your bq table, Dataset Id and Table Name to match your deployment. If you assigned app codes add a list of project ids in where clause from the csv file upload. Verify the query by running in BigQuery Editor for accuracy & syntax:

    #For org level dashboard use the following query
    SELECT
        project_id,
        added_at,
        region,
        quota_metric,
        CASE
            WHEN CAST(quota_limit AS STRING) ='9223372036854775807' THEN 'unlimited'
        ELSE
            CAST(quota_limit AS STRING)
        END AS str_quota_limit,
        SUM(current_usage) AS current_usage,
        ROUND((SAFE_DIVIDE(CAST(SUM(current_usage) AS BIGNUMERIC), CAST(quota_limit AS BIGNUMERIC))*100),2) AS current_consumption,
        SUM(max_usage) AS max_usage,
        ROUND((SAFE_DIVIDE(CAST(SUM(max_usage) AS BIGNUMERIC), CAST(quota_limit AS BIGNUMERIC))*100),2) AS max_consumption
    FROM
        (
            SELECT
                *,
                RANK() OVER (PARTITION BY project_id, region, quota_metric ORDER BY added_at DESC) AS latest_row
            FROM
                `[YOUR_PROJECT_ID].quota_monitoring_dataset.quota_monitoring_table`
        ) t
    WHERE
        latest_row=1
        AND current_usage IS NOT NULL
        AND quota_limit IS NOT NULL
        AND current_usage != 0
        AND quota_limit != 0
        GROUP BY
        project_id,
        region,
        quota_metric,
        added_at,
        quota_limit
    
    # For app level dashboard use the following query replace PROJECT_ID with project_ids from csv file upload
    SELECT 
        project_id,
        added_at,
        region,
        quota_metric,
        CASE
            WHEN CAST(quota_limit AS STRING) ='9223372036854775807' THEN 'unlimited'
        ELSE
            CAST(quota_limit AS STRING)
        END AS str_quota_limit,
        SUM(current_usage) AS current_usage,
        ROUND((SAFE_DIVIDE(CAST(SUM(current_usage) AS BIGNUMERIC), CAST(quota_limit AS BIGNUMERIC))*100),2) AS current_consumption,
        SUM(max_usage) AS max_usage,
        ROUND((SAFE_DIVIDE(CAST(SUM(max_usage) AS BIGNUMERIC), CAST(quota_limit AS BIGNUMERIC))*100),2) AS max_consumption
    FROM
        (
            SELECT
                *,
                RANK() OVER (PARTITION BY project_id, region, quota_metric ORDER BY added_at DESC) AS latest_row
            FROM
                `[YOUR_PROJECT_ID].quota_monitoring_dataset.quota_monitoring_table`
        ) t
    WHERE
        latest_row=1
        AND current_usage IS NOT NULL
        AND quota_limit IS NOT NULL
        AND current_usage != 0
        AND quota_limit != 0
        AND project-id IN ([PROJECT_ID1], [PROJECT_ID2]..)
        GROUP BY
        project_id,
        region,
        quota_metric,
        added_at,
        quota_limit
  7. After making sure that query is returning results, replace it in the Data Studio, click on the ‘Reconnect’ button in the data source pane. ds_data_source_config_step_3

  8. In the next window, click on the ‘Done’ button. ds_data_source_config_step_2

  9. Once the data source is configured, click on the ‘View’ button on the top right corner. Note: make additional changes in the layout like which metrics to be displayed on Dashboard, color shades for consumption column, number of rows for each table etc in the ‘Edit’ mode. ds-switch-to-view-mode

4.11 Scheduled Reporting

Quota monitoring reports can be scheduled from the Looker Studio dashboard using ‘Schedule email delivery’. The screenshot of the Looker Studio dashboard will be delivered as a pdf report to the configured email Ids.

ds-schedule-email-button

4.11 Alerting

The alerts about services nearing their quota limits can be configured to be sent via email as well as following external services:

  • Slack
  • PagerDuty
  • SMS
  • Custom Webhooks

4.11.1 Slack Configuration

To configure notifications to be sent to a Slack channel, you must have the Monitoring Notification Channel Editor role on the host project.

4.11.1.1 Create Notification Channel
  1. In the Cloud Console, use the project picker to select your Google Cloud project, and then select Monitoring, or click the link here: Go to Monitoring
  2. In the Monitoring navigation pane, click Alerting.
  3. Click Edit notification channels.
  4. In the Slack section, click Add new. This brings you to the Slack sign-in page:
    • Select your Slack workspace.
    • Click Allow to enable Google Cloud Monitoring access to your Slack workspace. This action takes you back to the Monitoring configuration page for your notification channel.
    • Enter the name of the Slack channel you want to use for notifications.
    • Enter a display name for the notification channel.
  5. In your Slack workspace:
    • Invite the Monitoring app to the channel by sending the following message in the channel:
    • /invite @Google Cloud Monitoring
    • Be sure you invite the Monitoring app to the channel you specified when creating the notification channel in Monitoring.
4.11.1.2 Configuring Alerting Policy
  1. In the Alerting section, click on Policies.
  2. Find the Policy named ‘Resource Reaching Quotas’. This policy was created via Terraform code above.
  3. Click Edit.
  4. It opens an Edit Alerting Policy page. Leave the current condition metric as is, and click on Next.
  5. In the Notification Options, Select the Slack Channel that you created above.
  6. Click on Save.

You should now receive alerts in your Slack channel whenever a quota reaches the specified threshold limit.

5. Release Note

v4.0.0: Quota Monitoring across GCP services

New

  • The new version provides visibility into Quotas across various GCP services beyond the original GCE (Compute).
  • New Looker Studio Dashboard template reporting metrics across GCP services

Known Limitations

  • The records are grouped by hour. Scheduler need to be configured to start running preferably at the beginning of the hour.
  • Out of the box solution is configured to scan quotas ‘once every day’. The SQL query to build the dashboard uses current date to filter the records. If you change the frequency, make changes to the query to rightly reflect the latest data.

v4.4.0

New in v4.4.0

  • The new version includes a fix that converts the data pull process to use the Montoring Query Language (MQL). This allows QMS to pull the limit and current usage at the exact same time, so reporting queries can be more tightly scoped, eliminating over reporting problems.

    To upgrade existing installations:

    • Re-run the Terraform, to update the Cloud Functions and Scheduled Query
    • Update the SQL used in the Looker Studio dashboard according to Step #7 of 4.10 Looker Studio Dashboard setup.

6. What is Next

  1. Graphs (Quota utilization over a period of time)
  2. Search project, folder, org, region
  3. Threshold configurable for each metric

7. Getting Support

Quota Monitoring Solution is a project based on open source contributions. We'd love for you to report issues, file feature requests, and send pull requests (see Contributing). Quota Monitoring Solution is not officially covered by the Google Cloud product support.

8. Contributing

quota-monitoring-solution's People

Contributors

anuradha-bajpai-google avatar bgood avatar dependabot[bot] avatar mikesparr avatar quota-monitoring-solution-bot avatar shadowshot-x avatar ypenn21 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.