Google · 0.1.0

BigQuery

A Dock-hosted BigQuery starter integration for Harbor. Importing it creates a local http_api port shell plus a Harbor-safe BigQuery action set that uses Harbor's Action Model for project-scoped path inputs and an approval-friendly SQL query action. OAuth bearer tokens still stay local to Harbor Node.

Community included analyticsbigquerydatagooglesql
Data Category
16 Action templates
1 Workflow templates
Google Publisher

Service Overview

BigQuery is the official cloud data warehouse api from Google for bigquery, data, analytics, and sql. This Hub entry keeps the service explanation, setup links, and Harbor-safe import path together so operators can review the integration before local credentials and policies are applied.

BigQuery Service
Cloud data warehouse API Service type
https://bigquery.googleapis.com/ Base URL

What This Port Does

  • List datasets in a specific Google Cloud project using typed path and pagination inputs.
  • Run a BigQuery SQL query synchronously using a bounded JSON request body for the target project.
  • Fetch metadata and status for a specific BigQuery job using typed project, job, and optional location inputs.
  • Keeps service credentials local to Harbor Node so approval, execution, and audit stay inside the Harbor ecosystem.

Operator Setup

  • Create or confirm a Google account, workspace, or tenant with the API access needed for the actions you plan to enable.
  • Import this Hub entry into Harbor, then verify the service base URL and connection settings before testing actions.
  • Store the required Bearer token locally in Harbor Node using the Authorization header, then review automatic versus approval-required actions before publish.
  • Use the vendor setup guide linked below to create service credentials, scopes, or app registration settings that match your Harbor policy.

Install

Use in Harbor UI or the import-by-URL route.
https://hub.breakwaterharbor.net/p/google/bigquery
http_api Local shell
16 Actions
Community Required tier
Never Secrets imported

Actions

List Datasets JSON

{
  "slug": "list-datasets",
  "label": "List Datasets",
  "description": "List datasets in a specific Google Cloud project using typed path and pagination inputs.",
  "method": "GET",
  "path": "/bigquery/v2/projects/{projectId}/datasets",
  "inputs": {
    "parameters": [
      {
        "name": "projectId",
        "in": "path",
        "required": true,
        "schema": {
          "type": "string",
          "minLength": 1
        },
        "description": "Google Cloud project ID that owns the datasets."
      },
      {
        "name": "maxResults",
        "in": "query",
        "schema": {
          "type": "integer",
          "minimum": 1,
          "maximum": 1000
        },
        "default": 50,
        "description": "Maximum number of datasets to return."
      },
      {
        "name": "pageToken",
        "in": "query",
        "schema": {
          "type": "string"
        },
        "description": "Continuation token from a previous dataset list response."
      },
      {
        "name": "all",
        "in": "query",
        "schema": {
          "type": "boolean"
        },
        "description": "Whether to include hidden datasets."
      }
    ]
  },
  "approvalMode": "automatic",
  "requestBodyMode": "none",
  "resultMode": "json_summary"
}

Workflows

Project Query Check Future workflow template for running a reviewed SQL check and then inspecting the resulting job.