LogoLogo
Documentation and Guides
Documentation and Guides
  • ABOUT APONO
    • Why Choose Apono
    • Security and Architecture
    • Glossary
  • GETTING STARTED
    • How Apono Works
    • Getting started
    • Access Discovery
    • Integrating with Apono
  • CONNECTORS AND SECRETS
    • Apono Integration Secret
    • High Availability for Connectors
    • Installing a connector with Docker
    • Manage integrations
    • Manage connectors
  • AWS ENVIRONMENT
    • AWS Overview
    • Apono Connector for AWS
      • Installing a connector on EKS Using Terraform
      • Updating a connector in AWS
      • Installing a connector on AWS ECS using Terraform
    • AWS Integrations
      • Integrate an AWS account or organization
        • Auto Discover AWS RDS Instances
        • AWS Best Practices
      • Amazon Redshift
      • RDS PostgreSQL
      • AWS RDS MySQL
      • Integrate with EKS
      • AWS Lambda Custom Integration
      • EC2 via Systems Manager Agent (SSM)
  • AZURE ENVIRONMENT
    • Apono Connector for Azure
      • Install an Azure connector on ACI using Azure CLI
      • Install an Azure connector on ACI using PowerShell
      • Install an Azure connector on ACI using Terraform
      • Updating a connector in Azure
    • Azure Integrations
      • Integrate with Azure Management Group or Subscription
        • Auto Discover Azure SQL Databases
      • Azure MySQL
      • Azure PostgreSQL
      • Integrate with AKS
  • GCP ENVIRONMENT
    • Apono Connector for GCP
      • Installing a GCP connector on Cloud Run using CLI
      • Installing a GCP connector on GKE using CLI (Helm)
      • Installing a GCP connector on GKE using Terraform
      • Updating a connector in Google Cloud
    • GCP Integrations
      • Integrate a GCP organization or project
      • CloudSQL - MySQL
      • CloudSQL - PostgreSQL
      • Google Cloud Functions
      • Integrate with GKE
      • AlloyDB
  • KUBERNETES ENVIRONMENT
    • Apono Connector for Kubernetes
      • Installing a connector on Kubernetes with AWS permissions
      • Updating a Kubernetes connector
    • Kubernetes Integrations
      • Integrate with Self-Managed Kubernetes
  • ADDITIONAL INTEGRATIONS
    • Databases and Data Repositories
      • Microsoft SQL Server
      • MongoDB
      • MongoDB Atlas
      • MongoDB Atlas Portal
      • MySQL
      • Oracle Database
      • PostgreSQL
      • Redis Cloud (Redislabs)
      • Snowflake
      • Vertica
      • MariaDB
    • Network Management
      • SSH Servers
      • RDP Servers
      • Windows Domain Controller
      • AWS EC2 SSH Servers
      • Azure VM SSH Servers
      • Installing the Apono HTTP Proxy
    • Development Tools
      • GitHub
      • Rancher
    • Identity Providers
      • Okta SCIM
      • Okta Groups
      • Okta SSO for Apono logins
      • Google Workspace (Gsuite)
      • Google Workspace (GSuite) Groups
      • Azure Active Directory (Microsoft Entra ID)
      • Azure Active Directory (Entra ID) Groups
      • Jumpcloud
      • JumpCloud Groups
      • OneLogin
      • OneLogin Group
      • LDAP Groups
      • The Manager Attribute in Access Flows
      • HiBob
      • Ping Identity SSO
    • Incident Response Integrations
      • Opsgenie
      • PagerDuty
      • VictorOps (Splunk On-Call)
      • Zenduty
    • ChatOps Integrations
      • Slack integration
      • Teams integration
      • Backstage Integration
  • WEBHOOK INTEGRATIONS
    • Webhooks Overview
    • Anomaly Webhook
    • Audit Log Webhook
    • Request Webhook
      • Custom Webhooks
      • Communications and Notifications
        • Slack Outbound Webhooks
        • Teams
        • Outlook and Gmail (Using Azure Logic App)
      • ITSM
        • Freshdesk
        • Jira
        • ServiceNow
        • Zendesk
        • Freshservice
        • ServiceDesk Plus
      • Logs and SIEMs
        • Coralogix
        • Datadog
        • Logz.io
        • Grafana
        • New Relic
        • SolarWinds
        • Sumo Logic
        • Cortex
        • Logpoint
        • Splunk
        • Microsoft Sentinel
      • Orchestration and workflow builders
        • Okta Workflows
        • Torq
    • Integration Webhook
    • Webhook Payload References
      • Audit Log Webhook Payload Schema Reference
      • Webhook Payload Schema Reference
    • Manage webhooks
    • Troubleshoot a webhook
    • Manual Webhook
      • ITSM
        • PagerDuty
  • ACCESS FLOWS
    • Access Flows
      • What are Access Flows?
    • Create Access Flows
      • Self Serve Access Flows
      • Automatic Access Flows
      • Access Duration
    • Manage Access Flows
      • Right Sizing
    • Revoke Access
    • Dynamic Access Management
      • Resource and Integration Owners
    • Common Use Cases
      • Ensuring SLA
      • Protecting PII and Customer Data
      • Production Stability and Management
      • Break Glass Protocol
    • Create Bundles
    • Manage Bundles
  • ACCESS REQUESTS AND APPROVALS
    • Slack
      • Requesting Access with Slack
      • Approving Access with Slack
    • Teams
      • Requesting Access with Teams
      • Approving Access with Teams
    • CLI
      • Install and manage the Apono CLI
      • Requesting Access with CLI
    • Web Portal
      • Requesting Access with the Web Portal
      • Approving Access with the Web Portal
      • Reviewing historical requests with the Web Portal
    • Freshservice
    • Favorites
  • Inventory
    • Inventory Overview
    • Inventory
    • Access Scopes
    • Risk Scores
    • Apono Query Language
  • AUDITS AND REPORTS
    • Activity Overview
      • Activity
      • Create Reports
      • Manage Reports
    • Compliance: Audit and Reporting
    • Auditing Access in Apono
    • Admin Audit Log (Syslog)
  • HELP AND DEBUGGING
    • Integration Status Page
    • Troubleshooting Errors
  • ARCHITECTURE AND SECURITY
    • Anomaly Detection
    • Multi-factor Authentication
    • Credentials Rotation Policy
    • Periodic User Cleanup & Deletion
    • End-user Authentication
    • Personal API Tokens
  • User Administration
    • Role-Based Access Control (RBAC) Reference
    • Create Identities
    • Manage Identities
Powered by GitBook
On this page
  • Prerequisites
  • Associate BigQuery dataset permissions
  • Integrate BigQuery

Was this helpful?

Export as PDF
  1. GCP ENVIRONMENT
  2. GCP Integrations

BigQuery

Create an integration to manage access to BigQuery datasets

Last updated 3 months ago

Was this helpful?

Google BigQuery is a fast, scalable, secure, fully managed data warehouse service in the cloud, serving as a primary data store for vast datasets and analytic workloads. Google Cloud enables businesses to analyze their data using standard SQL and existing business intelligence tools, promoting insightful decision-making and integration with various Google Cloud services.

Through this integration, Apono helps you securely manage access to your BigQuery datasets.


Prerequisites

Item
Description

Apono Connector

Apono Premium

BigQuery Information

Google Cloud Command Line Interface (Google Cloud CLI)

Google Cloud Information

Information for your Google Cloud instance associated with the Apono connector Google-defined:

User-defined

  • Service Account Name


Associate BigQuery dataset permissions

To integrate BigQuery, you must create a custom role with BigQuery dataset permissions and assign the role to the service account for the Apono connector.

The following instructions in this section use the Google Cloud CLI.

However, you can also through the Google Console, and IAM client library, or the REST API. Additionally, you can to the Apono connector through the Google Console.

\

Follow these steps to associate the permissions through the Google Cloud CLI:

  1. In your shell environment, log in to Google Cloud and enable the API.Shell

    gcloud auth login
    gcloud services enable cloudresourcemanager.googleapis.com
    gcloud services enable iam.googleapis.com
  2. Set the environment variables.

export GCP_PROJECT_ID=<GOOGLE_PROJECT_ID>
export SERVICE_ACCOUNT_NAME=<SERVICE_ACCOUNT_NAME>
export GCP_ORGANIZATION_ID=<GOOGLE_ORGANIZATION_ID>
export GCP_PROJECT_ID=<GOOGLE_PROJECT_ID>
export SERVICE_ACCOUNT_NAME=<SERVICE_ACCOUNT_NAME>
  1. Create the custom role. Be sure to replace the placeholders (<ROLE_ID>, <TITLE>, and <DESCRIPTION>) with actual values of your choosing for the role ID, title, and description of the role.ProjectOrganization

gcloud iam roles create <ROLE_ID> --project=$GCP_PROJECT_ID --title="<TITLE>" --description="<DESCRIPTION>" --permissions=bigquery.datasets.get,bigquery.datasets.update --stage=ALPHA
gcloud iam roles create <ROLE_ID> --organization=$GCP_ORGANIZATION_ID --title="<TITLE>" --description="<DESCRIPTION>" --permissions=bigquery.datasets.get,bigquery.datasets.update --stage=ALPHA
  1. Using the role ID defined in the previous step, assign the custom role to the Apono connector service account.ProjectOrganization

gcloud projects add-iam-policy-binding $GCP_PROJECT_ID --member="serviceAccount:$SERVICE_ACCOUNT_NAME@$GCP_PROJECT_ID.iam.gserviceaccount.com" --role="projects/$GCP_PROJECT_ID/roles/<ROLD_ID>"
gcloud organizations add-iam-policy-binding $GCP_ORGANIZATION_ID --member="serviceAccount:$SERVICE_ACCOUNT_NAME@$GCP_PROJECT_ID.iam.gserviceaccount.com" --role="organizations/$GCP_ORGANIZATION_ID/roles/<ROLE_ID>"

Integrate BigQuery

Follow these steps to complete the integration:

  1. Under Discovery, click Next. The Apono connector section expands.

  2. From the dropdown menu, select a connector.

  1. Click Next. The Integration Config section expands.

  2. Define the Integration Config settings.

    Setting
    Description

    Integration Name

    Unique, alphanumeric, user-friendly name used to identify this integration when constructing an access flow

    Project ID

    GCP project where BigQuery is enabled

    Dataset Name

    Name of the BigQuery dataset to integrate

  3. Click Next. The Custom Access Details section expands.

  4. Define the Get more with Apono settings.\

    Setting
    Description

    Credential Rotation

    User cleanup after access is revoked (in days)

    (Optional) Defines the number of days after access has been revoked that the user should be deleted

    Custom Access Details

    (Optional) Instructions explaining how to access this integration's resources Upon accessing an integration, a message with these instructions will be displayed to end users in the User Portal. The message may include up to 400 characters. To view the message as it appears to end users, click Preview.

    Integration Owner

    1. From the Attribute dropdown menu, select User or Group under the relevant identity provider (IdP) platform.

    2. From the Value dropdown menu, select one or multiple users or groups.

    NOTE: When Resource Owner is defined, an Integration Owner must be defined.

    Resource Owner

    1. Enter a Key name. This value is the name of the tag created in your cloud environment.

    2. From the Attribute dropdown menu, select an attribute under the IdP platform to which the key name is associated. Apono will use the value associated with the key (tag) to identify the resource owner. When you update the membership of the group or role in your IdP platform, this change is also reflected in Apono.

    NOTE: When this setting is defined, an Integration Owner must also be defined.

  5. Click Confirm.

On-prem serving as a bridge between a Google Cloud instance and Apono

providing the most features and dedicated account support

for the BigQuery datatset to be integrated

used to manage Google Cloud resources

(Organization)

On the tab, click Google BigQuery. The Connect Integration page appears.

If the desired connector is not listed, click + Add new connector and follow the instructions for creating a .

(Optional) Number of days after which the database credentials must be rotated Learn more about the .

Learn more about .

(Optional) Fallback approver if no is found Follow these steps to define one or several integration owners:

(Optional) Group or role responsible for managing access approvals or rejections for the resource Follow these steps to define one or several :

Now that you have completed this integration, you can that grant permission to your BigQuery datasets.

create a custom role
assign the custom role
Catalog
GCP connector
create access flows
connection
Apono plan
Dataset name
Command-line interface
Organization ID
Project ID
Credentials Rotation Policy
Periodic User Cleanup & Deletion
resource owner
resource owners