Apica LogFlow
  • Overview
  • EULA
    • End User License Agreement
  • Getting Started Guide
    • Deployment guides
  • Architecture
    • Never Block, Never Drop
    • InstaStore
    • Deployment
  • Flow Management
    • Overview
    • Forwarders
    • Mapping Applications
    • Data Transformation
  • Splunk Forwarding
    • Overview
    • Apica UF Proxy App Extension
      • Standalone Instance
      • List of Indexer instances
      • Indexer Discovery
    • Metric indexes
    • Non metric indexes
    • Syslog forwarding
  • Real-time Stream Forwarding
    • Overview
    • AWS Kinesis
    • Azure Eventhub
    • Google Pub/Sub
  • Forwarding To Data Warehouse
    • Overview
    • GCP Bigquery
  • Object Store Forwarding
    • Overview
    • S3 Compatible
    • Azure Blob Storage
  • Forwarding to monitoring tools
    • DataDog Forwarding
    • New Relic Forwarding
    • Dynatrace Forwarding
    • Elasticsearch Forwarding
    • Coralogix Forwarding
    • Azure Log Analytics Forwarding
    • JS Code Forwarding
  • Security Monitor Forwarding
    • Overview
    • Arc Sight
    • RSA New Witness
Powered by GitBook
On this page
  • Pre-Requisite
  • Create Bigquery Forwarder
  • Create Bigquery Forwarder with existing table
  • Add Column to Biquery Table
  • Mapping logs to Bigquery table

Was this helpful?

Export as PDF
  1. Forwarding To Data Warehouse

GCP Bigquery

PreviousOverviewNextOverview

Last updated 1 year ago

Was this helpful?

Pre-Requisite

  • Project in GCP

  • IAM Role which has access to big query service

Create Bigquery Forwarder

  • Navigate to create forwarder page

  • Select bigquery forwarder

  • Provide the credentials for the IAM role as JSON in json_key field

  • Provide project_id, dataset_id and table_id

  • Set table_expiration in hours which will delete the table after the specified expiration time

  • Enable the create_table toggle to define the schema for the table

  • Add required columns for the table

  • Provide name for the forwarder and click create

Create Bigquery Forwarder with existing table

  • Select bigquery forwarder

  • Provide the credentials for the IAM role as JSON in json_key field

  • Provide project_id, existingdataset_id and existingtable_id

  • Set table_expiration in hours which will delete the table after the specified expiration time

  • Disable the create_table toggle to define the schema for the table

  • Provide name for the forwarder and click create

Add Column to Biquery Table

  • Navigate to forwarders page

  • Click edit icon in the big query forwarder for which you need to add columns in the schema

  • Click on add column to update the schema

Mapping logs to Bigquery table

When you forward your logs to BigQuery, the fields in the log will be mapped to the column in the table. By default, Apica maps the schema from the forwarder to the facet in the logs. It will insert the value for the facet as a value to the column in the table.

To change the default mapping behavior, for the namespace and application by renaming the facet with the column name. This will help you to manage the mappings between your log and the BigQuery table schema.

create a forward rule
facets
Columns in bigquery table