GCP Bigquery
Pre-Requisite
Project in GCP
IAM Role which has access to big query service
Create Bigquery Forwarder
Navigate to
create forwarder
pageSelect
bigquery
forwarderProvide the credentials for the IAM role as JSON in
json_key
fieldProvide
project_id
,dataset_id
andtable_id
Set
table_expiration
in hours which will delete the table after the specified expiration timeEnable the
create_table
toggle to define the schema for the tableAdd required columns for the table
Provide
name
for the forwarder and clickcreate
Create Bigquery Forwarder with existing table
Select
bigquery
forwarderProvide the credentials for the IAM role as JSON in
json_key
fieldProvide
project_id
, existingdataset_id
and existingtable_id
Set
table_expiration
in hours which will delete the table after the specified expiration timeDisable the
create_table
toggle to define the schema for the tableProvide
name
for the forwarder and clickcreate
Add Column to Biquery Table
Navigate to
forwarders
pageClick
edit
icon in the big query forwarder for which you need to add columns in the schemaClick on
add column
to update the schema
Mapping logs to Bigquery table
When you forward your logs to BigQuery, the fields in the log will be mapped to the column in the table. By default, Apica maps the schema from the forwarder to the facet in the logs. It will insert the value for the facet as a value to the column in the table.
To change the default mapping behavior, create a forward rule for the namespace and application by renaming the facet with the column name. This will help you to manage the mappings between your log and the BigQuery table schema.
Last updated