Pub/SUB push from unbounded GCP project

Is it possible to use Pub/Sub push to forward logs from an unbounded GCP project to Chronicle SIEM?

The pub/sub documentation is slightly unclear in this note under the https://cloud.google.com/chronicle/docs/reference/feed-management-api#pubsub "If you need to push logs from an additional Google Cloud project that isn't bound to your Google Security Operations tenant, such as from a user-managed service account, you must set up a log sink."

We have set up a sink in a personal GCP account that feeds VPC logs to a topic, within the topic we are configuring the push option and in the endpoint we have added the endpoint URL generated by configuring a pub/sub push feed for VPC log type. we also created a service account in the personal GCP account with PUB/SUB admin permission and select the same in enable authentication section in the push configuration. 
But it give HTTP 403 error. 

Google doesn't support Pub/Sub pull in chronicle which we know supports this scenario.

Solved Solved
1 1 161
1 ACCEPTED SOLUTION

@vgera Chronicle won't directly pickup those logs from pubsub, but you can achieve that result but deploying a cloud function ingestion script to pick up the logs from the topic and forward them into Chronicle. We provide an ingestion script that you can use for this purpose here:  https://github.com/chronicle/ingestion-scripts/tree/main/pubsub

Depending on your configuration and specifics of your use case there are 2 other options that may be easier to setup and manage.

1: Configure the log sink to target the actual project that has been bound to chronicle (instead of a resource in that project) then rely on native ingestion to pickup the logs.  This will probably be easiest to implement but does require that all the logs you are trying to ingest are supported for direct ingest. https://cloud.google.com/chronicle/docs/ingestion/cloud/ingest-gcp-logs#option_1_direct_ingestion

2: You can use a GCS bucket as a sink target, then configure a feed with that bucket as the source.   This gets you away from the requirement for direct ingest log types, but will require you configure a feed per log type you are trying to ingest. 
https://cloud.google.com/chronicle/docs/ingestion/cloud/ingest-gcp-logs#gcp-storage
https://cloud.google.com/chronicle/docs/reference/feed-management-api#gc-storage

View solution in original post

1 REPLY 1

@vgera Chronicle won't directly pickup those logs from pubsub, but you can achieve that result but deploying a cloud function ingestion script to pick up the logs from the topic and forward them into Chronicle. We provide an ingestion script that you can use for this purpose here:  https://github.com/chronicle/ingestion-scripts/tree/main/pubsub

Depending on your configuration and specifics of your use case there are 2 other options that may be easier to setup and manage.

1: Configure the log sink to target the actual project that has been bound to chronicle (instead of a resource in that project) then rely on native ingestion to pickup the logs.  This will probably be easiest to implement but does require that all the logs you are trying to ingest are supported for direct ingest. https://cloud.google.com/chronicle/docs/ingestion/cloud/ingest-gcp-logs#option_1_direct_ingestion

2: You can use a GCS bucket as a sink target, then configure a feed with that bucket as the source.   This gets you away from the requirement for direct ingest log types, but will require you configure a feed per log type you are trying to ingest. 
https://cloud.google.com/chronicle/docs/ingestion/cloud/ingest-gcp-logs#gcp-storage
https://cloud.google.com/chronicle/docs/reference/feed-management-api#gc-storage