How to export GCP's Security Center Assets to a Cloud Storage via cloud Function?

799 Views Asked by At

I have a cloud function calling SCC's list_assets and converting the paginated output to a List (to fetch all the results). However, since I have quite a lot of assets in the organization tree, it is taking a lot of time to fetch and cloud function times out (540 seconds max timeout).

asset_iterator = security_client.list_assets(org_name)
asset_fetch_all=list(asset_iterator)

I tried to export via WebUI and it works fine (took about 5 minutes). Is there a way to export the assets from SCC directly to a Cloud Storage bucket using the API?

2

There are 2 best solutions below

0
On

Try something like this: We use it to upload finding into a bucket. Make sure to give the SP the function is running the right permissions to the bucket.

def test_list_medium_findings(source_name):
    # [START list_findings_at_a_time]
    from google.cloud import securitycenter
    from google.cloud import storage


    # Create a new client.
    client = securitycenter.SecurityCenterClient()


    #Set query paramaters
    organization_id = "11112222333344444"
    org_name = "organizations/{org_id}".format(org_id=organization_id)
    all_sources = "{org_name}/sources/-".format(org_name=org_name)

   
   #Query Security Command Center
    finding_result_iterator = client.list_findings(all_sources,filter_=YourFilter)

    

    #Set output file settings
    bucket="YourBucketName"
    storage_client = storage.Client()
    bucket = storage_client.get_bucket(bucket)
    output_file_name = "YourFileName"
    my_file = bucket.blob(output_file_name)


    with open('/tmp/data.txt', 'w') as file:
        for i, finding_result in enumerate(finding_result_iterator):
            file.write(
               "{}: name: {} resource: {}".format(
                  i, finding_result.finding.name, finding_result.finding.resource_name
            )
        )


    #Upload to bucket
    my_file.upload_from_filename("/tmp/data.txt")
2
On

I develop the same thing, in Python, for exporting to BQ. Searching in BigQuery is easier than in a file. The code is very similar for GCS storage. Here my working code with BQ

import os

from google.cloud import asset_v1
from google.cloud.asset_v1.proto import asset_service_pb2
from google.cloud.asset_v1 import enums

def GCF_ASSET_TO_BQ(request):

    client = asset_v1.AssetServiceClient()
    parent = 'organizations/{}'.format(os.getenv('ORGANIZATION_ID'))
    output_config = asset_service_pb2.OutputConfig()
    output_config.bigquery_destination.dataset = 'projects/{}/datasets/{}'.format(os.getenv('PROJECT_ID'),os.getenv('DATASET'))
    content_type = enums.ContentType.RESOURCE
    output_config.bigquery_destination.table = 'asset_export'
    output_config.bigquery_destination.force = True

    response = client.export_assets(parent, output_config, content_type=content_type)

    # For waiting the finish
    # response.result()
    # Do stuff after export
    return "done", 200


if __name__ == "__main__":
    GCF_ASSET_TO_BQ('')

As you can see, there is some values in Env Var (OrganizationID, projectId and Dataset). For exporting to Cloud Storage, you have to change the definition of the output_config like this

output_config = asset_service_pb2.OutputConfig()
output_config.gcs_destination.uri = 'gs://path/to/file' 

You have example in other languages here