I have a dataflow in which data (sample below) is fetched from the azure events hub and sent to other destinations based on destination_type. For E.g s3
Sample Example:
{
"client_name": "foo",
"destination_type": "s3",
"data": {
"key1": "foo-value1",
"key2": "foo-value2",
"key3": "foo-value3"
}
}
{
"client_name": "bar",
"destination_type": "s3",
"data": {
"key1": "bar-value1",
"key2": "bar-value2",
"key3": "bar-value3"
}
}
I can fetch this client_name and destination_type using EvaluateJsonPath and make it an attributes
Now based on client_name and destination_type, I have to configure processors property dynamically
I was planning to storage the creds in parameter with sensitive value and of format like
CREDS_<client_name>_S3_ACCESSKEY = <Access Key ID>
CREDS_<client_name>_S3_SECRETACCESSKEY = <Secret Access Key>
Eg. For PutS3Object - Bucket, Access Key ID & Secret Access Key needs to be loaded based on client_name
#{CREDS_${client_name}_S3_ACCESSKEY}
#{CREDS_${client_name}_S3_SECRETACCESSKEY}
But these seem to be not working out, can anyone suggest any alternative way to load sensitive value dynamically
Those Attributes in the the PutS3Object support variable registry, so you can use a combination of variable registry update while ou run it.(it can be slow).
Or use Execute streaming command to pass the AccessKey & Secret Key at S3 Put runtime as env variable.
Someting like this:
Second one is hack but it works :)