Event-based real-time cloud functions examples
APIENDPOINT, APIKEY and STORAGECLIENTID need to be added as environment variables and should be accessible from the function. Please check Event-based handling for more details about these variables.
Amazon S3 lambda function setup
For creating a AWS lambda function, follow the official documentation: https://aws.amazon.com/getting-started/hands-on/run-serverless-code/
Function trigger needs to be set to all objects created events in order to receive notification only for newly added files.
Python function example:
import requests
import os
import json
from urllib.parse import unquote
def lambda_handler(event, context):
for eventRecord in event['Records']:
eventRecord['s3']['object']['key'] = unquote(eventRecord['s3']['object']['key'].replace("+", " "))
requests.post(os.getenv('APIENDPOINT', ""), headers={'ApiKey':os.getenv('APIKEY', "")}, json = {'metadata': json.dumps(eventRecord), 'storageClientId': os.getenv('STORAGECLIENTID', "") })
Azure Blob function app setup
To deploy the Azure function app, please use the following Terraform script: https://github.com/OPSWAT/metadefender-k8s/tree/main/terraform/azure-function-docker
STORAGECLIENTID, APIKEY and APIENDPOINT variables should be configured on .tvars file:
resource_group_name = "" #The name of the resource group in which the function app will be created."
service_plan_name = "" #The name of the app service plan
storage_account_name = "" #The name of the storage account to be created
docker_registry_server_url = ""
docker_registry_server_username = "" #optional
docker_registry_server_password = "" #optional
docker_image_name = ""
docker_image_tag = ""
AzureWebJobsBlobTrigger = "" #The storage account connection string that triggers the function
CONTAINERNAME = "" #The blob container that needs to be scanned
fn_name_prefix = "" #function name
location = "" #azure region
STORAGECLIENTID = ""
APIKEY = ""
APIENDPOINT = ""
Azure Blob Event Grid RTP configuration
For a detailed example, please use the example here: https://github.com/OPSWAT/metadefender-k8s/tree/main/terraform/CloudFunctions/Azure/webhook-notification
Event Notifications for Page and Append blob is not supported.
For Page and Append blobs, an event is sent as soon as the first block is committed to the storage, which can result in events being sent before the upload is complete.
Google Cloud function setup
The google.cloud.storage.object.v1.finalized
trigger needs to be setup for the Cloud Function(v2), in order to process newly added objects.
Python function example:
mport functions_framework
import json
import requests
import os
# Triggered by a change in a storage bucket
cloud_event .
def hello_gcs(cloud_event):
requests.post(os.getenv('APIENDPOINT', ""), headers={'ApiKey':os.getenv('APIKEY', "")}, json = {'metadata': json.dumps(cloud_event.data), 'storageClientId': os.getenv('STORAGECLIENTID', "") })
Alibaba Cloud function setup
Follow the official Alibaba documentation for creating a compute function with OSS trigger: https://www.alibabacloud.com/help/en/function-compute/latest/configure-an-oss-trigger
When the function compute is created, it is necessary to specify the bucket to monitor and to subscribe to the following event oss:ObjectCreated:*
Python function example:
import oss2, json, os
import requests
def handler(event, context):
for eventRecord in json.loads(event)['events']:
requests.post(os.getenv('APIENDPOINT', ""), headers={'ApiKey':os.getenv('APIKEY', "")}, json = {'metadata': json.dumps(eventRecord), 'storageClientId': os.getenv('STORAGECLIENTID', "") })