This demo workflow automates adding VOD sources to AWS MediaTailor Channel Assembly. Further details are available at this AWS Media blog post
The AWS Serverless Application Model (SAM) template deploys an Amazon S3 bucket that publishes events to Amazon EventBridge which triggers an AWS Step Function. This includes an AWS Lambda function with Python code that converts CSV to JSON. Template includes the AWS IAM policies required to run the application, but is not ready for production workloads.
To learn about AWS MediaTailor Channel Assembly, this workshop walks through defining sources, creating a channel and programs along with test source content.
Important: this application uses various AWS services and there are costs associated with these services after the Free Tier usage - please see the AWS Pricing page for details. You are responsible for any AWS costs incurred. No warranty is implied in this example.
- Create an AWS account if you do not already have one and log in. The IAM user that you use must have sufficient permissions to make necessary AWS service calls and manage AWS resources.
- AWS CLI installed and configured
- Git Installed
- AWS Serverless Application Model (AWS SAM) installed
-
Create a new directory, navigate to that directory in a terminal and clone the GitHub repository:
git clone git@github.com:aws-samples/mediatailor-vod-upload.git
-
Change directory to the directory:
cd mediatailor-vod-upload
-
From the command line, use AWS SAM to build and deploy the AWS resources for the pattern as specified in the template.yml file:
sam deploy --guided
-
During the prompts:
- Enter a stack name
- Enter the desired AWS Region
- Enter the email address to receive notifications of workflow results
- Allow SAM CLI to create IAM roles with the required permissions.
Once you have run
sam deploy --guided
mode once and saved arguments to a configuration file (samconfig.toml), you can usesam deploy
in future to use these defaults.Example parameters (you must change these for your account):
Setting default arguments for 'sam deploy' ========================================= Stack Name [CABulkUpload]: AWS Region [ap-southeast-2]: Parameter SNSEndpoint [test@test.com]: #Shows you resources changes to be deployed and require a 'Y' to initiate deploy Confirm changes before deploy [y/N]: y #SAM needs permission to be able to create roles to connect to the resources in your template Allow SAM CLI IAM role creation [Y/n]: y #Preserves the state of previously provisioned resources when an operation fails Disable rollback [y/N]: n Save arguments to configuration file [Y/n]: y SAM configuration file [samconfig.toml]: SAM configuration environment [default]:
-
Note the outputs from the SAM deployment process. These contain the S3 bucket to upload CSV file to and the Step Function to monitor workflow.
Note if you are following the AWS MediaTailor workshop - you can use workshop.csv
to test, after first creating source location WorkshopTest
= https://channelassembly.videocloud.live
- Upload a CSV file to the created S3 bucket.
- The Step Function is invoked with the event from S3, routed via EventBridge
- Lambda loads the source csv file and converts to JSON data format
- Step Function iterates through each VOD source using a map function to call MediaTailor VodSource API
- Notification email is raised at completion of workflow
MediaTailor resources and operations requests have quotas per https://docs.aws.amazon.com/mediatailor/latest/ug/quotas.html, so the following limits are applied to this sample workflow.
- maximum VOD assets of 500
- maximum 10 requests per second to MediaTailor API
SourceLocation,VodSourceName,SourceGroup,Type,Path
MySourceLocation1,MyFirstVodSource,hls,HLS,/path/to/my/first/source.m3u8
MySourceLocation1,MyFirstVodSource,dash,DASH,/path/to/my/first/source.mpd
MySourceLocation1,MySecondVodSource,hls,HLS,/path/to/my/second/source.m3u8
MySourceLocation1,MySecondVodSource,dash,DASH,/path/to/my/second/source.mpd
The event delivered to the EventBridge rule target (a Lambda function in this example) has the following structure:
{
"version": "0",
"id": "xxxx",
"detail-type": "Object Created",
"source": "aws.s3",
"account": "xxxxxxxx",
"time": "2022-01-20T03:45:42Z",
"region": "ap-southeast-2",
"resources": [
"arn:aws:s3:::xxxxxx"
],
"detail": {
"version": "0",
"bucket": {
"name": "cabulkupload-sourcebucket-xxxxx"
},
"object": {
"key": "workshop.csv",
"size": 192,
"etag": "xxx",
"sequencer": "xxxx"
},
"request-id": "xxxx",
"requester": "xxxxxx",
"source-ip-address": "xxxx",
"reason": "PutObject"
}
}
- Run the following S3 CLI command to upload an object to the S3 bucket. Note, you must edit the SourceBucketName placeholder with the name of the S3 Bucket. This is provided in the stack outputs.
aws s3 cp './test.csv' s3://*SourceBucketName*
- Run the following command to check to get the logs from the deployed Lambda function (use the function name from the stack output):
sam logs -n *FunctionName* --region *YourRegion*
- Delete the stack
sam delete
See CONTRIBUTING for more information.
This library is licensed under the MIT-0 License. See the LICENSE file.