HLS and DASH Packaging Tutorial
HLS (HTTP Live Streaming) and DASH (Dynamic Adaptive Streaming over HTTP) are two popular methods of delivering video to consumers over the internet. Both are types of Adaptive Bitrate (ABR) delivery, which means the video player can dynamically select the bitrate that is being delivered to the end viewer during playback as the viewer’s internet connection speed may change during playback. Rather than stalling and buffering the video, the player can switch to a lower bitrate and continue playing.
With Hybrik, the creation of HLS or DASH content consists of two tasks: transcode
and package
. In the transcode
Task, we create the range of bitrates. In the package
Task, we create the HLS or DASH manifests and, if necessary, remux the transcoded media assets into the required format. You may combine multiple HLS and DASH packaging tasks in a single Hybrik job.
In this tutorial, we’ll look at JSON samples of tasks for transcoding and packaging content. To follow along, download the sample from the bottom of the page.
Transcoding Renditions for ABR
Each bitrate variant that we create for HLS or DASH is called a rendition. Before you get started, you will want to decide on the list of renditions (called a “ladder”) that you want Hybrik to generate. A common bitrate ladder for a 1080p source might look like:
Video Bitrate (kbps) | Resolution | Frame Rate |
---|---|---|
145 | 416x234 | (source frame rate)/2 |
365 | 480 x 270 | (source frame rate)/2 |
730 | 640 x 360 | Same as source |
1100 | 768 x 432 | Same as source |
2000 | 960 x 540 | Same as source |
3000 | 1280 x 720 | Same as source |
4500 | 1920 x 1080 | Same as source |
6000 | 1920 x 1080 | Same as source |
7800 | 1920 x 1080 | Same as source |
Adaptive Bitrate streaming requires that each rendition be split into a series of segments, each generally 2 to 10 seconds long. This allows the player to switch bitrates on the segment boundaries by loading the next segment from a different rendition than the one currently playing.
HLS and DASH each support a variety of media formats, but they must be segmented as part of the packaging process. Your decision of format may vary depending on the end-user devices and players you support, but the fragmented MP4 (fMP4) format is a good baseline. An fMP4 file consists of independent “chunks” of MP4 video that have been binarily concatenated together into a single MP4 file. Having all the chunks concatenated into a single file can make data management simpler, and an ABR-capable video player can use HTTP Range requests to retrieve only the bits of the fMP4 file containing the chunks it needs.
Example Transcode Task
In the following example Transcode Task, we are encoding three video renditions (800kbps, 400kbps, and 200kbps) as well as two audio renditions (128kbps and 64kbps). Each rendition is being encoded into the fMP4 format.
In the definitions
object at the beginning of the job’s JSON, we have defined `` as the destination for our encoded renditions. The filename for each rendition is defined by the file_pattern
parameter - in this case, we’re using the {source_basename}
and {default_extension}
placeholders to concatenate together the source filename, the rendition label, and the correct extension for the output file. For more on definitions and placeholders, refer to the Definitions And Placeholders Tutorial.
Each video rendition has a layer_affinities
parameter, which tells Hybrik to match that video layer with audio tracks that have a matching layer_id
. In this case, we’re generating a low-bitrate audio file with a layer_id of audio_low
to match with the two lowest-bitrate video renditions, as well as a high-bitrate audio file labeled audio_high
, which is matched to the high-bitrate video rendition.
Also note that we are only setting the height of each rendition; Hybrik will infer the missing width dimension from the aspect ratio of the source.
{
"definitions": {
"descriptor": "hls_and_dash_packaging",
"source_filename": "tears_of_steel_720p.mov",
"destination_mp4s": "/transcoded",
"destination_hls_media": "/hls_media",
"destination_hls_manifests": "/hls_manifests",
"destination_dash_manifests": "/dash_manifests",
"source_path": "s3://hybrik-examples/public/sources",
"destination_path": "s3://hybrik-examples/public/output/"
},
...
{
"uid": "transcode_all_renditions",
"kind": "transcode",
"task": {
"retry_method": "fail"
},
"payload": {
"location": {
"storage_provider": "s3",
"path": ""
},
"targets": [
{
"file_pattern": "{source_basename}_800kbps{default_extension}",
"existing_files": "replace",
"container": {
"kind": "fmp4",
"segment_duration": 6
},
"video": {
"codec": "h264",
"bitrate_mode": "cbr",
"use_scene_detection": false,
"bitrate_kb": 800,
"vbv_buffer_size_kb": 800,
"profile": "high",
"level": "3.1",
"height": 486,
"par": "1:1",
"layer_affinities": [
"audio_high"
]
}
},
{
"file_pattern": "{source_basename}_400kbps{default_extension}",
"existing_files": "replace",
"container": {
"kind": "fmp4",
"segment_duration": 6
},
"video": {
"codec": "h264",
"bitrate_mode": "cbr",
"use_scene_detection": false,
"bitrate_kb": 400,
"vbv_buffer_size_kb": 400,
"profile": "main",
"level": "3.0",
"height": 360,
"par": "1:1",
"layer_affinities": [
"audio_low"
]
}
},
{
"file_pattern": "{source_basename}_200kbps{default_extension}",
"existing_files": "replace",
"container": {
"kind": "fmp4",
"segment_duration": 6
},
"video": {
"codec": "h264",
"bitrate_mode": "cbr",
"use_scene_detection": false,
"bitrate_kb": 200,
"vbv_buffer_size_kb": 200,
"profile": "baseline",
"level": "3.0",
"height": 252,
"par": "1:1",
"layer_affinities": [
"audio_low"
]
}
},
{
"file_pattern": "{source_basename}_audio_64kbps{default_extension}",
"existing_files": "replace",
"container": {
"kind": "fmp4",
"segment_duration": 6
},
"audio": [
{
"channels": 2,
"codec": "aac_lc",
"sample_rate": 48000,
"bitrate_kb": 64,
"layer_id": "audio_low"
}
]
},
{
"file_pattern": "{source_basename}_audio_128kbps{default_extension}",
"existing_files": "replace",
"container": {
"kind": "fmp4",
"segment_duration": 6
},
"audio": [
{
"channels": 2,
"codec": "aac_lc",
"sample_rate": 48000,
"bitrate_kb": 128,
"layer_id": "audio_high"
}
]
}
]
}
}
}
Packaging for ABR
The Package task creates a final set of assets that can be read by an ABR-capable player. Both HLS and DASH utilize text-based manifest files to tell the player about the various renditions and how to switch between them. HLS uses a master manifest with an .m3u8
extension, as well as a manifest for each rendition. DASH uses a single manifest, called the “Media Presentation Description”, with the .mpd
extension. Hybrik will automatically generate these manifest files during the package
task.
During the Package process, it may be necessary to remux the encoded rendition files into a different container. For example, even though we have encoded the renditions into fMP4, the HLS output may require transport stream segments for wider device compatibility. In this case, you can instruct Hybrik to remux the fMP4 files into .ts
files. You can reuse the same encoded media from a Transcode task within multiple Package tasks (as we will do in the final example).
Example HLS Package Task
In the example below, we are creating an HLS output using the Transport Stream container type. This means the fMP4 renditions we encoded earlier will be transmuxed into .ts
files. We are using output locations from our definitions object – manifest files will go in {{destination_hls_manifests}}
, while the .ts
media files will go in {{destination_hls_media}}
. The name of the master manifest file is set to “master_manifest.m3u8”.
The segmentation_mode has been set to single_ts
, which tells Hybrik to remux each fMP4 file into a single transport stream file. If we had set the segmentation_mode to segmented_ts
, Hybrik would create many small .ts
files, each with a duration matching the segment_duration set during the Transcode task (6 seconds, in our example above).
hls_single_ts
{
"uid": "hls_single_ts",
"kind": "package",
"payload": {
"uid": "main_manifest",
"kind": "hls",
"location": {
"storage_provider": "s3",
"path": "",
"attributes": [
{
"name": "ContentType",
"value": "application/x-mpegURL"
}
]
},
"file_pattern": "master_manifest.m3u8",
"segmentation_mode": "single_ts",
"force_original_media": false,
"media_location": {
"storage_provider": "s3",
"path": "",
"attributes": [
{
"name": "ContentType",
"value": "video/MP2T"
}
]
},
"media_file_pattern": "{source_basename}.ts",
"hls": {
"media_playlist_location": {
"storage_provider": "s3",
"path": ""
}
}
}
},
Creating Multiple Packages
Since we are using the same transcoded fMP4 renditions in both our HLS and DASH packaging tasks, we can transcode the renditions and generate both HLS and DASH packages in a single job. In the connections
array, we connect the output of the transcode_all_renditions
task to both the hls_single_ts
HLS packaging task and the dash_fmp4
packaging task.
DASH
dash_fmp4
{
"uid": "dash_fmp4",
"kind": "package",
"payload": {
"uid": "main_manifest",
"kind": "dash",
"location": {
"storage_provider": "s3",
"path": ""
},
"file_pattern": "master_manifest.mpd",
"segmentation_mode": "fmp4",
"force_original_media": true
}
}
"connections": [
{
"from": [
{
"element": "source_file"
}
],
"to": {
"success": [
{
"element": "transcode_all_renditions"
}
]
}
},
{
"from": [
{
"element": "transcode_all_renditions"
}
],
"to": {
"success": [
{
"element": "hls_single_ts"
},
{
"element": "dash_fmp4"
}
]
}
}
]