Options for HDS Packaging

Here are some of the options that I am aware of for packaging content as Adobe HDS. They are all commercial software. 1. Adobe Media Server and the f4fpackager tool 2. Wowza Media Server 3. Unifed Streaming Server 4. Nginx HDS module The specification for the manifest format from Adobe is here: http://wwwimages.adobe.com/content/dam/Adobe/en/devnet/hds/pdfs/adobe-media-manifest-specification.pdf And the specification for HDS fragments and complete setup is here: http://wwwimages.adobe.com/content/dam/Adobe/en/devnet/hds/pdfs/adobe-hds-specification.pdf Other information: PHP Script that can join f4f/f4m: https://github.com/K-S-V/Scripts Note that it also appears there is a ts2hds function in gpac that requires further investigation as it doesn’t appear to be built by default. https://github.com/maki-rxrz/gpac

Generating encodes and SMIL files for Akamai HD, Wowza etc

Here is a quick script I generated for encoding files and generating SMIL files for use with Akamai HD so they can be segmented on the fly as HDS or HLS.

Formatted etails here: https://gist.github.com/sinkers/148a39f8d926a443501a

or


#!/bin/bash
VIDSOURCE=$1
OUTNAME=$2
RESOLUTION1="320x180"
RESOLUTION2="512x288"
RESOLUTION3="640x360"
RESOLUTION4="960x540"
RESOLUTION5="1024x576"
RESOLUTION6="1280x720"
RESOLUTION7="1920x1080"
BITRATE1="400000"
BITRATE2="800000"
BITRATE3="1000000"
BITRATE4="1200000"
BITRATE5="1400000"
BITRATE6="2000000"
BITRATE7="4000000"

echo “Encoding $VIDSOURCE”

AUDIO_OPTS=”-c:a libfaac -b:a 160000 -ac 2″
AUDIO_OPTS2=”-c:a libfaac -b:a 640000 -ac 2″
VIDEO_OPTS1=”-c:v libx264 -vprofile main -preset slow”
VIDEO_OPTS2=”-c:v libx264 -vprofile main -preset slow”
VIDEO_OPTS3=”-c:v libx264 -vprofile main -preset slow”
OUTPUT_HLS=”-f mp4″

~/Desktop/workspace/ffmpeg-mac/FFmpeg/ffmpeg -i $VIDSOURCE -y \
$AUDIO_OPTS -s $RESOLUTION1 $VIDEO_OPTS1 -b:v $BITRATE1 $OUTPUT_HLS ${OUTNAME}_${BITRATE1}.mp4 \
$AUDIO_OPTS -s $RESOLUTION2 $VIDEO_OPTS2 -b:v $BITRATE2 $OUTPUT_HLS ${OUTNAME}_${BITRATE2}.mp4 \
$AUDIO_OPTS2 -s $RESOLUTION3 $VIDEO_OPTS3 -b:v $BITRATE3 $OUTPUT_HLS ${OUTNAME}_${BITRATE3}.mp4 \
$AUDIO_OPTS2 -s $RESOLUTION4 $VIDEO_OPTS3 -b:v $BITRATE4 $OUTPUT_HLS ${OUTNAME}_${BITRATE4}.mp4 \
$AUDIO_OPTS2 -s $RESOLUTION5 $VIDEO_OPTS3 -b:v $BITRATE5 $OUTPUT_HLS ${OUTNAME}_${BITRATE5}.mp4 \
$AUDIO_OPTS2 -s $RESOLUTION6 $VIDEO_OPTS3 -b:v $BITRATE6 $OUTPUT_HLS ${OUTNAME}_${BITRATE6}.mp4 \
$AUDIO_OPTS2 -s $RESOLUTION7 $VIDEO_OPTS3 -b:v $BITRATE7 $OUTPUT_HLS ${OUTNAME}_${BITRATE7}.mp4 \

MASTER=” \
\\
\
\
\
\
\
\
\
\
\
\
\
\

echo $MASTER > “$OUTNAME.smil”

 

CBR vs VBR in adaptive streaming

An interesting debate arose recently about whether CBR or VBR should be used when encoding for adaptive streaming. The debate started with my comment that adaptive streaming should use CBR as it better allows the client to manage what bandwidth it is receiving, the issue being VBR that if the client is receiving what it thinks is a 3Mbps stream that due to scene complexity then spikes say up to 6Mbps to deal with the extra complexity this would cause the clients buffers to fill a lot slower than expected for the bitrate and then down shift.

There are a few factors at play here in that need to be considered, these are:

  1. Real world bandwidth in a consumer environment available to a device can vary quite a lot
  2. Video encoding can demand varying amounts of bits to represent an image at a constant quality
  3. Encoding at a constant bit rate may produce video overhead from “stuffing” bits that unneccesarily consume storage and bandwidth

In relation to item 1. if we take a normal home environment not only the providers upstream available bandwidth may vary due to congestion but other in home factors come in to play such as competition for limited bandwidth from multiple downloaded to variations in single strength over wifi.

In relation to item 2. the amount of bits to encode 2 seconds of black vs a high action CGI scene with a lot of colours or rippling water is significant.

This post is a work in progress but if anyone is interested leave me a note and I will follow up.

Note that the Apple encoding recommendations for HLS which are widely cited refer to a maximum VBR rate of 10% over the target rate.

References:
Android Java based adaptive streaming client
Adaptive Video Streaming over HTTP with Dynamic Resource Estimation

nginx-rtmp – nice tool for managing rtmp streams

nginx is one of the best web servers out there not only because it is massively scalable but also because it has a nice clean configuration syntax.

The nginx-rtmp module looks like a very impressive add on that provides some really nice additional features for video streaming, including:

  • Ability to re-stream rtmp
  • Push rtmp to other locations
  • Ability to record streams
  • Ability to perform on the fly transcodes of content coming through

https://github.com/arut/nginx-rtmp-module

Using ffmpeg with Akamai HD

Quite often it is useful to put up a test stream or source with a new CDN configuration for testing purposes, this works with authenticated rtmp which is required when connected to Akamai or many Adobe/Flash Media Server servers.

ffmpeg -re -f lavfi -i testsrc=size=1920x1080 -c:v libx264 -b:v 500k -an -s 1920x1080 -x264opts keyint=50 -g 25 -pix_fmt yuv420p -f flv rtmp://<USERNAME>:<PASSWORD>@p.<CPCODE>.i.akamaientrypoint.net/EntryPoint/mystream_1_500@<STREAMID>

You need to get the username, password, entrypoint name and streamid from your Akamai account (Configure -> Live Media)

You can then play it back using the URLs you also see in your Akamai control panel.

Note that for the RTMP to be converted to HLS or HDS you need to make sure you have frequent enough keyframes which is what the keyint directive is for.

Good study on Netflix CDN usage

http://www-users.cs.umn.edu/~viadhi/netflix.pdf 

Abstract—Netflix is the leading provider of on-demand Internet
video streaming in the US and Canada, accounting for 29.7%
of the peak downstream traffic in US. Understanding the Netflix
architecture and its performance can shed light on how to best
optimize its design as well as on the design of similar on-demand
streaming services. In this paper, we perform a measurement
study of Netflix to uncover its architecture and service strategy.
We find that Netflix employs a blend of data centers and Content
Delivery Networks (CDNs) for content distribution. We also
perform active measurements of the three CDNs employed by
Netflix to quantify the video delivery bandwidth available to
users across the US. Finally, as improvements to Netflix’s current
CDN assignment strategy, we propose a measurement-based
adaptive CDN selection strategy and a multiple-CDN-based video
delivery strategy, and demonstrate their potentials in significantly
increasing user’s average bandwidth.