Blog-Archiv

Sonntag, 11. Oktober 2020

Utility Scripts for ffmpeg

Part of my new ffmpeg toolbox are some utility scripts that I want to present in this article.

  1. Display Video Properties
  2. Extract Image at Time
  3. Calculate Duration of Videos in a Folder
  4. Output Key-Frame Times

All scripts would output their syntax and exit when called without command line arguments.

Display Video Properties

videoProperties.sh

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
#######################################################
# Displays quality settings of video and audio.
#######################################################

[ -z "$1" ] &&    {
    echo "SYNTAX: $0 [-a|videoPropertyCsv] videoFile [videoFile ...]" >&2
    echo "Shows significant properties of first video and audio stream." >&2
    echo "-a: show full property listing" >&2
    echo "videoPropertyCsv: comma-separated list of video properties to show" >&2
    exit 1
}

formatProperties()    {
    ffprobe -v error \
            -show_entries format=$1 \
            -of default=noprint_wrappers=1 $2 \
        | sort \
        | uniq \
        | sed 's/^/format /'
}

streamProperties()    {
    ffprobe -v error \
            -select_streams $2 -show_entries stream=$1 \
            -of default=noprint_wrappers=1 $3 \
        | sort \
        | uniq \
        | sed 's/^/'$2' /'
}

for argument in $*
do
    if [ -f "$argument" ]
    then
        videoFile=$argument
        echo "Properties for $videoFile" >&2
        
        if [ -n "$propertyList" ]
        then
            streamProperties $propertyList v:0 $videoFile
        elif [ "$showAll" = "true" ]
        then
            ffprobe -v error -show_format -show_streams $videoFile
        else
            formatProperties format_name,bit_rate $videoFile
            streamProperties codec_name,profile,time_base,pix_fmt,r_frame_rate,width,height,bit_rate v:0 $videoFile
            streamProperties codec_name,sample_rate,channels,bit_rate a:0 $videoFile
        fi
    elif [ "$argument" = "-a" ]
    then
        showAll=true
    else
        propertyList=$argument
    fi
done

For every given video file, it would output the most important quality properties, prepending "format" for general, "v:0" for video and "a:0" for audio, so that we can see where the property belongs to. If you give one or more property names (comma-separated without spaces), it would restrict the output to video-stream properties of that name (just the first video of possibly several).

The option -a ("all") would cause a full output of all stream- and format- properties. Mind that there could be also programs and subtitles inside the video.

Examples:

videoProperties.sh testvideos/GOPR1486.MP4
format bit_rate=30137653
format format_name=mov,mp4,m4a,3gp,3g2,mj2
v:0 bit_rate=30005060
v:0 codec_name=h264
v:0 height=1080
v:0 pix_fmt=yuvj420p
v:0 profile=High
v:0 r_frame_rate=48000/1001
v:0 time_base=1/48000
v:0 width=1920
a:0 bit_rate=128013
a:0 channels=2
a:0 codec_name=aac
a:0 sample_rate=48000
videoProperties.sh start_time,duration testvideos/GOPR1486.MP4
v:0 duration=26.797604
v:0 start_time=0.000000

Extract Image at Time

extractImage.sh

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
#######################################################
# Extracts an image from given video at given time.
#######################################################

IMAGEFILENAME=image.png

[ -z "$1" -o ! -f "$1" -o -z "$2" ] &&    {
    echo "SYNTAX: $0 videoFile time [imageFilePath]" >&2
    echo "Extracts an image from a video to default image file $IMAGEFILENAME" >&2
    echo "Example:" >&2
    echo "    $0 myvideo.mp4 1:23 title-image.png" >&2
    echo "    would extract the key-frame image at or before minute 1 second 23" >&2
    echo "    from myvideo.mp4 to title-image.png." >&2
    exit 1
}

video=$1
time=$2
image=${3:-$IMAGEFILENAME}

ffmpeg -y \
    -ss $time -i $video -frames:v 1 \
    -f image2 $image

First argument is the video file, second the time when the targeted image occurs, optional third the name of the resulting image file.

The ffmpeg command uses input-seeking (-ss before -i), thus it would find a key-frame at, or shortly before, given time. Mind that you can also specify thousands seconds by adding ".111" to seconds.

Calculate Duration of Videos in a Folder

durationOfVideos.sh

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
#######################################################
# The rounded duration of all videos in given directory.
#######################################################

[ -z "$1" ] && {
    echo "SYNTAX: $0 videoDir" >&2
    exit 1
}

cd $1 || exit 2

sum=0
for video in `ls -1 *.MP4 *.mp4 2>/dev/null | sort`
do
    seconds=`ffprobe -v error -show_entries format=duration -of default=noprint_wrappers=1:nokey=1 \$video`
    sum=`echo "\$sum \$seconds" | awk '{ print $1 + $2 }'`
    
    echo "$video: $seconds seconds" >&2
done

echo "sum of seconds = $sum" >&2

rounded=`echo \$sum | awk '{ print int($1 + 0.5) }'`
    
hours=`expr \$rounded / 3600`
rest=`expr \$rounded % 3600`
minutes=`expr \$rest / 60`
seconds=`expr \$rest % 60`

echo "$hours:$minutes:$seconds"

This script specializes on MP4 files. If you have others, replace the ls -1 wildcard patterns in line 13.

It changes to the given video directory, loops all videos and displays their duration in seconds on stderr. Finally it displays the rounded sum in hours:minutes:seconds format on stdout.

Because bc ("binary calculator") is not installed on every UNIX system, I use awk for calculations. It processes decimal numbers precisely, while the expr ("expression") command can process only integers.

Example:

durationOfVideos.sh testvideos
20200905_133128.mp4: 44.139000 seconds
GOPR1486.MP4: 26.816000 seconds
GOPR1487.MP4: 26.901333 seconds
GOPR1488.mp4: 29.802667 seconds
TITLE.MP4: 7.007000 seconds
sum of seconds = 134.666
0:2:15

Output Key-Frame Seconds

keyFrameSeconds.sh

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
###########################################
# Outputs the seconds when key-frames occur.
###########################################

[ -f "$1" ] || {
    echo "SYNTAX: $0 videoFile" >&2
    echo "Outputs the times when key-frames occur in given video." >&2
    exit 1
}

ffprobe -v error \
    -select_streams v:0 \
    -skip_frame nokey \
    -show_frames -show_entries frame=pkt_pts_time \
    -of default=noprint_wrappers=1:nokey=1 \
    $1

This script my be useful when you want to know where the key-frames are in your video.

Example:

keyFrameSeconds.sh testvideos/GOPR1486.MP4
0.000000
0.500500
1.001000
1.501500
2.002000
2.502500
3.003000
3.503500
4.004000
4.504500
5.005000
5.505500
6.006000
6.506500
7.007000
7.507500
8.008000
8.508500
9.009000
9.509500
10.010000
10.510500
11.011000
11.511500
12.012000
12.512500
13.013000
13.513500
14.014000
14.514500
15.015000
15.515500
16.016000
16.516500
17.017000
17.517500
18.018000
18.518500
19.019000
19.519500
20.020000
20.520500
21.021000
21.521500
22.022000
22.522500
23.023000
23.523500
24.024000
24.524500
25.025000
25.525500
26.026000
26.526500

This shows that in given video there is a key-frame every half a second.




Dienstag, 6. Oktober 2020

Another Video Title with ffmpeg

The first cut is not always the deepest:-)
The title video created by the script in this article will show the first image of the first cut in cutting-plan cuts.txt as a "still image". It will fade in, then the title text will fade in, sourrounded by a semitransparent rectangle, after 4 seconds the title will fade out and the video will start with the clip from which the title-background was taken.

Everything is backed by the simple directory and file structure I introduced in my recent article about fast cutting and joining videos without re-encoding. This is video cut automation with ffmpeg. The script below is an alternative to my very simple title video that I documented recently.

Mind that you need CYGWIN or similar to execute a UNIX shell script on WINDOWS.

Alternative titleForVideos.sh

In the following I will explain the parts of the script in the order they appear. At end of the article you can find the complete source.

Configurations

#######################################################
# Creates a title for a video with text in file title.txt.
# Developed with ffmpeg 3.4.8-0ubuntu0.2.
#######################################################

# configurations

fontcolor=white	# foreground color
fontsize=100	# size of text
bordercolor=black	# text outline color
boxbordercolor=Silver@0.6	# rectangle color, light gray, 60% opaque
boxborderwidth=40

videoFadeInDuration=1	# seconds
titleFadeDuration=1	# for both fade-in and -out
titleVisibility=4	# without fades
startTitleFadeIn=$videoFadeInDuration	# start title fade-in immediately after video fade-in

titleVideo=TITLE.MP4	# file name of the resulting title video, naming convention used by cutVideos.sh

Here on top of the script you can edit configurations that will modify the title video. It will be a white text, outlined black, surrounded by a gray (boxbordercolor) semi-transparent (Silver@0.6) rectangle. The boxborderwidth would make the rectangle bigger.

The videoFadeInDuration is the duration in seconds that the title-video fade-in will last. After startTitleFadeIn seconds from the beginning of the video the title would start to fade-in, and this fade would last titleFadeDuration seconds. The duration of the title video is determined by titleVisibility, which is the number of seconds the title text will be visible without fades.

Argument Scanning

# argument scanning

[ -z "$1" ] && {
	echo "SYNTAX: $0 videoDir/[TITLEVIDEO.MP4] [titleTextFile]" >&2
	echo "	Creates videoDir/$titleVideo with background image from video in given directory." >&2
	echo "	If TITLEVIDEO.MP4 is not given on commandline, it will be taken from videoDir/cuts.txt by default." >&2
	echo "	The title text is in file title.txt, or in titleTextFile, each must be where the videos are." >&2
	exit 1
}

if [ -d $1 ]	# get start-video and -time from cutting-plan
then
	cd $1 || exit 2
	cuttingPlan=cuts.txt
	[ -f $cuttingPlan ] || {
		echo "No cutting-plan cuts.txt found in `pwd`"
		exit 3
	}
	
	# get first video from cutting-plan, same regexp as in cutVideos.sh
	variableSettingScript=`awk '
		BEGIN	{ IGNORECASE = 1; }	# make all pattern matching case-insensitive
		/^[a-zA-Z0-9_\-]+\.MP4[ \t]*$/	{	# first video file
			videoFile = $1
		}
		/^[0-9]+:[0-9]+/	{	# first start time
			if (videoFile) {	# print shell script
				print "firstVideo=" videoFile "; startTime=" $1
				exit 0
			}
		}
	' \$cuttingPlan`

	eval "$variableSettingScript"	# evaluate shell script printed by awk

	[ -f "$firstVideo" ] ||	{
		echo "Found no video $firstVideo in `pwd`" >&2
		exit 4
	}
elif [ -f $1 ]
then
	cd `dirname \$1` || exit 2
	firstVideo=`basename \$1`
	startTime=0
else
	echo "Given video template or directory does not exist: $1" >&2
	exit 5
fi

titleText=${2:-title.txt}
[ -f $titleText ] || {
	echo "Found no $titleText in `pwd`" >&2
	exit 6
}

Argument checking is boring but necessary to prepare your script for the future when even you have forgotten how to use it:-)

The first parameter to this script is the directory where the video clips and the two files cuts.txt (cutting-plan) and title.txt (multiline title text) are. Both are plain text files, I described them in a recent article. If that parameter is empty, the script-syntax is displayed and the script terminates.

Optionally you can add a video that you want the title's background-image to be taken from. In this case the first image will be taken from the video.

The directory is checked for existence, and the first cut and its start time gets scanned from the cutting-plan. This is an extended shell technique where you generate some shell script code in an awk-script, and then execute that code through the eval (→ "evaluate") built-in shell command. That way you can set several shell variable values in just one awk-run. The awk script uses the same patterns as my recenty introduced cutVideos.sh script to read the first video and its first cut start time.

Last not least the script checks the existence of title.txt where the plain text of the title is. This can be a multiline text, but mind that you must center the lines by using spaces, ffmpeg aligns all lines to the left.

Reading Source Video Properties

# fetch video target properties from first video

echo "Working in `pwd` ..."

streamProperty()	{	# $1 = property name, $2 = stream name, $3 = video file
	ffprobe -v error -select_streams $2 -show_entries stream=$1 -of default=noprint_wrappers=1:nokey=1 $3
}

getVideoProperties()	{	# $1 = video file
	stream=v:0	# first found video
	frameRate=`streamProperty r_frame_rate \$stream \$1`
	pixelFormat=`streamProperty pix_fmt \$stream \$1`
	bitRate=`streamProperty bit_rate \$stream \$1`
	
	stream=a:0	#  first found audio
	audioCodec=`streamProperty codec_name \$stream \$1`
	audioSampleRate=`streamProperty sample_rate \$stream \$1`
	
	echo "r_frame_rate=$frameRate\nbit_rate=$bitRate\npix_fmt=$pixelFormat\naudio_codec=$audioCodec\naudio sample_rate=$audioSampleRate"
}
getVideoProperties $firstVideo

Now that we have a video where we will take the title background image from, we can also read the properties from that video, so that our title video will have the same technical settings and can be prepended to the cuts without re-encoding.

The shell function streamProperty() encapsulates the ffprobe command that serves for reading video properties. That function gets called by getVideoProperties() which evaluates several shell variables that will be used later. It also outputs the properties so that we can compare them to those of the final result video.

Finally we call the getVideoProperties() function with the template video as parameter to get these parameters into shell variables. Mind that all shell variables are global, there are no local variables except $1 - $9 inside a shell function.

Image Extraction

# start to work

firstImage=firstFrame.jpg
fadeVideo=fadeVideo.mp4
cleanup()	{
	rm -f $firstImage $fadeVideo
}
error()	{
	cleanup
	exit $1
}

echo "Extracting image at $startTime from $firstVideo as title background ..."
ffmpeg -y -v error \
	-ss $startTime -i $firstVideo -frames:v 1 \
	-f image2 $firstImage || error $?

Now the concrete work starts. As preparation some names for temporary files are assigned, and a cleanup() function that will remove them on script termination. The error() function is a nice convenience for terminating the script with the exit-code of the last failed command. We will use it instead of the built-in exit command.

The following ffmpeg commad extracts the image at $startTime (read from cuts.txt) to the temporary file firstImage.jpg. The parameter pair -frames:v 1 gives the number of frames to extract. If we had not 1 here, we'd have to give an image file pattern instead of a name.

After this command we have a background for our title in $firstImage file. Now we can weave a video from it, overlaying it with a title.

Title Video Creation

The following command does a lot. It builds a video from an image, fades it in, overlays it with a title that fades in and out, and paints a semi-transparent rectangle behind the title. I have split the command into lines so that I can explain it better. Backslash is the UNIX shell newline escape character.

startTitleFadeOut=`echo "\$startTitleFadeIn \$titleFadeDuration \$titleVisibility" | awk '{ print $1 + $2 + $3 }'`
duration=`echo "\$startTitleFadeOut \$titleFadeDuration" | awk '{ print $1 + $2 }'`

echo "Creating faded-in $titleVideo of $duration seconds with title from $titleText ..."
ffmpeg -y -v error \
	-loop 1 -i $firstImage -c:v libx264 -t $duration \
	-filter_complex "\
		[0]split[imagevideo][text];\
		[imagevideo]fade=t=in:st=0:d=$videoFadeInDuration[fadedvideo];\
		[text]drawtext=
			textfile=$titleText:\
				fontcolor=$fontcolor:fontsize=h/10:borderw=7:bordercolor=$bordercolor:\
				line_spacing=60:\
				box=1:boxcolor=$boxbordercolor:boxborderw=$boxborderwidth:\
				x=(w-text_w)/2:y=(h-text_h)/2,\
			format=$pixelFormat,\
			fade=t=in:st=$startTitleFadeIn:d=$titleFadeDuration:alpha=1,\
			fade=t=out:st=$startTitleFadeOut:d=$titleFadeDuration:alpha=1[titletext];\
		[fadedvideo][titletext]overlay" \
	-pix_fmt $pixelFormat -r $frameRate -b $bitRate $fadeVideo || error $?

First the start time of the title fade-out gets calculated from the sum of $startTitleFadeIn, $titleFadeDuration and $titleVisibility. The overall duration of the title video is then the $startTitleFadeOut plus the fade-out of the title.

The ffmpeg-option -y makes ffmpeg overwrite any file without questions, and -v error reduces the log-level to error.

The loop 1 -i $firstImage -c:v libx264 -t $duration line generates the video from the given $firstImage, giving it a duration of $duration in seconds.

The following complex_filer option seems to be one of the most powerful options of ffmpeg. We can use it to perform several filters in just one ffmpeg-run. Lets do it line by line. This is a DSL (domain-specific language).

[0]split[imagevideo][text];
The input stream number 0 (image-video created by loop) gets split into a stream "imagevideo" and "text".

[imagevideo]fade=t=in:st=0:d=$videoFadeInDuration[fadedvideo];
The "imagevideo" stream will be filtered to fade in at start-time zero with duration $videoFadeInDuration, the result will be named "fadedvideo".

[text]drawtext=textfile=$titleText:
The "imagevideo" stream will be filtered to draw a text taken from file $titleText. Following lines until the closing semicolon ";" are parameterization and further filtering of the initial drawtext.

fontcolor=$fontcolor:fontsize=h/10:borderw=7:bordercolor=$bordercolor:line_spacing=60:
Sets the color of the text font, the size will be a tenth of the video height (h), the black font outline will be 7 pixels thick and of given bordercolor. The distace between multiple lines is set by line_spacing.

box=1:boxcolor=$boxbordercolor:boxborderw=$boxborderwidth:
The rectangle around the text will be of given boxbordercolor, and be of given boxborderwidth.

x=(w-text_w)/2:y=(h-text_h)/2,
This centers the text. The w and h variables are the width and height of the video, the text_w and text_h variables are the ready-calculated text width and height.

format=$pixelFormat,
Filters the stream to given pixel-format.

fade=t=in:st=$startTitleFadeIn:d=$titleFadeDuration:alpha=1,
Fades-in the stream with given start-time (st) and duration (d).

fade=t=out:st=$startTitleFadeOut:d=$titleFadeDuration:alpha=1[titletext];
Fades-out the stream with given start-time (st) and duration (d). Here the text filter ends with a semicolon, and the result gets the name "titletext".

[fadedvideo][titletext]overlay
The stream "fadedvideo" gets overlayed with the stream "titletext". The text-stream has a transparent background, thus the image will be visible underneath.

-pix_fmt $pixelFormat -r $frameRate -b $bitRate $fadeVideo || error $?
Ensures that everything is in given pixel-format, frames-per-second and bit-rate. The output will appear in file $fadeVideo. When the ffmpeg command fails, the error() function will be executed with the ffmpeg's exit-code and the script will terminate negatively.

Adding a Silent Audio Track

echo "Adding a silent audio track to $titleVideo ..."
ffmpeg -v error -y \
	-f lavfi -i anullsrc=sample_rate=$audioSampleRate:channel_layout=stereo \
	-i $fadeVideo \
	-c:v copy -c:a $audioCodec \
	-shortest $titleVideo || error $?

cleanup

echo "Successfully created $titleVideo in `pwd`"
getVideoProperties $titleVideo

The lavfi filter is called to generate a silent audio-track with given sample-rate. This gets added to the $fadeVideo result, the video stream gets copied, the audio-stream is encoded using given $audioCodec. Result is written to file $titleVideo - and we are done! Finally the result video properties get printed out.

This script takes some time to execute. It was never below 20 seconds, the complex filter taking the most time, although the result is just 7 seconds long. Here is the script output when I run it over my test video directory:

Working in /media/space/videos/ffmpeg-script/testvideos ...
r_frame_rate=48000/1001
bit_rate=30005060
pix_fmt=yuvj420p
audio_codec=aac
audio sample_rate=48000
Extracting image at 0:0:3.123 from GOPR1486.MP4 as title background ...
Creating faded-in TITLE.MP4 of 7 seconds with title from title.txt ...
Adding a silent audio track to TITLE.MP4 ...
Successfully created TITLE.MP4 in /media/space/videos/ffmpeg-script/testvideos
r_frame_rate=48000/1001
bit_rate=17872542
pix_fmt=yuvj420p
audio_codec=aac
audio sample_rate=48000

Complete Source


  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
#######################################################
# Creates a title for a video with text in file title.txt.
# Developed with ffmpeg 3.4.8-0ubuntu0.2.
#######################################################

# configurations

fontcolor=white	# foreground color
fontsize=100	# size of text
bordercolor=black	# text outline color
boxbordercolor=Silver@0.6	# rectangle color, light gray, 60% opaque
boxborderwidth=40

videoFadeInDuration=1	# seconds
titleFadeDuration=1	# for both fade-in and -out
titleVisibility=4	# without fades
startTitleFadeIn=$videoFadeInDuration	# start title fade-in immediately after video fade-in

titleVideo=TITLE.MP4	# file name of the resulting title video, naming convention used by cutVideos.sh

# argument scanning

[ -z "$1" ] && {
	echo "SYNTAX: $0 videoDir/[TITLEVIDEO.MP4] [titleTextFile]" >&2
	echo "	Creates videoDir/$titleVideo with background image from video in given directory." >&2
	echo "	If TITLEVIDEO.MP4 is not given on commandline, it will be taken from videoDir/cuts.txt by default." >&2
	echo "	The title text is in file title.txt, or in titleTextFile, each must be where the videos are." >&2
	exit 1
}

if [ -d $1 ]	# get start-video and -time from cutting-plan
then
	cd $1 || exit 2
	cuttingPlan=cuts.txt
	[ -f $cuttingPlan ] || {
		echo "No cutting-plan cuts.txt found in `pwd`"
		exit 3
	}
	
	# get first video from cutting-plan, same regexp as in cutVideos.sh
	variableSettingScript=`awk '
		BEGIN	{ IGNORECASE = 1; }	# make all pattern matching case-insensitive
		/^[a-zA-Z0-9_\-]+\.MP4[ \t]*$/	{	# first video file
			videoFile = $1
		}
		/^[0-9]+:[0-9]+/	{	# first start time
			if (videoFile) {	# print shell script
				print "firstVideo=" videoFile "; startTime=" $1
				exit 0
			}
		}
	' \$cuttingPlan`

	eval "$variableSettingScript"	# evaluate shell script printed by awk

	[ -f "$firstVideo" ] ||	{
		echo "Found no video $firstVideo in `pwd`" >&2
		exit 4
	}
elif [ -f $1 ]
then
	cd `dirname \$1` || exit 2
	firstVideo=`basename \$1`
	startTime=0
else
	echo "Given video template or directory does not exist: $1" >&2
	exit 5
fi

titleText=${2:-title.txt}
[ -f $titleText ] || {
	echo "Found no $titleText in `pwd`" >&2
	exit 6
}

# fetch video target properties from first video

echo "Working in `pwd` ..."

streamProperty()	{	# $1 = property name, $2 = stream name, $3 = video file
	ffprobe -v error -select_streams $2 -show_entries stream=$1 -of default=noprint_wrappers=1:nokey=1 $3
}

getVideoProperties()	{	# $1 = video file
	stream=v:0	# first found video
	frameRate=`streamProperty r_frame_rate \$stream \$1`
	pixelFormat=`streamProperty pix_fmt \$stream \$1`
	bitRate=`streamProperty bit_rate \$stream \$1`
	
	stream=a:0	#  first found audio
	audioCodec=`streamProperty codec_name \$stream \$1`
	audioSampleRate=`streamProperty sample_rate \$stream \$1`
	
	echo "r_frame_rate=$frameRate\nbit_rate=$bitRate\npix_fmt=$pixelFormat\naudio_codec=$audioCodec\naudio sample_rate=$audioSampleRate"
}
getVideoProperties $firstVideo

# start to work

firstImage=firstFrame.jpg
fadeVideo=fadeVideo.mp4
cleanup()	{
	rm -f $firstImage $fadeVideo
}
error()	{
	cleanup
	exit $1
}

echo "Extracting image at $startTime from $firstVideo as title background ..."
ffmpeg -y -v error \
	-ss $startTime -i $firstVideo -frames:v 1 \
	-f image2 $firstImage || error $?

startTitleFadeOut=`echo "\$startTitleFadeIn \$titleFadeDuration \$titleVisibility" | awk '{ print $1 + $2 + $3 }'`
duration=`echo "\$startTitleFadeOut \$titleFadeDuration" | awk '{ print $1 + $2 }'`

echo "Creating faded-in $titleVideo of $duration seconds with title from $titleText ..."
ffmpeg -y -v error \
	-loop 1 -i $firstImage -c:v libx264 -t $duration \
	-filter_complex "\
		[0]split[imagevideo][text];\
		[imagevideo]fade=t=in:st=0:d=$videoFadeInDuration[fadedvideo];\
		[text]drawtext=
			textfile=$titleText:\
				fontcolor=$fontcolor:fontsize=h/10:borderw=7:bordercolor=$bordercolor:\
				line_spacing=60:\
				box=1:boxcolor=$boxbordercolor:boxborderw=$boxborderwidth:\
				x=(w-text_w)/2:y=(h-text_h)/2,\
			format=$pixelFormat,\
			fade=t=in:st=$startTitleFadeIn:d=$titleFadeDuration:alpha=1,\
			fade=t=out:st=$startTitleFadeOut:d=$titleFadeDuration:alpha=1[titletext];\
		[fadedvideo][titletext]overlay" \
	-pix_fmt $pixelFormat -r $frameRate -b $bitRate $fadeVideo || error $?

echo "Adding a silent audio track to $titleVideo ..."
ffmpeg -v error -y \
	-f lavfi -i anullsrc=sample_rate=$audioSampleRate:channel_layout=stereo \
	-i $fadeVideo \
	-c:v copy -c:a $audioCodec \
	-shortest $titleVideo || error $?

cleanup

echo "Successfully created $titleVideo in `pwd`"
getVideoProperties $titleVideo







Update 2021-10-26

Here is a fixed variant of the script, keeping the time_base of the video the image is taken from:

#######################################################
# Creates a title for a video with text in file title.txt.
# Developed with ffmpeg 3.4.8-0ubuntu0.2.
#######################################################

# configurations

fontcolor=white	# foreground color
fontsize=100	# size of text
bordercolor=black	# text outline color
boxbordercolor=Silver@0.6	# rectangle color, light gray, 60% opaque
boxborderwidth=40

videoFadeInDuration=1	# seconds
titleFadeDuration=1	# for both fade-in and -out
titleVisibility=4	# without fades
startTitleFadeIn=$videoFadeInDuration	# start title fade-in immediately after video fade-in

titleVideo=TITLE.MP4	# file name of the resulting title video, naming convention used by cutVideos.sh

# argument scanning

[ -z "$1" ] && {
	echo "SYNTAX: $0 videoDir/[TITLEVIDEO.MP4] [titleTextFile [startSecond]]" >&2
	echo "	Creates videoDir/$titleVideo with background image from video in given directory." >&2
	echo "	If TITLEVIDEO.MP4 is not given on commandline, it will be taken from videoDir/cuts.txt by default." >&2
	echo "	The title text is in file title.txt, or in titleTextFile, must be where the videos are." >&2
	echo "	Parameter startSecond only works when video file is given." >&2
	exit 1
}

if [ -d $1 ]	# get start-video and -time from cutting-plan
then
	cd $1 || exit 2
	cuttingPlan=cuts.txt
	[ -f $cuttingPlan ] || {
		echo "No cutting-plan cuts.txt found in `pwd`"
		exit 3
	}
	
	# get first video from cutting-plan, same regexp as in cutVideos.sh
	variableSettingScript=`awk '
		BEGIN	{ IGNORECASE = 1; }	# make all pattern matching case-insensitive
		/^[a-zA-Z0-9_\-]+\.MP4[ \t]*$/	{	# first video file
			videoFile = $1
		}
		/^[0-9]+:[0-9]+/	{	# first start time
			if (videoFile) {	# print shell script
				print "firstVideo=" videoFile "; startTime=" $1
				exit 0
			}
		}
	' \$cuttingPlan`

	eval "$variableSettingScript"	# evaluate shell script printed by awk

	[ -f "$firstVideo" ] ||	{
		echo "Found no video $firstVideo in `pwd`" >&2
		exit 4
	}
elif [ -f $1 ]
then
	cd `dirname \$1` || exit 2
	firstVideo=`basename \$1`
	startTime=${3:-0}
else
	echo "Given video template or directory does not exist: $1" >&2
	exit 5
fi

titleText=${2:-title.txt}
[ -f $titleText ] || {
	echo "Found no $titleText in `pwd`" >&2
	exit 6
}

# fetch video target properties from first video

echo "Working in `pwd` ..."

streamProperty()	{	# $1 = property name, $2 = stream name, $3 = video file
	ffprobe -v error -select_streams $2 -show_entries stream=$1 -of default=noprint_wrappers=1:nokey=1 $3
}

getInverseTimeBase()    {    # $1 = video path
    streamProperty time_base v:0 $1 | sed 's/^1\///'
}

outputVideoProperties()	{	# $1 = video path
	echo "$1:\n  frameRate=`streamProperty r_frame_rate v:0 \$1`\n  pixelFormat=`streamProperty pix_fmt v:0 \$1`\n  timeBase=`streamProperty time_base v:0 \$1`"
}

getVideoAndAudioProperties()	{	# $1 = video file
	stream=v:0	# first found video
	
	frameRate=`streamProperty r_frame_rate \$stream \$1`
	pixelFormat=`streamProperty pix_fmt \$stream \$1`
	timeBase=`streamProperty time_base \$stream \$1`
	
	stream=a:0	#  first found audio
	audioCodec=`streamProperty codec_name \$stream \$1`
	audioSampleRate=`streamProperty sample_rate \$stream \$1`
}
getVideoAndAudioProperties $firstVideo

# start to work

firstImage=firstFrame.jpg
fadeVideo=fadeVideo.mp4
cleanup()	{
	rm -f $firstImage $fadeVideo
}
error()	{
	cleanup
	exit $1
}

echo "Extracting image at $startTime from $firstVideo as title background ..."
ffmpeg -y -v error \
	-ss $startTime -i $firstVideo -frames:v 1 \
	-f image2 $firstImage || error $?

startTitleFadeOut=`echo "\$startTitleFadeIn \$titleFadeDuration \$titleVisibility" | awk '{ print $1 + $2 + $3 }'`
duration=`echo "\$startTitleFadeOut \$titleFadeDuration" | awk '{ print $1 + $2 }'`

# keep time_base
inverseTimeBase=`getInverseTimeBase \$firstVideo`
keepTimeBase="-video_track_timescale $inverseTimeBase"

outputVideoProperties $firstVideo

echo "Creating faded-in $titleVideo of $duration seconds with title from $titleText ..."
ffmpeg -y -v error \
	-loop 1 -i $firstImage -c:v libx264 -t $duration \
	-filter_complex "\
		[0]split[imagevideo][text];\
		[imagevideo]fade=t=in:st=0:d=$videoFadeInDuration[fadedvideo];\
		[text]drawtext=
			textfile=$titleText:\
				fontcolor=$fontcolor:fontsize=h/10:borderw=7:bordercolor=$bordercolor:\
				line_spacing=60:\
				box=1:boxcolor=$boxbordercolor:boxborderw=$boxborderwidth:\
				x=(w-text_w)/2:y=(h-text_h)/2,\
			format=$pixelFormat,\
			fade=t=in:st=$startTitleFadeIn:d=$titleFadeDuration:alpha=1,\
			fade=t=out:st=$startTitleFadeOut:d=$titleFadeDuration:alpha=1[titletext];\
		[fadedvideo][titletext]overlay" \
	-pix_fmt $pixelFormat -r $frameRate $keepTimeBase $fadeVideo || error $?

# outputVideoProperties $fadeVideo

echo "Adding a silent audio track to $titleVideo ..."
ffmpeg -v error -y \
	-f lavfi -i anullsrc=sample_rate=$audioSampleRate:channel_layout=stereo \
	-i $fadeVideo \
	-c:v copy -c:a $audioCodec \
	-shortest $titleVideo || error $?

cleanup

echo "Successfully created $titleVideo in `pwd`"

outputVideoProperties $titleVideo