I’m new to GStreamer and I’m working with GStreamer-Sharp, Visual Basic and Visual Studio 2022. I’m trying to create a simple test application that prepares a sequence of square, greyscale images (ie. video frames) at 7.2fps for presentation to GStreamer via an appsrc
, for encoding with x264enc
and streaming as RTP over UDP. My pipeline:
appsrc ! video/x-raw,format=GRAY8,width=256,height=256,framerate=72/10 ! x264enc tune=zerolatency qp-max=0 key-int-max=72 bframes=3 intra-refresh=1 noise-reduction=200 ! rtph264pay pt=96 ! udpsink host=127.0.0.1 port=5000
Unfortunately, when I run my application I see no UDP packets issued by the udpsink
.
From the x264enc
log I can see that video data is arriving and being compressed. However, this activity stops after approximately 50 frames. After a further 4 frames, appsrc
begins emitting enough-data
signals, presumably because the x264enc
is no-longer taking any data and the appsrc
’s limited input buffer has filled.
Looking at the rtph264pay
log, I see the arrival of a single input-frame which it then tries to send to the udpsink
. However, the udpsink
log is completely empty. It’s as though the udpsink
doesn’t initialise, or rtph264pay
is having difficulty passing its output data to udpsink
.
Here’s the rtph264pay
log:
gst_rtp_h264_pay_getcaps:<Payload> returning caps video/x-h264, stream-format=(string)avc, alignment=(string)au; video/x-h264, stream-format=(string)byte-stream, alignment=(string){ nal, au }
gst_rtp_h264_pay_getcaps:<Payload> returning caps video/x-h264, stream-format=(string)avc, alignment=(string)au; video/x-h264, stream-format=(string)byte-stream, alignment=(string){ nal, au }
gst_rtp_h264_pay_getcaps:<Payload> Intersect video/x-h264, stream-format=(string)avc, alignment=(string)au; video/x-h264, stream-format=(string)byte-stream, alignment=(string){ nal, au } and filter video/x-h264, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], stream-format=(string){ avc, byte-stream }, alignment=(string)au, profile=(string){ high-4:4:4, high-4:2:2, high-10, high, main, baseline, constrained-baseline, high-4:4:4-intra, high-4:2:2-intra, high-10-intra }
gst_rtp_h264_pay_getcaps:<Payload> returning caps video/x-h264, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], stream-format=(string)avc, alignment=(string)au, profile=(string){ high-4:4:4, high-4:2:2, high-10, high, main, baseline, constrained-baseline, high-4:4:4-intra, high-4:2:2-intra, high-10-intra }; video/x-h264, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], stream-format=(string)byte-stream, alignment=(string)au, profile=(string){ high-4:4:4, high-4:2:2, high-10, high, main, baseline, constrained-baseline, high-4:4:4-intra, high-4:2:2-intra, high-10-intra }
gst_rtp_h264_pay_getcaps:<Payload> Intersect video/x-h264, stream-format=(string)avc, alignment=(string)au; video/x-h264, stream-format=(string)byte-stream, alignment=(string){ nal, au } and filter video/x-h264, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], stream-format=(string){ avc, byte-stream }, alignment=(string)au, profile=(string){ high-4:4:4, high-4:2:2, high-10, high, main, baseline, constrained-baseline, high-4:4:4-intra, high-4:2:2-intra, high-10-intra }
gst_rtp_h264_pay_getcaps:<Payload> returning caps video/x-h264, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], stream-format=(string)avc, alignment=(string)au, profile=(string){ high-4:4:4, high-4:2:2, high-10, high, main, baseline, constrained-baseline, high-4:4:4-intra, high-4:2:2-intra, high-10-intra }; video/x-h264, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], stream-format=(string)byte-stream, alignment=(string)au, profile=(string){ high-4:4:4, high-4:2:2, high-10, high, main, baseline, constrained-baseline, high-4:4:4-intra, high-4:2:2-intra, high-10-intra }
gst_rtp_h264_pay_sink_event:<Payload> New stream detected => Clear SPS and PPS
gst_rtp_h264_pay_send_bundle:<Payload> no bundle, nothing to send
gst_rtp_h264_pay_getcaps:<Payload> Intersect video/x-h264, stream-format=(string)avc, alignment=(string)au; video/x-h264, stream-format=(string)byte-stream, alignment=(string){ nal, au } and filter video/x-h264, codec_data=(buffer)01640014ffe1001967640014f159010086c05b2000000300a000000911e28532c001000568efb2c8b0, stream-format=(string)avc, alignment=(string)au, level=(string)2, profile=(string)high, width=(int)256, height=(int)256, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)36/5, interlace-mode=(string)progressive, colorimetry=(string)1:4:0:0
gst_rtp_h264_pay_getcaps:<Payload> returning caps video/x-h264, codec_data=(buffer)01640014ffe1001967640014f159010086c05b2000000300a000000911e28532c001000568efb2c8b0, stream-format=(string)avc, alignment=(string)au, level=(string)2, profile=(string)high, width=(int)256, height=(int)256, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)36/5, interlace-mode=(string)progressive, colorimetry=(string)1:4:0:0
gst_rtp_h264_pay_setcaps:<Payload> have packetized h264
gst_rtp_h264_pay_setcaps:<Payload> profile 640014
gst_rtp_h264_pay_setcaps:<Payload> nal length 4
gst_rtp_h264_pay_setcaps:<Payload> num SPS 1
gst_rtp_h264_pay_setcaps:<Payload> SPS 0 size 25
gst_rtp_h264_pay_setcaps:<Payload> num PPS 1
gst_rtp_h264_pay_setcaps:<Payload> PPS 0 size 5
gst_rtp_h264_pay_handle_buffer:<Payload> got 861 bytes
gst_rtp_h264_pay_handle_buffer:<Payload> got NAL of size 2
gst_rtp_h264_pay_payload_nal:<Payload> payloading NAL Unit: datasize=2 type=9 pts=1000:00:00.000000000
gst_rtp_h264_pay_payload_nal_fragment:<Payload> sending NAL Unit: datasize=2 mtu=1400
Sadly, nothing further is written to the rtph264pay
log, presumably because it can’t process any new data from x264enc
due to being unable to pass its current data to the udpsink
.
To be clear, I need help to understand why the udpsink
isn’t taking data from the upstream rtph264pay
, and how to correct the situation. I assume this is because my code creates an invalid pipeline configuration. So I’ve provided a copy of the code below.
I’ve run the code with a simpler, more conventional set of x264enc
parameters, with the same result. I’ve also tested the pipeline with gst-launch-1.0
(with appsrc
replaced by videotestsrc
and a capsfilter
) where it works fine. Setting GST_DEBUG_DUMP_DOT_DIR
and running both gst-launch1.0
and my own code, I see ‘.dot’ files that reveal very similar topographies. So I’m confident that my code must be close to being correct.
This is the topography reported from my application:
Using GStreamer-Sharp, my application configures GStreamer as follows:
Private Sub ConfigurePipeline()
Gst.Application.Init()
VInfo1 = New VideoInfo()
VInfo1.SetFormat(VideoFormat.Gray8, SrcSize.Width, SrcSize.Height)
VInfo1.FpsN = 72
VInfo1.FpsD = 10
VCaps = VInfo1.ToCaps()
Diagnostics.Debug.WriteLine(VCaps.ToString)
FrameInterval = VInfo1.FpsD / VInfo1.FpsN
FrameDuration = Util.Uint64ScaleInt(VInfo1.FpsD, Gst.Constants.SECOND, VInfo1.FpsN)
Dim FrameBytes As UInteger = 256 * 256
ReDim FrameData(FrameBytes - 1)
System.Array.Fill(FrameData, 127)
'Pipe = Parse.Launch("appsrc ! video/x-raw,format=GRAY8,width=256,height=256,framerate=72/10 " &
' "! x264enc tune=zerolatency qp-max=0 key-int-max=72 bframes=3 intra-refresh=1 noise-reduction=200 " &
' "! rtph264pay pt=96 ! udpsink host=127.0.0.1 port=5000")
Pipe = New Pipeline("Pipe")
PBus = Pipe.Bus
PBus.AddSignalWatch()
AddHandler PBus.Message, AddressOf Handle_PBus_Message
Source = New AppSrc("Source")
Compress = ElementFactory.Make("x264enc", "Compress")
Payload = ElementFactory.Make("rtph264pay", "Payload")
UDPSink = ElementFactory.Make("udpsink", "UDPSink")
Source.Caps = VCaps
Source.SetProperty("stream-type", New GLib.Value(AppStreamType.Stream))
Source.SetProperty("format", New GLib.Value(Gst.Constants.TIME_FORMAT))
Source.SetProperty("emit-signals", New GLib.Value(True))
AddHandler Source.NeedData, AddressOf Handle_Source_NeedData
AddHandler Source.EnoughData, AddressOf Handle_Source_EnoughData
Compress.SetProperty("tune", New GLib.Value("zerolatency"))
Compress.SetProperty("qp-max", New GLib.Value(0))
Compress.SetProperty("key-int-max", New GLib.Value(72))
Compress.SetProperty("bframes", New GLib.Value(3))
Compress.SetProperty("intra-refresh", New GLib.Value(1))
Compress.SetProperty("noise-reduction", New GLib.Value(200))
Payload.SetProperty("pt", New GLib.Value(96))
UDPSink.SetProperty("host", New GLib.Value("127.0.0.1"))
UDPSink.SetProperty("port", New GLib.Value(5000))
Pipe.Add(Source, Compress, Payload, UDPSink)
Source.Link(Compress)
Compress.Link(Payload)
Payload.Link(UDPSink)
Dim Result As StateChangeReturn = Pipe.SetState(State.Playing)
If Result = StateChangeReturn.Failure Then
Diagnostics.Debug.WriteLine("Unable to set the pipeline to the playing state")
Else
MainLoop = New MainLoop()
MainLoop.Run()
FrameTimer.Stop()
Diagnostics.Debug.WriteLine("Mainloop has exited, stopping pipeline")
Pipe.SetState(State.Null)
End If
Diagnostics.Debug.WriteLine("Disposing pipeline elements")
Pipe.Dispose()
Source.Dispose()
Compress.Dispose()
Payload.Dispose()
UDPSink.Dispose()
End Sub
The AppSrc.NeedData
event handler starts a System.Timers.Timer
which ticks at 7.2Hz, and the Timer.Elapsed
event handler calls the following method that transfers data to the appsrc
:
Private Sub NewFrame()
Using GSTBuffer As New Buffer(FrameData)
GSTBuffer.Pts = Timestamp
GSTBuffer.Dts = Timestamp
GSTBuffer.Duration = FrameDuration
Timestamp += FrameDuration
Source.PushBuffer(GSTBuffer)
End Using
End Sub
The AppSrc.EnoughData
event handler only prints a message to the console.
I’d be very grateful if someone could examine the above and make any suggestions for where to look for my mistake.
I’ve also posted this question to discourse.gstreamer.com with no response yet. If a helpful reply comes in, I’ll pass it along as an update.
Thanks
2
Answers
With the help of someone from the GStreamer discourse, I was guided to look more closely at the issue of timestamping. I now have working code, although precisely why it works is bit of a mystery. The solution was to not set timestamps on the buffers being sent to the appsrc. Here’s the working code:
And the method that issues new Buffers (ie. video frames) the appsrc:
I finally got to the bottom of my problem. Looks like I used the wrong Enumeration:
Source.SetProperty("format", New GLib.Value(Gst.Constants.TIME_FORMAT))
It turns out that
Gst.Constants.TIME_FORMAT
is actually a time format string. What I needed was:Source.SetProperty("format", New GLib.Value(3)) ' TimeFormat
This works regardless of whether I set a timestamp on the buffers I pass to the appsrc.