

The receiver, after getting an RTP packet and inspecting the Payload Type field, will be able to know what decoder should be used to successfully handle the media.

Īn example: in a typical WebRTC session, Chrome might decide that the Payload Type 96 will correspond to the video codec VP8, PT 98 will be VP9, and PT 102 will be H.264. These Payload Types are called dynamic, and are always chosen to be any number in the range. Nowadays, instead of using a table of predefined numbers, applications can define their own Payload Types on the fly, and share them ad-hoc between senders and receivers of the RTP streams. More predefined values can be found in RFC 3551: For example, the Payload Type 34 corresponds to the H.263 video codec. Originally, the standard provided some predefined Payload Types for commonly used encoding formats at the time. All this information is needed by the receiver in order to decode the stream. In essence, a Payload Type is an integer number that maps to a previously defined encoding, including clock rate, codec type, codec settings, number of channels (in the case of audio), etc. Identifies the format of the RTP payload.
Ffmpeg rtsp teardown full#
Here we'll only talk about those header fields that are most useful for a full description of all fields defined by the RTP standard, refer to the RFC document at RFC 3550. The RTP standard definition is more than 15 years old, and it shows the RTP packet header contains some fields that are defined as mandatory, but nowadays are not really used any more by current RTP and WebRTC implementations. | contributing source (CSRC) identifiers |Īll data before the payload is called the RTP Header, and contains some information needed by participants in the RTP session. | synchronization source (SSRC) identifier | RFC 3550 defines what exactly an RTP packet is: " A data packet consisting of the fixed RTP header, a possibly empty list of contributing sources, and the payload data".

Let's review several basic concepts and extensions over this initial principle. However, as the saying goes, the devil is in the details. Those packets have to be collected and re-assembled, to obtain the media that was originally transmitted by the sender.

Participants expecting to receive data will open a UDP port where they listen for incoming RTP packets. Participants wanting to send will partition the media into different chunks of data called RTP packets, then send those over UDP to the receivers. The basic principle behind RTP is very simple: an RTP session comprises a set of participants (we'll also call them peers) communicating with RTP, to either send or receive audio or video. RTP has surely become a de-facto standard given that it's the mandated transport used by WebRTC, and also lots of tools use RTP for video or audio transmission between endpoints. Our first topic is the Real-time Transport Protocol (RTP), the most popular method to send or receive real-time networked multimedia streams. It is important to have a firm grasp on some basic concepts about RTP, to understand what is going on behind the curtains, so we are able to fix issues when these happen. Of course, " easier" has to be quoted in the previous paragraph, because the fact is that using these command line tools still requires a good amount of knowledge about what the tool is doing, why, and how. Both of these tools offer libraries meant to be used from your programming language of choice, but they also provide handy command-line tools that become an "easier" alternative for those who don't want to write their own programs from scratch.
Ffmpeg rtsp teardown how to#
Afterwards, we'll see how to leverage this knowledge to build a reliable RTP connection between Kurento and mediasoup:įFmpeg and GStreamer are two of the tools that come to mind for most developers while thinking about writing a quick script that is capable of operating with RTP.
Ffmpeg rtsp teardown series#
In the next series of posts we'll first talk about how RTP and SDP messages work, and some implementation details in two popular multimedia toolkits: FFmpeg and GStreamer.
