In the past few years, live video, live games, live broadcasts, etc. have become a hot topic in the Internet industry. This article mainly talks about technical content such as live protocol and video streaming.
Streaming media overview
The so-called streaming media refers to the media format that is played on the Internet by streaming.
Streaming media is also called streaming media. It refers to the transmission of real-time video streams or audio streams to a server. The server then sends the video stream or audio stream as a data packet and transmits it to the network.
After the user decompresses the data through the decompression device, the program will be displayed and played as before being sent.
Streaming media streams the form of audio, video, and multimedia files over the network.
The streaming media file format is a media format that supports streaming and playback.
The streaming method is to divide multimedia files such as video and audio into compression packages by special compression methods.
Continuously and in real time transmitted by the server to the user’s computer. In a system using streaming, the user does not have to wait for the entire file to be downloaded after the entire file is downloaded, as in non-streaming, but only after a few seconds or tens of seconds of startup delay. The streaming media files such as compressed video or audio are played on the user’s computer using the corresponding player, and the remaining portions will continue to be downloaded until the playback is completed.
Streaming live broadcast protocol
RTP : (Real-time Transport Protocol)
RTP is a transport layer protocol for multimedia data streams on the Internet. The RTP protocol is used with the RTP control protocol RTCP, and it is built on the UDP protocol. RTP does not download the entire video file completely like http and ftp. It sends data on the network at a fixed data rate. The client also watches the video files at this speed. When the video screen is played, it cannot be played again. Unless the data is re-requested to the server.
RTCP: Real-time Transport Control Protocol or RTP Control Protocol or Shorthand RTCP)
RTCP is a real-time transmission control protocol and is a sister protocol of the Real-Time Transport Protocol (RTP).
Note: The RTP protocol is used with the RTP Control Protocol (RTCP) and it is built on the UDP protocol.
RTSP: (Real Time Streaming Protocol)
RTSP is a multimedia streaming protocol for controlling sound or video. RTSP provides an extensible framework that enables controlled, on-demand playback of real-time data such as audio and video. Media data uses rtp, rtcp protocol. Udp is generally used as the transport layer. Suitable for IPTV scenarios. The data source includes live data and data stored in the clip. The purpose of the protocol is to control multiple data transmission connections, providing a way to select transmission channels, such as UDP, multicast UDP, and TCP.
The network protocol used to select the method based on the RTP transmission mechanism is not within the scope of its definition. The server can choose to use TCP or UDP to transmit the stream content, which can tolerate network delay.
The biggest difference between RTSP and RTP is that RTSP is a two-way real-time data transmission protocol that allows clients to send requests to the server, such as playback, fast forward, and reverse. Of course, RTSP can transmit data based on RTP, and can also select TCP, UDP, multicast UDP and other channels to send data, which is very scalable. It is a network application layer protocol similar to the http protocol.
RTMP (Real Time Messaging Protocol)
A live video protocol developed by Macromedia is now Adobe. Like HLS, it can be applied to live video, and will not be lost based on TCP. The difference is that RTMP based flash cannot be played in iOS browsers, but real-time performance is better than HLS. The RTMP Real-Time Messaging Protocol is an open protocol developed by Adobe Systems for the transmission of audio, video and data between Flash players and servers. The iOS code is generally used to use RTMP push stream, you can use the third-party library librtmp-iOS to push the stream, librtmp encapsulates some core API for the user to call.
The RTMP protocol also requires the client and server to establish a RTMP Connection by “handshake” and then transmit control information on the Connection. The RTMP protocol will format the data when it is transmitted. In order to better achieve the fairness of multiplexing, packetization and information during the actual transmission, the sender will divide the Message into Chunk with Message ID, each Chunk. It may be a single Message, or it may be part of Message. At the receiving end, the chunk will be restored to a complete Message according to the length of the data contained in the Chunk, the length of the message id and the message, so that the information can be sent and received.
The HTTP video protocol is mainly after the popularity of the Internet. Formed under the demand for watching videos on the Internet.
The original HTTP video protocol, nothing special, is a generic HTTP file progressive download. The essence is to download the video file, and the characteristics of the video file itself, that is, the presence of the header information, and part of the video frame data, can be completely decoded and played. Obviously this way you need to put the header information of the video file in front of the file. Some, for example, the faststart tool, specifically do this.
However, in the most primitive state, the video cannot be fast-forwarded or jumped to the part where the file has not been downloaded. At this time, the range-request requirement for the HTTP protocol was proposed. This is currently supported by almost all HTTP servers. Range-request, which is part of the data of the request file, specifies the number of offset bytes. After the video client parses the header of the video file, it can determine the position of the corresponding frame of the subsequent video. Or according to the information such as the code rate, calculate the corresponding position.
HTTP Live Streaming also has a huge advantage: adaptive streaming. The effect is that the client will automatically select video streams with different bitrates according to the network conditions. When the conditions permit, the high bit rate is used. When the network is busy, the low bit rate is used, and the video is automatically switched between the two. This is very helpful in ensuring smooth playback in the case of unstable mobile network conditions. The implementation method is that the server side provides a multi-rate video stream, and the note file indicates in the list file that the player automatically adjusts according to the playback progress and the download speed. It is also very simple to use.
The real-time performance is relatively poor, and the delay during live broadcast is relatively high.
HLS: HTTP Live Streaming (HLS)
HLS is an HTTP-based streaming media transfer protocol implemented by Apple Inc., which enables streaming and on-demand streaming media. It is mainly used in iOS systems to provide audio and video live and on-demand solutions for iOS devices (such as iPhones and iPads). .
HLS on demand is basically a common segmentation HTTP on demand, the difference is that its segmentation is very small. Compared with common streaming media live broadcast protocols, such as RTMP protocol, RTSP protocol, MMS protocol, etc., the biggest difference of HLS live broadcast is that the live broadcast client does not get a complete data stream.
The HLS protocol stores the live stream on the server side as a continuous, short-lived media file (MPEG-TS format), while the client continuously downloads and plays these small files, because the server will always have the latest live broadcast. The data generates a new small file, so that the client can play the live broadcast as long as the files obtained from the server are played in sequence. It can be seen from this that it can basically be considered that HLS is implemented by on-demand technology. Since the data is transmitted through the HTTP protocol, there is no need to consider the firewall or proxy problem, and the length of the segment file is very short, and the client can quickly select and switch the bit rate to adapt to the playback under different bandwidth conditions. However, this technical feature of HLS determines that its delay is generally higher than the normal streaming live broadcast protocol.
The web side implements the protocol of streaming media. When Google just launched WebRTC, the giants either looked down on them or they were very resistant. Transfer using the RTP protocol.
Author: Tang Zhuanlin
Copyright statement: This article is the original article of the blogger, please attach the blog post link!