轉自:http://blog.csdn.net/Tinnal/article/details/2871734,感謝博主的分
這是用RTP(RFC3350)按RFC2550封裝MPEG ES流數據的發送程序。學習RTP的路真的辛苦。在網上收集的有關RTP的程序都是那種只負責RTP數據包發送的庫,如jrtplib等,他們的DEMO程序都只是用來發發字符串,編編聊天程序,無論是國內還是國外,都沒有結合真正的應用的DEMO。其實我的目的很簡單,就是寫發個視頻流服務器,不用複雜,只用把基本原理弄懂,因爲這樣你纔能有的放矢。與網上和RTP相關的庫沒有應用不一讓,當你嘗試以流媒體服務器、linux來baidu或google時,你搜出來完非就那麼幾類:
1.FFSERVER
FFMPEG2的DEMO,說它有名只是因爲這類程序太少了。FFMPEG2是很好用,我現在還在用,但這個DEMO就有很多“炒作”的嫌疑了。好像在做着FFMPEG2庫的演示而不是真的視頻流服務器。後來想想,這不正是作者想要的嗎,但這不是我想要的。編解碼部分我會很偏向FFMPEG這個“大雜會”,其它部分我會選擇其它的“強者”
2.Darwin、Helix
兩個都是非常有名軟件,也只能稱之爲軟件了,因爲就算Darwin有源碼,這種代碼規模,也不適合用於嵌入式。說回軟件本身,真的很有名。它們都是很真真拿來商業化運行的軟件,但我是研發人員,不是視頻流服務商,對不起,Apple,對不起,Microsoft。
3.LIVE555
如果說上面兩個和我都相關性爲零(當然了,也是困擾了N周以後痛苦得出的結論),那LIVE555真的給了我一條出路,它是一個代碼規模非常合適,又非常強大的媒體解決方案(稱之爲方案是因爲它功能非常的豐富)。有長一段時間,我想去弄懂它的源碼,不過和網上的很多人一樣,最後軟下來了,畢竟,去把這麼多東西揉在一起,框架會弄得很複雜,因爲我們要把這些完全不同的東西不斷一層一層的抽像,最後抽像成一樣(哲學呀)。它結構複雜是我中斷分析它原來的其中一個原因,但不是主要原因。它結構的複雜程度也沒胡像很多人網上說的那樣嚴重,如果你是一個C++的熱忱愛好者,你反而會迷上這段代碼,當然了,對C的愛好都來說,當然是一種折磨了。暫時把我自己歸類在C++愛好者範疇吧,呵呵,我很欣賞這段原碼。主要原因是我不希望被某一個庫綁死。LIVE555是有編解碼能力,但我更希望它只做服務器的工作。
因此,最終後回來的老路上來,沒有幫助,就得自己幫自己,從最基礎的RFC看起。經過了N天(周)的英文,終於領會了如果在RTP承載MPEG數據包。在這個過程中很得到了一些LIVE555的幫助(通過對Ethereal捕捉的LIVE555數據包進行分析)。先把程序弄上來,原理性的以後有空再寫,程序只有一個.cpp文件,在vs.net 2003下編譯通過,播放的視頻文件在http://www.cnitblog.com/Files/tinnal/ES流解釋程序.rar
內,播放的客戶端採用VLC,其下載地址爲http://www.videolan.org/。選擇打開網絡串流,然後選擇“UDP/RTP”端口,輸入程序的輸出端口1000,然後才運行程序,你將在VLC內看到測試的廣播視頻,IP不一樣的話自己改改就行。其它所謂的原理性的,也就是看RFC 3350、RFC2550以及iso13818-2的一些重點地方。
#include <stdio.h>
#include <stdlib.h>
#include <conio.h>
#include <string.h>
#include <winsock2.h>
#include <winsock2.h>
//#include "mem.h"
//
#define PACK_STARTCODE (unsigned int)0x000001ba
#define SYSTEM_HEADER_STARTCODE (unsigned int)0x000001bb
#define PICTURE_START_CODE (unsigned int)0x00000100
#define GROUP_START_CODE (unsigned int)0x000000B8
#define ISO_11172_ENDCODE (unsigned int)0x000001b9
#define SEQUENCE_HEADER_CODE (unsigned int)0x000001b3
#define PACKET_BUFFER_END (unsigned int)0x00000000
#define MAX_RTP_PKT_LENGTH 1440
#define HEADER_LENGTH 16
#define DEST_IP "192.168.0.98"
#define DEST_PORT 1000
#define MPA 14 /*MPEG PAYLOAD TYPE */
#define MPV 32
typedef struct
{
/* byte 0 */
unsigned char csrc_len:4; /* expect 0 */
unsigned char extension:1; /* expect 1, see RTP_OP below */
unsigned char padding:1; /* expect 0 */
unsigned char version:2; /* expect 2 */
/* byte 1 */
unsigned char payload:7; /* RTP_PAYLOAD_RTSP */
unsigned char marker:1; /* expect 1 */
/* bytes 2, 3 */
unsigned short seq_no;
/* bytes 4-7 */
unsigned long timestamp;
/* bytes 8-11 */
unsigned long ssrc; /* stream number is used here. */
} RTP_FIXED_HEADER;
typedef struct {
//byte 0
unsigned char TR_high2:2; /* Temporal Reference high 2 bits*/
unsigned char T:1; /* video specific head extension flag */
unsigned char MBZ:5; /* unused */
//byte1
unsigned char TR_low8:8; /* Temporal Reference low 8 bits*/
//byte3
unsigned char P:3; /* picture type; 1=I,2=P,3=B,4=D */
unsigned char E:1; /* set if last byte of payload is slice end code */
unsigned char B:1; /* set if start of payload is slice start code */
unsigned char S:1; /* sequence header present flag */
unsigned char N:1; /* N bit; used in MPEG 2 */
unsigned char AN:1; /* Active N bit */
//byte4
unsigned char FFC:3; /* forward_f_code */
unsigned char FFV:1; /* full_pel_forward_vector */
unsigned char BFC:3; /* backward_f_code */
unsigned char FBV:1; /* full_pel_backward_vector */
} MPEG_VID_SPECIFIC_HDR; /* 4 BYTES */
enum reading_status {
SLICE_AGAIN,
SLICE_BREAK,
UNKNOWN,
SLICE,
SEQUENCE_HEADER,
GROUP_START,
PICTURE
};
void validate_file();
float frame_rate(int buffer_index);
unsigned int read_picture_type(int buffer_index);
unsigned int read_FBV(int buffer_index);
unsigned int read_BFC(int buffer_index);
unsigned int read_FFV(int buffer_index);
unsigned int read_FFC(int buffer_index);
unsigned int extract_temporal_reference(int buffer_index);
unsigned int find_next_start_code(unsigned int *buffer_index);
void reset_buffer_index(void);
BOOL InitWinsock();
//這個程序主要用於RTP封裝MPEG2數據的學習和測試,不作任何其它用途
//軟件在VS.net 2003中編譯通過,但在linux下作小量修改也應編譯通過。
//通過VLC測試,VLC能正確接收和解碼由本程序發送的TEST.MPV編碼流。
//
//作者:馮富秋 Tinnal
//郵箱:[email protected]
#include "MPEG2RTP.h"
#pragma comment(lib,"Ws2_32")
unsigned char buf[MAX_RTP_PKT_LENGTH + 4]; //input buffer
enum reading_status state = SEQUENCE_HEADER;
unsigned int g_index_in_packet_buffer = HEADER_LENGTH;
static unsigned long g_time_stamp = 0;
static unsigned long g_time_stamp_current =0;
static float g_frame_rate = 0;
static unsigned int g_delay_time = 0;
static unsigned int g_timetramp_increment = 0;
FILE *mpfd;
SOCKET socket1;
RTP_FIXED_HEADER *rtp_hdr;
MPEG_VID_SPECIFIC_HDR *mpeg_hdr;
#if 0
void Send_RTP_Packet(unsigned char *buf,int bytes)
{
int i = 0;
int count = 0;
printf("/nPacket length %d/n",bytes);
printf("RTP Header: [M]:%s [sequence number]:0x%lx [timestamp]:0x%lx/n",
rtp_hdr->marker == 1?"TRUE":"FALSE",
rtp_hdr->seq_no,
rtp_hdr->timestamp);
printf(" [TR]:%d [AN]:%d [N]:%d [Sequence Header]:%s /
/n [Begin Slice]:%s [End Slice]:%s /
/n [Pictute Type]:%d /
/n [FBV]:%d [BFC]:%d [FFV]:%d [FFC]:%d/n",
(mpeg_hdr->TR_high2 << 8 | mpeg_hdr->TR_low8),
mpeg_hdr->AN, mpeg_hdr->N, mpeg_hdr->S == 1?"TRUE":"FALSE",
mpeg_hdr->B ==1?"TRUE":"FALSE", mpeg_hdr->E ==1?"TRUE":"FALSE",
mpeg_hdr->P,
mpeg_hdr->FBV, mpeg_hdr->BFC, mpeg_hdr->FFV, mpeg_hdr->FFC);
while(bytes --)
{
printf("%02x ",buf[count++]);
if(++i == 16)
{
i=0;
printf("/n");
}
}
printf("/n");
}
#else
Send_RTP_Packet(unsigned char *buf,int bytes)
{
return send( socket1, (char*) buf, bytes, 0 );
}
#endif
void main(int argc, char *argv[])
{
unsigned int next_start_code;
unsigned int next_start_code_index;
unsigned int sent_bytes;
unsigned short seq_num =0;
unsigned short stream_num = 10;
struct sockaddr_in server;
int len =sizeof(server);
#if 0
mpfd = fopen("E://tinnal//live555//vc_proj//es//Debug//test.mpv", "rb");
#else
if (argc < 2)
{
printf("/nUSAGE: %s mpegfile/nExiting../n/n",argv[0]);
exit(0);
}
mpfd = fopen(argv[1], "rb");
#endif
if (mpfd == NULL )
{
printf("/nERROR: could not open input file %s/n/n",argv[1]);
exit(0);
}
rtp_hdr = (RTP_FIXED_HEADER*)&buf[0];
mpeg_hdr = (MPEG_VID_SPECIFIC_HDR*)&buf[12];
memset((void *)rtp_hdr,0,12); //zero-out the rtp fixed hdr
memset((void *)mpeg_hdr,0,4); //zero-out the video specific hdr
memset((void *)buf,0,MAX_RTP_PKT_LENGTH + 4);
InitWinsock();
server.sin_family=AF_INET;
server.sin_port=htons(DEST_PORT); //server的監聽端口
server.sin_addr.s_addr=inet_addr(DEST_IP); //server的地址
socket1=socket(AF_INET,SOCK_DGRAM,0);
connect(socket1, (const sockaddr *)&server, len) ;
//read the first packet from the mpeg file
//always read 4 extra bytes in (in case there's a startcode there)
//but dont send more than MAX_RTP_PKT_LENGTH in one packet
fread(&(buf[HEADER_LENGTH]), MAX_RTP_PKT_LENGTH-HEADER_LENGTH+4, 1,mpfd);
validate_file();
do
{
/* initialization of the two RTP headers */
rtp_hdr->seq_no = htons(seq_num ++);
rtp_hdr->payload = MPV;
rtp_hdr->version = 2;
rtp_hdr->marker = 0;
rtp_hdr->ssrc = htonl(stream_num);
mpeg_hdr->S = mpeg_hdr->E = mpeg_hdr->B= 0;
do{
next_start_code = find_next_start_code(&next_start_code_index);
if ((next_start_code >0x100) && (next_start_code<0x1b0) )
{
// //
if(state == SEQUENCE_HEADER
|| state ==GROUP_START
|| state ==PICTURE
|| state == UNKNOWN)
{
state = SLICE;
mpeg_hdr->B = 1;
}
// // //
else if (state == SLICE ||state == SLICE_AGAIN)
{
state = SLICE_AGAIN;
sent_bytes = next_start_code_index;
mpeg_hdr->E = 1;
}
// // // //
else if (state == SLICE_BREAK)
{
state = UNKNOWN;
sent_bytes = next_start_code_index;
mpeg_hdr->E = 1;
goto Sent_Packet;
}
}
switch(next_start_code)
{
case SEQUENCE_HEADER_CODE:
// // //
if(state == SLICE || state == SLICE_AGAIN)
{
state = SEQUENCE_HEADER;
sent_bytes = next_start_code_index;
// //
rtp_hdr->marker = 1;
goto Sent_Packet;
}
state = SEQUENCE_HEADER;
g_frame_rate = frame_rate(next_start_code_index);
g_delay_time = (unsigned int)(1000.0 / g_frame_rate +0.5); //ms
g_timetramp_increment = (unsigned int)(90000.0 / g_frame_rate +0.5); //90K Hz
mpeg_hdr->S=1;
break;
case GROUP_START_CODE:
// // //
if(state == SLICE || state == SLICE_AGAIN)
{
state = GROUP_START;
sent_bytes = next_start_code_index;
// //
rtp_hdr->marker = 1;
goto Sent_Packet;
}
state = GROUP_START;
case PICTURE_START_CODE:
// // //
if(state == SLICE || state == SLICE_AGAIN)
完成這個測試程序後,我有了很大的信心,又重複看了RFC3550幾編,其實,如果你真看了程序,你發現我只發送了RTP,並沒有發送RTCP數據包,因此,我們是不能同步多個RTP流的。我沒去編碼下去,因爲我覺得已經夠了。這裏強調,沒用說的RTP沒有了RTCP就不行!接下來的工作,就是把這個程序的下層發包函數去掉,採用RTP庫JRTPLIB,我覺得這才應該是JRTPLIB的DEMO!如果有人問,就這樣的一個程序就能完成任務了,要JRTPLIB幹嘛,其實,我不寫RTCP相關代碼的原因爲多個:
1.RTCP裏頭有很多關於RTCP發送簡隔的時間計算,RTP信息的統計,這種操作不是難,而是煩,我不想去寫
2.RTCP和RTP一開始出來的時候並不是因爲視頻的點播等應用的,而是視頻會議。RTCP有管理與會者的層面含義,這一功能在很多場合並不會用到。
3.我想簡單,沒有寫多個流間的同步,如一個影片的視頻和音頻流。這些其實是RTCP來完成的。
我懶得去寫,因爲這些功作RTP的各個庫類都做得很好。我覺得用庫的最大優點就在這吧。
享