日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

Live555实时视频流应用总结

發(fā)布時間:2023/12/18 编程问答 25 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Live555实时视频流应用总结 小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.

1,linux 環(huán)境:
官網(wǎng)上下載,下載地址:http://www.live555.com/liveMedia/public/
live555 版本:“2018.12.14”
參考:http://www.live555.com/liveMedia/faq.html 這個FAQ要仔細閱讀。
2,編譯
根據(jù)不同的平臺來配置,并生成對應的Makefile
2.1 ARM平臺:
修改交叉編譯工具
cp config.armlinux config.arm
vi config.arm
CROSS_COMPILE?= arm-buildroot-linux-uclibcgnueabi-
生成Makefile: ./genMakefiles arm
2.2 Linux 64位平臺(x86-64 ):
./genMakefiles linux-64bit
2.3 Linux 32位平臺(x86):
./genMakefiles linux
make

生成mediaServer/live555MediaServer

3,測試
3.1,mediaServer下 會生成 live555MediaServer。
live555MediaServer test.264
如果出現(xiàn)Correct this by increasing “OutPacketBuffer::maxSize” to at least 186818, before creating this ‘RTPSink’. (Current value is 100000.)
在DynamicRTSPServer.cpp文件ServerMediaSession* createNewSMS()
里修改OutPacketBuffer::maxSize

if (strcmp(extension, ".264") == 0) {// Assumed to be a H.264 Video Elementary Stream file:NEW_SMS("H.264 Video");OutPacketBuffer::maxSize = 300000; //100000;// allow for some possibly large H.264 framessms->addSubsession(H264VideoFileServerMediaSubsession::createNew(env, fileName, reuseSource));}

createNewSMS是在RTSP setup時調用的。
3.2,testProgs
testProgs 目錄下各種測試文件,每個文件的作用和用法,官網(wǎng)上有詳細的介紹。這些測試用例目前基本上都是以文件的形式作為輸入源,下面重點介紹以實時流的形式作為輸入源的2種方法。
主要是參考testH264VideoStreamer 和testOnDemandRTSPServer來修改。

4.不用讀文件,使用實時視頻流作為輸入源
**
最簡單的方法:將實時視頻流推送到一個FIFO管道(或stdin),將文件名改為這個管道的文件名,這里不做詳細介紹了。

4.1 方法1,參考testH264VideoStreamer

參考"liveMedia/DeviceSource.cpp"
定義一個H264LiveVideoSource例并繼承DeviceSource,填充其成員,

void play() {// Open the input file as a 'byte-stream file source':ByteStreamFileSource* fileSource=ByteStreamFileSource::createNew(*env, inputFileName); }

這里用H264LiveVideoSource代替ByteStreamFileSource

H264LiveVideoSource類后面會給出具體的代碼。
修改testH264VideoStreamer.cpp main()

ServerMediaSession* sms= ServerMediaSession::createNew(*env, "testStream", NULL,"Session streamed by \"testH264VideoStreamer\"",True /*SSM*/);

修改play()函數(shù)如下:

void play() {// Open the input file as a 'byte-stream file source':#if 1H264LiveVideoSource* fileSource= new H264LiveVideoSource(*env);if (fileSource == NULL) {*env << "Unable to open file \"" << inputFileName<< "\" as a byte-stream file source\n";exit(1);}#elseByteStreamFileSource* fileSource= ByteStreamFileSource::createNew(*env, inputFileName);if (fileSource == NULL) {*env << "Unable to open file \"" << inputFileName<< "\" as a byte-stream file source\n";exit(1);}#endifFramedSource* videoES = fileSource;// Create a framer for the Video Elementary Stream:videoSource = H264VideoStreamFramer::createNew(*env, videoES);// Finally, start playing:*env << "Beginning to read from file...\n";videoSink->startPlaying(*videoSource, afterPlaying, videoSink); }

4.2 方法2,參考testOnDemandRTSPServer
1)set the variable “reuseFirstSource” to “True”
2)根據(jù)類H264VideoFileServerMediaSubsession,新建一個新類H264LiveVideoServerMediaSubsession, implementation of the two pure virtual functions “createNewStreamSource()” and “createNewRTPSink()”
在createNewStreamSource()里用上面的H264LiveVideoSource代替ByteStreamFileSource。

H264VideoRTPSink繼承關系:
H264VideoRTPSink->H264or5VideoRTPSink->VideoRTPSink->MultiFramedRTPSink->RTPSink->MediaSink->Medium。
H264VideoRTPSource繼承關系:
H264VideoRTPSource->MultiFramedRTPSource->RTPSource->FramedSource->MediaSource->Medium.
H264VideoStreamFramer繼承關系:
H264VideoStreamFramer->H264or5VideoStreamFramer->MPEGVideoStreamFramer->FramedFilter->FramedSource ->MediaSource->Medium.

下面列出具體實現(xiàn)的代碼。

#ifndef _H264_LIVE_VIDEO_SERVER_MEDIA_SUBSESSION_HH #define _H264_LIVE_VIDEO_SERVER_MEDIA_SUBSESSION_HH#include "OnDemandServerMediaSubsession.hh" #include "liveMedia.hh" #include "UsageEnvironment.hh" #include "GroupsockHelper.hh"class H264LiveVideoServerMediaSubsession: public OnDemandServerMediaSubsession { public:H264LiveVideoServerMediaSubsession(UsageEnvironment & env,Boolean reuseFirstSource);~H264LiveVideoServerMediaSubsession();static H264LiveVideoServerMediaSubsession* createNew(UsageEnvironment& env,Boolean reuseFirstSource); public: // new virtual functions, defined by all subclassesvirtual FramedSource* createNewStreamSource(unsigned clientSessionId,unsigned& estBitrate) ;// "estBitrate" is the stream's estimated bitrate, in kbpsvirtual RTPSink* createNewRTPSink(Groupsock* rtpGroupsock,unsigned char rtpPayloadTypeIfDynamic,FramedSource* inputSource);virtual char const * getAuxSDPLine(RTPSink * rtpSink, FramedSource * inputSource);static H264LiveVideoServerMediaSubsession* createNew(UsageEnvironment & env, FramedSource * source);static void afterPlayingDummy(void * ptr);static void chkForAuxSDPLine(void * ptr);void chkForAuxSDPLine1(); private:FramedSource * m_pSource;char * m_pSDPLine;RTPSink * m_pDummyRTPSink;char m_done; };#endif

H264LiveVideoServerMediaSubsession.cpp

H264LiveVideoServerMediaSubsession::H264LiveVideoServerMediaSubsession(UsageEnvironment & env,Boolean reuseFirstSource):OnDemandServerMediaSubsession(env,reuseFirstSource) {m_pSource = NULL;//source;m_pSDPLine = NULL;m_pDummyRTPSink =NULL;m_done=0;}H264LiveVideoServerMediaSubsession::~H264LiveVideoServerMediaSubsession() {if (m_pSDPLine){free(m_pSDPLine);} } H264LiveVideoServerMediaSubsession * H264LiveVideoServerMediaSubsession::createNew(UsageEnvironment& env,Boolean reuseFirstSource) {return new H264LiveVideoServerMediaSubsession(env,reuseFirstSource); } FramedSource * H264LiveVideoServerMediaSubsession::createNewStreamSource(unsigned clientSessionId, unsigned & estBitrate) {//printf("===========createNewStreamSource===================\n");estBitrate = 500; return H264VideoStreamFramer::createNew(envir(), new H264LiveVideoSource(envir())); }RTPSink * H264LiveVideoServerMediaSubsession::createNewRTPSink(Groupsock * rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource * inputSource) {return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic); }char const *H264LiveVideoServerMediaSubsession::getAuxSDPLine(RTPSink * rtpSink, FramedSource * inputSource){if (m_pSDPLine){return m_pSDPLine;}m_pDummyRTPSink = rtpSink;if(NULL == m_pDummyRTPSink)return NULL;//mp_dummy_rtpsink->startPlaying(*source, afterPlayingDummy, this);m_pDummyRTPSink->startPlaying(*inputSource, 0, 0);chkForAuxSDPLine(this);m_done = 0;char const *dasl = m_pDummyRTPSink->auxSDPLine();envir().taskScheduler().doEventLoop(&m_done);if(dasl)m_pSDPLine = strdup(dasl);m_pDummyRTPSink->stopPlaying();return m_pSDPLine;} void H264LiveVideoServerMediaSubsession::afterPlayingDummy(void * ptr) {H264LiveVideoServerMediaSubsession * This = (H264LiveVideoServerMediaSubsession *)ptr;This->m_done = ~0; }void H264LiveVideoServerMediaSubsession::chkForAuxSDPLine(void * ptr) {H264LiveVideoServerMediaSubsession * This = (H264LiveVideoServerMediaSubsession *)ptr;This->chkForAuxSDPLine1(); }void H264LiveVideoServerMediaSubsession::chkForAuxSDPLine1() {if (m_pDummyRTPSink->auxSDPLine()){m_done = ~0;}else{double delay = 1000.0 / (FRAME_PER_SEC); // msint to_delay = delay * 1000; // usnextTask() = envir().taskScheduler().scheduleDelayedTask(to_delay, chkForAuxSDPLine, this);} }

修改testOnDemandRTSPServer.cpp文件如下:
在main()里加入下面的代碼

// A H.264 live video stream:{OutPacketBuffer::maxSize = 300000;char const* streamName = "h264LiveVideo";char const* inputFileName = "test";ServerMediaSession* sms= ServerMediaSession::createNew(*env, streamName, streamName,descriptionString,True);UsageEnvironment& envr = rtspServer->envir();envr << "\n\"" << sms<< "\"\n" ;if(NULL == sms)printf("sms is null \n");sms->addSubsession(H264LiveVideoServerMediaSubsession ::createNew(*env,True));rtspServer->addServerMediaSession(sms);announceStream(rtspServer, sms, streamName, inputFileName);}

H264LiveVideoSource.hh

#ifndef _H264_LIVE_VIDEO_SOURCE_HH #define _H264_LIVE_VIDEO_SOURCE_HH#ifndef _FRAMED_SOURCE_HH #include "FramedSource.hh" #endif #include "DeviceSource.hh"class H264LiveVideoSource: public FramedSource { public:H264LiveVideoSource(UsageEnvironment& env);// called only by createNew()virtual ~H264LiveVideoSource();private:// redefined virtual functions:virtual void doGetNextFrame();//virtual void doStopGettingFrames();int maxFrameSize();static void getNextFrame(void * ptr);void GetFrameData();private:void *m_pToken;char *m_pFrameBuffer;char *fTruncatedBytes;int fTruncatedBytesNum; };#endif

H264LiveVideoSource.cpp

#include "H264LiveVideoSource.hh" //#include "InputFile.hh" #include "GroupsockHelper.hh"#define FRAME_BUF_SIZE (1024*1024) #define FMAX (300000) H264LiveVideoSource::H264LiveVideoSource(UsageEnvironment& env):FramedSource(env), m_pToken(0), m_pFrameBuffer(0)fTruncatedBytesNum(0),fTruncatedBytes(0) {m_pFrameBuffer = new char[FRAME_BUF_SIZE];fTruncatedBytes = new char[FRAME_BUF_SIZE];if(m_pFrameBuffer == NULL || fTruncatedBytes== NULL ){printf("[MEDIA SERVER] error malloc data buffer failed\n");return;}memset(m_pFrameBuffer,0,FRAME_BUF_SIZE);//fMaxSize = FMAX;printf("[H264LiveVideoSource] fMaxSize:%d\n",fMaxSize); }H264LiveVideoSource::~H264LiveVideoSource() {envir().taskScheduler().unscheduleDelayedTask(m_pToken);if(m_pFrameBuffer){delete[] m_pFrameBuffer;m_pFrameBuffer = NULL;}if(fTruncatedBytes){delete[] fTruncatedBytes;fTruncatedBytes = NULL;} } int H264LiveVideoSource::maxFrameSize() {return FRAME_BUF_SIZE; } void H264LiveVideoSource::doGetNextFrame() {int uSecsToDelay = 40000; // 40 msm_pToken = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay,(TaskFunc*)getNextFrame, this);//printf("m_pToken =%p \n" ,m_pToken); } void H264LiveVideoSource::getNextFrame(void *ptr) {H264LiveVideoSource *p=(H264LiveVideoSource *)ptr;if(NULL == p)printf("null point \n");p->GetFrameData(); }#include <sys/types.h> #include <sys/stat.h> #include <string.h> #include <fcntl.h> #include <unistd.h> #include <limits.h>typedef struct {unsigned long long timeTick; //時間(ms)unsigned int dataLen; //數(shù)據(jù)長度unsigned char dataType; //數(shù)據(jù)類型(DataType_E)unsigned char rsv[3]; unsigned long long timeStamp; //編碼時間戳(us)unsigned char iFrame; //是否為關鍵幀unsigned char frameRate; //幀率int encodeType; //編碼類型VideoEncodeType_Eunsigned short width; //視頻寬度unsigned short height; //視頻高度unsigned char rsv1[8];unsigned char data[0]; }IFVFrameHeader_S; void H264LiveVideoSource::GetFrameData() { #if 1//memcpy(fTo,m_pFrameBuffer,fFrameSize);int read = ShareBufGetOneFrame(g_BufHandle[0], FRAME_BUF_SIZE, (char *)m_pFrameBuffer); //這里要改成你自己實際獲取視頻幀的函數(shù)if (read == 0){printf("read byte =0 \n");fFrameSize =0;// FramedSource::afterGetting(this);return; }IFVFrameHeader_S *pFrameHead = reinterpret_cast<IFVFrameHeader_S *>(m_pFrameBuffer);if(pFrameHead == NULL ){printf("pFrameHead =0 \n");fFrameSize =0;return;}if(iframetype == 0){ if(1==pFrameHead->iFrame){iframetype =1;}else{//printf("no i frame \n");//fFrameSize =0;//gettimeofday(&fPresentationTime,NULL);//FramedSource::afterGetting(this);//return;}}int framelen=pFrameHead->dataLen;#if 0if(pFrameHead->dataLen > fMaxSize)pFrameHead->dataLen = fMaxSize;memcpy(fTo,m_pFrameBuffer+sizeof(IFVFrameHeader_S),pFrameHead->dataLen);fFrameSize =pFrameHead->dataLen;#else//printf("pFrameHead->dataLen =%d fMaxSize=%u\n",pFrameHead->dataLen,fMaxSize);if(framelen > fMaxSize){framelen = fMaxSize;fNumTruncatedBytes = pFrameHead->dataLen-fMaxSize;memcpy(fTo,pFrameHead->data,framelen);memmove(fTruncatedBytes,pFrameHead->data + framelen,fNumTruncatedBytes);fFrameSize =framelen;}else{if(fNumTruncatedBytes > 0 ) {memmove(fTo,fTruncatedBytes,fTruncatedBytesNum);memmove(fTo + fTruncatedBytesNum,pFrameHead->data,framelen);fFrameSize += fTruncatedBytesNum;// printf("send last truncted %d bytes\n",fTruncatedBytesNum);fTruncatedBytesNum = 0;}else{memcpy(fTo,pFrameHead->data,framelen);fFrameSize =framelen; }}#endiffDurationInMicroseconds = 1000000/25;gettimeofday(&fPresentationTime,NULL);//*nextPT=fPresentationTime;FramedSource::afterGetting(this); #else#define FIFO_NAME "./test.264"//#define BUFFER_SIZE (30000)static int fd=-1;//static u_int64_t fFileSize =0;if(fd ==-1){fd = open(FIFO_NAME,O_RDONLY);if(fd > 0){// fFileSize =5316637;// GetFileSize(FIFO_NAME, fd);}}if(fd ==-1){printf("open file %s fail \n",FIFO_NAME);return;}int len =0;int remain = fMaxSize;//if(remain >fMaxSize)// remain =fMaxSize;if((len = read(fd,fTo,remain))>0){//memmove(fTo,m_pFrameBuffer,len);gettimeofday(&fPresentationTime, NULL);fFrameSize=len;}else{if(fd >0){::close(fd);fd = -1;//printf("GetFrameData close file %d\n",len);}}fDurationInMicroseconds = 1000000/25;gettimeofday(&fPresentationTime,NULL);// printf("fMaxSize=%d fFrameSize=%d\n",fMaxSize,fFrameSize);//nextTask() = envir().taskScheduler().scheduleDelayedTask(0,// (TaskFunc*)FramedSource::afterGetting, this);FramedSource::afterGetting(this);#endif }

linve555常用修改點:

1, 輸入的一幀數(shù)據(jù)最大值
StreamParser.cpp
#define BANK_SIZE 1500000 //幀越大,這個值就要越大

2, rtp buffer最大值
(1)Source端使用 MultiFramedRTPSource.cpp
BufferedPacket::BufferedPacket()
定義輸入Buffer的上限值,即BufferedPacket的最大值
#define MAX_PACKET_SIZE 65536
(2)Sink端使用 MultiFramedRTPSink.cpp
#define RTP_PAYLOAD_MAX_SIZE 1456 //(1500-14-20-8)/4 *4 //ethernet=14,IP=20, UDP=8, a multiple of 4 bytes
MediaSink.cpp
靜態(tài)變量OutPacketBuffer::maxSize = 600000; // allow for some possibly large H.265 frames,2000000 is by default
最好是RTP_PAYLOAD_MAX_SIZE的整數(shù)倍
值小了,會不斷打印信息: Correct this by increasing “OutPacketBuffer::maxSize” to at least

,3,獲取IP地址失敗
RTSPServer::rtspURLPrefix(){
ourIPAddress(envir())
}

GroupsockHelper.cpp

ourIPAddress(){if (badAddressForUs(from)) {#if 0char tmp[100];sprintf(tmp, "This computer has an invalid IP address: %s", AddressString(from).val());env.setResultMsg(tmp);from = 0;#endifstruct ifreq req;int ret = 0;char szIpBuf[32];sock = socket(AF_INET, SOCK_DGRAM, 0);if (-1 != sock){memset(&req, 0, sizeof(req));strncpy(req.ifr_name, "eth0", sizeof(req.ifr_name));ret = ioctl(sock, SIOCGIFADDR, &req);if (-1 == ret){close(sock);}else{memset(&szIpBuf, 0, sizeof(szIpBuf));strcpy(szIpBuf, inet_ntoa(((struct sockaddr_in *)&req.ifr_addr)->sin_addr));close(sock);fromAddr.sin_addr.s_addr=our_inet_addr(szIpBuf);from = fromAddr.sin_addr.s_addr;}}else{char tmp[100];sprintf(tmp, "This computer has an invalid IP address: %s", AddressString(from).val());env.setResultMsg(tmp);from = 0;}}

3,內存泄漏點
RTCPInstance::processIncomingReport
if(NULL != reason)
{
delete[] reason;
reason = NULL;
}
在申請內存時加上上面釋放語句
reason = new char[reasonLength + 1];
4,fill sei data DeltaTfiDivisor
H264or5VideoStreamParser::H264or5VideoStreamParser()
{
//according to H264 and H265 spec, if not fill sei data, then frame_field_info_present_flag is zero. so need to set DeltaTfiDivisor to 2.0 in H264 and 1.0 in H265
if(fHNumber == 264) {
DeltaTfiDivisor = 2.0;
} else {
DeltaTfiDivisor = 1.0;
}

}
5,長時間拉取拉取RTSP流
報錯誤"Hit limit when reading incoming packet over TCP"
可考慮提高maxRTCPPacketSize的值
RTCP.CPP
static unsigned const maxRTCPPacketSize = 1456;

6,如播放越久延時越大
MultiFramedRTPSink.cpp->MultiFramedRTPSink::sendPacketIfNecessary() 最后延時列隊uSecondsToGo 每幀都有延時時間。將uSecondsToGo 值賦為0。

7, 裁剪

只需留下這些目錄,其它可刪除掉。
其中l(wèi)iveMedia目錄下有很多類型的文件,不需要的也可刪除,同時修改
MediaSubsession::createSourceObjects()把相關類型的createNew也刪除掉,否則編譯失敗。

總結

以上是生活随笔為你收集整理的Live555实时视频流应用总结的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內容還不錯,歡迎將生活随笔推薦給好友。