日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

ffmpeg在android上输出滑屏问题处理

發布時間:2023/12/10 编程问答 27 豆豆
生活随笔 收集整理的這篇文章主要介紹了 ffmpeg在android上输出滑屏问题处理 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

ffmpeg部分機器上有花屏的問題


原代碼例如以下:

while(av_read_frame(formatCtx, &packet)>=0 && !_stop && NULL!=window && bInit) {// Is this a packet from the video stream?if(packet.stream_index==videoStream) {// Decode video frameavcodec_decode_video2(codecCtx, decodedFrame, &frameFinished,&packet);// Did we get a video frame?if(frameFinished) {// Convert the image from its native format to RGBAsws_scale(sws_ctx,(uint8_t const * const *)decodedFrame->data,decodedFrame->linesize,0,codecCtx->height,frameRGBA->data,frameRGBA->linesize);if(packet.dts == AV_NOPTS_VALUE&& decodedFrame->opaque && *(uint64_t*)decodedFrame->opaque != AV_NOPTS_VALUE){pts = *(uint64_t *)decodedFrame->opaque;LOGD("pst1: %d",pts);}else if(packet.dts != AV_NOPTS_VALUE) {pts = packet.dts;LOGD("pst2: %d",pts);} else {pts = 0;LOGD("pst3: %d",pts);}//pts = av_q2d(codecCtx->time_base) * 1000000.0 * i * 2;pts *= 1000;//LOGD("debug %d,%d,%f",pts, (long)(av_q2d(codecCtx->time_base) * 1000000.0 * i * 2), av_q2d(codecCtx->time_base));if(0 == pts || 0 == baseTime){baseTime = av_gettime() - pts;LOGD("BASETIME: %d",baseTime);}else{waitTime = (baseTime + pts) - av_gettime();LOGD("WAITTIME: %d, %d",waitTime,pts);}//waitTime = (av_q2d(codecCtx->time_base) * 1000.0 - 0.0) * 1000;if(waitTime>0)usleep(waitTime);if(!_stop){synchronized(lockWindow){if(!_stop && NULL!=window){// lock the window bufferif (ANativeWindow_lock(pWin, &windowBuffer, NULL) < 0) {LOGE("cannot lock window");} else {// draw the frame on buffer//LOGD("copy buffer %d:%d:%d", width, height, width*height*RGB_SIZE);//LOGD("window buffer: %d:%d:%d", windowBuffer.width, windowBuffer.height, windowBuffer.stride);memcpy(windowBuffer.bits, buffer, width * height * RGB_SIZE);// unlock the window buffer and post it to displayANativeWindow_unlockAndPost(pWin);// count number of frames++i;}}}}}}細致分析后發現 部分分辨率又可以正常展示,感覺是寬度錯位導致的,分析例如以下:

ORG: 176 ?* 144 ? F
X2: ?352 288 ? O
X3: ?528 432 ? F
X4: ?704 576 ? O
X6: ?1056 *? ?O


X1.1 193 158 ? F
X1.2 211 172 ? F
X1.5 264 216 ? F


X0.5 88 72 ? ?F




X2?

: 352 290 ? O
X2?: 352 600 ? O
X2?: 352 720 ? O
X4?: 704 720 ? O
X6?: 1056 720 ? O
? ?


1280 ---1312
? ? ? ? 1056
1184
1248 ok

發現分辨率依照%64+32對齊, 感覺是內存對齊造成的, 查看ANativeWindow_Buffer例如以下

typedef struct ANativeWindow_Buffer {// The number of pixels that are show horizontally.int32_t width;// The number of pixels that are shown vertically.int32_t height;// The number of *pixels* that a line in the buffer takes in// memory. This may be >= width.int32_t stride;// The format of the buffer. One of WINDOW_FORMAT_*int32_t format;// The actual bits.void* bits;// Do not touch.uint32_t reserved[6]; } ANativeWindow_Buffer;

輸出stride和width的日志發現,假設正常顯示則stride==width, 通過凝視能夠看出應該是內存對齊問題導致的,調整代碼:

if(packet.stream_index==videoStream) {// Decode video frameavcodec_decode_video2(codecCtx, decodedFrame, &frameFinished,&packet);// Did we get a video frame?

if(frameFinished) { // Convert the image from its native format to RGBA sws_scale ( sws_ctx, (uint8_t const * const *)decodedFrame->data, decodedFrame->linesize, 0, codecCtx->height, frameRGBA->data, frameRGBA->linesize ); if(packet.dts == AV_NOPTS_VALUE && decodedFrame->opaque && *(uint64_t*)decodedFrame->opaque != AV_NOPTS_VALUE) { pts = *(uint64_t *)decodedFrame->opaque; LOGD("pst1: %d",pts); } else if(packet.dts != AV_NOPTS_VALUE) { pts = packet.dts; LOGD("pst2: %d",pts); } else { pts = 0; LOGD("pst3: %d",pts); } //pts = av_q2d(codecCtx->time_base) * 1000000.0 * i * 2; pts *= 1000; //LOGD("debug %d,%d,%f",pts, (long)(av_q2d(codecCtx->time_base) * 1000000.0 * i * 2), av_q2d(codecCtx->time_base)); if(0 == pts || 0 == baseTime) { baseTime = av_gettime() - pts; LOGD("BASETIME: %d",baseTime); }else{ waitTime = (baseTime + pts) - av_gettime(); LOGD("WAITTIME: %d, %d",waitTime,pts); } //waitTime = (av_q2d(codecCtx->time_base) * 1000.0 - 0.0) * 1000; if(waitTime>0) usleep(waitTime); if(!_stop) { synchronized(lockWindow) { if(!_stop && NULL!=window) { // lock the window buffer if (ANativeWindow_lock(pWin, &windowBuffer, NULL) < 0) { LOGE("cannot lock window"); } else { // draw the frame on buffer //LOGD("copy buffer %d:%d:%d", width, height, width*height*RGB_SIZE); //LOGD("window buffer: %d:%d:%d", windowBuffer.width, windowBuffer.height, windowBuffer.stride); //memcpy(windowBuffer.bits, buffer, width * height * RGB_SIZE); if(windowBuffer.width >= windowBuffer.stride){ memcpy(windowBuffer.bits, buffer, width * height * RGB_SIZE); }else{ //skip stride-width 跳過padding部分內存 for(int i=0;i<height;++i) memcpy(windowBuffer.bits + windowBuffer.stride * i * RGB_SIZE , buffer + width * i * RGB_SIZE , width * RGB_SIZE); } // unlock the window buffer and post it to display ANativeWindow_unlockAndPost(pWin); // count number of frames ++i; } } } } } }


通過行拷貝方式,跳過后面對齊部分的內存,?

解決這個問題,


轉載于:https://www.cnblogs.com/yangykaifa/p/6992349.html

總結

以上是生活随笔為你收集整理的ffmpeg在android上输出滑屏问题处理的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。