directshow中FillBuffer獲取不到幀如何做

編者:李國帥

qq:9611153 微信lgs9611153

時間:2012-7-31

背景原因:

以前在開發directshow filter的時候,遇到一個問題,如果接收filter的fillbuffer函數沒有收到數據幀的時候,會出現空跑的問題,造成cpu升高,於是查閱了一些資料。基本上也就幾個方法:使用sleep等待、使用重複幀,使用空白黑幀。

以下是一個找到的資料,希望對一些人有用。

 

相關內容:

 

If you look at the docs for FillBuffer, it can return S_OK (prefer this to the equivalent NOERROR) or S_FALSE.

You cannot return S_OK unless you have provided a valid sample in the parameter pSample .

You can return S_FALSE if you are at the end of the stream but not to indicate that you do not have a sample ready.

I provided the information on the loop above but here it is in code:

while (true)
{
         CAutoLock lock(&m_lock); // shared CCritSec with function in other thread that gets the sample data

         hr = CheckForNewSample(pSample); // return VFW_S_STATE_INTERMEDIATE when a sample is not ready, S_OK for a sample is provided in pSample and S_FALSE when end of stream is detected

     if (hr != VFW_S_STATE_INTERMEDIATE)
         return hr;
}

::Sleep(1); // adjust the sleep value to suit your needs
}

Hello,

I'm in the case where my source filter doesn't have (yet) any sample to push. But I don't know what to do in FillBuffer():

- I can't return S_OK since no sample is produced

- I can't return S_FALSE since it is not an end of stream

- I can't do a Sleep() until I have samples to produce, since it blocks the video rendering window.

 

I'm stuck.

 

Do you have any idea?

 

Thanks.

答1

You block in FillBuffer until you have samples to push.  Yes, it blocks the rendering window.  Your source filter can send the EC_BUFFERING_DATA events to let the application know that you are buffering.

答2

Hi!

I overloaded the DoProcessingLoop function and then I have FillBuffer return 99 (I sometimes have frames that i process but i don't want them to be displayed).

In DoProcessingLoop, if hr == 99, I skip the

hr=Deliver(pSample);

and just do pSample->Release();

and everything continues fine.

 

答3

Not sure how kosher this is, but I did this in FillBUffer: 

FILTER_STATE State;

  m_pFilter->GetState(0, &State);

  if(State == State_Paused || State == State_Stopped)
  {
    pIMediaSample->SetActualDataLength(0);  // TODO: Find better way to do this.

    return S_OK;
  }

如果獲取不到數據,一定要等待,不然會出現cpu空跑

         LONG nCount = m_pRTPRecv->GetData(TRUE, pData,lDataLen, nTimeStamp,cbFrameType);

         if(nCount == 0)

         {

                   // get data from buffer, if timeout, retrieved 0 size.

                   pms->SetActualDataLength(0);//空的buffer可能導致下游filter出現崩潰

                   pms->SetTime(NULL, NULL);//時間爲0馬上播放


                   Sleep(5);//如果使用sleep,那麼如果線程較多,cpu佔用依然很大。

                   //CNetLog::printlog(_T("%s: no frame\n"),__FUNCTIONT__);

                   return NOERROR;

         }

http://www.tech-archive.net/Archive/Development/microsoft.public.win32.programmer.directx.video/2006-10/msg00426.html

?Source Filter - What to do when no sample is available?

 

Hi,

 

I'm currently implementing a video conferencing application using DirectShow. I've got a DirectShow source filter (derived from CSource) which reads data (compressed audio or video) from the network and passes it along to the Audio/Video decompressor.

 

My dilemna at the moment is: what should i do when the FillBuffer method is called but no new data is available from the network yet? I have 2 alternatives it seems:

 

1) block until there is some new data. The problem here is that blocking will freeze the entire graph including the renderer filter so this might cause jerky playback (or would it?)

 

2) return an empty sample by calling SetActualDataLength(0) on the IMediaSample pointer passed to the FillBuffer method. The problem here is that if there isn't any new data for a little while, DirectShow might go into a crazy loop calling FillBuffer over and over again (without doing anything in between) which might cause a lot of wasted CPU cycles (and cause the CPU load to reach 100%).

 

I've tried both solutions and both seem to work fine for either audio or video. So I don't quite know which one to pick and I'm not sure whether any of my worries could actually pose real problems.

 

So is there a "best practice" method to follow in these cases or does anybody knows which methods would be better and for what reason?

 

Thanks.

 

回覆1

1) This is what capture filters do. They block till there is another sample.

 

This should work just fine. Of course the playback will be jerky - you're freezing the output!

(阻塞的方式只能用在捕捉過濾器)

2) Provided that you set the timestamps correctly (to the time that the sample would have if it were real), this will not waste CPU time.

It may,however crash downstream filters which expect real data to be present.

At best it will probably make the screen go black.

If you want to take this approach, I would send an emptied buffer of the right size (and timestamps).

(設置爲黑屏,發送大小適當的黑色的壓縮幀)(提供不正確的數據和長度,可能導致下游過濾器崩潰)

The one which you've missed is to repeat the last sample.

(發送一個重複幀)

THis probably isn't good for audio <g>, but will work fine for video (and is preferred in some cases, as things like VMR9 Processing rely on a 'signal' on pin 0 to present changes (such as other input pins or, I think the bitmap) to the display.

回覆2

1)You could also use a combination of the techniques. Block in FillBuffer, if no data is received after a certain time and your socket read times out with a 10060 error then deliver an empty sample to keep things rolling slowly.(採用聯合方式)

回覆3

1)Sounds like a nice idea. I might try that if only i can find some time to optimize the thing once it's working.

 

The thing is that I do not set any timestamps at this stage. The audio and video frames are just rendered as they arrive. I did try to set timestamps with limited succes in order to get lip synchronization (I've got 2 different graphs for the audio and video) but ran into quite a few problems so I decided to remove all the timestamp for now since i need to get a beta version out right now and it actually works better without any timestamps.

I'll have to reintroduce timestamps later but that's another problem. So it might go into a crazy loop if there are no frames for too long.

 

2)Problem is: what is the right size? It depends i guess of the codec used so this means that i'd need to have customized code for every codec I might use to encode audio and video.

 

But i think I'll take your advice and simply block as it works anyway and i would not want to have some filters crash because of my empty buffers.

Nothing crashed during my testing even when i returned 0 length buffers but this problem might appear if I decide to use other codecs so I'll play it safe here.

 

『The one which you've missed is to repeat the last sample.』

I actually thought of that but, as you said, it wouldn't work quite well for audio.

【THis probably isn't good for audio <g>, but will work fine for video (and is preferred in some cases, as things like VMR9 Processing rely on a 'signal' on pin 0 to present changes (such as other input pins or, I think the bitmap) to the display.】

That's interesting. I'll have to have a closer look at how the WMR9 (which happens to be the video renderer i'm using at the moment) works.

 

--------------------------------------------------------------------------------

 

 

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章