Package tvi.webrtc

Class SurfaceTextureHelper

java.lang.Object
tvi.webrtc.SurfaceTextureHelper

public class SurfaceTextureHelper extends Object
Helper class for using a SurfaceTexture to create WebRTC VideoFrames. In order to create WebRTC VideoFrames, render onto the SurfaceTexture. The frames will be delivered to the listener. Only one texture frame can be in flight at once, so the frame must be released in order to receive a new frame. Call stopListening() to stop receiveing new frames. Call dispose to release all resources once the texture frame is released.
  • Method Details

    • create

      public static SurfaceTextureHelper create(String threadName, EglBase.Context sharedContext, boolean alignTimestamps, YuvConverter yuvConverter, SurfaceTextureHelper.FrameRefMonitor frameRefMonitor)
      Construct a new SurfaceTextureHelper sharing OpenGL resources with `sharedContext`. A dedicated thread and handler is created for handling the SurfaceTexture. May return null if EGL fails to initialize a pixel buffer surface and make it current. If alignTimestamps is true, the frame timestamps will be aligned to rtc::TimeNanos(). If frame timestamps are aligned to rtc::TimeNanos() there is no need for aligning timestamps again in PeerConnectionFactory.createVideoSource(). This makes the timestamps more accurate and closer to actual creation time.
    • create

      public static SurfaceTextureHelper create(String threadName, EglBase.Context sharedContext)
      Same as above with alignTimestamps set to false and yuvConverter set to new YuvConverter.
      See Also:
    • create

      public static SurfaceTextureHelper create(String threadName, EglBase.Context sharedContext, boolean alignTimestamps)
      Same as above with yuvConverter set to new YuvConverter.
      See Also:
    • create

      public static SurfaceTextureHelper create(String threadName, EglBase.Context sharedContext, boolean alignTimestamps, YuvConverter yuvConverter)
      Create a SurfaceTextureHelper without frame ref monitor.
      See Also:
    • startListening

      public void startListening(VideoSink listener)
      Start to stream textures to the given `listener`. If you need to change listener, you need to call stopListening() first.
    • stopListening

      public void stopListening()
      Stop listening. The listener set in startListening() is guaranteded to not receive any more onFrame() callbacks after this function returns.
    • setTextureSize

      public void setTextureSize(int textureWidth, int textureHeight)
      Use this function to set the texture size. Note, do not call setDefaultBufferSize() yourself since this class needs to be aware of the texture size.
    • forceFrame

      public void forceFrame()
      Forces a frame to be produced. If no new frame is available, the last frame is sent to the listener again.
    • setFrameRotation

      public void setFrameRotation(int rotation)
      Set the rotation of the delivered frames.
    • getSurfaceTexture

      public android.graphics.SurfaceTexture getSurfaceTexture()
      Retrieve the underlying SurfaceTexture. The SurfaceTexture should be passed in to a video producer such as a camera or decoder.
    • getHandler

      public android.os.Handler getHandler()
      Retrieve the handler that calls onFrame(). This handler is valid until dispose() is called.
    • isTextureInUse

      public boolean isTextureInUse()
    • dispose

      public void dispose()
      Call disconnect() to stop receiving frames. OpenGL resources are released and the handler is stopped when the texture frame has been released. You are guaranteed to not receive any more onFrame() after this function returns.
    • textureToYuv

      @Deprecated public VideoFrame.I420Buffer textureToYuv(VideoFrame.TextureBuffer textureBuffer)
      Deprecated.
      Use toI420() instead.
      Posts to the correct thread to convert `textureBuffer` to I420.