RCRTCSurface Texture Helper
Helper class for using a SurfaceTexture to create WebRTC VideoFrames. In order to create WebRTC VideoFrames, render onto the SurfaceTexture. The frames will be delivered to the listener. Only one texture frame can be in flight at once, so the frame must be released in order to receive a new frame. Call stopListening() to stop receiveing new frames. Call dispose to release all resources once the texture frame is released.
Functions
Link copied to clipboard
Same as above with alignTimestamps set to false and yuvConverter set to new YuvConverter.
public static RCRTCSurfaceTextureHelper create(String threadName, Context sharedContext, boolean alignTimestamps)
Same as above with yuvConverter set to new YuvConverter.
public static RCRTCSurfaceTextureHelper create(String threadName, Context sharedContext, boolean alignTimestamps, YuvConverter yuvConverter)
Construct a new SurfaceTextureHelper sharing OpenGL resources with |sharedContext|.
Link copied to clipboard
Retrieve the handler that calls onFrame().
Link copied to clipboard
Retrieve the underlying SurfaceTexture.
Link copied to clipboard
Link copied to clipboard
Link copied to clipboard
Set the rotation of the delivered frames.
Link copied to clipboard
Use this function to set the texture size.
Link copied to clipboard
Start to stream textures to the given |listener|.
Link copied to clipboard
Stop listening.
Link copied to clipboard
Posts to the correct thread to convert |textureBuffer| to I420.