|
| 1 | +# Custom Video Source |
| 2 | + |
| 3 | +This guide explains how to use the `CustomVideoSource` class to push video frames from external sources directly to the WebRTC video pipeline. |
| 4 | + |
| 5 | +## Overview |
| 6 | + |
| 7 | +The `CustomVideoSource` allows you to provide video frames from custom sources such as: |
| 8 | +- Video files |
| 9 | +- Network streams |
| 10 | +- Generated video (patterns, animations, etc.) |
| 11 | +- Video processing libraries |
| 12 | +- Any other source of raw video frames |
| 13 | + |
| 14 | +This is particularly useful when you need to: |
| 15 | +- Stream pre-recorded video |
| 16 | +- Process video before sending it |
| 17 | +- Generate synthetic video |
| 18 | +- Integrate with external video APIs |
| 19 | + |
| 20 | +## Basic Usage |
| 21 | + |
| 22 | +### Creating a Custom Video Source |
| 23 | + |
| 24 | +To use a custom video source, you first need to create an instance: |
| 25 | + |
| 26 | +```java |
| 27 | +import dev.onvoid.webrtc.media.video.CustomVideoSource; |
| 28 | + |
| 29 | +// Create a new CustomVideoSource instance |
| 30 | +CustomVideoSource videoSource = new CustomVideoSource(); |
| 31 | +``` |
| 32 | + |
| 33 | +### Creating a Video Track |
| 34 | + |
| 35 | +Once you have a custom video source, you can create a video track with it: |
| 36 | + |
| 37 | +```java |
| 38 | +import dev.onvoid.webrtc.PeerConnectionFactory; |
| 39 | +import dev.onvoid.webrtc.media.video.VideoTrack; |
| 40 | + |
| 41 | +// Create a PeerConnectionFactory (you should already have this in your WebRTC setup) |
| 42 | +PeerConnectionFactory factory = new PeerConnectionFactory(); |
| 43 | + |
| 44 | +// Create a video track using the custom video source |
| 45 | +VideoTrack videoTrack = factory.createVideoTrack("video-track-id", videoSource); |
| 46 | +``` |
| 47 | + |
| 48 | +### Pushing Video Frames |
| 49 | + |
| 50 | +The key feature of `CustomVideoSource` is the ability to push video frames directly to the WebRTC pipeline: |
| 51 | + |
| 52 | +```java |
| 53 | +import dev.onvoid.webrtc.media.video.VideoFrame; |
| 54 | +import dev.onvoid.webrtc.media.video.NativeI420Buffer; |
| 55 | + |
| 56 | +// Create a video frame with appropriate dimensions and format |
| 57 | +int width = 640; |
| 58 | +int height = 480; |
| 59 | +NativeI420Buffer buffer = NativeI420Buffer.allocate(width, height); |
| 60 | + |
| 61 | +// Fill the buffer with your video data |
| 62 | +// ... |
| 63 | + |
| 64 | +// Create a video frame with the buffer |
| 65 | +VideoFrame frame = new VideoFrame(buffer, System.nanoTime()); |
| 66 | + |
| 67 | +// Push the frame to the WebRTC pipeline |
| 68 | +videoSource.pushFrame(frame); |
| 69 | + |
| 70 | +// Don't forget to release the frame when done |
| 71 | +frame.dispose(); |
| 72 | +``` |
| 73 | + |
| 74 | +## Video Format Considerations |
| 75 | + |
| 76 | +When pushing video frames, you need to consider the following: |
| 77 | + |
| 78 | +### Resolution |
| 79 | +- Common resolutions: 320x240, 640x480, 1280x720, 1920x1080 |
| 80 | +- Higher resolutions require more bandwidth and processing power |
| 81 | + |
| 82 | +### Frame Rate |
| 83 | +- Common frame rates: 15, 24, 30, 60 fps |
| 84 | +- Higher frame rates provide smoother video but require more bandwidth |
| 85 | + |
| 86 | +### Color Format |
| 87 | +- WebRTC primarily uses I420 (YUV 4:2:0) format |
| 88 | +- You may need to convert from other formats (RGB, RGBA, NV12, etc.) |
| 89 | + |
| 90 | +## Advanced Usage |
| 91 | + |
| 92 | +### Continuous Video Streaming |
| 93 | + |
| 94 | +For continuous streaming, you'll typically push video frames in a separate thread: |
| 95 | + |
| 96 | +```java |
| 97 | +import java.util.concurrent.Executors; |
| 98 | +import java.util.concurrent.ScheduledExecutorService; |
| 99 | +import java.util.concurrent.TimeUnit; |
| 100 | + |
| 101 | +public class VideoStreamer { |
| 102 | + private final CustomVideoSource videoSource; |
| 103 | + private final ScheduledExecutorService executor; |
| 104 | + private final int width = 640; |
| 105 | + private final int height = 480; |
| 106 | + private final int frameRate = 30; |
| 107 | + |
| 108 | + public VideoStreamer(CustomVideoSource videoSource) { |
| 109 | + this.videoSource = videoSource; |
| 110 | + this.executor = Executors.newSingleThreadScheduledExecutor(); |
| 111 | + } |
| 112 | + |
| 113 | + public void start() { |
| 114 | + // Schedule task to run at the desired frame rate |
| 115 | + int periodMs = 1000 / frameRate; |
| 116 | + executor.scheduleAtFixedRate(this::pushNextVideoFrame, 0, periodMs, TimeUnit.MILLISECONDS); |
| 117 | + } |
| 118 | + |
| 119 | + public void stop() { |
| 120 | + executor.shutdown(); |
| 121 | + } |
| 122 | + |
| 123 | + private void pushNextVideoFrame() { |
| 124 | + try { |
| 125 | + // Create a buffer for the frame |
| 126 | + NativeI420Buffer buffer = NativeI420Buffer.allocate(width, height); |
| 127 | + |
| 128 | + // Fill buffer with your video data |
| 129 | + // ... |
| 130 | + |
| 131 | + // Create and push the frame |
| 132 | + VideoFrame frame = new VideoFrame(buffer, System.nanoTime()); |
| 133 | + videoSource.pushFrame(frame); |
| 134 | + |
| 135 | + // Release resources |
| 136 | + frame.dispose(); |
| 137 | + } |
| 138 | + catch (Exception e) { |
| 139 | + e.printStackTrace(); |
| 140 | + } |
| 141 | + } |
| 142 | +} |
| 143 | +``` |
| 144 | + |
| 145 | +## Integration with Video Tracks |
| 146 | + |
| 147 | +### Adding Sinks to Monitor Video |
| 148 | + |
| 149 | +You can add sinks to the video track to monitor the video frames: |
| 150 | + |
| 151 | +```java |
| 152 | +import dev.onvoid.webrtc.media.video.VideoTrackSink; |
| 153 | + |
| 154 | +// Create a sink to monitor the video frames |
| 155 | +VideoTrackSink monitorSink = frame -> { |
| 156 | + System.out.println("Received frame: " + |
| 157 | + frame.getBuffer().getWidth() + "x" + |
| 158 | + frame.getBuffer().getHeight() + ", " + |
| 159 | + "rotation: " + frame.getRotation() + ", " + |
| 160 | + "timestamp: " + frame.getTimestampUs() + "µs"); |
| 161 | + |
| 162 | + // You can process or analyze the frame here |
| 163 | + // Note: Don't modify the frame directly |
| 164 | +}; |
| 165 | + |
| 166 | +// Add the sink to the video track |
| 167 | +videoTrack.addSink(monitorSink); |
| 168 | + |
| 169 | +// When done, remove the sink |
| 170 | +videoTrack.removeSink(monitorSink); |
| 171 | +``` |
| 172 | + |
| 173 | +## Cleanup |
| 174 | + |
| 175 | +When you're done with the custom video source, make sure to clean up resources: |
| 176 | + |
| 177 | +```java |
| 178 | +// Dispose of the video track |
| 179 | +videoTrack.dispose(); |
| 180 | + |
| 181 | +// Dispose of the video source to prevent memory leaks |
| 182 | +videoSource.dispose(); |
| 183 | + |
| 184 | +// If you're using a scheduled executor for pushing frames |
| 185 | +videoStreamer.stop(); |
| 186 | +``` |
| 187 | + |
| 188 | +## Conclusion |
| 189 | + |
| 190 | +The `CustomVideoSource` provides a flexible way to integrate external video sources with WebRTC. By understanding the video format parameters and properly managing the video frame flow, you can create applications that use custom video from virtually any source. |
| 191 | + |
| 192 | +For more information on video processing and other WebRTC features, refer to the additional guides in the documentation. |
0 commit comments