I'm interested to use this library with a long mpegts streaming (rather than segmented one, as in HLS).
I guess I have to follow this spec:
https://w3c.github.io/media-source/isobmff-byte-stream-format.html
I guess the following stages should be taken:
- Change all STTS, STSC, STCO to be zero-length.
- Add a traf boxes contain TFDT boxes.
- Use relative addressing.
- Using TRUN box.
- Pre-create an init segment which will be followed by the data segments. Data segments should consists MOOF and MDAT boxes.
The first stage seems simple, but I don't know how to do the other stages. I would be happy to contribute but I'm not very experienced with the internal structure of MP4, so any help is appreciated.
Thanks
MOTIVATION:
I would like to show live video with extra-low latency. I'm using WebSocket to send data to the client as soon as possible (I'm aware of other alternatives like MPEG-DASH, but it is out of scope now).
By pre-devision of the TS into chunks of full TS and PES packets, I succeeded to bypass the problem of jBinary streaming mentioned in this issue:
jDataView/jBinary#41
In addition I changed the mpegts library to support streaming context (index.js file), so future calls to mpegts can use previous segments passed to mpegts.
Now every TS segment is converted into MP4 complete file. The problem is that now the beginning of each segment missing some informations (e.g. first frames do not have key frames to refer).
By creating fragmented MP4 I will be able to feed MSE segment-by-segment.