Problem Some projects using WebRTC functionality have indicated that they need to record data on “source to screen” performance - that is, information about how much time it takes between some event (typically frame capture) occurs in “real life” on the media-generating side and the serving of the same event to the user on the media-consuming side.
Approach I’ve sketched out an approach in this repo:
It consists of a mechanism to tell the media engine to note the times certain events happen to a frame, and a way to get these notes back to JS. It’s intended to be predictable and not too resource intensive.
Comments are welcome; guidance to the WICG process as well - this is my first attempt to use this forum for an API proposal.