HTML5 audio streaming: precisely measure latency?
So, I need to precisely measure the latency between my handing audio to the streaming server (Icecast?), and the audio coming out of the speakers on the computer hosting the speaker. Some blue-sky possibilities:
Add metadata to the audio stream, and parse it from the playing audio (I understand this isn’t possible using the standard audio element)
Add brief periods of pure silence to the audio, and then detect them on the browser (can audio elements yield the actual audio samples?)
Query the server and the browser as to the various buffer depths
Any thoughts as to how I could do this?
timeupdate event of
<audio> element, which is fired three to four times per second, to perform precise animations during streaming of media by checking
<audio> element. Where animations or transitions can be started or stopped up to several times per second.
If available at browser, you can use
fetch() to request audio resource, at
response.body.getReader() which returns a
ReadableStream of the resource; create a new
MediaSource object, set
objectURL of the
MediaSource; append first stream chunks at
.mode set to
"sequence"; append remainder of chunks to
response.body.getReader() is not available at browser, you can still use
progress event of
<audio> element to check
.currentTime, start or stop animations or transitions at required second of streaming media playback.
canplay event of
<audio> element to play media when stream has accumulated adequate buffers at
MediaSource to proceed with playback.
You can use an object with properties set to numbers corresponding to
<audio> where animation should occur, and values set to
css property of element which should be animated to perform precise animations.
0, and at every sixty seconds until the media playback has concluded.
There no way for you to measure latency directly, but any AudioElement generate events like ‘playing’ if it just played (fired quite often), or ‘stalled’ if stoped streaming, or ‘waiting’ if data is loading. So what you can do, is to manipulate your video based on this events.
So play while stalled or waiting is fired, then continue playing video if playing fired again.
But I advice you check other events that might affect your flow (error for example would be important for you).
What i would try is first create a timestamp with performance.now, process the data, and record it in a blob with the new web recorder api.
The web recorder will ask user access to his audio card, this can be a problem for your app, but it look like mandatory to get the real latency.
As soon this done, there is many way to measure the actual latency between the generation and the actual rendering. Basically, a sound event.
For further reference and example: