How to create a local audio livestream server with ffmpeg and python?
Simply put, this is what I’m trying to accomplish:
I navigate to something like http://localhost:8080/
in my browser and the browser shows a built-in audio player playing whatever the ffmpeg process is streaming. (Not just serving a local audio file.) (Built-in here meaning the page looks the same as if you had opened an mp3 file with your browser.)
At first I thought it would be easy, as ffmpeg has the ability to stream through different protocols. I seem to have misunderstood though, because while I can stream something over rtp with it, I can’t access that from my browser. Some stackoverflow questions I found seem to imply that you can do this with the output options -f mpegts http://localhost:8080
, but when I try this, ffmpeg freezes for a second, then I get these errors:
[tcp @ 00000210f70b0700] Connection to tcp://localhost:8080 failed: Error number -138 occurred
[out#0/mpegts @ 00000210f7080ec0] Error opening output http://localhost:8080: Error number -138 occurred
Error opening output file http://localhost:8080.
Error opening output files: Error number -138 occurred
but I have no problem with -f rtp rtp://localhost:8080
. (Like I said though, I can’t access that through the browser).
So I suspect I need something else to “pick up” the rtp stream and put it on an http server, but I haven’t been able to find anything on that, probably because I just don’t know the right thing to search. It seems like something that should be easily doable in Python, and that would be my preferred language to do it in over javascript, if possible.
Can anyone point me in the right direction? Or let me know if I’m misunderstanding something? Thanks.
Read more here: Source link