I use wavesurfer to display and playback audio, and set some user-customizable filters to change the playback in different ways.
I would now like to also process audio from an audio input in real-time. I’ve tried the Microphone plugin which seems like it does what I need, and I got it to display incoming audio, but I can’t figure out whether it supports actual playback as well – does it?
It seemed liket the
play() function might do that, but I can’t get it to actually ‘play’. On the other hand, if it doesn’t support playback – why not, and is there an alternative way I can still ‘connect’ the microphone (or ideally any selectable audio source) to my wavesurfer setup, e.g. via AudioContext backend? (In fact, I could even live without the visualization if I could ‘hear’ the input instead.)
I’m still pretty new to the Web Audio API and wavesurfer.js.
Any help is appreciated, thanks a lot!
Read more here: Source link