swift – High-Performance Multi-Track Waveform Drawing from Background Data

I am developing a macOS audio application and need to display multiple audio track waveforms in a custom NSView embedded in an NSScrollView. My audio track model contains a URL to a potentially very large audio file. I need to:

  1. Read the audio file and calculate the waveform data (peak values) on a background thread to avoid blocking the main thread.

  2. Store the pre-calculated waveform data for each track efficiently.

  3. When the custom NSView‘s draw(_:) method is called, retrieve only the visible portion of the pre-calculated data and draw the waveforms.

  4. Handle view resizing and scrolling gracefully, triggering re-drawing only for the visible portion.

What is the best pattern for managing the background data processing and ensuring a responsive UI? I am currently using DispatchQueue.global().async, but am unsure how to best synchronize the data between the background processing and the main-thread drawing logic without causing performance issues during scrolling.

Read more here: Source link