SCRAPL: Scattering Transform with Random Paths for Machine Learning

Image generated by Gemini AI
Researchers have introduced SCRAPL (Scattering transform with Random Paths for machine Learning), a novel optimization method to streamline the use of wavelet scattering transforms in neural network training. By employing a stochastic approach, SCRAPL enhances the efficiency of joint time-frequency scattering transforms for analyzing sound patterns, such as in granular synthesis and matching with the Roland TR-808. The method includes an importance sampling heuristic to improve model convergence and performance. Code and audio samples are available as a Python package, facilitating broader application in audio processing tasks.
Introducing SCRAPL: A New Approach to Machine Learning with Scattering Transforms
Researchers have developed "Scattering Transform with Random Paths for Machine Learning" (SCRAPL), aimed at enhancing the efficiency of scattering transforms in deep learning applications. This approach addresses the significant computational overhead associated with wavelet scattering transform coefficients, which are critical for perceptual quality assessment in computer vision and audio processing.
SCRAPL offers a stochastic optimization scheme that streamlines the evaluation of multivariable scattering transforms, thereby reducing the burden on computing resources. It has been specifically implemented for the joint time-frequency scattering transform (JTFS), effectively demodulating spectrotemporal patterns across multiple scales and rates.
- Unsupervised Sound Matching: SCRAPL has been applied to differentiate digital signal processing (DDSP) tasks, focusing on unsupervised sound matching between a granular synthesizer and the iconic Roland TR-808 drum machine.
The team has made their code and audio samples publicly available. SCRAPL is also offered as a Python package, making it accessible for practitioners in the field.
Related Topics:
📰 Original Source: https://arxiv.org/abs/2602.11145v1
All rights and credit belong to the original publisher.