Send email Copy Email Address
2023-07-27

Revisiting Neural Program Smoothing for Fuzzing

Summary

Testing with randomly generated inputs (fuzzing) has gained significant traction due to its capacity to expose program vulnerabilities automatically. Fuzz testing campaigns generate large amounts of data, making them ideal for the application of machine learning (ML). Neural program smoothing, a specific family of ML-guided fuzzers, aims to use a neural network as a smooth approximation of the program target for new test case generation. In this paper, we conduct the most extensive evaluation of neural program smoothing (NPS) fuzzers against standard gray-box fuzzers (>11 CPU years and >5.5 GPU years), and make the following contributions: (1) We find that the original performance claims for NPS fuzzers do not hold; a gap we relate to fundamental, implementation, and experimental limitations of prior works. (2) We contribute the first in-depth analysis of the contribution of machine learning and gradient-based mutations in NPS . (3) We implement Neuzz++, which shows that addressing the practical limitations of NPS fuzzers improves performance, but standard gray-box fuzzers almost always surpass NPS-based fuzzers. (4) As a consequence, we propose new guidelines targeted at benchmarking fuzzing based on machine learning, and present a platform, MLFuzz, with GPU access for easy and reproducible evaluation of ML -based fuzzers. Neuzz++, MLFuzz, and all our data are public.

Conference / Medium

ESEC/FSE 2023

Date published

2023-07-27

Date last modified

2023-09-29 14:24:55