Add workflow to prime chanmon_consistency fuzz corpus#4578
Add workflow to prime chanmon_consistency fuzz corpus#4578joostjager wants to merge 1 commit intolightningdevkit:mainfrom
Conversation
|
👋 Thanks for assigning @TheBlueMatt as a reviewer! |
|
I've thoroughly reviewed the entire diff, cross-referencing with the existing fuzz infrastructure in Key areas checked:
No issues found. The prior review's inline comment about newline handling is correctly addressed in the current code (line 44). No new bugs, security vulnerabilities, or logic errors identified. |
Add a manual workflow that primes the persistent chanmon consistency honggfuzz corpus with comma-separated hex seeds. This lets failing or challenging inputs from offline fuzzing be kept in the rolling corpus so future CI continues exercising them. Regular minimization drops duplicates and inputs that no longer add value.
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #4578 +/- ##
==========================================
+ Coverage 87.15% 87.17% +0.02%
==========================================
Files 161 161
Lines 109251 109251
Branches 109251 109251
==========================================
+ Hits 95215 95243 +28
+ Misses 11560 11534 -26
+ Partials 2476 2474 -2
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Add a manual workflow for priming the persistent
chanmon_consistency_targethonggfuzz corpus with known interesting inputs.The goal is to preserve failing or challenging fuzz strings discovered during larger offline runs and feed them back into regular CI, so current and future code keeps being exercised against those cases instead of letting the inputs get lost. The workflow restores the latest rolling main corpus, decodes comma-separated hex seeds from the dispatch input into the chanmon consistency corpus, and saves a fresh main-prefix cache entry for subsequent fuzz runs to restore.
The normal fuzz job still owns execution and minimization. Once the primed cache is picked up by a main fuzz run, honggfuzz minimization removes duplicate or non-contributing inputs, so only seeds that add useful coverage remain in the rolling corpus.
Tested on my fork.