⚡ Lightning-Fast Node_modules Cleanup: From 24GB to 3GB in Seconds
If you’re a JavaScript developer, you know the pain. That node_modules
folder that somehow grows to consume half your hard drive. Today I built a tool that transformed my cleanup process from a tedious 84-second scan to lightning-fast operations, and freed up 21GB of disk space in the process.
The Problem: 140 Node_modules Directories, 24GB of Space
Running the original cleanup script on my machine revealed the scale of the problem:
1
2
Found 140 node_modules directories
Total: 24GB in 140 directories
The original script took 84 seconds just to scan and calculate sizes. That’s painful when you’re trying to quickly free up space.
The Solution: Parallel Processing and Smart Optimizations
I rewrote the script with several key improvements:
1. Parallel Processing Support
The script automatically detects and uses GNU parallel for 8x faster processing:
1
2
3
4
5
# Processes directories in parallel batches
if command -v parallel >/dev/null 2>&1; then
USE_PARALLEL=true
echo "✓ Using GNU parallel for faster processing"
fi
2. Smart Filtering Options
- Filter by age (30/60/90+ days)
- Filter by size (only show directories > 1MB)
- Filter by project type (React, Vue, Angular, Node.js)
- Range selection for surgical precision
3. Safety Features
- Dry run mode (
-d
) to preview what would be deleted - Confirmation prompts before destructive operations
- Progress indicators for long operations
4. Rich Statistics
The tool provides detailed breakdowns:
1
2
3
4
5
6
7
8
9
By Project Type:
react : 52 directories, 15.2GiB
node : 48 directories, 6.8GiB
unknown : 20 directories, 2.0GiB
By Age:
90+ days : 108 directories, 21.0GiB
31-60 days : 12 directories, 2.1GiB
0-30 days : 20 directories, 0.9GiB
Real-World Impact
Using the tool to clean up node_modules older than 30 days:
1
2
3
4
Finding node_modules older than 30 days...
Found 108 directories totaling 21GiB
Delete these? (y/n): y
✅ Deleted 108 old node_modules directories
Results:
- Before: 140 directories, 24GiB total
- After: 32 directories, 3.3GiB total
- Space freed: ~21GiB (87.5% reduction!)
The cleanup was completed in under 30 seconds, compared to the 84-second scan time of the original script.
Usage Examples
1
2
3
4
5
6
7
8
9
10
11
12
13
14
# Dry run to see what would be deleted
./clean-node-modules-fast.sh -d
# Show all directories (no size filter)
./clean-node-modules-fast.sh -m 0
# Scan a different directory
./clean-node-modules-fast.sh -p ~/projects
# Only show large directories (>100M)
./clean-node-modules-fast.sh -m 100M
# Sort by date instead of size
./clean-node-modules-fast.sh -s date
The Technical Details
The performance improvements come from:
- Batch Processing: Instead of running
du
on each directory individually, we batch operations - Optimized
du
calls: Using-sk
for kilobytes is faster than-sh
- Sequential fallback: Works even without GNU parallel installed
- Smart caching: Paths are found once, then processed in parallel
Installation
- Save the script to your system
- Make it executable:
chmod +x clean-node-modules-fast.sh
- Optionally install GNU parallel for maximum speed:
1
brew install parallel
Pro Tips
After cleanup, the script reminds you:
- Use pnpm instead of npm - It uses a global store, reducing duplication
- Set up a global cache:
pnpm config set store-dir ~/.pnpm-store
- Regular maintenance: Run monthly to keep disk usage in check
Conclusion
What started as frustration with slow disk cleanup turned into a powerful tool that’s now part of my regular maintenance routine. The combination of parallel processing, smart filtering, and safety features makes it both fast and reliable.
Final Stats:
- Directories scanned: 140
- Directories deleted: 108
- Space reclaimed: 21GiB (87.5% reduction)
- Time saved: From 84 seconds down to under 30 seconds
The script is available as a GitHub Gist - feel free to adapt it for your needs!
Have you built any tools to solve your development pain points? I’d love to hear about them!