I've recently had to manually check my system for two different vulnerable packages, and I'm starting to feel like local development has become riskier than ever. Has anyone else started building and testing new proof of concepts in complete isolation? What measures are you taking to stay safe?
5 Answers
Back in IT school during the 2000s, it was understood that you shouldn't use bleeding-edge technology—stuff that's unproven. It seems like that mindset has faded, and now we deploy new releases without much thought. We’ve really strayed from the idea of sticking to stable, reliable versions before jumping on new ones.
Honestly, I’m really frightened by how frequent supply chain attacks have become. I’ve started using Docker and virtual machines for testing everything because I don’t want to risk running anything directly on my main machine. It feels like installing a new library is a gamble these days, and I’m always on edge, checking everything!
You’re definitely not alone in this. I’ve moved to fully isolated environments with limited network access for testing, which has helped me feel a bit more secure.
The straightforward solution is to avoid external dependencies unless you’ve thoroughly audited them. For smaller utility packages, consider copying them and vendoring them locally. It's a bit more work, but it can save you from unexpected surprises.
There are some simple strategies you can implement to enhance your safety. First, consider using the `minimumReleaseAge` setting to avoid downloading very new packages that might be compromised. Also, pinning your package versions can prevent issues. These basic steps can help shield you from major infections in widely-used packages, although niche packages may still pose risks.

Related Questions
How To: Running Codex CLI on Windows with Azure OpenAI
Set Wordpress Featured Image Using Javascript
How To Fix PHP Random Being The Same
Why no WebP Support with Wordpress
Replace Wordpress Cron With Linux Cron
Customize Yoast Canonical URL Programmatically