This really captures a tension that's becoming harder to ignore:
AI isn't "creating" bad practices, it's removing the barriers that used to slow down bad practices.
Vibe coding existed long before LLMs, but now someone can ship fast without knowing (or caring) how fragile their implementation is. And because demos look good and investors want speed, it's easy to mistake velocity for progress.
What worries me most isn't the code itself; it's the cultural signal it sends. If early-stage success gets decoupled from technical soundness, then short-term incentives will dominate, and we’ll lose the developers who understand how to build lasting systems.
The best teams will be the ones who don't fight AI, but who also don't let AI drive them straight into a wall.
This really captures a tension that's becoming harder to ignore:
AI isn't "creating" bad practices, it's removing the barriers that used to slow down bad practices.
Vibe coding existed long before LLMs, but now someone can ship fast without knowing (or caring) how fragile their implementation is. And because demos look good and investors want speed, it's easy to mistake velocity for progress.
What worries me most isn't the code itself; it's the cultural signal it sends. If early-stage success gets decoupled from technical soundness, then short-term incentives will dominate, and we’ll lose the developers who understand how to build lasting systems.
The best teams will be the ones who don't fight AI, but who also don't let AI drive them straight into a wall.