Cursor claims more than 30,000 Nvidia engineers commit 3x more code and make coding 'a lot more fun than it used to be', but is it too good to be true?
Observers are skeptical about the quality of the code produced with AI
Sign up for breaking news, reviews, opinion, top tech deals, and more.
You are now subscribed
Your newsletter sign-up was successful
- Cursor reports Nvidia engineers now commit three times more code than before
- Nvidia maintains that defect rates stayed flat despite the reported surge in output
- AI-assisted workflows contributed to DLSS 4 and smaller GPU die sizes
Nvidia has rolled out generative AI tools across a large portion of its engineering workforce, with Cursor integrated into daily development workflows.
The company says more than 30,000 engineers now rely on this setup, with internal claims pointing to three times higher code output than previous processes.
This claim has attracted attention partly because volume-based metrics have long been treated cautiously within software engineering.
Productivity claims versus engineering reality
This deployment is an operational change touching core software, including GPU drivers and infrastructure code that supports gaming, data centers, and AI training systems.
These products are widely considered mission-critical, where errors can have visible and sometimes costly consequences.
Nvidia claims defect rates have remained flat despite the surge in output, suggesting that internal controls and testing requirements remain in place.
Driver code, firmware, and low-level system components typically pass through extensive validation before release, regardless of how quickly they are written.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
This approach is not new, as Nvidia has previously relied on AI-assisted workflows, including internal systems used to improve DLSS over multiple hardware generations.
Some of Nvidia’s recent outcomes are cited as examples of AI-supported development delivering tangible results.
DLSS 4 and reductions in GPU die size relative to comparable designs are often referenced as outcomes tied to broader use of internal optimization tools.
These examples suggest that AI assistance, when applied within tightly controlled environments, can contribute to measurable improvements.
At the same time, Nvidia’s software stack has faced criticism in recent years, with users pointing to driver regressions and update-related issues across the industry.
Cursor also claims that coding is “a lot more fun than it used to be,” but this sits alongside productivity figures that remain difficult to independently assess.
Lines of code committed over a given period have never been a reliable indicator of software quality, stability, or long-term value.
True software quality is better measured by stability, maintainability, and impact on end-user performance, and output volume alone says little about these.
Nvidia also benefits commercially from promoting AI-driven development, given its central role in supplying the hardware behind these systems.
In that context, skepticism around messaging and metrics is expected, even if the underlying tools deliver real efficiencies in specific, tightly managed scenarios.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.

Efosa has been writing about technology for over 7 years, initially driven by curiosity but now fueled by a strong passion for the field. He holds both a Master's and a PhD in sciences, which provided him with a solid foundation in analytical thinking.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.