The package with most versions still listed on PyPI is spanishconjugator [2], which consistently published ~240 releases per month between 2020 and 2024.
Incidentally I once ran into a mature package that had lived in the 0.0.x lane forever and treated every release as a patch, racking up a huge version number, and I had to remind the maintainer that users depending with caret ranges won't get those updates automatically. (In semver caret ranges never change the leftmost non-zero digit; in 0.0.x that digit is the patch version, so ^0.0.123 is just a hard pin to 0.0.123). There may occasionally be valid reasons to stay on 0.0.x though (e.g. @types/web).
It's the type definitions for developing chrome extensions. They'd been incrementing in the 0.0.x lane for almost a decade and bumped it to 0.1.0 after I raised the issue, so I doubt it was intentional:
This is part of the DefinitelyTyped project. DT tends to get a lot of one-off contributions just for fixing the one error a dev is experiencing. So maybe they all just copied the version incrementing that previous commits had done, and no one in particular ever took the responsibility to say "this is ready now".
Anthony Fu’s epoch versioning scheme (to differentiate breaking change majors from "marketing" majors) could yield easy winners here, at least on the raw version number alone (not the number of sequential versions released):
The "winner" just had its 3000th release on GitHub, already a few patch versions past the version referenced in this article (which was published today): https://github.com/wppconnect-team/wa-version
I made a fairly significant (dumb) mistake in the logic for extracting valid semver versions. I was doing a falsy check, so if any of major/minor/patch in the version was a 0, the whole package was ignored.
Hmm yeah, I decided that one counts because the new packages have (slightly) different content, although it might be the case that the changes are junk/pointless anyway.
Brief reminder/clarification that these tools are used to circumvent WhatsApp ToS, and that they are used to:
1- Spam
2- Scam
3- Avoid paying for Whatsapp API (which is the only form of monetization)
And that the reason this thing gets so many updates is probably because of a mouse and cat game where Meta updates their software continuously to avoid these types of hacks and the maintainers do so as well, whether in automated or manual fashion.
Considering the 18 billions price tag and the current mixing of user data between meta and WhatsApp I believe that meta has now revenue streams in mind than just the API pricing
> Time to fetch version data for each one of those packages: ~12 hours (yikes)
The author could improve the batching in fetchAllPackageData by not waiting for all 50 (BATCH_SIZE) promises to resolve at once. I just published a package for proper promise batching last week: https://www.npmjs.com/package/promises-batched
Just spin up a loop of 50 call chains. When one completes you just do the next on next tick. It's like 3 lines of code. No libraries needed. Then you're always doing 50 at a time. You can still use await.
async work() { await thing(); nextTick(work); }
for(to 50) { work(); }
then maybe a separate timer to check how many tasks are active I guess.
Promise.all waits for all 50 promises to resolve, so if one of these promises takes 3s, while the other 49 are taking 0.5s, you're waisting 2.5s awaiting each batch.
Haha, good luck finding a real project that holds that title. It's always some squatted name, a dependency confusion experiment, or a troll publishing a package with version 99999.99999.99999 just to see what breaks. The "king" of that hill changes all the time. Just another day in the NPM circus.
> I was recently working on a project that uses the AWS SDK for JavaScript. When updating the dependencies in said project, I noticed that the version of that dependency was v3.888.0. Eight hundred eighty eight. That’s a big number as far as versions go.
It also isn’t the first AWS SDK. A few of us in… 2012 IIRC… wrote the first one because AWS didn’t think node was worth an SDK.
For Python (or PyPI) this is easier, since their data is available on Google BigQuery [1], so you can just run
The winner is: https://pypi.org/project/elvisgogo/#historyThe package with most versions still listed on PyPI is spanishconjugator [2], which consistently published ~240 releases per month between 2020 and 2024.
[1] https://console.cloud.google.com/bigquery?p=bigquery-public-...
[2] https://pypi.org/project/spanishconjugator/#history
Incidentally I once ran into a mature package that had lived in the 0.0.x lane forever and treated every release as a patch, racking up a huge version number, and I had to remind the maintainer that users depending with caret ranges won't get those updates automatically. (In semver caret ranges never change the leftmost non-zero digit; in 0.0.x that digit is the patch version, so ^0.0.123 is just a hard pin to 0.0.123). There may occasionally be valid reasons to stay on 0.0.x though (e.g. @types/web).
Maybe that is intentional? Which package is it?
It's the type definitions for developing chrome extensions. They'd been incrementing in the 0.0.x lane for almost a decade and bumped it to 0.1.0 after I raised the issue, so I doubt it was intentional:
https://www.npmjs.com/package/@types/chrome?activeTab=versio...
This is part of the DefinitelyTyped project. DT tends to get a lot of one-off contributions just for fixing the one error a dev is experiencing. So maybe they all just copied the version incrementing that previous commits had done, and no one in particular ever took the responsibility to say "this is ready now".
One of the 'winners' I randomly googled.
> carrot-scan -> 27708 total versions
> Command-line tool for detecting vulnerabilities in files and directories.
I can't help but feel there is something absurd about this.
Each version is likely a new vulnerability that got submitted, doesn't seem that weird.
Shouldn't vulnerabilities be "data" in this context? You bump the vulns database but keep the code at the same version if the logic is the same.
Anthony Fu’s epoch versioning scheme (to differentiate breaking change majors from "marketing" majors) could yield easy winners here, at least on the raw version number alone (not the number of sequential versions released):
https://antfu.me/posts/epoch-semver
So 19494 is the largest? That's far lower than I expected. There's nobody out there that has put a date in a version number (e.g., 20250915)?
The "winner" just had its 3000th release on GitHub, already a few patch versions past the version referenced in this article (which was published today): https://github.com/wppconnect-team/wa-version
After double-checking some things, the real winner is actually: https://github.com/nice-registry/all-the-package-names
I made a fairly significant (dumb) mistake in the logic for extracting valid semver versions. I was doing a falsy check, so if any of major/minor/patch in the version was a 0, the whole package was ignored.
The post has been updated to reflect this.
This package also seems to just have a misbehaving github action that is in a loop.
Hmm yeah, I decided that one counts because the new packages have (slightly) different content, although it might be the case that the changes are junk/pointless anyway.
Brief reminder/clarification that these tools are used to circumvent WhatsApp ToS, and that they are used to:
1- Spam 2- Scam 3- Avoid paying for Whatsapp API (which is the only form of monetization)
And that the reason this thing gets so many updates is probably because of a mouse and cat game where Meta updates their software continuously to avoid these types of hacks and the maintainers do so as well, whether in automated or manual fashion.
Considering the 18 billions price tag and the current mixing of user data between meta and WhatsApp I believe that meta has now revenue streams in mind than just the API pricing
Large number of released packages due to renovatebot / dependabot patching + release automation!
If this was an actual measurement of productivity that bot deserves a raise!
> Time to fetch version data for each one of those packages: ~12 hours (yikes)
The author could improve the batching in fetchAllPackageData by not waiting for all 50 (BATCH_SIZE) promises to resolve at once. I just published a package for proper promise batching last week: https://www.npmjs.com/package/promises-batched
What's the benefit of promises like this here?
Just spin up a loop of 50 call chains. When one completes you just do the next on next tick. It's like 3 lines of code. No libraries needed. Then you're always doing 50 at a time. You can still use await.
async work() { await thing(); nextTick(work); }
for(to 50) { work(); }
then maybe a separate timer to check how many tasks are active I guess.
Promise.all waits for all 50 promises to resolve, so if one of these promises takes 3s, while the other 49 are taking 0.5s, you're waisting 2.5s awaiting each batch.
The implementation is rather simple, but more than 3 LoC: https://github.com/whilenot-dev/promises-batched/blob/main/s...
I know. My point is you can do better without a library.
Ah this is cool, thanks!
Haha, good luck finding a real project that holds that title. It's always some squatted name, a dependency confusion experiment, or a troll publishing a package with version 99999.99999.99999 just to see what breaks. The "king" of that hill changes all the time. Just another day in the NPM circus.
I wonder if the author could have replicated the couchdb database locally to make their life easier.
> I was recently working on a project that uses the AWS SDK for JavaScript. When updating the dependencies in said project, I noticed that the version of that dependency was v3.888.0. Eight hundred eighty eight. That’s a big number as far as versions go.
It also isn’t the first AWS SDK. A few of us in… 2012 IIRC… wrote the first one because AWS didn’t think node was worth an SDK.
[dead]