1. Buy expired NPM maintainer email domains.
2. Re-create maintainer emails
3. Take over packages
4. Submit legitimate security patches that include package.json version bumps to malicious dependency you pushed
5. Enjoy world domination.

I just noticed "foreach" on npm is controlled by a single maintainer.

I also noticed they let their personal email domain expire, so I bought it before someone else did.

I now control "foreach" on NPM, and the 36826 projects that depend on it.

@lrvick@mastodon.social foreach sounds like a package that you shouldnt need with Array.prototype.forEach :blobfoxthonking:


@Johann150 yes, that's true. It made it into ECMAScript 5.1

Now if you've for _some_ weird reason a system that requries some _older_ build target you get a polyfill.

That was provided by packages like this and should be helluvEOL nowadays. There are better suited and highly automated polyfills.

Anyway, the issue is very real. This happened before and will happen again.

It's also the very same for most language depending package managers out there and this is why version pinning is a thing.

So it could happen to PyPI (Python), RubyGems (Ruby), Crates (Rust), … too :-(

@RyunoKi …and browser extensions and game mods. Heck, whatever allows to regain access to an account via mail basically.

No 2FA on your Google Dev account? Too bad 🙃

@RyunoKi Google boo whatever. Try releasing a Chrome extension without :P

(Or an Android app).


Why would I want to write for Chrome?
That doesn't help Firefox at all.

@bekopharm @Johann150 @RyunoKi Attacks on PyPI (the one I know best) and the others based on the fact that anybody can upload a new package have already happened and will keep happening.

With npn it happens more often because of the higher visibility, of the culture of tiny libraries that multiplies the attack surface by a few orders of magnitude and other social factors.
@clacke @federico3 @bekopharm @wolf480pl @Sandra @lrvick @technicallypossible @ruffni @Johann150 @RyunoKi Personally I very much prefer to live in a world where there are two layers in the distribution of software (libraries).

One where everybody can upload, and I as a moderately competent software person can go to discover new things, review them (both as code and as maintainership situation), decide whether or not I want to trust them.

And another where I as a tired person who needs the software|library *now* can go and get things (and trust automatic updates) knowing that somebody has already given them at least a bit of review, that there are automatic systems in place to keep checking that it keeps working, that there are procedure in place to substitute the people involved in this review if they disappear, if there are security patches I will get them applied without having to upgrade to a completely new version at a random time and basically all the things that I would have to do personally to safely use in production code taken directly from the first layer.

@valhalla @clacke @federico3 @bekopharm @Sandra @lrvick @technicallypossible @ruffni @Johann150 @RyunoKi also, since with the first layer you have to re-audit with every update, you may as well vendor that dependency (as in, put a copy of a specific version in your repo), so arguably github could be enough as the first layer

@clacke @federico3 @bekopharm @wolf480pl @Sandra @lrvick @technicallypossible @ruffni @Johann150 @RyunoKi Honestly having a central point for all things python which is just little more than a directory feels nicer than the alternative of having to go through github, condemning to oblivion everybody who wants to host elsewhere (including self-host).

Some kind of distributed directory would be even better, but I'm not the person who will write one any time soon :D
@clacke @federico3 @bekopharm @wolf480pl @Sandra @lrvick @technicallypossible @ruffni @Johann150 @RyunoKi Also, my personal choice rather than vendoring would be to package and upload for debian¹: 90+% of the work has already been done, I might as well do the last bit and make my work useful for everybody else.

¹ because that's what I use and what runs on production. substitute with fedora, arch, whatever else may apply.

Hard to do with multiple projects on the same machine.

Nor using Docker or VMs.

(Anybody want to stop getting notified?)
@clacke @federico3 @bekopharm @wolf480pl @Sandra @lrvick @technicallypossible @ruffni @Johann150

@RyunoKi @valhalla
Each project can be installed with the required OS dependencies. In case you need to test something against an older Debian release you can just use a simple chroot or systemd-nspawn as a container. Less messy and more secure than docker.

That sounds like something I need to research more.

So far I only used chroot for repairing broken installations.

There's a series of articles starting from:
Most of the time you just need an ephemeral run akin to running chroot.

@yes @Johann150 @RyunoKi sorry, failed to parse that but yes, that's a very common thing in npm too due to it's popularity.

@bekopharm @Johann150 @RyunoKi cargo and rust made the same mistakes the nodejs ecosystem did early on and they are about to learn just how dire the consequences are when combined with normalizing vendored c deps, version pinning and being a compiled language. people may actually die from bugs in old rust power builds of software. an ecosystem of lies and false promises is about to start imploding lok

That's a different attack vector.

The above is turning a benevolent package into a malicious one while there is seemingly no change in authorship (same email address)

Sign in to participate in the conversation

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!