#orogene

2025-05-30

so this is happening.

All my #Rust #RustLang repos are moved over. Now I just need to figure out CI, republish to crates.io, and archive the github side (and document a tombstone in their readmes).

I have a bunch of other repos I'll either archive or delete as well.

#KDL and #orogene will remain github-side for now because they're a bit more dependent on github services, but I would like to at least move orogene over eventually. KDL might be stuck, though, unfortunately, but I might move only kdl-rs.

#bevy #miette

screenshot of zkat's profile page on codeberg.org showing a number of Rust projects, including miette.
2024-10-03

I was having nice warm fuzzy feelings because even though #orogene itself is kind of on pause (for now), it’s actually spawned multiple widely-used projects:

  1. #miette was extracted from orogene, because it’s what I was building for rich errors/error codes/etc
  2. #kdl was created with the intention of being the configuration file for orogene, as well as an alternative syntax for package.json with more bells and whistles
  3. We had to write an NPM-compatible #semver package in Rust, because the semantics are actually different from Cargo’s semver implementation: crates.io/crates/node-semver
  4. The orogene resolver is published as a wasm NPM package, and is currently used by vscode.dev for doing in-browser, full fledged intellisense (because you need access to your dependencies to get their types, source definitions, etc). You can literally GoToDef and you don’t need a vm/desktop for it

I think that’s it so far? And of course I’ve had the opportunity to share a lot of learning about performance that have been used by more recent package managers in other ecosystems!

2024-09-06

very excited to see a feature I worked on finally see the light of day.

This feature is backed by a wasm version of #orogene!

It's extremely cool to just be able to load arbitrary package code like this, without needing a vm/filesystem. And it's so fast! Please go try it out!

2024-08-14

warm cache + lockfile:

#orogene: 29s
NPM: 59s
bun: 34s

(I did not expect bun to be slower here. huh.)

2024-08-14

#orogene web: 177.5s
orogene desktop: 157.861s
bun: 106.3s
npm: 540s

not bad for something done in someone's spare time, rather than with VC money, and that can run in the browser (!)

This is full cold start. No cache, no lockfile, nothing. Just a package.json and some vibes.

2024-08-14

back to poking at adding type acquisition support to vscode.dev and finally fixed some issues that were making it slow. Tested with a huge package.json, yeah, it took a while. Tested on orogene desktop, took about the same amount of time. ok.

npm? 3x the amount of time (9 minutes!!), even though orogene web can only open two fetch requests at a time and has to serialize stuff between the Rust and JS layers.

lol. lmao. And orogene desktop was only a few seconds faster!

I should jump back on #orogene dev some time. This thing is still pretty solid.

2024-01-30

my 2c on opt-in vs opt-out:

both are bad in the case of things like telemetry.

If you're going to do something that the user might want to make an important decision about (such as their privacy), you should of course have them opt-in, but opt-in with default-to-no means that something important might not happen, and enablement of that feature will be too low to make it useful. But of course, if you just enable them and yolo, then you're.... well, you're probably violating some laws, tbh, so I'll leave it at that.

I want to have a special term for making opting decisions that doesn't imply defaults, such as "active opt-in" or something, where the application requires that you make a decision either way, during setup, if it's going to do things like telemetry.

Because, in all seriousness, telemetry is useful, and when privacy is respected, it genuinely helps makes products better, but opt-out is gross af, and opt-in won't get you the data you need.

This is what I decided to do with #orogene, btw. It sends anonymized crash reports, but you're prompted the first time you launch orogene and both options are presented equally and you can't just skip this (well, you technically can if you're in a non-interactive environment, where we'll default to no telemetry)

2024-01-16

@janriemer just gotta get #orogene out into the world now and world domination will be complete 🦀🦀🦀

2023-11-30

The video for my #PackagingCon2023 talk is now up on youtube!

I blurted out a bunch of stuff about the kind of work I've done speeding up #NPM and #Orogene, in hopes that it would help some poor soul out there. It's kind of a high-level overview, but it talks about a lot of different things. It's also the first talk I've given in YEARS.

Check it out if you're interested! youtube.com/watch?v=eh3VME3opn

2023-11-23

I’m so backlogged on foss stuff. As soon as I’m able to, I need to:

👉🏻 wrap up sure discussions/executive decisions and release a KDL 2.0 RC

👉🏻 notify the (many!) kdl implementors about the RC for final feedback, and so they have a chance to update their parsers

👉🏻 tag a new version of miette with an MSRV bump and a long tail of contribution patches.

👉🏻 probably before the above, respond to a bunch of pending discussions and make some executive decisions

👉🏻 update kdl-rs to parse kdl2.0 specifically, and fill out some missing APIs

👉🏻 go through several of my smaller crates (supports-color, is-ci, etc) and make sure there’s no unreleased patches. Tag and release as needed.

Ideally, I’ll be able to use this long weekend for all this. And if it all works out, this will kickstart my coding engines a bit and I’ll be able to get back to #Banchan work, which really needs attention again.

As you may have noticed, this also means #orogene is going in the backburner for a bit. But that’s ok. There’s no rush on it and the thing I was working on is huge and stressful so I don’t feel like touching it again yet :)

2023-10-16

thoughts from the weekend: I swear to fuck this stunt I'm trying to pull with #orogene better work because trying to get this to happen is proving to be a massive pain while I figure out Swift <-> Rust bridging.

2023-10-10

I have become entirely brainwormed by this idea I got for #orogene and I'm trying really hard to not just take a 2-week vacation and go into a cave about it.

2023-10-10

I guess #orogene is gonna be pivoting a bit 👀

2023-10-10

I keep having ideas for #orogene and then I get sad that there’s no way I have the time or resources to pull them off in any reasonable timeline lol. I guess I’ll just keep tinkering.

2023-10-10

What if you didn’t have to wait for node_modules to populate, but you also don’t need a bunch of plugins for every tool and IDE you use to make it PnP aware? 🤔 #orogene

2023-10-09

#orogene v0.3.34 is out. It has several performance-oriented fixes, but they didn't seem to make much of a difference in the (limited) testing I did. Give it a shot and see if it made a difference for you!

Changelog: github.com/orogene/orogene/rel

$ npx oro -h

2023-10-09

I'm sorry to say. #orogene is not going to be the fastest package manager. I'm sorry.

2023-10-09

I just spent maybe a week deep in the #orogene perf mines and learned a ton and... didn't really speed things up very much. I just know where at least a lot of the current costs are, and I'm not sure I'm willing to pay the price of making things go faster in exchange for loss of safety/consistency. I've probably already given up too much tbh.

I should probably just give up the perf crusade for now and focus on what I think will matter most for Orogene: a new DX that makes it a delight to work with.

2023-10-08

🙈 a whole 7% of the fast-path #orogene flamegraph is taken up by tracing and indicatif (progress bar stuff), and I can't get over that lol

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst