Skip to Content.
Sympa Menu

sm-discuss - Re: [SM-Discuss] How to handle the rust ecosystem with offline builds

sm-discuss AT lists.ibiblio.org

Subject: Public SourceMage Discussion List

List archive

Chronological Thread  
  • From: Ismael Luceno <ismael AT iodev.co.uk>
  • To: Thomas Orgis <thomas-forum AT orgis.org>
  • Cc: sm-discuss AT lists.ibiblio.org
  • Subject: Re: [SM-Discuss] How to handle the rust ecosystem with offline builds
  • Date: Thu, 1 Oct 2020 22:39:47 +0200

On 01/Oct/2020 19:39, Thomas Orgis wrote:
> Am Thu, 1 Oct 2020 18:46:31 +0200
> schrieb Ismael Luceno <ismael AT iodev.co.uk>:
>
<...>
>
> Looks like sources to me. Others seem to also contain .rs sources.
>
> Cargo gets all dependencies, then builds them in some order … then
> lumps everything into the target binary. Where does your A, B, C
> picture come into play?

If you build project A and B using the same target directory it re-uses
anything it had built before.

> I guess a .crate can contain sources or binary libs, and you don't know
> from the outside?

AFAIK, always sources.

> > If you start pinning commits from git repos then it's chaos.
>
> To be clear: Are you intending to take over control of dependencies
> from rust packages? Apparently so …

Only to remove pinned versions, the .lock files, so that you don't get
different versions of the same library to different "final" binaries.

>
> > We need to force versions for sure (replace/edit .lock files), because
> > upstream often can't be trusted to make the right choices, but we need to
> > be able to install multiple versions too.
>
> You trust us to make better choices? Sure, often the list of deps will
> be outdated. But if we're the only ones messing around, our builds will
> differ from everything the people use out there. Nobody tests our setup
> (not even as little as it's the case for 'normal' stuff right now).
>
> Anyone got a hint on how other distros handle this? Is rust stuff
> integrated at all? What about go?
>
> I don't see a conflict with the versions … the directories/crate names
> contain version numbers. Do rust packages know compile-time options,
> though? Well, no issue if we have only source crates. It could be fun
> working with a central CARGO_HOME, though. We'd need to use it as
> upstream to pull copies from into the spell's CARGO_HOME, as things are
> built there. See where we went from the craze about out-of-tree builds
> before? ;-)

The problem is we would then need to provide packages and duplicate the
dependencies in the DEPENDS file for every version mentioned in a .lock
file.

> > Not a problem if we let cargo manage it under a special target
> > directory, I guess, one per rust version.
>
> Different versions of rust?! More madness?

Of course, did you expect it to end there? X-D

You need to compile with the "approved" version of rust for each app.

> > rust-{ver}-{pkg}-{ver} spells
> >
> > And we copy them all at each upgrade; the rust spell should also be
> > versioned.
>
> Well, a section could be handy for that … with the addition that the
> section name becomes part of the spell identification. But without
> that, you need to automate the hell out of this. Well, like Cargo
> automates it right now. Should the spells even exist as classic spell
> files? Should sorcery generate them on the fly? Same for CPAN and
> friends actually?

Not an option unless we make inet mandatory for those packages. I.e.
even if you got the sources you wouldn't be able to build them, unless
we add some sort of cache for that and new semantics for updating it...

<...>
> > This will be produced at each PRE_BUILD from the downloaded sources.
> >
> > Then I guess the spells only need to keep build artifacts around (that's
> > CARGO_TARGET_DIR), and somehow workaround cargo's dependency management,
>
> Would be nice to have things built once only that way. This does put us
> roughly back into the comfort level of static libraries (with us having
> to write a good deal of the linker).
>
> In generall: With all these existing ecosystem packaging systems, I
> feel it is a huge waste of time if we spend time managing spells
> mirroring the respective package databases. Grimoire releases could
> contain dumps of those databases though, to have reproducible builds
> from mirrored sources even if those database services are gone.
>
> (Heck, how big would be a dump of CTAN, CPAN, CRAN, pipy, npm, … each?
> Maybe separate grimoires. Or just storing info on just-cast spells on
> demand.)

It is a lot of duplication and we can do very little about it... other
than automate the heck out of the conversion, no matter how it's
implemented.

The problem isn't the design of the system itself but the fact it's
meant to operate on it's own and always connected to the net.

Yes, cargo provides an offline mode, but it still requires you to
connect first to a repository to get everything, so you would need
to duplicate that to fully work offline.

In the end, you need an alternate implementation of cargo to make all of
it work.

So we need the duplication in a way or another in order to provide basic
functionaly like dependency resolution without depending on an internet
connection...

There's no way to sugarcoat it.

Maybe a way would de to edit the Cargo.toml files to provide local paths
to the dependencies, and in that way avoid the registry, which means the
CARGO_HOME could be readonly maybe.




Archive powered by MHonArc 2.6.24.

Top of Page