Re: [BuildStream] Proposal: Allowing download-only sources to work on local files
- From: Benjamin Schubert <contact benschubert me>
- To: Tom Mewett <tom mewett codethink co uk>
- Cc: "buildstream-list gnome org" <buildstream-list gnome org>
- Subject: Re: [BuildStream] Proposal: Allowing download-only sources to work on local files
- Date: Wed, 11 Dec 2019 09:45:32 +0000
Hey all!
Sorry for the late reply,
TLDR: counter-proposal at the end, only add '{project-path}'.
On Monday, 02 December 2019 10:19, Tom Mewett via buildstream-list <buildstream-list gnome org> wrote:
Sometimes, a BuildStream project may need to include proprietary
software. This software may come in an deb/zip/tar archive but may not
be available publicly for direct download. To work with this, one
method is to have elements source the archive from a location in the
project, but to not distribute the archive. Then, instructions are
provided for the user the obtain the software and put it in the right
place.
This is an interesting case, which actually has more problems than that, and I'm not sure
BuildStream has a good story around it currently.
One of the problem is that you can't restrict BuildStream from pushing specific sources/artifacts
to a cache, so as soon as you have somebody with 'push' rights, they might distribute the
source/artifact without wanting too. But let's open a new discussion around this if that's
needed.
Firstly, I'm going to use the term "URL" in quite a general way here.
That's just to keep with existing terminology; it should probably be
changed to avoid confusion.
I disagree with changing the terminology here. Nothing in 'URL' means it should be only to
target a resource on the web. A local file path _is_ a valid url. As such, it is also
understood in the same way by all python tooling relative to URLs. More on this later.
I propose that the act of "resolving a URL to a file" be generalised and have
plugins only operate on the result of that resolution, not knowing or
caring whether the file was downloaded or is local.
While this would work in theory, I don't think this should be done in practice.
The reason for this is that we would then have a set of sources (say Zip,Deb,Tar) that
would work with both a local path or a remote one. However other plugins like 'git'
would then only be supporting remote? Why such a discrepancy? And then, if we want support
for all kind of sources, with this proposal, we would need to split plugins in two:
SourceFetchers and Sources, which would add complexity and make it less easy to extend/write
new plugins.
- All source plugins which operate on files are unified as subclasses of
a single base class, say FileBasedSource
- This base class handles the 'url' and 'ref' keys of the source config
- First it checks whether 'url' is a fully-qualified URL or is just a
relative path. If it's the former, it is fetched as necessary and
stored in the source cache
- If the URL is a relative path, specifying a ref is optional. If it is
given and is different from what is calculated, an error is thrown
The 'local' and 'remote' sources could then also be unified to some
kind of 'copy' source, as 'copy' would act as either one depending on
whether a local path or full URL was given.
If desirable, the 'url' key could be split into mutually-exclusive
'url' and 'path' keys which would decide the behaviour.
This would lead to two radically different handling of the _same_ source type, making it
harder ot understand what is happening and how to write a source element correctly.
I think this is too complex and will lead to confusion for newcomers.
Another advantage would be that local files imported into the project
can be given refs, meaning that they would not need to be present to
compute cache keys of depending elements. (This is not possible with
the current 'local' source.)
That would mean that depending on whether a ref is set or not, a change will
be picked up or not. I think this will lead to too many confusion with editing
said files.
---
Based on those points, I dislike the current proposal and believe we can do
better at providing an easy experience for those cases.
Here is my alternate proposition, which builds upon Tristan's one a lot:
1) We add a '%{project-path}' variable to BuildStream, handled by the core,
which resolves to the path to the directory in which 'project.conf' is.
Sources that want to be local _have_ to use the 'file://' protocol.
2) The rest stays as is.
This means:
1) Local files still work as they do now, nothing needs to change there.
2) When needing to import a tar/zip/deb/etc, from a local directory, the user
can specify a `file://{project-path}/...` url, which _is valid_ and _is supported_
by the core currently. This keeps all the semantics of a remote file, and the exact
same handling.
3) We document an example with `file://`.
I believe this is the simplest implementation we can do, with the least added complexity
to sources, both in terms of handling in the core, and around usage.
Ben
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]