* use filepath to check for known packages
this fixes an issue using windows. without this patch, jb would delete
the folder after installation since the path is not known
* reduce temp directory length by hashing
using this hash it is harder to reach the windows limit for filenames
and directories
* further reduce temp dir length
* do not build binaries for windows/amd*
* feat: go-like import style
jb now creates a directory structure inside of vendor/ that is similar to how go
does (github.com/grafana/jsonnet-libs). This is reflected in the final import
paths, which means they will be go-like
* refactor(spec/deps): named regexs
* feat: make goImportStyle configurable
Defaults to off, can be enabled in `jsonnetfile.json`
* fix: integration test
* doc: license headers
* fix(deps): remove GO_IMPORT_STYLE
not an option anymore, will always do so and symlink
* feat: symlink to legacy location
* feat: allow to disable legacy links
* fix(test): legacyImports in integration tests
* fix(spec): test
* fix: respect legacyName aliases
It was possible to alias packages by changing `name` previously.
While names are now absolute (and computed), legacy links should still respect
old aliases to avoid breaking code.
* fix(test): integration
* fix(init): keep legacyImports enabled for now
* feat: rewrite imports
adds a command to automatically rewrite imports from legacy to absolute style
* fix(tool): rewrite confused by prefixing packages
When a package was a prefix of another one, it broke.
Fixed that by using a proper regular expression. Added a test to make sure it
works as expected
* Update cmd/jb/init.go
* fix: exclude local packages from legacy linking
They actually still use the old style, which is fine. LegacyLinking
messed them up, but from now on it just ignores symlinks that match a localPackage.
So far, `pkg` and `pkg/jsonnetfile` had overlapping functionality when it came
to choosing and loading jsonnetfiles.
This fully switches to the separate package `pkg/jsonnetfile` that seems to be
created for exactly this purpose
Not using an undocumented ETag header from the GitHub archive API is
probably for the best. This is slightly slower due to extra round-trip,
but it is still much faster than cloning the repository to resolve the
ref.
Signed-off-by: Benoit Gagnon <benoit.gagnon@ubisoft.com>
If the server supports it, fetch a specific
revision with --depth 1. Otherwise, fall back
to the normal fetch.
This replaces the previous "clone" operation. The bandwidth and time savings
can be significant depending on the history
of the repository (number of commits).