Adds a comprehensive test to verify that `nix-prefetch-url` correctly
handles S3 URLs with query parameters (e.g., custom endpoints and regions).
Previously, nix-prefetch-url would fail with "invalid store
path" errors when given S3 URLs with query parameters like
`?endpoint=http://server:9000®ion=eu-west-1`, because it incorrectly
extracted the filename from the query parameters instead of the path.
Add `test_public_bucket_operations` to validate that store operations
work correctly on public S3 buckets without requiring credentials.
Tests nix store info and nix copy operations.
Add cleanup of client store in the finally block of setup_s3 decorator.
Uses `nix store delete --ignore-liveness` to properly handle GC roots
and only attempts deletion if the path exists.
Nix attempts to set the stack size to 64 MB during initialization, which is
required for the repl tests to run successfully. Skip the tests on systems
where the hard stack limit is less than this value rather than failing.
This is necessary to fix nix-everything-llvm.
The problem here is that nix-cli is taken from the previous
stage that is built with libstdc++, but this derivation builds
plugins with libc++ and the plugin load fails miserably.
Turns out there's a much better API for this that doesn't have the
footguns of the previous method.
isLegalRefName is somewhat of a misnomer, since it's mainly used to
validate user inputs that can be either references, branch names,
psedorefs or tags.
This commit replaces the AWS C++ SDK with a lighter curl-based approach
for S3 binary cache operations.
- Removed dependency on the heavy aws-cpp-sdk-s3 and aws-cpp-sdk-transfer
- Added lightweight aws-crt-cpp for credential resolution only
- Leverages curl's native AWS SigV4 authentication (requires curl >= 7.75.0)
- S3BinaryCacheStore now delegates to HttpBinaryCacheStore
- Function s3ToHttpsUrl converts ParsedS3URL to ParsedURL
- Multipart uploads are no longer supported (may be reimplemented later)
- Build now requires curl >= 7.75.0 for AWS SigV4 support
Fixes: #13084, #12671, #11748, #12403, #5947
Resolve the derivation before creating a building goal, in a context
where we know what output(s) we want. That way we have a chance just to
download the outputs we want.
Fix#13247
(cherry picked from commit 39f6fd9b46)
CURL is not very strict about validation of URLs passed to it. We
should reflect this in our handling of URLs that we get from the user
in <nix/fetchurl.nix> or builtins.fetchurl. ValidURL was an attempt to
rectify this, but it turned out to be too strict. The only good way to
resolve this is to pass (in some cases) the user-provided string verbatim
to CURL. Other usages in libfetchers still benefit from using structured
ParsedURL and validation though.
nix store prefetch-file --name foo 'https://cdn.skypack.dev/big.js@^5.2.2'
error: 'https://cdn.skypack.dev/big.js@^5.2.2' is not a valid URL: leftover
Instead of specifying env variables all the time
we can instead embed the __asan_default_options symbol
in all executables / shared objects. This reduces code
duplication.
This barfed with
error: [json.exception.type_error.302] type must be string, but is array
on `nix build github:malt3/bazel-env#bazel-env` because it has a `exportReferencesGraph` with a value like `["string",...["string"]]`.
Best I can tell this was never supposed to be exposed to the user
and has been this way since 2.19.
2.18 did not expose this file to the user:
nix run nix/2.18-maintenance -- eval --expr "import <nix/derivation-internal.nix>"
error: getting status of '/__corepkgs__/derivation-internal.nix': No such file or directory