Skip to main content

cargo/sources/registry/
mod.rs

1//! A `Source` for registry-based packages.
2//!
3//! # What's a Registry?
4//!
5//! [Registries] are central locations where packages can be uploaded to,
6//! discovered, and searched for. The purpose of a registry is to have a
7//! location that serves as permanent storage for versions of a crate over time.
8//!
9//! Compared to git sources (see [`GitSource`]), a registry provides many
10//! packages as well as many versions simultaneously. Git sources can also
11//! have commits deleted through rebasings where registries cannot have their
12//! versions deleted.
13//!
14//! In Cargo, [`RegistryData`] is an abstraction over each kind of actual
15//! registry, and [`RegistrySource`] connects those implementations to
16//! [`Source`] trait. Two prominent features these abstractions provide are
17//!
18//! * A way to query the metadata of a package from a registry. The metadata
19//!   comes from the index.
20//! * A way to download package contents (a.k.a source files) that are required
21//!   when building the package itself.
22//!
23//! We'll cover each functionality later.
24//!
25//! [Registries]: https://doc.rust-lang.org/nightly/cargo/reference/registries.html
26//! [`GitSource`]: super::GitSource
27//!
28//! # Different Kinds of Registries
29//!
30//! Cargo provides multiple kinds of registries. Each of them serves the index
31//! and package contents in a slightly different way. Namely,
32//!
33//! * [`LocalRegistry`] --- Serves the index and package contents entirely on
34//!   a local filesystem.
35//! * [`RemoteRegistry`] --- Serves the index ahead of time from a Git
36//!   repository, and package contents are downloaded as needed.
37//! * [`HttpRegistry`] --- Serves both the index and package contents on demand
38//!   over a HTTP-based registry API. This is the default starting from 1.70.0.
39//!
40//! Each registry has its own [`RegistryData`] implementation, and can be
41//! created from either [`RegistrySource::local`] or [`RegistrySource::remote`].
42//!
43//! [`LocalRegistry`]: local::LocalRegistry
44//! [`RemoteRegistry`]: remote::RemoteRegistry
45//! [`HttpRegistry`]: http_remote::HttpRegistry
46//!
47//! # The Index of a Registry
48//!
49//! One of the major difficulties with a registry is that hosting so many
50//! packages may quickly run into performance problems when dealing with
51//! dependency graphs. It's infeasible for cargo to download the entire contents
52//! of the registry just to resolve one package's dependencies, for example. As
53//! a result, cargo needs some efficient method of querying what packages are
54//! available on a registry, what versions are available, and what the
55//! dependencies for each version is.
56//!
57//! To solve the problem, a registry must provide an index of package metadata.
58//! The index of a registry is essentially an easily query-able version of the
59//! registry's database for a list of versions of a package as well as a list
60//! of dependencies for each version. The exact format of the index is
61//! described later.
62//!
63//! See the [`index`] module for topics about the management, parsing, caching,
64//! and versioning for the on-disk index.
65//!
66//! ## The Format of The Index
67//!
68//! The index is a store for the list of versions for all packages known, so its
69//! format on disk is optimized slightly to ensure that `ls registry` doesn't
70//! produce a list of all packages ever known. The index also wants to ensure
71//! that there's not a million files which may actually end up hitting
72//! filesystem limits at some point. To this end, a few decisions were made
73//! about the format of the registry:
74//!
75//! 1. Each crate will have one file corresponding to it. Each version for a
76//!    crate will just be a line in this file (see [`cargo_util_schemas::index::IndexPackage`] for its
77//!    representation).
78//! 2. There will be two tiers of directories for crate names, under which
79//!    crates corresponding to those tiers will be located.
80//!    (See [`cargo_util::registry::make_dep_path`] for the implementation of
81//!    this layout hierarchy.)
82//!
83//! As an example, this is an example hierarchy of an index:
84//!
85//! ```notrust
86//! .
87//! ├── 3
88//! │   └── u
89//! │       └── url
90//! ├── bz
91//! │   └── ip
92//! │       └── bzip2
93//! ├── config.json
94//! ├── en
95//! │   └── co
96//! │       └── encoding
97//! └── li
98//!     ├── bg
99//!     │   └── libgit2
100//!     └── nk
101//!         └── link-config
102//! ```
103//!
104//! The root of the index contains a `config.json` file with a few entries
105//! corresponding to the registry (see [`RegistryConfig`] below).
106//!
107//! Otherwise, there are three numbered directories (1, 2, 3) for crates with
108//! names 1, 2, and 3 characters in length. The 1/2 directories simply have the
109//! crate files underneath them, while the 3 directory is sharded by the first
110//! letter of the crate name.
111//!
112//! Otherwise the top-level directory contains many two-letter directory names,
113//! each of which has many sub-folders with two letters. At the end of all these
114//! are the actual crate files themselves.
115//!
116//! The purpose of this layout is to hopefully cut down on `ls` sizes as well as
117//! efficient lookup based on the crate name itself.
118//!
119//! See [The Cargo Book: Registry Index][registry-index] for the public
120//! interface on the index format.
121//!
122//! [registry-index]: https://doc.rust-lang.org/nightly/cargo/reference/registry-index.html
123//!
124//! ## The Index Files
125//!
126//! Each file in the index is the history of one crate over time. Each line in
127//! the file corresponds to one version of a crate, stored in JSON format (see
128//! the [`cargo_util_schemas::index::IndexPackage`] structure).
129//!
130//! As new versions are published, new lines are appended to this file. **The
131//! only modifications to this file that should happen over time are yanks of a
132//! particular version.**
133//!
134//! # Downloading Packages
135//!
136//! The purpose of the index was to provide an efficient method to resolve the
137//! dependency graph for a package. After resolution has been performed, we need
138//! to download the contents of packages so we can read the full manifest and
139//! build the source code.
140//!
141//! To accomplish this, [`RegistryData::download`] will "make" an HTTP request
142//! per-package requested to download tarballs into a local cache. These
143//! tarballs will then be unpacked into a destination folder.
144//!
145//! Note that because versions uploaded to the registry are frozen forever that
146//! the HTTP download and unpacking can all be skipped if the version has
147//! already been downloaded and unpacked. This caching allows us to only
148//! download a package when absolutely necessary.
149//!
150//! # Filesystem Hierarchy
151//!
152//! Overall, the `$HOME/.cargo` looks like this when talking about the registry
153//! (remote registries, specifically):
154//!
155//! ```notrust
156//! # A folder under which all registry metadata is hosted (similar to
157//! # $HOME/.cargo/git)
158//! $HOME/.cargo/registry/
159//!
160//!     # For each registry that cargo knows about (keyed by hostname + hash)
161//!     # there is a folder which is the checked out version of the index for
162//!     # the registry in this location. Note that this is done so cargo can
163//!     # support multiple registries simultaneously
164//!     index/
165//!         registry1-<hash>/
166//!         registry2-<hash>/
167//!         ...
168//!
169//!     # This folder is a cache for all downloaded tarballs (`.crate` file)
170//!     # from a registry. Once downloaded and verified, a tarball never changes.
171//!     cache/
172//!         registry1-<hash>/<pkg>-<version>.crate
173//!         ...
174//!
175//!     # Location in which all tarballs are unpacked. Each tarball is known to
176//!     # be frozen after downloading, so transitively this folder is also
177//!     # frozen once its unpacked (it's never unpacked again)
178//!     # CAVEAT: They are not read-only. See rust-lang/cargo#9455.
179//!     src/
180//!         registry1-<hash>/<pkg>-<version>/...
181//!         ...
182//! ```
183//!
184
185use std::cell::RefCell;
186use std::collections::HashSet;
187use std::fs;
188use std::fs::{File, OpenOptions};
189use std::io;
190use std::io::Read;
191use std::io::Write;
192use std::path::{Path, PathBuf};
193
194use anyhow::Context as _;
195use cargo_util::paths;
196use cargo_util_terminal::report::Level;
197use flate2::read::GzDecoder;
198use futures::FutureExt as _;
199use serde::Deserialize;
200use serde::Serialize;
201use tar::Archive;
202use tracing::debug;
203
204use crate::core::dependency::Dependency;
205use crate::core::global_cache_tracker;
206use crate::core::{Package, PackageId, SourceId};
207use crate::sources::PathSource;
208use crate::sources::source::MaybePackage;
209use crate::sources::source::QueryKind;
210use crate::sources::source::Source;
211use crate::util::cache_lock::CacheLockMode;
212use crate::util::interning::InternedString;
213use crate::util::{CargoResult, Filesystem, GlobalContext, LimitErrorReader, restricted_names};
214use crate::util::{VersionExt, hex};
215
216/// The `.cargo-ok` file is used to track if the source is already unpacked.
217/// See [`RegistrySource::unpack_package`] for more.
218///
219/// Not to be confused with `.cargo-ok` file in git sources.
220const PACKAGE_SOURCE_LOCK: &str = ".cargo-ok";
221
222pub const CRATES_IO_INDEX: &str = "https://github.com/rust-lang/crates.io-index";
223pub const CRATES_IO_HTTP_INDEX: &str = "sparse+https://index.crates.io/";
224pub const CRATES_IO_REGISTRY: &str = "crates-io";
225pub const CRATES_IO_DOMAIN: &str = "crates.io";
226
227/// The content inside `.cargo-ok`.
228/// See [`RegistrySource::unpack_package`] for more.
229#[derive(Deserialize, Serialize)]
230#[serde(rename_all = "kebab-case")]
231struct LockMetadata {
232    /// The version of `.cargo-ok` file
233    v: u32,
234}
235
236/// A [`Source`] implementation for a local or a remote registry.
237///
238/// This contains common functionality that is shared between each registry
239/// kind, with the registry-specific logic implemented as part of the
240/// [`RegistryData`] trait referenced via the `ops` field.
241///
242/// For general concepts of registries, see the [module-level documentation](crate::sources::registry).
243pub struct RegistrySource<'gctx> {
244    /// A unique name of the source (typically used as the directory name
245    /// where its cached content is stored).
246    name: InternedString,
247    /// The unique identifier of this source.
248    source_id: SourceId,
249    /// The path where crate files are extracted (`$CARGO_HOME/registry/src/$REG-HASH`).
250    src_path: Filesystem,
251    /// Local reference to [`GlobalContext`] for convenience.
252    gctx: &'gctx GlobalContext,
253    /// Abstraction for interfacing to the different registry kinds.
254    ops: Box<dyn RegistryData + 'gctx>,
255    /// Interface for managing the on-disk index.
256    index: index::RegistryIndex<'gctx>,
257    /// A set of packages that should be allowed to be used, even if they are
258    /// yanked.
259    ///
260    /// This is populated from the entries in `Cargo.lock` to ensure that
261    /// `cargo update somepkg` won't unlock yanked entries in `Cargo.lock`.
262    /// Otherwise, the resolver would think that those entries no longer
263    /// exist, and it would trigger updates to unrelated packages.
264    yanked_whitelist: RefCell<HashSet<PackageId>>,
265    /// Yanked versions that have already been selected during queries.
266    ///
267    /// As of this writing, this is for not emitting the `--precise <yanked>`
268    /// warning twice, with the assumption of (`dep.package_name()` + `--precise`
269    /// version) being sufficient to uniquely identify the same query result.
270    selected_precise_yanked: RefCell<HashSet<(InternedString, semver::Version)>>,
271}
272
273/// The [`config.json`] file stored in the index.
274///
275/// The config file may look like:
276///
277/// ```json
278/// {
279///     "dl": "https://example.com/api/{crate}/{version}/download",
280///     "api": "https://example.com/api",
281///     "auth-required": false             # unstable feature (RFC 3139)
282/// }
283/// ```
284///
285/// [`config.json`]: https://doc.rust-lang.org/nightly/cargo/reference/registry-index.html#index-configuration
286#[derive(Deserialize, Debug, Clone)]
287#[serde(rename_all = "kebab-case")]
288pub struct RegistryConfig {
289    /// Download endpoint for all crates.
290    ///
291    /// The string is a template which will generate the download URL for the
292    /// tarball of a specific version of a crate. The substrings `{crate}` and
293    /// `{version}` will be replaced with the crate's name and version
294    /// respectively.  The substring `{prefix}` will be replaced with the
295    /// crate's prefix directory name, and the substring `{lowerprefix}` will
296    /// be replaced with the crate's prefix directory name converted to
297    /// lowercase. The substring `{sha256-checksum}` will be replaced with the
298    /// crate's sha256 checksum.
299    ///
300    /// For backwards compatibility, if the string does not contain any
301    /// markers (`{crate}`, `{version}`, `{prefix}`, or `{lowerprefix}`), it
302    /// will be extended with `/{crate}/{version}/download` to
303    /// support registries like crates.io which were created before the
304    /// templating setup was created.
305    ///
306    /// For more on the template of the download URL, see [Index Configuration](
307    /// https://doc.rust-lang.org/nightly/cargo/reference/registry-index.html#index-configuration).
308    pub dl: String,
309
310    /// API endpoint for the registry. This is what's actually hit to perform
311    /// operations like yanks, owner modifications, publish new crates, etc.
312    /// If this is None, the registry does not support API commands.
313    pub api: Option<String>,
314
315    /// Whether all operations require authentication. See [RFC 3139].
316    ///
317    /// [RFC 3139]: https://rust-lang.github.io/rfcs/3139-cargo-alternative-registry-auth.html
318    #[serde(default)]
319    pub auth_required: bool,
320}
321
322/// Result from loading data from a registry.
323#[derive(Debug, Clone)]
324pub enum LoadResponse {
325    /// The cache is valid. The cached data should be used.
326    CacheValid,
327
328    /// The cache is out of date. Returned data should be used.
329    Data {
330        raw_data: Vec<u8>,
331        /// Version of this data to determine whether it is out of date.
332        index_version: Option<String>,
333    },
334
335    /// The requested crate was found.
336    NotFound,
337}
338
339/// An abstract interface to handle both a local and remote registry.
340///
341/// This allows [`RegistrySource`] to abstractly handle each registry kind.
342///
343/// For general concepts of registries, see the [module-level documentation](crate::sources::registry).
344#[async_trait::async_trait(?Send)]
345pub trait RegistryData {
346    /// Performs initialization for the registry.
347    ///
348    /// This should be safe to call multiple times, the implementation is
349    /// expected to not do any work if it is already prepared.
350    fn prepare(&self) -> CargoResult<()>;
351
352    /// Returns the path to the index.
353    ///
354    /// Note that different registries store the index in different formats
355    /// (remote = git, http & local = files).
356    fn index_path(&self) -> &Filesystem;
357
358    /// Returns the path of the directory that stores the cache of `.crate` files.
359    ///
360    /// The directory is currently expected to contain a flat list of all `.crate` files,
361    /// named `<package-name>-<version>.crate`.
362    fn cache_path(&self) -> &Filesystem;
363
364    /// Loads the JSON for a specific named package from the index.
365    ///
366    /// * `root` is the root path to the index.
367    /// * `path` is the relative path to the package to load (like `ca/rg/cargo`).
368    /// * `index_version` is the version of the requested crate data currently
369    ///    in cache. This is useful for checking if a local cache is outdated.
370    async fn load(
371        &self,
372        root: &Path,
373        path: &Path,
374        index_version: Option<&str>,
375    ) -> CargoResult<LoadResponse>;
376
377    /// Loads the `config.json` file and returns it.
378    ///
379    /// Local registries don't have a config, and return `None`.
380    async fn config(&self) -> CargoResult<Option<RegistryConfig>>;
381
382    /// Invalidates locally cached data.
383    fn invalidate_cache(&self);
384
385    /// If quiet, the source should not display any progress or status messages.
386    fn set_quiet(&mut self, quiet: bool);
387
388    /// Is the local cached data up-to-date?
389    fn is_updated(&self) -> bool;
390
391    /// Prepare to start downloading a `.crate` file.
392    ///
393    /// Despite the name, this doesn't actually download anything. If the
394    /// `.crate` is already downloaded, then it returns [`MaybeLock::Ready`].
395    /// If it hasn't been downloaded, then it returns [`MaybeLock::Download`]
396    /// which contains the URL to download. The [`crate::core::package::Downloads`]
397    /// system handles the actual download process. After downloading, it
398    /// calls [`Self::finish_download`] to save the downloaded file.
399    ///
400    /// `checksum` is currently only used by local registries to verify the
401    /// file contents (because local registries never actually download
402    /// anything). Remote registries will validate the checksum in
403    /// `finish_download`. For already downloaded `.crate` files, it does not
404    /// validate the checksum, assuming the filesystem does not suffer from
405    /// corruption or manipulation.
406    async fn download(&self, pkg: PackageId, checksum: &str) -> CargoResult<MaybeLock>;
407
408    /// Finish a download by saving a `.crate` file to disk.
409    ///
410    /// After [`crate::core::package::Downloads`] has finished a download,
411    /// it will call this to save the `.crate` file. This is only relevant
412    /// for remote registries. This should validate the checksum and save
413    /// the given data to the on-disk cache.
414    ///
415    /// Returns a [`File`] handle to the `.crate` file, positioned at the start.
416    async fn finish_download(
417        &self,
418        pkg: PackageId,
419        checksum: &str,
420        data: &[u8],
421    ) -> CargoResult<File>;
422
423    /// Returns whether or not the `.crate` file is already downloaded.
424    fn is_crate_downloaded(&self, _pkg: PackageId) -> bool {
425        true
426    }
427
428    /// Validates that the global package cache lock is held.
429    ///
430    /// Given the [`Filesystem`], this will make sure that the package cache
431    /// lock is held. If not, it will panic. See
432    /// [`GlobalContext::acquire_package_cache_lock`] for acquiring the global lock.
433    ///
434    /// Returns the [`Path`] to the [`Filesystem`].
435    fn assert_index_locked<'a>(&self, path: &'a Filesystem) -> &'a Path;
436}
437
438/// The status of [`RegistryData::download`] which indicates if a `.crate`
439/// file has already been downloaded, or if not then the URL to download.
440pub enum MaybeLock {
441    /// The `.crate` file is already downloaded. [`File`] is a handle to the
442    /// opened `.crate` file on the filesystem.
443    Ready(File),
444    /// The `.crate` file is not downloaded, here's the URL to download it from.
445    ///
446    /// `descriptor` is just a text string to display to the user of what is
447    /// being downloaded.
448    Download {
449        url: String,
450        descriptor: String,
451        authorization: Option<String>,
452    },
453}
454
455mod download;
456mod http_remote;
457pub(crate) mod index;
458pub use index::IndexSummary;
459mod local;
460mod remote;
461
462/// Generates a unique name for [`SourceId`] to have a unique path to put their
463/// index files.
464fn short_name(id: SourceId, is_shallow: bool) -> String {
465    // CAUTION: This should not change between versions. If you change how
466    // this is computed, it will orphan previously cached data, forcing the
467    // cache to be rebuilt and potentially wasting significant disk space. If
468    // you change it, be cautious of the impact. See `test_cratesio_hash` for
469    // a similar discussion.
470    let hash = hex::short_hash(&id);
471    let ident = id.url().host_str().unwrap_or("").to_string();
472    let mut name = format!("{}-{}", ident, hash);
473    if is_shallow {
474        name.push_str("-shallow");
475    }
476    name
477}
478
479impl<'gctx> RegistrySource<'gctx> {
480    /// Creates a [`Source`] of a "remote" registry.
481    /// It could be either an HTTP-based [`http_remote::HttpRegistry`] or
482    /// a Git-based [`remote::RemoteRegistry`].
483    ///
484    /// * `yanked_whitelist` --- Packages allowed to be used, even if they are yanked.
485    pub fn remote(
486        source_id: SourceId,
487        yanked_whitelist: &HashSet<PackageId>,
488        gctx: &'gctx GlobalContext,
489    ) -> CargoResult<RegistrySource<'gctx>> {
490        assert!(source_id.is_remote_registry());
491        let name = short_name(
492            source_id,
493            gctx.cli_unstable()
494                .git
495                .map_or(false, |features| features.shallow_index)
496                && !source_id.is_sparse(),
497        );
498        let ops = if source_id.is_sparse() {
499            Box::new(http_remote::HttpRegistry::new(source_id, gctx, &name)?) as Box<_>
500        } else {
501            Box::new(remote::RemoteRegistry::new(source_id, gctx, &name)) as Box<_>
502        };
503
504        Ok(RegistrySource::new(
505            source_id,
506            gctx,
507            &name,
508            ops,
509            yanked_whitelist,
510        ))
511    }
512
513    /// Creates a [`Source`] of a local registry, with [`local::LocalRegistry`] under the hood.
514    ///
515    /// * `path` --- The root path of a local registry on the file system.
516    /// * `yanked_whitelist` --- Packages allowed to be used, even if they are yanked.
517    pub fn local(
518        source_id: SourceId,
519        path: &Path,
520        yanked_whitelist: &HashSet<PackageId>,
521        gctx: &'gctx GlobalContext,
522    ) -> RegistrySource<'gctx> {
523        let name = short_name(source_id, false);
524        let ops = local::LocalRegistry::new(path, gctx, &name);
525        RegistrySource::new(source_id, gctx, &name, Box::new(ops), yanked_whitelist)
526    }
527
528    /// Creates a source of a registry. This is a inner helper function.
529    ///
530    /// * `name` --- Name of a path segment which may affect where `.crate`
531    ///   tarballs, the registry index and cache are stored. Expect to be unique.
532    /// * `ops` --- The underlying [`RegistryData`] type.
533    /// * `yanked_whitelist` --- Packages allowed to be used, even if they are yanked.
534    fn new(
535        source_id: SourceId,
536        gctx: &'gctx GlobalContext,
537        name: &str,
538        ops: Box<dyn RegistryData + 'gctx>,
539        yanked_whitelist: &HashSet<PackageId>,
540    ) -> RegistrySource<'gctx> {
541        // Before starting to work on the registry, make sure that
542        // `<cargo_home>/registry` is marked as excluded from indexing and
543        // backups. Older versions of Cargo didn't do this, so we do it here
544        // regardless of whether `<cargo_home>` exists.
545        //
546        // This does not use `create_dir_all_excluded_from_backups_atomic` for
547        // the same reason: we want to exclude it even if the directory already
548        // exists.
549        //
550        // IO errors in creating and marking it are ignored, e.g. in case we're on a
551        // read-only filesystem.
552        let registry_base = gctx.registry_base_path();
553        let _ = registry_base.create_dir();
554        cargo_util::paths::exclude_from_backups_and_indexing(&registry_base.into_path_unlocked());
555
556        RegistrySource {
557            name: name.into(),
558            src_path: gctx.registry_source_path().join(name),
559            gctx,
560            source_id,
561            index: index::RegistryIndex::new(source_id, ops.index_path(), gctx),
562            yanked_whitelist: RefCell::new(yanked_whitelist.clone()),
563            ops,
564            selected_precise_yanked: RefCell::new(HashSet::new()),
565        }
566    }
567
568    /// Decode the [configuration](RegistryConfig) stored within the registry.
569    ///
570    /// This requires that the index has been at least checked out.
571    pub async fn config(&self) -> CargoResult<Option<RegistryConfig>> {
572        self.ops.config().await
573    }
574
575    /// Unpacks a downloaded package into a location where it's ready to be
576    /// compiled.
577    ///
578    /// No action is taken if the source looks like it's already unpacked.
579    ///
580    /// # History of interruption detection with `.cargo-ok` file
581    ///
582    /// Cargo has always included a `.cargo-ok` file ([`PACKAGE_SOURCE_LOCK`])
583    /// to detect if extraction was interrupted, but it was originally empty.
584    ///
585    /// In 1.34, Cargo was changed to create the `.cargo-ok` file before it
586    /// started extraction to implement fine-grained locking. After it was
587    /// finished extracting, it wrote two bytes to indicate it was complete.
588    /// It would use the length check to detect if it was possibly interrupted.
589    ///
590    /// In 1.36, Cargo changed to not use fine-grained locking, and instead used
591    /// a global lock. The use of `.cargo-ok` was no longer needed for locking
592    /// purposes, but was kept to detect when extraction was interrupted.
593    ///
594    /// In 1.49, Cargo changed to not create the `.cargo-ok` file before it
595    /// started extraction to deal with `.crate` files that inexplicably had
596    /// a `.cargo-ok` file in them.
597    ///
598    /// In 1.64, Cargo changed to detect `.crate` files with `.cargo-ok` files
599    /// in them in response to [CVE-2022-36113], which dealt with malicious
600    /// `.crate` files making `.cargo-ok` a symlink causing cargo to write "ok"
601    /// to any arbitrary file on the filesystem it has permission to.
602    ///
603    /// In 1.71, `.cargo-ok` changed to contain a JSON `{ v: 1 }` to indicate
604    /// the version of it. A failure of parsing will result in a heavy-hammer
605    /// approach that unpacks the `.crate` file again. This is in response to a
606    /// security issue that the unpacking didn't respect umask on Unix systems.
607    ///
608    /// This is all a long-winded way of explaining the circumstances that might
609    /// cause a directory to contain a `.cargo-ok` file that is empty or
610    /// otherwise corrupted. Either this was extracted by a version of Rust
611    /// before 1.34, in which case everything should be fine. However, an empty
612    /// file created by versions 1.36 to 1.49 indicates that the extraction was
613    /// interrupted and that we need to start again.
614    ///
615    /// Another possibility is that the filesystem is simply corrupted, in
616    /// which case deleting the directory might be the safe thing to do. That
617    /// is probably unlikely, though.
618    ///
619    /// To be safe, we delete the directory and start over again if an empty
620    /// `.cargo-ok` file is found.
621    ///
622    /// [CVE-2022-36113]: https://blog.rust-lang.org/2022/09/14/cargo-cves.html#arbitrary-file-corruption-cve-2022-36113
623    fn unpack_package(&self, pkg: PackageId, tarball: &File) -> CargoResult<PathBuf> {
624        let package_dir = format!("{}-{}", pkg.name(), pkg.version());
625        let dst = self.src_path.join(&package_dir);
626        let path = dst.join(PACKAGE_SOURCE_LOCK);
627        let path = self
628            .gctx
629            .assert_package_cache_locked(CacheLockMode::DownloadExclusive, &path);
630        let unpack_dir = path.parent().unwrap();
631        match fs::read_to_string(path) {
632            Ok(ok) => match serde_json::from_str::<LockMetadata>(&ok) {
633                Ok(lock_meta) if lock_meta.v == 1 => {
634                    self.gctx
635                        .deferred_global_last_use()?
636                        .mark_registry_src_used(global_cache_tracker::RegistrySrc {
637                            encoded_registry_name: self.name,
638                            package_dir: package_dir.into(),
639                            size: None,
640                        });
641                    return Ok(unpack_dir.to_path_buf());
642                }
643                _ => {
644                    if ok == "ok" {
645                        tracing::debug!("old `ok` content found, clearing cache");
646                    } else {
647                        tracing::warn!("unrecognized .cargo-ok content, clearing cache: {ok}");
648                    }
649                    // See comment of `unpack_package` about why removing all stuff.
650                    paths::remove_dir_all(dst.as_path_unlocked())?;
651                }
652            },
653            Err(e) if e.kind() == io::ErrorKind::NotFound => {}
654            Err(e) => anyhow::bail!("unable to read .cargo-ok file at {path:?}: {e}"),
655        }
656        dst.create_dir()?;
657
658        let bytes_written = unpack(self.gctx, tarball, unpack_dir, &|_| true)?;
659        update_mtime_for_generated_files(unpack_dir);
660
661        // Now that we've finished unpacking, create and write to the lock file to indicate that
662        // unpacking was successful.
663        let mut ok = OpenOptions::new()
664            .create_new(true)
665            .read(true)
666            .write(true)
667            .open(&path)
668            .with_context(|| format!("failed to open `{}`", path.display()))?;
669
670        let lock_meta = LockMetadata { v: 1 };
671        write!(ok, "{}", serde_json::to_string(&lock_meta).unwrap())?;
672
673        self.gctx
674            .deferred_global_last_use()?
675            .mark_registry_src_used(global_cache_tracker::RegistrySrc {
676                encoded_registry_name: self.name,
677                package_dir: package_dir.into(),
678                size: Some(bytes_written),
679            });
680
681        Ok(unpack_dir.to_path_buf())
682    }
683
684    /// Unpacks the `.crate` tarball of the package in a given directory.
685    ///
686    /// Returns the path to the crate tarball directory,
687    /// which is always `<unpack_dir>/<pkg>-<version>`.
688    ///
689    /// This holds some assumptions
690    ///
691    /// * The associated tarball already exists
692    /// * If this is a local registry,
693    ///   the package cache lock must be externally synchronized.
694    ///   Cargo does not take care of it being locked or not.
695    pub fn unpack_package_in(
696        &self,
697        pkg: &PackageId,
698        unpack_dir: &Path,
699        include: &dyn Fn(&Path) -> bool,
700    ) -> CargoResult<PathBuf> {
701        let path = self.ops.cache_path().join(pkg.tarball_name());
702        let path = self.ops.assert_index_locked(&path);
703        let dst = unpack_dir.join(format!("{}-{}", pkg.name(), pkg.version()));
704        let tarball =
705            File::open(path).with_context(|| format!("failed to open {}", path.display()))?;
706        unpack(self.gctx, &tarball, &dst, include)?;
707        update_mtime_for_generated_files(&dst);
708        Ok(dst)
709    }
710
711    /// Turns the downloaded `.crate` tarball file into a [`Package`].
712    ///
713    /// This unconditionally sets checksum for the returned package, so it
714    /// should only be called after doing integrity check. That is to say,
715    /// you need to call either [`RegistryData::download`] or
716    /// [`RegistryData::finish_download`] before calling this method.
717    async fn get_pkg(&self, package: PackageId, path: &File) -> CargoResult<Package> {
718        let path = self
719            .unpack_package(package, path)
720            .with_context(|| format!("failed to unpack package `{}`", package))?;
721        let src = PathSource::new(&path, self.source_id, self.gctx);
722        src.load()?;
723        let mut pkg = match src.download(package).await? {
724            MaybePackage::Ready(pkg) => pkg,
725            MaybePackage::Download { .. } => unreachable!(),
726        };
727
728        // After we've loaded the package configure its summary's `checksum`
729        // field with the checksum we know for this `PackageId`.
730        let cksum = self
731            .index
732            .hash(package, &*self.ops)
733            .now_or_never()
734            .expect("a downloaded dep now pending!?")
735            .expect("summary not found");
736        pkg.manifest_mut()
737            .summary_mut()
738            .set_checksum(cksum.to_string());
739
740        Ok(pkg)
741    }
742}
743
744#[async_trait::async_trait(?Send)]
745impl<'gctx> Source for RegistrySource<'gctx> {
746    async fn query(
747        &self,
748        dep: &Dependency,
749        kind: QueryKind,
750        f: &mut dyn FnMut(IndexSummary),
751    ) -> CargoResult<()> {
752        let mut req = dep.version_req().clone();
753
754        // Handle `cargo update --precise` here.
755        if let Some((_, requested)) = self
756            .source_id
757            .precise_registry_version(dep.package_name().as_str())
758            .filter(|(c, to)| {
759                if to.is_prerelease() && self.gctx.cli_unstable().unstable_options {
760                    req.matches_prerelease(c)
761                } else {
762                    req.matches(c)
763                }
764            })
765        {
766            req.precise_to(&requested);
767        }
768
769        let mut called = false;
770        let callback = &mut |s| {
771            called = true;
772            f(s);
773        };
774
775        // If this is a locked dependency, then it came from a lock file and in
776        // theory the registry is known to contain this version. If, however, we
777        // come back with no summaries, then our registry may need to be
778        // updated, so we fall back to performing a lazy update.
779        if kind == QueryKind::Exact && req.is_locked() && !self.ops.is_updated() {
780            debug!("attempting query without update");
781            self.index
782                .query_inner(dep.package_name(), &req, &*self.ops, &mut |s| {
783                    if matches!(s, IndexSummary::Candidate(_) | IndexSummary::Yanked(_))
784                        && dep.matches(s.as_summary())
785                    {
786                        // We are looking for a package from a lock file so we do not care about yank
787                        callback(s)
788                    }
789                })
790                .await?;
791            if called {
792                return Ok(());
793            } else {
794                debug!("falling back to an update");
795                self.invalidate_cache();
796            }
797        }
798
799        let mut called = false;
800        let callback = &mut |s| {
801            called = true;
802            f(s);
803        };
804
805        let mut precise_yanked_in_use = false;
806        self.index
807            .query_inner(dep.package_name(), &req, &*self.ops, &mut |s| {
808                let matched = match kind {
809                    QueryKind::Exact | QueryKind::RejectedVersions => {
810                        if req.is_precise() && self.gctx.cli_unstable().unstable_options {
811                            dep.matches_prerelease(s.as_summary())
812                        } else {
813                            dep.matches(s.as_summary())
814                        }
815                    }
816                    QueryKind::AlternativeNames => true,
817                    QueryKind::Normalized => true,
818                };
819                if !matched {
820                    return;
821                }
822                // Next filter out all yanked packages. Some yanked packages may
823                // leak through if they're in a whitelist (aka if they were
824                // previously in `Cargo.lock`
825                match s {
826                    s @ _ if kind == QueryKind::RejectedVersions => callback(s),
827                    s @ IndexSummary::Candidate(_) => callback(s),
828                    s @ IndexSummary::Yanked(_) => {
829                        if self.yanked_whitelist.borrow().contains(&s.package_id()) {
830                            callback(s);
831                        } else if req.is_precise() {
832                            precise_yanked_in_use = true;
833                            callback(s);
834                        }
835                    }
836                    IndexSummary::Unsupported(summary, v) => {
837                        tracing::debug!(
838                            "unsupported schema version {} ({} {})",
839                            v,
840                            summary.name(),
841                            summary.version()
842                        );
843                    }
844                    IndexSummary::Invalid(summary) => {
845                        tracing::debug!("invalid ({} {})", summary.name(), summary.version());
846                    }
847                    IndexSummary::Offline(summary) => {
848                        tracing::debug!("offline ({} {})", summary.name(), summary.version());
849                    }
850                }
851            })
852            .await?;
853        if precise_yanked_in_use {
854            let name = dep.package_name();
855            let version = req
856                .precise_version()
857                .expect("--precise <yanked-version> in use");
858            if self
859                .selected_precise_yanked
860                .borrow_mut()
861                .insert((name, version.clone()))
862            {
863                let mut shell = self.gctx.shell();
864                shell.print_report(
865                    &[Level::WARNING
866                        .secondary_title(format!(
867                            "selected package `{name}@{version}` was yanked by the author"
868                        ))
869                        .element(
870                            Level::HELP.message("if possible, try a compatible non-yanked version"),
871                        )],
872                    false,
873                )?;
874            }
875        }
876        if called {
877            return Ok(());
878        }
879        if kind == QueryKind::AlternativeNames || kind == QueryKind::Normalized {
880            // Attempt to handle misspellings by searching for a chain of related
881            // names to the original name. The resolver will later
882            // reject any candidates that have the wrong name, and with this it'll
883            // have enough information to offer "a similar crate exists" suggestions.
884            // For now we only try canonicalizing `-` to `_` and vice versa.
885            // More advanced fuzzy searching become in the future.
886            for name_permutation in [
887                dep.package_name().replace('-', "_"),
888                dep.package_name().replace('_', "-"),
889            ] {
890                let name_permutation = name_permutation.into();
891                if name_permutation == dep.package_name() {
892                    continue;
893                }
894                self.index
895                    .query_inner(name_permutation, &req, &*self.ops, &mut |s| {
896                        if !s.is_yanked() {
897                            f(s);
898                        } else if kind == QueryKind::AlternativeNames {
899                            f(s);
900                        }
901                    })
902                    .await?;
903            }
904        }
905        Ok(())
906    }
907
908    fn supports_checksums(&self) -> bool {
909        true
910    }
911
912    fn requires_precise(&self) -> bool {
913        false
914    }
915
916    fn source_id(&self) -> SourceId {
917        self.source_id
918    }
919
920    fn invalidate_cache(&self) {
921        self.index.clear_summaries_cache();
922        self.ops.invalidate_cache();
923    }
924
925    fn set_quiet(&mut self, quiet: bool) {
926        self.ops.set_quiet(quiet);
927    }
928
929    async fn download(&self, package: PackageId) -> CargoResult<MaybePackage> {
930        let hash = self.index.hash(package, &*self.ops).await?;
931        match self.ops.download(package, &hash).await? {
932            MaybeLock::Ready(file) => self.get_pkg(package, &file).await.map(MaybePackage::Ready),
933            MaybeLock::Download {
934                url,
935                descriptor,
936                authorization,
937            } => Ok(MaybePackage::Download {
938                url,
939                descriptor,
940                authorization,
941            }),
942        }
943    }
944
945    async fn finish_download(&self, package: PackageId, data: Vec<u8>) -> CargoResult<Package> {
946        let hash = self.index.hash(package, &*self.ops).await?;
947        let file = self.ops.finish_download(package, &hash, &data).await?;
948        self.get_pkg(package, &file).await
949    }
950
951    fn fingerprint(&self, pkg: &Package) -> CargoResult<String> {
952        Ok(pkg.package_id().version().to_string())
953    }
954
955    fn describe(&self) -> String {
956        self.source_id.display_index()
957    }
958
959    fn add_to_yanked_whitelist(&self, pkgs: &[PackageId]) {
960        self.yanked_whitelist.borrow_mut().extend(pkgs);
961    }
962
963    async fn is_yanked(&self, pkg: PackageId) -> CargoResult<bool> {
964        self.index.is_yanked(pkg, &*self.ops).await
965    }
966}
967
968impl RegistryConfig {
969    /// File name of [`RegistryConfig`].
970    const NAME: &'static str = "config.json";
971}
972
973/// Get the maximum unpack size that Cargo permits
974/// based on a given `size` of your compressed file.
975///
976/// Returns the larger one between `size * max compression ratio`
977/// and a fixed max unpacked size.
978///
979/// In reality, the compression ratio usually falls in the range of 2:1 to 10:1.
980/// We choose 20:1 to cover almost all possible cases hopefully.
981/// Any ratio higher than this is considered as a zip bomb.
982///
983/// In the future we might want to introduce a configurable size.
984///
985/// Some of the real world data from common compression algorithms:
986///
987/// * <https://www.zlib.net/zlib_tech.html>
988/// * <https://cran.r-project.org/web/packages/brotli/vignettes/brotli-2015-09-22.pdf>
989/// * <https://blog.cloudflare.com/results-experimenting-brotli/>
990/// * <https://tukaani.org/lzma/benchmarks.html>
991fn max_unpack_size(gctx: &GlobalContext, size: u64) -> u64 {
992    const SIZE_VAR: &str = "__CARGO_TEST_MAX_UNPACK_SIZE";
993    const RATIO_VAR: &str = "__CARGO_TEST_MAX_UNPACK_RATIO";
994    const MAX_UNPACK_SIZE: u64 = 512 * 1024 * 1024; // 512 MiB
995    const MAX_COMPRESSION_RATIO: usize = 20; // 20:1
996
997    let max_unpack_size = if cfg!(debug_assertions) && gctx.get_env(SIZE_VAR).is_ok() {
998        // For integration test only.
999        gctx.get_env(SIZE_VAR)
1000            .unwrap()
1001            .parse()
1002            .expect("a max unpack size in bytes")
1003    } else {
1004        MAX_UNPACK_SIZE
1005    };
1006    let max_compression_ratio = if cfg!(debug_assertions) && gctx.get_env(RATIO_VAR).is_ok() {
1007        // For integration test only.
1008        gctx.get_env(RATIO_VAR)
1009            .unwrap()
1010            .parse()
1011            .expect("a max compression ratio in bytes")
1012    } else {
1013        MAX_COMPRESSION_RATIO
1014    };
1015
1016    u64::max(max_unpack_size, size * max_compression_ratio as u64)
1017}
1018
1019/// Set the current [`umask`] value for the given tarball. No-op on non-Unix
1020/// platforms.
1021///
1022/// On Windows, tar only looks at user permissions and tries to set the "read
1023/// only" attribute, so no-op as well.
1024///
1025/// [`umask`]: https://man7.org/linux/man-pages/man2/umask.2.html
1026#[allow(unused_variables)]
1027fn set_mask<R: Read>(tar: &mut Archive<R>) {
1028    #[cfg(unix)]
1029    tar.set_mask(crate::util::get_umask());
1030}
1031
1032/// Unpack a tarball with zip bomb and overwrite protections.
1033fn unpack(
1034    gctx: &GlobalContext,
1035    tarball: &File,
1036    unpack_dir: &Path,
1037    include: &dyn Fn(&Path) -> bool,
1038) -> CargoResult<u64> {
1039    let mut tar = {
1040        let size_limit = max_unpack_size(gctx, tarball.metadata()?.len());
1041        let gz = GzDecoder::new(tarball);
1042        let gz = LimitErrorReader::new(gz, size_limit);
1043        let mut tar = Archive::new(gz);
1044        set_mask(&mut tar);
1045        tar
1046    };
1047    let mut bytes_written = 0;
1048    let prefix = unpack_dir.file_name().unwrap();
1049    let parent = unpack_dir.parent().unwrap();
1050    for entry in tar.entries()? {
1051        let mut entry = entry.context("failed to iterate over archive")?;
1052        let entry_path = entry
1053            .path()
1054            .context("failed to read entry path")?
1055            .into_owned();
1056
1057        if let Ok(path) = entry_path.strip_prefix(prefix) {
1058            if !include(path) {
1059                continue;
1060            }
1061        } else {
1062            // We're going to unpack this tarball into the global source
1063            // directory, but we want to make sure that it doesn't accidentally
1064            // (or maliciously) overwrite source code from other crates. Cargo
1065            // itself should never generate a tarball that hits this error, and
1066            // crates.io should also block uploads with these sorts of tarballs,
1067            // but be extra sure by adding a check here as well.
1068            anyhow::bail!(
1069                "invalid tarball downloaded, contains \
1070                     a file at {entry_path:?} which isn't under {prefix:?}",
1071            )
1072        }
1073
1074        // Prevent unpacking the lockfile from the crate itself.
1075        if entry_path
1076            .file_name()
1077            .map_or(false, |p| p == PACKAGE_SOURCE_LOCK)
1078        {
1079            continue;
1080        }
1081        // Unpacking failed
1082        bytes_written += entry.size();
1083        let mut result = entry.unpack_in(parent).map_err(anyhow::Error::from);
1084        if cfg!(windows) && restricted_names::is_windows_reserved_path(&entry_path) {
1085            result = result.with_context(|| {
1086                format!(
1087                    "`{}` appears to contain a reserved Windows path, \
1088                        it cannot be extracted on Windows",
1089                    entry_path.display()
1090                )
1091            });
1092        }
1093        result.with_context(|| format!("failed to unpack entry at `{}`", entry_path.display()))?;
1094    }
1095
1096    Ok(bytes_written)
1097}
1098
1099/// Workaround for rust-lang/cargo#16237
1100///
1101/// Generated files should have the same deterministic mtime as other files.
1102/// However, since we forgot to set mtime for those files when uploading, they
1103/// always have older mtime (1973-11-29) that prevents zip from packing (requiring >1980)
1104///
1105/// This workaround updates mtime after we unpack the tarball at the destination.
1106fn update_mtime_for_generated_files(pkg_root: &Path) {
1107    const GENERATED_FILES: &[&str] = &["Cargo.lock", "Cargo.toml", ".cargo_vcs_info.json"];
1108    // Hardcoded value be removed once alexcrichton/tar-rs#420 is merged and released.
1109    // See also rust-lang/cargo#16237
1110    const DETERMINISTIC_TIMESTAMP: i64 = 1153704088;
1111
1112    for file in GENERATED_FILES {
1113        let path = pkg_root.join(file);
1114        let mtime = filetime::FileTime::from_unix_time(DETERMINISTIC_TIMESTAMP, 0);
1115        if let Err(e) = filetime::set_file_mtime(&path, mtime) {
1116            tracing::trace!("failed to set deterministic mtime for {path:?}: {e}");
1117        }
1118    }
1119}