Skip to main content

alloc/
boxed.rs

1//! The `Box<T>` type for heap allocation.
2//!
3//! [`Box<T>`], casually referred to as a 'box', provides the simplest form of
4//! heap allocation in Rust. Boxes provide ownership for this allocation, and
5//! drop their contents when they go out of scope. Boxes also ensure that they
6//! never allocate more than `isize::MAX` bytes.
7//!
8//! # Examples
9//!
10//! Move a value from the stack to the heap by creating a [`Box`]:
11//!
12//! ```
13//! let val: u8 = 5;
14//! let boxed: Box<u8> = Box::new(val);
15//! ```
16//!
17//! Move a value from a [`Box`] back to the stack by [dereferencing]:
18//!
19//! ```
20//! let boxed: Box<u8> = Box::new(5);
21//! let val: u8 = *boxed;
22//! ```
23//!
24//! Creating a recursive data structure:
25//!
26//! ```
27//! # #[allow(dead_code)]
28//! #[derive(Debug)]
29//! enum List<T> {
30//!     Cons(T, Box<List<T>>),
31//!     Nil,
32//! }
33//!
34//! let list: List<i32> = List::Cons(1, Box::new(List::Cons(2, Box::new(List::Nil))));
35//! println!("{list:?}");
36//! ```
37//!
38//! This will print `Cons(1, Cons(2, Nil))`.
39//!
40//! Recursive structures must be boxed, because if the definition of `Cons`
41//! looked like this:
42//!
43//! ```compile_fail,E0072
44//! # enum List<T> {
45//! Cons(T, List<T>),
46//! # }
47//! ```
48//!
49//! It wouldn't work. This is because the size of a `List` depends on how many
50//! elements are in the list, and so we don't know how much memory to allocate
51//! for a `Cons`. By introducing a [`Box<T>`], which has a defined size, we know how
52//! big `Cons` needs to be.
53//!
54//! # Memory layout
55//!
56//! For non-zero-sized values, a [`Box`] will use the [`Global`] allocator for its allocation. It is
57//! valid to convert both ways between a [`Box`] and a raw pointer allocated with the [`Global`]
58//! allocator, given that the [`Layout`] used with the allocator is correct for the type and the raw
59//! pointer points to a valid value of the right type. More precisely, a `value: *mut T` that has
60//! been allocated with the [`Global`] allocator with `Layout::for_value(&*value)` may be converted
61//! into a box using [`Box::<T>::from_raw(value)`]. Conversely, the memory backing a `value: *mut T`
62//! obtained from [`Box::<T>::into_raw`] may be deallocated using the [`Global`] allocator with
63//! [`Layout::for_value(&*value)`].
64//!
65//! For zero-sized values, the `Box` pointer has to be non-null and sufficiently aligned. The
66//! recommended way to build a Box to a ZST if `Box::new` cannot be used is to use
67//! [`ptr::NonNull::dangling`].
68//!
69//! On top of these basic layout requirements, a `Box<T>` must point to a valid value of `T`.
70//!
71//! So long as `T: Sized`, a `Box<T>` is guaranteed to be represented
72//! as a single pointer and is also ABI-compatible with C pointers
73//! (i.e. the C type `T*`). This means that if you have extern "C"
74//! Rust functions that will be called from C, you can define those
75//! Rust functions using `Box<T>` types, and use `T*` as corresponding
76//! type on the C side. As an example, consider this C header which
77//! declares functions that create and destroy some kind of `Foo`
78//! value:
79//!
80//! ```c
81//! /* C header */
82//!
83//! /* Returns ownership to the caller */
84//! struct Foo* foo_new(void);
85//!
86//! /* Takes ownership from the caller; no-op when invoked with null */
87//! void foo_delete(struct Foo*);
88//! ```
89//!
90//! These two functions might be implemented in Rust as follows. Here, the
91//! `struct Foo*` type from C is translated to `Box<Foo>`, which captures
92//! the ownership constraints. Note also that the nullable argument to
93//! `foo_delete` is represented in Rust as `Option<Box<Foo>>`, since `Box<Foo>`
94//! cannot be null.
95//!
96//! ```
97//! #[repr(C)]
98//! pub struct Foo;
99//!
100//! #[unsafe(no_mangle)]
101//! pub extern "C" fn foo_new() -> Box<Foo> {
102//!     Box::new(Foo)
103//! }
104//!
105//! #[unsafe(no_mangle)]
106//! pub extern "C" fn foo_delete(_: Option<Box<Foo>>) {}
107//! ```
108//!
109//! Even though `Box<T>` has the same representation and C ABI as a C pointer,
110//! this does not mean that you can convert an arbitrary `T*` into a `Box<T>`
111//! and expect things to work. `Box<T>` values will always be fully aligned,
112//! non-null pointers. Moreover, the destructor for `Box<T>` will attempt to
113//! free the value with the global allocator. In general, the best practice
114//! is to only use `Box<T>` for pointers that originated from the global
115//! allocator.
116//!
117//! **Important.** At least at present, you should avoid using
118//! `Box<T>` types for functions that are defined in C but invoked
119//! from Rust. In those cases, you should directly mirror the C types
120//! as closely as possible. Using types like `Box<T>` where the C
121//! definition is just using `T*` can lead to undefined behavior, as
122//! described in [rust-lang/unsafe-code-guidelines#198][ucg#198].
123//!
124//! # Considerations for unsafe code
125//!
126//! **Warning: This section is not normative and is subject to change, possibly
127//! being relaxed in the future! It is a simplified summary of the rules
128//! currently implemented in the compiler.**
129//!
130//! The aliasing rules for `Box<T>` are the same as for `&mut T`. `Box<T>`
131//! asserts uniqueness over its content. Using raw pointers derived from a box
132//! after that box has been mutated through, moved or borrowed as `&mut T`
133//! is not allowed. For more guidance on working with box from unsafe code, see
134//! [rust-lang/unsafe-code-guidelines#326][ucg#326].
135//!
136//! # Editions
137//!
138//! A special case exists for the implementation of `IntoIterator` for arrays on the Rust 2021
139//! edition, as documented [here][array]. Unfortunately, it was later found that a similar
140//! workaround should be added for boxed slices, and this was applied in the 2024 edition.
141//!
142//! Specifically, `IntoIterator` is implemented for `Box<[T]>` on all editions, but specific calls
143//! to `into_iter()` for boxed slices will defer to the slice implementation on editions before
144//! 2024:
145//!
146//! ```rust,edition2021
147//! // Rust 2015, 2018, and 2021:
148//!
149//! # #![allow(boxed_slice_into_iter)] // override our `deny(warnings)`
150//! let boxed_slice: Box<[i32]> = vec![0; 3].into_boxed_slice();
151//!
152//! // This creates a slice iterator, producing references to each value.
153//! for item in boxed_slice.into_iter().enumerate() {
154//!     let (i, x): (usize, &i32) = item;
155//!     println!("boxed_slice[{i}] = {x}");
156//! }
157//!
158//! // The `boxed_slice_into_iter` lint suggests this change for future compatibility:
159//! for item in boxed_slice.iter().enumerate() {
160//!     let (i, x): (usize, &i32) = item;
161//!     println!("boxed_slice[{i}] = {x}");
162//! }
163//!
164//! // You can explicitly iterate a boxed slice by value using `IntoIterator::into_iter`
165//! for item in IntoIterator::into_iter(boxed_slice).enumerate() {
166//!     let (i, x): (usize, i32) = item;
167//!     println!("boxed_slice[{i}] = {x}");
168//! }
169//! ```
170//!
171//! Similar to the array implementation, this may be modified in the future to remove this override,
172//! and it's best to avoid relying on this edition-dependent behavior if you wish to preserve
173//! compatibility with future versions of the compiler.
174//!
175//! [ucg#198]: https://github.com/rust-lang/unsafe-code-guidelines/issues/198
176//! [ucg#326]: https://github.com/rust-lang/unsafe-code-guidelines/issues/326
177//! [dereferencing]: core::ops::Deref
178//! [`Box::<T>::from_raw(value)`]: Box::from_raw
179//! [`Global`]: crate::alloc::Global
180//! [`Layout`]: crate::alloc::Layout
181//! [`Layout::for_value(&*value)`]: crate::alloc::Layout::for_value
182//! [valid]: ptr#safety
183
184#![stable(feature = "rust1", since = "1.0.0")]
185
186use core::borrow::{Borrow, BorrowMut};
187use core::clone::CloneToUninit;
188use core::cmp::Ordering;
189use core::error::{self, Error};
190use core::fmt;
191use core::future::Future;
192use core::hash::{Hash, Hasher};
193use core::marker::{Tuple, Unsize};
194#[cfg(not(no_global_oom_handling))]
195use core::mem::MaybeUninit;
196use core::mem::{self, SizedTypeProperties};
197use core::ops::{
198    AsyncFn, AsyncFnMut, AsyncFnOnce, CoerceUnsized, Coroutine, CoroutineState, Deref, DerefMut,
199    DerefPure, DispatchFromDyn, LegacyReceiver,
200};
201#[cfg(not(no_global_oom_handling))]
202use core::ops::{Residual, Try};
203use core::pin::{Pin, PinCoerceUnsized};
204use core::ptr::{self, NonNull, Unique};
205use core::task::{Context, Poll};
206
207#[cfg(not(no_global_oom_handling))]
208use crate::alloc::handle_alloc_error;
209use crate::alloc::{AllocError, Allocator, Global, Layout};
210use crate::raw_vec::RawVec;
211#[cfg(not(no_global_oom_handling))]
212use crate::str::from_boxed_utf8_unchecked;
213
214/// Conversion related impls for `Box<_>` (`From`, `downcast`, etc)
215mod convert;
216/// Iterator related impls for `Box<_>`.
217mod iter;
218/// [`ThinBox`] implementation.
219mod thin;
220
221#[unstable(feature = "thin_box", issue = "92791")]
222pub use thin::ThinBox;
223
224/// A pointer type that uniquely owns a heap allocation of type `T`.
225///
226/// See the [module-level documentation](../../std/boxed/index.html) for more.
227#[lang = "owned_box"]
228#[fundamental]
229#[stable(feature = "rust1", since = "1.0.0")]
230#[rustc_insignificant_dtor]
231#[doc(search_unbox)]
232// The declaration of the `Box` struct must be kept in sync with the
233// compiler or ICEs will happen.
234pub struct Box<
235    T: ?Sized,
236    #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
237>(Unique<T>, A);
238
239/// Monomorphic function for allocating an uninit `Box`.
240#[inline]
241// The is a separate function to avoid doing it in every generic version, but it
242// looks small to the mir inliner (particularly in panic=abort) so leave it to
243// the backend to decide whether pulling it in everywhere is worth doing.
244#[rustc_no_mir_inline]
245#[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
246#[cfg(not(no_global_oom_handling))]
247fn box_new_uninit(layout: Layout) -> *mut u8 {
248    match Global.allocate(layout) {
249        Ok(ptr) => ptr.as_mut_ptr(),
250        Err(_) => handle_alloc_error(layout),
251    }
252}
253
254/// Helper for `vec!`.
255///
256/// This is unsafe, but has to be marked as safe or else we couldn't use it in `vec!`.
257#[doc(hidden)]
258#[unstable(feature = "liballoc_internals", issue = "none")]
259#[inline(always)]
260#[cfg(not(no_global_oom_handling))]
261#[rustc_diagnostic_item = "box_assume_init_into_vec_unsafe"]
262pub fn box_assume_init_into_vec_unsafe<T, const N: usize>(
263    b: Box<MaybeUninit<[T; N]>>,
264) -> crate::vec::Vec<T> {
265    unsafe { (b.assume_init() as Box<[T]>).into_vec() }
266}
267
268impl<T> Box<T> {
269    /// Allocates memory on the heap and then places `x` into it.
270    ///
271    /// This doesn't actually allocate if `T` is zero-sized.
272    ///
273    /// # Examples
274    ///
275    /// ```
276    /// let five = Box::new(5);
277    /// ```
278    #[cfg(not(no_global_oom_handling))]
279    #[inline(always)]
280    #[stable(feature = "rust1", since = "1.0.0")]
281    #[must_use]
282    #[rustc_diagnostic_item = "box_new"]
283    #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
284    pub fn new(x: T) -> Self {
285        // This is `Box::new_uninit` but inlined to avoid build time regressions.
286        let ptr = box_new_uninit(<T as SizedTypeProperties>::LAYOUT) as *mut T;
287        // Nothing below can panic so we do not have to worry about deallocating `ptr`.
288        // SAFETY: we just allocated the box to store `x`.
289        unsafe { core::intrinsics::write_via_move(ptr, x) };
290        // SAFETY: we just initialized `b`.
291        unsafe { mem::transmute(ptr) }
292    }
293
294    /// Constructs a new box with uninitialized contents.
295    ///
296    /// # Examples
297    ///
298    /// ```
299    /// let mut five = Box::<u32>::new_uninit();
300    /// // Deferred initialization:
301    /// five.write(5);
302    /// let five = unsafe { five.assume_init() };
303    ///
304    /// assert_eq!(*five, 5)
305    /// ```
306    #[cfg(not(no_global_oom_handling))]
307    #[stable(feature = "new_uninit", since = "1.82.0")]
308    #[must_use]
309    #[inline(always)]
310    #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
311    pub fn new_uninit() -> Box<mem::MaybeUninit<T>> {
312        // This is the same as `Self::new_uninit_in(Global)`, but manually inlined (just like
313        // `Box::new`).
314
315        // SAFETY:
316        // - If `allocate` succeeds, the returned pointer exactly matches what `Box` needs.
317        unsafe { mem::transmute(box_new_uninit(<T as SizedTypeProperties>::LAYOUT)) }
318    }
319
320    /// Constructs a new `Box` with uninitialized contents, with the memory
321    /// being filled with `0` bytes.
322    ///
323    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
324    /// of this method.
325    ///
326    /// # Examples
327    ///
328    /// ```
329    /// let zero = Box::<u32>::new_zeroed();
330    /// let zero = unsafe { zero.assume_init() };
331    ///
332    /// assert_eq!(*zero, 0)
333    /// ```
334    ///
335    /// [zeroed]: mem::MaybeUninit::zeroed
336    #[cfg(not(no_global_oom_handling))]
337    #[inline]
338    #[stable(feature = "new_zeroed_alloc", since = "1.92.0")]
339    #[must_use]
340    pub fn new_zeroed() -> Box<mem::MaybeUninit<T>> {
341        Self::new_zeroed_in(Global)
342    }
343
344    /// Constructs a new `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
345    /// `x` will be pinned in memory and unable to be moved.
346    ///
347    /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin(x)`
348    /// does the same as <code>[Box::into_pin]\([Box::new]\(x))</code>. Consider using
349    /// [`into_pin`](Box::into_pin) if you already have a `Box<T>`, or if you want to
350    /// construct a (pinned) `Box` in a different way than with [`Box::new`].
351    #[cfg(not(no_global_oom_handling))]
352    #[stable(feature = "pin", since = "1.33.0")]
353    #[must_use]
354    #[inline(always)]
355    pub fn pin(x: T) -> Pin<Box<T>> {
356        Box::new(x).into()
357    }
358
359    /// Allocates memory on the heap then places `x` into it,
360    /// returning an error if the allocation fails
361    ///
362    /// This doesn't actually allocate if `T` is zero-sized.
363    ///
364    /// # Examples
365    ///
366    /// ```
367    /// #![feature(allocator_api)]
368    ///
369    /// let five = Box::try_new(5)?;
370    /// # Ok::<(), std::alloc::AllocError>(())
371    /// ```
372    #[unstable(feature = "allocator_api", issue = "32838")]
373    #[inline]
374    pub fn try_new(x: T) -> Result<Self, AllocError> {
375        Self::try_new_in(x, Global)
376    }
377
378    /// Constructs a new box with uninitialized contents on the heap,
379    /// returning an error if the allocation fails
380    ///
381    /// # Examples
382    ///
383    /// ```
384    /// #![feature(allocator_api)]
385    ///
386    /// let mut five = Box::<u32>::try_new_uninit()?;
387    /// // Deferred initialization:
388    /// five.write(5);
389    /// let five = unsafe { five.assume_init() };
390    ///
391    /// assert_eq!(*five, 5);
392    /// # Ok::<(), std::alloc::AllocError>(())
393    /// ```
394    #[unstable(feature = "allocator_api", issue = "32838")]
395    #[inline]
396    pub fn try_new_uninit() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
397        Box::try_new_uninit_in(Global)
398    }
399
400    /// Constructs a new `Box` with uninitialized contents, with the memory
401    /// being filled with `0` bytes on the heap
402    ///
403    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
404    /// of this method.
405    ///
406    /// # Examples
407    ///
408    /// ```
409    /// #![feature(allocator_api)]
410    ///
411    /// let zero = Box::<u32>::try_new_zeroed()?;
412    /// let zero = unsafe { zero.assume_init() };
413    ///
414    /// assert_eq!(*zero, 0);
415    /// # Ok::<(), std::alloc::AllocError>(())
416    /// ```
417    ///
418    /// [zeroed]: mem::MaybeUninit::zeroed
419    #[unstable(feature = "allocator_api", issue = "32838")]
420    #[inline]
421    pub fn try_new_zeroed() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
422        Box::try_new_zeroed_in(Global)
423    }
424
425    /// Maps the value in a box, reusing the allocation if possible.
426    ///
427    /// `f` is called on the value in the box, and the result is returned, also boxed.
428    ///
429    /// Note: this is an associated function, which means that you have
430    /// to call it as `Box::map(b, f)` instead of `b.map(f)`. This
431    /// is so that there is no conflict with a method on the inner type.
432    ///
433    /// # Examples
434    ///
435    /// ```
436    /// #![feature(smart_pointer_try_map)]
437    ///
438    /// let b = Box::new(7);
439    /// let new = Box::map(b, |i| i + 7);
440    /// assert_eq!(*new, 14);
441    /// ```
442    #[cfg(not(no_global_oom_handling))]
443    #[unstable(feature = "smart_pointer_try_map", issue = "144419")]
444    pub fn map<U>(this: Self, f: impl FnOnce(T) -> U) -> Box<U> {
445        if size_of::<T>() == size_of::<U>() && align_of::<T>() == align_of::<U>() {
446            let (value, allocation) = Box::take(this);
447            Box::write(
448                unsafe { mem::transmute::<Box<MaybeUninit<T>>, Box<MaybeUninit<U>>>(allocation) },
449                f(value),
450            )
451        } else {
452            Box::new(f(*this))
453        }
454    }
455
456    /// Attempts to map the value in a box, reusing the allocation if possible.
457    ///
458    /// `f` is called on the value in the box, and if the operation succeeds, the result is
459    /// returned, also boxed.
460    ///
461    /// Note: this is an associated function, which means that you have
462    /// to call it as `Box::try_map(b, f)` instead of `b.try_map(f)`. This
463    /// is so that there is no conflict with a method on the inner type.
464    ///
465    /// # Examples
466    ///
467    /// ```
468    /// #![feature(smart_pointer_try_map)]
469    ///
470    /// let b = Box::new(7);
471    /// let new = Box::try_map(b, u32::try_from).unwrap();
472    /// assert_eq!(*new, 7);
473    /// ```
474    #[cfg(not(no_global_oom_handling))]
475    #[unstable(feature = "smart_pointer_try_map", issue = "144419")]
476    pub fn try_map<R>(
477        this: Self,
478        f: impl FnOnce(T) -> R,
479    ) -> <R::Residual as Residual<Box<R::Output>>>::TryType
480    where
481        R: Try,
482        R::Residual: Residual<Box<R::Output>>,
483    {
484        if size_of::<T>() == size_of::<R::Output>() && align_of::<T>() == align_of::<R::Output>() {
485            let (value, allocation) = Box::take(this);
486            try {
487                Box::write(
488                    unsafe {
489                        mem::transmute::<Box<MaybeUninit<T>>, Box<MaybeUninit<R::Output>>>(
490                            allocation,
491                        )
492                    },
493                    f(value)?,
494                )
495            }
496        } else {
497            try { Box::new(f(*this)?) }
498        }
499    }
500}
501
502impl<T, A: Allocator> Box<T, A> {
503    /// Allocates memory in the given allocator then places `x` into it.
504    ///
505    /// This doesn't actually allocate if `T` is zero-sized.
506    ///
507    /// # Examples
508    ///
509    /// ```
510    /// #![feature(allocator_api)]
511    ///
512    /// use std::alloc::System;
513    ///
514    /// let five = Box::new_in(5, System);
515    /// ```
516    #[cfg(not(no_global_oom_handling))]
517    #[unstable(feature = "allocator_api", issue = "32838")]
518    #[must_use]
519    #[inline]
520    pub fn new_in(x: T, alloc: A) -> Self
521    where
522        A: Allocator,
523    {
524        let mut boxed = Self::new_uninit_in(alloc);
525        boxed.write(x);
526        unsafe { boxed.assume_init() }
527    }
528
529    /// Allocates memory in the given allocator then places `x` into it,
530    /// returning an error if the allocation fails
531    ///
532    /// This doesn't actually allocate if `T` is zero-sized.
533    ///
534    /// # Examples
535    ///
536    /// ```
537    /// #![feature(allocator_api)]
538    ///
539    /// use std::alloc::System;
540    ///
541    /// let five = Box::try_new_in(5, System)?;
542    /// # Ok::<(), std::alloc::AllocError>(())
543    /// ```
544    #[unstable(feature = "allocator_api", issue = "32838")]
545    #[inline]
546    pub fn try_new_in(x: T, alloc: A) -> Result<Self, AllocError>
547    where
548        A: Allocator,
549    {
550        let mut boxed = Self::try_new_uninit_in(alloc)?;
551        boxed.write(x);
552        unsafe { Ok(boxed.assume_init()) }
553    }
554
555    /// Constructs a new box with uninitialized contents in the provided allocator.
556    ///
557    /// # Examples
558    ///
559    /// ```
560    /// #![feature(allocator_api)]
561    ///
562    /// use std::alloc::System;
563    ///
564    /// let mut five = Box::<u32, _>::new_uninit_in(System);
565    /// // Deferred initialization:
566    /// five.write(5);
567    /// let five = unsafe { five.assume_init() };
568    ///
569    /// assert_eq!(*five, 5)
570    /// ```
571    #[unstable(feature = "allocator_api", issue = "32838")]
572    #[cfg(not(no_global_oom_handling))]
573    #[must_use]
574    pub fn new_uninit_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
575    where
576        A: Allocator,
577    {
578        let layout = Layout::new::<mem::MaybeUninit<T>>();
579        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
580        // That would make code size bigger.
581        match Box::try_new_uninit_in(alloc) {
582            Ok(m) => m,
583            Err(_) => handle_alloc_error(layout),
584        }
585    }
586
587    /// Constructs a new box with uninitialized contents in the provided allocator,
588    /// returning an error if the allocation fails
589    ///
590    /// # Examples
591    ///
592    /// ```
593    /// #![feature(allocator_api)]
594    ///
595    /// use std::alloc::System;
596    ///
597    /// let mut five = Box::<u32, _>::try_new_uninit_in(System)?;
598    /// // Deferred initialization:
599    /// five.write(5);
600    /// let five = unsafe { five.assume_init() };
601    ///
602    /// assert_eq!(*five, 5);
603    /// # Ok::<(), std::alloc::AllocError>(())
604    /// ```
605    #[unstable(feature = "allocator_api", issue = "32838")]
606    pub fn try_new_uninit_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
607    where
608        A: Allocator,
609    {
610        let ptr = if T::IS_ZST {
611            NonNull::dangling()
612        } else {
613            let layout = Layout::new::<mem::MaybeUninit<T>>();
614            alloc.allocate(layout)?.cast()
615        };
616        unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
617    }
618
619    /// Constructs a new `Box` with uninitialized contents, with the memory
620    /// being filled with `0` bytes in the provided allocator.
621    ///
622    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
623    /// of this method.
624    ///
625    /// # Examples
626    ///
627    /// ```
628    /// #![feature(allocator_api)]
629    ///
630    /// use std::alloc::System;
631    ///
632    /// let zero = Box::<u32, _>::new_zeroed_in(System);
633    /// let zero = unsafe { zero.assume_init() };
634    ///
635    /// assert_eq!(*zero, 0)
636    /// ```
637    ///
638    /// [zeroed]: mem::MaybeUninit::zeroed
639    #[unstable(feature = "allocator_api", issue = "32838")]
640    #[cfg(not(no_global_oom_handling))]
641    #[must_use]
642    pub fn new_zeroed_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
643    where
644        A: Allocator,
645    {
646        let layout = Layout::new::<mem::MaybeUninit<T>>();
647        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
648        // That would make code size bigger.
649        match Box::try_new_zeroed_in(alloc) {
650            Ok(m) => m,
651            Err(_) => handle_alloc_error(layout),
652        }
653    }
654
655    /// Constructs a new `Box` with uninitialized contents, with the memory
656    /// being filled with `0` bytes in the provided allocator,
657    /// returning an error if the allocation fails,
658    ///
659    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
660    /// of this method.
661    ///
662    /// # Examples
663    ///
664    /// ```
665    /// #![feature(allocator_api)]
666    ///
667    /// use std::alloc::System;
668    ///
669    /// let zero = Box::<u32, _>::try_new_zeroed_in(System)?;
670    /// let zero = unsafe { zero.assume_init() };
671    ///
672    /// assert_eq!(*zero, 0);
673    /// # Ok::<(), std::alloc::AllocError>(())
674    /// ```
675    ///
676    /// [zeroed]: mem::MaybeUninit::zeroed
677    #[unstable(feature = "allocator_api", issue = "32838")]
678    pub fn try_new_zeroed_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
679    where
680        A: Allocator,
681    {
682        let ptr = if T::IS_ZST {
683            NonNull::dangling()
684        } else {
685            let layout = Layout::new::<mem::MaybeUninit<T>>();
686            alloc.allocate_zeroed(layout)?.cast()
687        };
688        unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
689    }
690
691    /// Constructs a new `Pin<Box<T, A>>`. If `T` does not implement [`Unpin`], then
692    /// `x` will be pinned in memory and unable to be moved.
693    ///
694    /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin_in(x, alloc)`
695    /// does the same as <code>[Box::into_pin]\([Box::new_in]\(x, alloc))</code>. Consider using
696    /// [`into_pin`](Box::into_pin) if you already have a `Box<T, A>`, or if you want to
697    /// construct a (pinned) `Box` in a different way than with [`Box::new_in`].
698    ///
699    /// # Examples
700    ///
701    /// ```
702    /// #![feature(allocator_api)]
703    /// use std::alloc::System;
704    ///
705    /// let x = Box::pin_in(1, System);
706    /// ```
707    #[cfg(not(no_global_oom_handling))]
708    #[unstable(feature = "allocator_api", issue = "32838")]
709    #[must_use]
710    #[inline(always)]
711    pub fn pin_in(x: T, alloc: A) -> Pin<Self>
712    where
713        A: 'static + Allocator,
714    {
715        Self::into_pin(Self::new_in(x, alloc))
716    }
717
718    /// Converts a `Box<T>` into a `Box<[T]>`
719    ///
720    /// This conversion does not allocate on the heap and happens in place.
721    #[unstable(feature = "box_into_boxed_slice", issue = "71582")]
722    pub fn into_boxed_slice(boxed: Self) -> Box<[T], A> {
723        let (raw, alloc) = Box::into_raw_with_allocator(boxed);
724        unsafe { Box::from_raw_in(raw as *mut [T; 1], alloc) }
725    }
726
727    /// Consumes the `Box`, returning the wrapped value.
728    ///
729    /// # Examples
730    ///
731    /// ```
732    /// #![feature(box_into_inner)]
733    ///
734    /// let c = Box::new(5);
735    ///
736    /// assert_eq!(Box::into_inner(c), 5);
737    /// ```
738    #[unstable(feature = "box_into_inner", issue = "80437")]
739    #[inline]
740    pub fn into_inner(boxed: Self) -> T {
741        *boxed
742    }
743
744    /// Consumes the `Box` without consuming its allocation, returning the wrapped value and a `Box`
745    /// to the uninitialized memory where the wrapped value used to live.
746    ///
747    /// This can be used together with [`write`](Box::write) to reuse the allocation for multiple
748    /// boxed values.
749    ///
750    /// # Examples
751    ///
752    /// ```
753    /// #![feature(box_take)]
754    ///
755    /// let c = Box::new(5);
756    ///
757    /// // take the value out of the box
758    /// let (value, uninit) = Box::take(c);
759    /// assert_eq!(value, 5);
760    ///
761    /// // reuse the box for a second value
762    /// let c = Box::write(uninit, 6);
763    /// assert_eq!(*c, 6);
764    /// ```
765    #[unstable(feature = "box_take", issue = "147212")]
766    pub fn take(boxed: Self) -> (T, Box<mem::MaybeUninit<T>, A>) {
767        unsafe {
768            let (raw, alloc) = Box::into_non_null_with_allocator(boxed);
769            let value = raw.read();
770            let uninit = Box::from_non_null_in(raw.cast_uninit(), alloc);
771            (value, uninit)
772        }
773    }
774}
775
776impl<T: ?Sized + CloneToUninit> Box<T> {
777    /// Allocates memory on the heap then clones `src` into it.
778    ///
779    /// This doesn't actually allocate if `src` is zero-sized.
780    ///
781    /// # Examples
782    ///
783    /// ```
784    /// #![feature(clone_from_ref)]
785    ///
786    /// let hello: Box<str> = Box::clone_from_ref("hello");
787    /// ```
788    #[cfg(not(no_global_oom_handling))]
789    #[unstable(feature = "clone_from_ref", issue = "149075")]
790    #[must_use]
791    #[inline]
792    pub fn clone_from_ref(src: &T) -> Box<T> {
793        Box::clone_from_ref_in(src, Global)
794    }
795
796    /// Allocates memory on the heap then clones `src` into it, returning an error if allocation fails.
797    ///
798    /// This doesn't actually allocate if `src` is zero-sized.
799    ///
800    /// # Examples
801    ///
802    /// ```
803    /// #![feature(clone_from_ref)]
804    /// #![feature(allocator_api)]
805    ///
806    /// let hello: Box<str> = Box::try_clone_from_ref("hello")?;
807    /// # Ok::<(), std::alloc::AllocError>(())
808    /// ```
809    #[unstable(feature = "clone_from_ref", issue = "149075")]
810    //#[unstable(feature = "allocator_api", issue = "32838")]
811    #[must_use]
812    #[inline]
813    pub fn try_clone_from_ref(src: &T) -> Result<Box<T>, AllocError> {
814        Box::try_clone_from_ref_in(src, Global)
815    }
816}
817
818impl<T: ?Sized + CloneToUninit, A: Allocator> Box<T, A> {
819    /// Allocates memory in the given allocator then clones `src` into it.
820    ///
821    /// This doesn't actually allocate if `src` is zero-sized.
822    ///
823    /// # Examples
824    ///
825    /// ```
826    /// #![feature(clone_from_ref)]
827    /// #![feature(allocator_api)]
828    ///
829    /// use std::alloc::System;
830    ///
831    /// let hello: Box<str, System> = Box::clone_from_ref_in("hello", System);
832    /// ```
833    #[cfg(not(no_global_oom_handling))]
834    #[unstable(feature = "clone_from_ref", issue = "149075")]
835    //#[unstable(feature = "allocator_api", issue = "32838")]
836    #[must_use]
837    #[inline]
838    pub fn clone_from_ref_in(src: &T, alloc: A) -> Box<T, A> {
839        let layout = Layout::for_value::<T>(src);
840        match Box::try_clone_from_ref_in(src, alloc) {
841            Ok(bx) => bx,
842            Err(_) => handle_alloc_error(layout),
843        }
844    }
845
846    /// Allocates memory in the given allocator then clones `src` into it, returning an error if allocation fails.
847    ///
848    /// This doesn't actually allocate if `src` is zero-sized.
849    ///
850    /// # Examples
851    ///
852    /// ```
853    /// #![feature(clone_from_ref)]
854    /// #![feature(allocator_api)]
855    ///
856    /// use std::alloc::System;
857    ///
858    /// let hello: Box<str, System> = Box::try_clone_from_ref_in("hello", System)?;
859    /// # Ok::<(), std::alloc::AllocError>(())
860    /// ```
861    #[unstable(feature = "clone_from_ref", issue = "149075")]
862    //#[unstable(feature = "allocator_api", issue = "32838")]
863    #[must_use]
864    #[inline]
865    pub fn try_clone_from_ref_in(src: &T, alloc: A) -> Result<Box<T, A>, AllocError> {
866        struct DeallocDropGuard<'a, A: Allocator>(Layout, &'a A, NonNull<u8>);
867        impl<'a, A: Allocator> Drop for DeallocDropGuard<'a, A> {
868            fn drop(&mut self) {
869                let &mut DeallocDropGuard(layout, alloc, ptr) = self;
870                // Safety: `ptr` was allocated by `*alloc` with layout `layout`
871                unsafe {
872                    alloc.deallocate(ptr, layout);
873                }
874            }
875        }
876        let layout = Layout::for_value::<T>(src);
877        let (ptr, guard) = if layout.size() == 0 {
878            (layout.dangling_ptr(), None)
879        } else {
880            // Safety: layout is non-zero-sized
881            let ptr = alloc.allocate(layout)?.cast();
882            (ptr, Some(DeallocDropGuard(layout, &alloc, ptr)))
883        };
884        let ptr = ptr.as_ptr();
885        // Safety: `*ptr` is newly allocated, correctly aligned to `align_of_val(src)`,
886        // and is valid for writes for `size_of_val(src)`.
887        // If this panics, then `guard` will deallocate for us (if allocation occuured)
888        unsafe {
889            <T as CloneToUninit>::clone_to_uninit(src, ptr);
890        }
891        // Defuse the deallocate guard
892        core::mem::forget(guard);
893        // Safety: We just initialized `*ptr` as a clone of `src`
894        Ok(unsafe { Box::from_raw_in(ptr.with_metadata_of(src), alloc) })
895    }
896}
897
898impl<T> Box<[T]> {
899    /// Constructs a new boxed slice with uninitialized contents.
900    ///
901    /// # Examples
902    ///
903    /// ```
904    /// let mut values = Box::<[u32]>::new_uninit_slice(3);
905    /// // Deferred initialization:
906    /// values[0].write(1);
907    /// values[1].write(2);
908    /// values[2].write(3);
909    /// let values = unsafe { values.assume_init() };
910    ///
911    /// assert_eq!(*values, [1, 2, 3])
912    /// ```
913    #[cfg(not(no_global_oom_handling))]
914    #[stable(feature = "new_uninit", since = "1.82.0")]
915    #[must_use]
916    pub fn new_uninit_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
917        unsafe { RawVec::with_capacity(len).into_box(len) }
918    }
919
920    /// Constructs a new boxed slice with uninitialized contents, with the memory
921    /// being filled with `0` bytes.
922    ///
923    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
924    /// of this method.
925    ///
926    /// # Examples
927    ///
928    /// ```
929    /// let values = Box::<[u32]>::new_zeroed_slice(3);
930    /// let values = unsafe { values.assume_init() };
931    ///
932    /// assert_eq!(*values, [0, 0, 0])
933    /// ```
934    ///
935    /// [zeroed]: mem::MaybeUninit::zeroed
936    #[cfg(not(no_global_oom_handling))]
937    #[stable(feature = "new_zeroed_alloc", since = "1.92.0")]
938    #[must_use]
939    pub fn new_zeroed_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
940        unsafe { RawVec::with_capacity_zeroed(len).into_box(len) }
941    }
942
943    /// Constructs a new boxed slice with uninitialized contents. Returns an error if
944    /// the allocation fails.
945    ///
946    /// # Examples
947    ///
948    /// ```
949    /// #![feature(allocator_api)]
950    ///
951    /// let mut values = Box::<[u32]>::try_new_uninit_slice(3)?;
952    /// // Deferred initialization:
953    /// values[0].write(1);
954    /// values[1].write(2);
955    /// values[2].write(3);
956    /// let values = unsafe { values.assume_init() };
957    ///
958    /// assert_eq!(*values, [1, 2, 3]);
959    /// # Ok::<(), std::alloc::AllocError>(())
960    /// ```
961    #[unstable(feature = "allocator_api", issue = "32838")]
962    #[inline]
963    pub fn try_new_uninit_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
964        let ptr = if T::IS_ZST || len == 0 {
965            NonNull::dangling()
966        } else {
967            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
968                Ok(l) => l,
969                Err(_) => return Err(AllocError),
970            };
971            Global.allocate(layout)?.cast()
972        };
973        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
974    }
975
976    /// Constructs a new boxed slice with uninitialized contents, with the memory
977    /// being filled with `0` bytes. Returns an error if the allocation fails.
978    ///
979    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
980    /// of this method.
981    ///
982    /// # Examples
983    ///
984    /// ```
985    /// #![feature(allocator_api)]
986    ///
987    /// let values = Box::<[u32]>::try_new_zeroed_slice(3)?;
988    /// let values = unsafe { values.assume_init() };
989    ///
990    /// assert_eq!(*values, [0, 0, 0]);
991    /// # Ok::<(), std::alloc::AllocError>(())
992    /// ```
993    ///
994    /// [zeroed]: mem::MaybeUninit::zeroed
995    #[unstable(feature = "allocator_api", issue = "32838")]
996    #[inline]
997    pub fn try_new_zeroed_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
998        let ptr = if T::IS_ZST || len == 0 {
999            NonNull::dangling()
1000        } else {
1001            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
1002                Ok(l) => l,
1003                Err(_) => return Err(AllocError),
1004            };
1005            Global.allocate_zeroed(layout)?.cast()
1006        };
1007        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
1008    }
1009}
1010
1011impl<T, A: Allocator> Box<[T], A> {
1012    /// Constructs a new boxed slice with uninitialized contents in the provided allocator.
1013    ///
1014    /// # Examples
1015    ///
1016    /// ```
1017    /// #![feature(allocator_api)]
1018    ///
1019    /// use std::alloc::System;
1020    ///
1021    /// let mut values = Box::<[u32], _>::new_uninit_slice_in(3, System);
1022    /// // Deferred initialization:
1023    /// values[0].write(1);
1024    /// values[1].write(2);
1025    /// values[2].write(3);
1026    /// let values = unsafe { values.assume_init() };
1027    ///
1028    /// assert_eq!(*values, [1, 2, 3])
1029    /// ```
1030    #[cfg(not(no_global_oom_handling))]
1031    #[unstable(feature = "allocator_api", issue = "32838")]
1032    #[must_use]
1033    pub fn new_uninit_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
1034        unsafe { RawVec::with_capacity_in(len, alloc).into_box(len) }
1035    }
1036
1037    /// Constructs a new boxed slice with uninitialized contents in the provided allocator,
1038    /// with the memory being filled with `0` bytes.
1039    ///
1040    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
1041    /// of this method.
1042    ///
1043    /// # Examples
1044    ///
1045    /// ```
1046    /// #![feature(allocator_api)]
1047    ///
1048    /// use std::alloc::System;
1049    ///
1050    /// let values = Box::<[u32], _>::new_zeroed_slice_in(3, System);
1051    /// let values = unsafe { values.assume_init() };
1052    ///
1053    /// assert_eq!(*values, [0, 0, 0])
1054    /// ```
1055    ///
1056    /// [zeroed]: mem::MaybeUninit::zeroed
1057    #[cfg(not(no_global_oom_handling))]
1058    #[unstable(feature = "allocator_api", issue = "32838")]
1059    #[must_use]
1060    pub fn new_zeroed_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
1061        unsafe { RawVec::with_capacity_zeroed_in(len, alloc).into_box(len) }
1062    }
1063
1064    /// Constructs a new boxed slice with uninitialized contents in the provided allocator. Returns an error if
1065    /// the allocation fails.
1066    ///
1067    /// # Examples
1068    ///
1069    /// ```
1070    /// #![feature(allocator_api)]
1071    ///
1072    /// use std::alloc::System;
1073    ///
1074    /// let mut values = Box::<[u32], _>::try_new_uninit_slice_in(3, System)?;
1075    /// // Deferred initialization:
1076    /// values[0].write(1);
1077    /// values[1].write(2);
1078    /// values[2].write(3);
1079    /// let values = unsafe { values.assume_init() };
1080    ///
1081    /// assert_eq!(*values, [1, 2, 3]);
1082    /// # Ok::<(), std::alloc::AllocError>(())
1083    /// ```
1084    #[unstable(feature = "allocator_api", issue = "32838")]
1085    #[inline]
1086    pub fn try_new_uninit_slice_in(
1087        len: usize,
1088        alloc: A,
1089    ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
1090        let ptr = if T::IS_ZST || len == 0 {
1091            NonNull::dangling()
1092        } else {
1093            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
1094                Ok(l) => l,
1095                Err(_) => return Err(AllocError),
1096            };
1097            alloc.allocate(layout)?.cast()
1098        };
1099        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
1100    }
1101
1102    /// Constructs a new boxed slice with uninitialized contents in the provided allocator, with the memory
1103    /// being filled with `0` bytes. Returns an error if the allocation fails.
1104    ///
1105    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
1106    /// of this method.
1107    ///
1108    /// # Examples
1109    ///
1110    /// ```
1111    /// #![feature(allocator_api)]
1112    ///
1113    /// use std::alloc::System;
1114    ///
1115    /// let values = Box::<[u32], _>::try_new_zeroed_slice_in(3, System)?;
1116    /// let values = unsafe { values.assume_init() };
1117    ///
1118    /// assert_eq!(*values, [0, 0, 0]);
1119    /// # Ok::<(), std::alloc::AllocError>(())
1120    /// ```
1121    ///
1122    /// [zeroed]: mem::MaybeUninit::zeroed
1123    #[unstable(feature = "allocator_api", issue = "32838")]
1124    #[inline]
1125    pub fn try_new_zeroed_slice_in(
1126        len: usize,
1127        alloc: A,
1128    ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
1129        let ptr = if T::IS_ZST || len == 0 {
1130            NonNull::dangling()
1131        } else {
1132            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
1133                Ok(l) => l,
1134                Err(_) => return Err(AllocError),
1135            };
1136            alloc.allocate_zeroed(layout)?.cast()
1137        };
1138        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
1139    }
1140
1141    /// Converts the boxed slice into a boxed array.
1142    ///
1143    /// This operation does not reallocate; the underlying array of the slice is simply reinterpreted as an array type.
1144    ///
1145    /// If `N` is not exactly equal to the length of `self`, then this method returns `None`.
1146    ///
1147    /// # Examples
1148    ///
1149    /// ```
1150    /// #![feature(alloc_slice_into_array)]
1151    /// let box_slice: Box<[i32]> = Box::new([1, 2, 3]);
1152    ///
1153    /// let box_array: Box<[i32; 3]> = box_slice.into_array().unwrap();
1154    /// ```
1155    #[unstable(feature = "alloc_slice_into_array", issue = "148082")]
1156    #[inline]
1157    #[must_use]
1158    pub fn into_array<const N: usize>(self) -> Option<Box<[T; N], A>> {
1159        if self.len() == N {
1160            let (ptr, alloc) = Self::into_raw_with_allocator(self);
1161            let ptr = ptr as *mut [T; N];
1162
1163            // SAFETY: The underlying array of a slice has the exact same layout as an actual array `[T; N]` if `N` is equal to the slice's length.
1164            let me = unsafe { Box::from_raw_in(ptr, alloc) };
1165            Some(me)
1166        } else {
1167            None
1168        }
1169    }
1170}
1171
1172impl<T, A: Allocator> Box<mem::MaybeUninit<T>, A> {
1173    /// Converts to `Box<T, A>`.
1174    ///
1175    /// # Safety
1176    ///
1177    /// As with [`MaybeUninit::assume_init`],
1178    /// it is up to the caller to guarantee that the value
1179    /// really is in an initialized state.
1180    /// Calling this when the content is not yet fully initialized
1181    /// causes immediate undefined behavior.
1182    ///
1183    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
1184    ///
1185    /// # Examples
1186    ///
1187    /// ```
1188    /// let mut five = Box::<u32>::new_uninit();
1189    /// // Deferred initialization:
1190    /// five.write(5);
1191    /// let five: Box<u32> = unsafe { five.assume_init() };
1192    ///
1193    /// assert_eq!(*five, 5)
1194    /// ```
1195    #[stable(feature = "new_uninit", since = "1.82.0")]
1196    #[inline(always)]
1197    pub unsafe fn assume_init(self) -> Box<T, A> {
1198        // This is used in the `vec!` macro, so we optimize for minimal IR generation
1199        // even in debug builds.
1200        // SAFETY: `Box<T>` and `Box<MaybeUninit<T>>` have the same layout.
1201        unsafe { core::intrinsics::transmute_unchecked(self) }
1202    }
1203
1204    /// Writes the value and converts to `Box<T, A>`.
1205    ///
1206    /// This method converts the box similarly to [`Box::assume_init`] but
1207    /// writes `value` into it before conversion thus guaranteeing safety.
1208    /// In some scenarios use of this method may improve performance because
1209    /// the compiler may be able to optimize copying from stack.
1210    ///
1211    /// # Examples
1212    ///
1213    /// ```
1214    /// let big_box = Box::<[usize; 1024]>::new_uninit();
1215    ///
1216    /// let mut array = [0; 1024];
1217    /// for (i, place) in array.iter_mut().enumerate() {
1218    ///     *place = i;
1219    /// }
1220    ///
1221    /// // The optimizer may be able to elide this copy, so previous code writes
1222    /// // to heap directly.
1223    /// let big_box = Box::write(big_box, array);
1224    ///
1225    /// for (i, x) in big_box.iter().enumerate() {
1226    ///     assert_eq!(*x, i);
1227    /// }
1228    /// ```
1229    #[stable(feature = "box_uninit_write", since = "1.87.0")]
1230    #[inline]
1231    pub fn write(mut boxed: Self, value: T) -> Box<T, A> {
1232        unsafe {
1233            (*boxed).write(value);
1234            boxed.assume_init()
1235        }
1236    }
1237}
1238
1239impl<T, A: Allocator> Box<[mem::MaybeUninit<T>], A> {
1240    /// Converts to `Box<[T], A>`.
1241    ///
1242    /// # Safety
1243    ///
1244    /// As with [`MaybeUninit::assume_init`],
1245    /// it is up to the caller to guarantee that the values
1246    /// really are in an initialized state.
1247    /// Calling this when the content is not yet fully initialized
1248    /// causes immediate undefined behavior.
1249    ///
1250    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
1251    ///
1252    /// # Examples
1253    ///
1254    /// ```
1255    /// let mut values = Box::<[u32]>::new_uninit_slice(3);
1256    /// // Deferred initialization:
1257    /// values[0].write(1);
1258    /// values[1].write(2);
1259    /// values[2].write(3);
1260    /// let values = unsafe { values.assume_init() };
1261    ///
1262    /// assert_eq!(*values, [1, 2, 3])
1263    /// ```
1264    #[stable(feature = "new_uninit", since = "1.82.0")]
1265    #[inline]
1266    pub unsafe fn assume_init(self) -> Box<[T], A> {
1267        let (raw, alloc) = Box::into_raw_with_allocator(self);
1268        unsafe { Box::from_raw_in(raw as *mut [T], alloc) }
1269    }
1270}
1271
1272impl<T: ?Sized> Box<T> {
1273    /// Constructs a box from a raw pointer.
1274    ///
1275    /// After calling this function, the raw pointer is owned by the
1276    /// resulting `Box`. Specifically, the `Box` destructor will call
1277    /// the destructor of `T` and free the allocated memory. For this
1278    /// to be safe, the memory must have been allocated in accordance
1279    /// with the [memory layout] used by `Box` .
1280    ///
1281    /// # Safety
1282    ///
1283    /// This function is unsafe because improper use may lead to
1284    /// memory problems. For example, a double-free may occur if the
1285    /// function is called twice on the same raw pointer.
1286    ///
1287    /// The raw pointer must point to a block of memory allocated by the global allocator.
1288    ///
1289    /// The safety conditions are described in the [memory layout] section.
1290    /// Note that the [considerations for unsafe code] apply to all `Box<T>` values.
1291    ///
1292    /// # Examples
1293    ///
1294    /// Recreate a `Box` which was previously converted to a raw pointer
1295    /// using [`Box::into_raw`]:
1296    /// ```
1297    /// let x = Box::new(5);
1298    /// let ptr = Box::into_raw(x);
1299    /// let x = unsafe { Box::from_raw(ptr) };
1300    /// ```
1301    /// Manually create a `Box` from scratch by using the global allocator:
1302    /// ```
1303    /// use std::alloc::{alloc, Layout};
1304    ///
1305    /// unsafe {
1306    ///     let ptr = alloc(Layout::new::<i32>()) as *mut i32;
1307    ///     // In general .write is required to avoid attempting to destruct
1308    ///     // the (uninitialized) previous contents of `ptr`, though for this
1309    ///     // simple example `*ptr = 5` would have worked as well.
1310    ///     ptr.write(5);
1311    ///     let x = Box::from_raw(ptr);
1312    /// }
1313    /// ```
1314    ///
1315    /// [memory layout]: self#memory-layout
1316    /// [considerations for unsafe code]: self#considerations-for-unsafe-code
1317    #[stable(feature = "box_raw", since = "1.4.0")]
1318    #[inline]
1319    #[must_use = "call `drop(Box::from_raw(ptr))` if you intend to drop the `Box`"]
1320    pub unsafe fn from_raw(raw: *mut T) -> Self {
1321        unsafe { Self::from_raw_in(raw, Global) }
1322    }
1323
1324    /// Constructs a box from a `NonNull` pointer.
1325    ///
1326    /// After calling this function, the `NonNull` pointer is owned by
1327    /// the resulting `Box`. Specifically, the `Box` destructor will call
1328    /// the destructor of `T` and free the allocated memory. For this
1329    /// to be safe, the memory must have been allocated in accordance
1330    /// with the [memory layout] used by `Box` .
1331    ///
1332    /// # Safety
1333    ///
1334    /// This function is unsafe because improper use may lead to
1335    /// memory problems. For example, a double-free may occur if the
1336    /// function is called twice on the same `NonNull` pointer.
1337    ///
1338    /// The non-null pointer must point to a block of memory allocated by the global allocator.
1339    ///
1340    /// The safety conditions are described in the [memory layout] section.
1341    /// Note that the [considerations for unsafe code] apply to all `Box<T>` values.
1342    ///
1343    /// # Examples
1344    ///
1345    /// Recreate a `Box` which was previously converted to a `NonNull`
1346    /// pointer using [`Box::into_non_null`]:
1347    /// ```
1348    /// #![feature(box_vec_non_null)]
1349    ///
1350    /// let x = Box::new(5);
1351    /// let non_null = Box::into_non_null(x);
1352    /// let x = unsafe { Box::from_non_null(non_null) };
1353    /// ```
1354    /// Manually create a `Box` from scratch by using the global allocator:
1355    /// ```
1356    /// #![feature(box_vec_non_null)]
1357    ///
1358    /// use std::alloc::{alloc, Layout};
1359    /// use std::ptr::NonNull;
1360    ///
1361    /// unsafe {
1362    ///     let non_null = NonNull::new(alloc(Layout::new::<i32>()).cast::<i32>())
1363    ///         .expect("allocation failed");
1364    ///     // In general .write is required to avoid attempting to destruct
1365    ///     // the (uninitialized) previous contents of `non_null`.
1366    ///     non_null.write(5);
1367    ///     let x = Box::from_non_null(non_null);
1368    /// }
1369    /// ```
1370    ///
1371    /// [memory layout]: self#memory-layout
1372    /// [considerations for unsafe code]: self#considerations-for-unsafe-code
1373    #[unstable(feature = "box_vec_non_null", issue = "130364")]
1374    #[inline]
1375    #[must_use = "call `drop(Box::from_non_null(ptr))` if you intend to drop the `Box`"]
1376    pub unsafe fn from_non_null(ptr: NonNull<T>) -> Self {
1377        unsafe { Self::from_raw(ptr.as_ptr()) }
1378    }
1379
1380    /// Consumes the `Box`, returning a wrapped raw pointer.
1381    ///
1382    /// The pointer will be properly aligned and non-null.
1383    ///
1384    /// After calling this function, the caller is responsible for the
1385    /// memory previously managed by the `Box`. In particular, the
1386    /// caller should properly destroy `T` and release the memory, taking
1387    /// into account the [memory layout] used by `Box`. The easiest way to
1388    /// do this is to convert the raw pointer back into a `Box` with the
1389    /// [`Box::from_raw`] function, allowing the `Box` destructor to perform
1390    /// the cleanup.
1391    ///
1392    /// Note: this is an associated function, which means that you have
1393    /// to call it as `Box::into_raw(b)` instead of `b.into_raw()`. This
1394    /// is so that there is no conflict with a method on the inner type.
1395    ///
1396    /// # Examples
1397    /// Converting the raw pointer back into a `Box` with [`Box::from_raw`]
1398    /// for automatic cleanup:
1399    /// ```
1400    /// let x = Box::new(String::from("Hello"));
1401    /// let ptr = Box::into_raw(x);
1402    /// let x = unsafe { Box::from_raw(ptr) };
1403    /// ```
1404    /// Manual cleanup by explicitly running the destructor and deallocating
1405    /// the memory:
1406    /// ```
1407    /// use std::alloc::{dealloc, Layout};
1408    /// use std::ptr;
1409    ///
1410    /// let x = Box::new(String::from("Hello"));
1411    /// let ptr = Box::into_raw(x);
1412    /// unsafe {
1413    ///     ptr::drop_in_place(ptr);
1414    ///     dealloc(ptr as *mut u8, Layout::new::<String>());
1415    /// }
1416    /// ```
1417    /// Note: This is equivalent to the following:
1418    /// ```
1419    /// let x = Box::new(String::from("Hello"));
1420    /// let ptr = Box::into_raw(x);
1421    /// unsafe {
1422    ///     drop(Box::from_raw(ptr));
1423    /// }
1424    /// ```
1425    ///
1426    /// [memory layout]: self#memory-layout
1427    #[must_use = "losing the pointer will leak memory"]
1428    #[stable(feature = "box_raw", since = "1.4.0")]
1429    #[inline]
1430    pub fn into_raw(b: Self) -> *mut T {
1431        // Avoid `into_raw_with_allocator` as that interacts poorly with Miri's Stacked Borrows.
1432        let mut b = mem::ManuallyDrop::new(b);
1433        // We need to give Miri (specifically, Stacked Borrows) a chance to recognize this as a
1434        // safe-to-raw-pointer cast. To achieve this, we first create a mutable reference, and then
1435        // cast that to a raw pointer -- this cast is recognized by the aliasing model and leads to
1436        // a suitable retag.
1437        // It would be wrong for `into_raw_with_allocator` to do the same as that would induce
1438        // uniqueness assumptions (from the `&mut`) that we only want with the default allocator.
1439        (&mut **b) as *mut T
1440    }
1441
1442    /// Consumes the `Box`, returning a wrapped `NonNull` pointer.
1443    ///
1444    /// The pointer will be properly aligned.
1445    ///
1446    /// After calling this function, the caller is responsible for the
1447    /// memory previously managed by the `Box`. In particular, the
1448    /// caller should properly destroy `T` and release the memory, taking
1449    /// into account the [memory layout] used by `Box`. The easiest way to
1450    /// do this is to convert the `NonNull` pointer back into a `Box` with the
1451    /// [`Box::from_non_null`] function, allowing the `Box` destructor to
1452    /// perform the cleanup.
1453    ///
1454    /// Note: this is an associated function, which means that you have
1455    /// to call it as `Box::into_non_null(b)` instead of `b.into_non_null()`.
1456    /// This is so that there is no conflict with a method on the inner type.
1457    ///
1458    /// # Examples
1459    /// Converting the `NonNull` pointer back into a `Box` with [`Box::from_non_null`]
1460    /// for automatic cleanup:
1461    /// ```
1462    /// #![feature(box_vec_non_null)]
1463    ///
1464    /// let x = Box::new(String::from("Hello"));
1465    /// let non_null = Box::into_non_null(x);
1466    /// let x = unsafe { Box::from_non_null(non_null) };
1467    /// ```
1468    /// Manual cleanup by explicitly running the destructor and deallocating
1469    /// the memory:
1470    /// ```
1471    /// #![feature(box_vec_non_null)]
1472    ///
1473    /// use std::alloc::{dealloc, Layout};
1474    ///
1475    /// let x = Box::new(String::from("Hello"));
1476    /// let non_null = Box::into_non_null(x);
1477    /// unsafe {
1478    ///     non_null.drop_in_place();
1479    ///     dealloc(non_null.as_ptr().cast::<u8>(), Layout::new::<String>());
1480    /// }
1481    /// ```
1482    /// Note: This is equivalent to the following:
1483    /// ```
1484    /// #![feature(box_vec_non_null)]
1485    ///
1486    /// let x = Box::new(String::from("Hello"));
1487    /// let non_null = Box::into_non_null(x);
1488    /// unsafe {
1489    ///     drop(Box::from_non_null(non_null));
1490    /// }
1491    /// ```
1492    ///
1493    /// [memory layout]: self#memory-layout
1494    #[must_use = "losing the pointer will leak memory"]
1495    #[unstable(feature = "box_vec_non_null", issue = "130364")]
1496    #[inline]
1497    pub fn into_non_null(b: Self) -> NonNull<T> {
1498        // SAFETY: `Box` is guaranteed to be non-null.
1499        unsafe { NonNull::new_unchecked(Self::into_raw(b)) }
1500    }
1501}
1502
1503impl<T: ?Sized, A: Allocator> Box<T, A> {
1504    /// Constructs a box from a raw pointer in the given allocator.
1505    ///
1506    /// After calling this function, the raw pointer is owned by the
1507    /// resulting `Box`. Specifically, the `Box` destructor will call
1508    /// the destructor of `T` and free the allocated memory. For this
1509    /// to be safe, the memory must have been allocated in accordance
1510    /// with the [memory layout] used by `Box` .
1511    ///
1512    /// # Safety
1513    ///
1514    /// This function is unsafe because improper use may lead to
1515    /// memory problems. For example, a double-free may occur if the
1516    /// function is called twice on the same raw pointer.
1517    ///
1518    /// The raw pointer must point to a block of memory allocated by `alloc`.
1519    ///
1520    /// The safety conditions are described in the [memory layout] section.
1521    /// Note that the [considerations for unsafe code] apply to all `Box<T, A>` values.
1522    ///
1523    /// # Examples
1524    ///
1525    /// Recreate a `Box` which was previously converted to a raw pointer
1526    /// using [`Box::into_raw_with_allocator`]:
1527    /// ```
1528    /// #![feature(allocator_api)]
1529    ///
1530    /// use std::alloc::System;
1531    ///
1532    /// let x = Box::new_in(5, System);
1533    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1534    /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1535    /// ```
1536    /// Manually create a `Box` from scratch by using the system allocator:
1537    /// ```
1538    /// #![feature(allocator_api, slice_ptr_get)]
1539    ///
1540    /// use std::alloc::{Allocator, Layout, System};
1541    ///
1542    /// unsafe {
1543    ///     let ptr = System.allocate(Layout::new::<i32>())?.as_mut_ptr() as *mut i32;
1544    ///     // In general .write is required to avoid attempting to destruct
1545    ///     // the (uninitialized) previous contents of `ptr`, though for this
1546    ///     // simple example `*ptr = 5` would have worked as well.
1547    ///     ptr.write(5);
1548    ///     let x = Box::from_raw_in(ptr, System);
1549    /// }
1550    /// # Ok::<(), std::alloc::AllocError>(())
1551    /// ```
1552    ///
1553    /// [memory layout]: self#memory-layout
1554    /// [considerations for unsafe code]: self#considerations-for-unsafe-code
1555    #[unstable(feature = "allocator_api", issue = "32838")]
1556    #[inline]
1557    pub unsafe fn from_raw_in(raw: *mut T, alloc: A) -> Self {
1558        Box(unsafe { Unique::new_unchecked(raw) }, alloc)
1559    }
1560
1561    /// Constructs a box from a `NonNull` pointer in the given allocator.
1562    ///
1563    /// After calling this function, the `NonNull` pointer is owned by
1564    /// the resulting `Box`. Specifically, the `Box` destructor will call
1565    /// the destructor of `T` and free the allocated memory. For this
1566    /// to be safe, the memory must have been allocated in accordance
1567    /// with the [memory layout] used by `Box` .
1568    ///
1569    /// # Safety
1570    ///
1571    /// This function is unsafe because improper use may lead to
1572    /// memory problems. For example, a double-free may occur if the
1573    /// function is called twice on the same raw pointer.
1574    ///
1575    /// The non-null pointer must point to a block of memory allocated by `alloc`.
1576    ///
1577    /// The safety conditions are described in the [memory layout] section.
1578    /// Note that the [considerations for unsafe code] apply to all `Box<T, A>` values.
1579    ///
1580    /// # Examples
1581    ///
1582    /// Recreate a `Box` which was previously converted to a `NonNull` pointer
1583    /// using [`Box::into_non_null_with_allocator`]:
1584    /// ```
1585    /// #![feature(allocator_api)]
1586    ///
1587    /// use std::alloc::System;
1588    ///
1589    /// let x = Box::new_in(5, System);
1590    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1591    /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1592    /// ```
1593    /// Manually create a `Box` from scratch by using the system allocator:
1594    /// ```
1595    /// #![feature(allocator_api)]
1596    ///
1597    /// use std::alloc::{Allocator, Layout, System};
1598    ///
1599    /// unsafe {
1600    ///     let non_null = System.allocate(Layout::new::<i32>())?.cast::<i32>();
1601    ///     // In general .write is required to avoid attempting to destruct
1602    ///     // the (uninitialized) previous contents of `non_null`.
1603    ///     non_null.write(5);
1604    ///     let x = Box::from_non_null_in(non_null, System);
1605    /// }
1606    /// # Ok::<(), std::alloc::AllocError>(())
1607    /// ```
1608    ///
1609    /// [memory layout]: self#memory-layout
1610    /// [considerations for unsafe code]: self#considerations-for-unsafe-code
1611    #[unstable(feature = "allocator_api", issue = "32838")]
1612    // #[unstable(feature = "box_vec_non_null", issue = "130364")]
1613    #[inline]
1614    pub unsafe fn from_non_null_in(raw: NonNull<T>, alloc: A) -> Self {
1615        // SAFETY: guaranteed by the caller.
1616        unsafe { Box::from_raw_in(raw.as_ptr(), alloc) }
1617    }
1618
1619    /// Consumes the `Box`, returning a wrapped raw pointer and the allocator.
1620    ///
1621    /// The pointer will be properly aligned and non-null.
1622    ///
1623    /// After calling this function, the caller is responsible for the
1624    /// memory previously managed by the `Box`. In particular, the
1625    /// caller should properly destroy `T` and release the memory, taking
1626    /// into account the [memory layout] used by `Box`. The easiest way to
1627    /// do this is to convert the raw pointer back into a `Box` with the
1628    /// [`Box::from_raw_in`] function, allowing the `Box` destructor to perform
1629    /// the cleanup.
1630    ///
1631    /// Note: this is an associated function, which means that you have
1632    /// to call it as `Box::into_raw_with_allocator(b)` instead of `b.into_raw_with_allocator()`. This
1633    /// is so that there is no conflict with a method on the inner type.
1634    ///
1635    /// # Examples
1636    /// Converting the raw pointer back into a `Box` with [`Box::from_raw_in`]
1637    /// for automatic cleanup:
1638    /// ```
1639    /// #![feature(allocator_api)]
1640    ///
1641    /// use std::alloc::System;
1642    ///
1643    /// let x = Box::new_in(String::from("Hello"), System);
1644    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1645    /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1646    /// ```
1647    /// Manual cleanup by explicitly running the destructor and deallocating
1648    /// the memory:
1649    /// ```
1650    /// #![feature(allocator_api)]
1651    ///
1652    /// use std::alloc::{Allocator, Layout, System};
1653    /// use std::ptr::{self, NonNull};
1654    ///
1655    /// let x = Box::new_in(String::from("Hello"), System);
1656    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1657    /// unsafe {
1658    ///     ptr::drop_in_place(ptr);
1659    ///     let non_null = NonNull::new_unchecked(ptr);
1660    ///     alloc.deallocate(non_null.cast(), Layout::new::<String>());
1661    /// }
1662    /// ```
1663    ///
1664    /// [memory layout]: self#memory-layout
1665    #[must_use = "losing the pointer will leak memory"]
1666    #[unstable(feature = "allocator_api", issue = "32838")]
1667    #[inline]
1668    pub fn into_raw_with_allocator(b: Self) -> (*mut T, A) {
1669        let mut b = mem::ManuallyDrop::new(b);
1670        // We carefully get the raw pointer out in a way that Miri's aliasing model understands what
1671        // is happening: using the primitive "deref" of `Box`. In case `A` is *not* `Global`, we
1672        // want *no* aliasing requirements here!
1673        // In case `A` *is* `Global`, this does not quite have the right behavior; `into_raw`
1674        // works around that.
1675        let ptr = &raw mut **b;
1676        let alloc = unsafe { ptr::read(&b.1) };
1677        (ptr, alloc)
1678    }
1679
1680    /// Consumes the `Box`, returning a wrapped `NonNull` pointer and the allocator.
1681    ///
1682    /// The pointer will be properly aligned.
1683    ///
1684    /// After calling this function, the caller is responsible for the
1685    /// memory previously managed by the `Box`. In particular, the
1686    /// caller should properly destroy `T` and release the memory, taking
1687    /// into account the [memory layout] used by `Box`. The easiest way to
1688    /// do this is to convert the `NonNull` pointer back into a `Box` with the
1689    /// [`Box::from_non_null_in`] function, allowing the `Box` destructor to
1690    /// perform the cleanup.
1691    ///
1692    /// Note: this is an associated function, which means that you have
1693    /// to call it as `Box::into_non_null_with_allocator(b)` instead of
1694    /// `b.into_non_null_with_allocator()`. This is so that there is no
1695    /// conflict with a method on the inner type.
1696    ///
1697    /// # Examples
1698    /// Converting the `NonNull` pointer back into a `Box` with
1699    /// [`Box::from_non_null_in`] for automatic cleanup:
1700    /// ```
1701    /// #![feature(allocator_api)]
1702    ///
1703    /// use std::alloc::System;
1704    ///
1705    /// let x = Box::new_in(String::from("Hello"), System);
1706    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1707    /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1708    /// ```
1709    /// Manual cleanup by explicitly running the destructor and deallocating
1710    /// the memory:
1711    /// ```
1712    /// #![feature(allocator_api)]
1713    ///
1714    /// use std::alloc::{Allocator, Layout, System};
1715    ///
1716    /// let x = Box::new_in(String::from("Hello"), System);
1717    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1718    /// unsafe {
1719    ///     non_null.drop_in_place();
1720    ///     alloc.deallocate(non_null.cast::<u8>(), Layout::new::<String>());
1721    /// }
1722    /// ```
1723    ///
1724    /// [memory layout]: self#memory-layout
1725    #[must_use = "losing the pointer will leak memory"]
1726    #[unstable(feature = "allocator_api", issue = "32838")]
1727    // #[unstable(feature = "box_vec_non_null", issue = "130364")]
1728    #[inline]
1729    pub fn into_non_null_with_allocator(b: Self) -> (NonNull<T>, A) {
1730        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1731        // SAFETY: `Box` is guaranteed to be non-null.
1732        unsafe { (NonNull::new_unchecked(ptr), alloc) }
1733    }
1734
1735    #[unstable(
1736        feature = "ptr_internals",
1737        issue = "none",
1738        reason = "use `Box::leak(b).into()` or `Unique::from(Box::leak(b))` instead"
1739    )]
1740    #[inline]
1741    #[doc(hidden)]
1742    pub fn into_unique(b: Self) -> (Unique<T>, A) {
1743        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1744        unsafe { (Unique::from(&mut *ptr), alloc) }
1745    }
1746
1747    /// Returns a raw mutable pointer to the `Box`'s contents.
1748    ///
1749    /// The caller must ensure that the `Box` outlives the pointer this
1750    /// function returns, or else it will end up dangling.
1751    ///
1752    /// This method guarantees that for the purpose of the aliasing model, this method
1753    /// does not materialize a reference to the underlying memory, and thus the returned pointer
1754    /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1755    /// Note that calling other methods that materialize references to the memory
1756    /// may still invalidate this pointer.
1757    /// See the example below for how this guarantee can be used.
1758    ///
1759    /// # Examples
1760    ///
1761    /// Due to the aliasing guarantee, the following code is legal:
1762    ///
1763    /// ```rust
1764    /// #![feature(box_as_ptr)]
1765    ///
1766    /// unsafe {
1767    ///     let mut b = Box::new(0);
1768    ///     let ptr1 = Box::as_mut_ptr(&mut b);
1769    ///     ptr1.write(1);
1770    ///     let ptr2 = Box::as_mut_ptr(&mut b);
1771    ///     ptr2.write(2);
1772    ///     // Notably, the write to `ptr2` did *not* invalidate `ptr1`:
1773    ///     ptr1.write(3);
1774    /// }
1775    /// ```
1776    ///
1777    /// [`as_mut_ptr`]: Self::as_mut_ptr
1778    /// [`as_ptr`]: Self::as_ptr
1779    #[unstable(feature = "box_as_ptr", issue = "129090")]
1780    #[rustc_never_returns_null_ptr]
1781    #[rustc_as_ptr]
1782    #[inline]
1783    pub fn as_mut_ptr(b: &mut Self) -> *mut T {
1784        // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1785        // any references.
1786        &raw mut **b
1787    }
1788
1789    /// Returns a raw pointer to the `Box`'s contents.
1790    ///
1791    /// The caller must ensure that the `Box` outlives the pointer this
1792    /// function returns, or else it will end up dangling.
1793    ///
1794    /// The caller must also ensure that the memory the pointer (non-transitively) points to
1795    /// is never written to (except inside an `UnsafeCell`) using this pointer or any pointer
1796    /// derived from it. If you need to mutate the contents of the `Box`, use [`as_mut_ptr`].
1797    ///
1798    /// This method guarantees that for the purpose of the aliasing model, this method
1799    /// does not materialize a reference to the underlying memory, and thus the returned pointer
1800    /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1801    /// Note that calling other methods that materialize mutable references to the memory,
1802    /// as well as writing to this memory, may still invalidate this pointer.
1803    /// See the example below for how this guarantee can be used.
1804    ///
1805    /// # Examples
1806    ///
1807    /// Due to the aliasing guarantee, the following code is legal:
1808    ///
1809    /// ```rust
1810    /// #![feature(box_as_ptr)]
1811    ///
1812    /// unsafe {
1813    ///     let mut v = Box::new(0);
1814    ///     let ptr1 = Box::as_ptr(&v);
1815    ///     let ptr2 = Box::as_mut_ptr(&mut v);
1816    ///     let _val = ptr2.read();
1817    ///     // No write to this memory has happened yet, so `ptr1` is still valid.
1818    ///     let _val = ptr1.read();
1819    ///     // However, once we do a write...
1820    ///     ptr2.write(1);
1821    ///     // ... `ptr1` is no longer valid.
1822    ///     // This would be UB: let _val = ptr1.read();
1823    /// }
1824    /// ```
1825    ///
1826    /// [`as_mut_ptr`]: Self::as_mut_ptr
1827    /// [`as_ptr`]: Self::as_ptr
1828    #[unstable(feature = "box_as_ptr", issue = "129090")]
1829    #[rustc_never_returns_null_ptr]
1830    #[rustc_as_ptr]
1831    #[inline]
1832    pub fn as_ptr(b: &Self) -> *const T {
1833        // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1834        // any references.
1835        &raw const **b
1836    }
1837
1838    /// Returns a reference to the underlying allocator.
1839    ///
1840    /// Note: this is an associated function, which means that you have
1841    /// to call it as `Box::allocator(&b)` instead of `b.allocator()`. This
1842    /// is so that there is no conflict with a method on the inner type.
1843    #[unstable(feature = "allocator_api", issue = "32838")]
1844    #[inline]
1845    pub fn allocator(b: &Self) -> &A {
1846        &b.1
1847    }
1848
1849    /// Consumes and leaks the `Box`, returning a mutable reference,
1850    /// `&'a mut T`.
1851    ///
1852    /// Note that the type `T` must outlive the chosen lifetime `'a`. If the type
1853    /// has only static references, or none at all, then this may be chosen to be
1854    /// `'static`.
1855    ///
1856    /// This function is mainly useful for data that lives for the remainder of
1857    /// the program's life. Dropping the returned reference will cause a memory
1858    /// leak. If this is not acceptable, the reference should first be wrapped
1859    /// with the [`Box::from_raw`] function producing a `Box`. This `Box` can
1860    /// then be dropped which will properly destroy `T` and release the
1861    /// allocated memory.
1862    ///
1863    /// Note: this is an associated function, which means that you have
1864    /// to call it as `Box::leak(b)` instead of `b.leak()`. This
1865    /// is so that there is no conflict with a method on the inner type.
1866    ///
1867    /// # Examples
1868    ///
1869    /// Simple usage:
1870    ///
1871    /// ```
1872    /// let x = Box::new(41);
1873    /// let static_ref: &'static mut usize = Box::leak(x);
1874    /// *static_ref += 1;
1875    /// assert_eq!(*static_ref, 42);
1876    /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1877    /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1878    /// # drop(unsafe { Box::from_raw(static_ref) });
1879    /// ```
1880    ///
1881    /// Unsized data:
1882    ///
1883    /// ```
1884    /// let x = vec![1, 2, 3].into_boxed_slice();
1885    /// let static_ref = Box::leak(x);
1886    /// static_ref[0] = 4;
1887    /// assert_eq!(*static_ref, [4, 2, 3]);
1888    /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1889    /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1890    /// # drop(unsafe { Box::from_raw(static_ref) });
1891    /// ```
1892    #[stable(feature = "box_leak", since = "1.26.0")]
1893    #[inline]
1894    pub fn leak<'a>(b: Self) -> &'a mut T
1895    where
1896        A: 'a,
1897    {
1898        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1899        mem::forget(alloc);
1900        unsafe { &mut *ptr }
1901    }
1902
1903    /// Converts a `Box<T>` into a `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
1904    /// `*boxed` will be pinned in memory and unable to be moved.
1905    ///
1906    /// This conversion does not allocate on the heap and happens in place.
1907    ///
1908    /// This is also available via [`From`].
1909    ///
1910    /// Constructing and pinning a `Box` with <code>Box::into_pin([Box::new]\(x))</code>
1911    /// can also be written more concisely using <code>[Box::pin]\(x)</code>.
1912    /// This `into_pin` method is useful if you already have a `Box<T>`, or you are
1913    /// constructing a (pinned) `Box` in a different way than with [`Box::new`].
1914    ///
1915    /// # Notes
1916    ///
1917    /// It's not recommended that crates add an impl like `From<Box<T>> for Pin<T>`,
1918    /// as it'll introduce an ambiguity when calling `Pin::from`.
1919    /// A demonstration of such a poor impl is shown below.
1920    ///
1921    /// ```compile_fail
1922    /// # use std::pin::Pin;
1923    /// struct Foo; // A type defined in this crate.
1924    /// impl From<Box<()>> for Pin<Foo> {
1925    ///     fn from(_: Box<()>) -> Pin<Foo> {
1926    ///         Pin::new(Foo)
1927    ///     }
1928    /// }
1929    ///
1930    /// let foo = Box::new(());
1931    /// let bar = Pin::from(foo);
1932    /// ```
1933    #[stable(feature = "box_into_pin", since = "1.63.0")]
1934    pub fn into_pin(boxed: Self) -> Pin<Self>
1935    where
1936        A: 'static,
1937    {
1938        // It's not possible to move or replace the insides of a `Pin<Box<T>>`
1939        // when `T: !Unpin`, so it's safe to pin it directly without any
1940        // additional requirements.
1941        unsafe { Pin::new_unchecked(boxed) }
1942    }
1943}
1944
1945#[stable(feature = "rust1", since = "1.0.0")]
1946unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Box<T, A> {
1947    #[inline]
1948    fn drop(&mut self) {
1949        // the T in the Box is dropped by the compiler before the destructor is run
1950
1951        let ptr = self.0;
1952
1953        unsafe {
1954            let layout = Layout::for_value_raw(ptr.as_ptr());
1955            if layout.size() != 0 {
1956                self.1.deallocate(From::from(ptr.cast()), layout);
1957            }
1958        }
1959    }
1960}
1961
1962#[cfg(not(no_global_oom_handling))]
1963#[stable(feature = "rust1", since = "1.0.0")]
1964impl<T: Default> Default for Box<T> {
1965    /// Creates a `Box<T>`, with the `Default` value for `T`.
1966    #[inline]
1967    fn default() -> Self {
1968        let mut x: Box<mem::MaybeUninit<T>> = Box::new_uninit();
1969        unsafe {
1970            // SAFETY: `x` is valid for writing and has the same layout as `T`.
1971            // If `T::default()` panics, dropping `x` will just deallocate the Box as `MaybeUninit<T>`
1972            // does not have a destructor.
1973            //
1974            // We use `ptr::write` as `MaybeUninit::write` creates
1975            // extra stack copies of `T` in debug mode.
1976            //
1977            // See https://github.com/rust-lang/rust/issues/136043 for more context.
1978            ptr::write(&raw mut *x as *mut T, T::default());
1979            // SAFETY: `x` was just initialized above.
1980            x.assume_init()
1981        }
1982    }
1983}
1984
1985#[cfg(not(no_global_oom_handling))]
1986#[stable(feature = "rust1", since = "1.0.0")]
1987impl<T> Default for Box<[T]> {
1988    /// Creates an empty `[T]` inside a `Box`.
1989    #[inline]
1990    fn default() -> Self {
1991        let ptr: Unique<[T]> = Unique::<[T; 0]>::dangling();
1992        Box(ptr, Global)
1993    }
1994}
1995
1996#[cfg(not(no_global_oom_handling))]
1997#[stable(feature = "default_box_extra", since = "1.17.0")]
1998impl Default for Box<str> {
1999    #[inline]
2000    fn default() -> Self {
2001        // SAFETY: This is the same as `Unique::cast<U>` but with an unsized `U = str`.
2002        let ptr: Unique<str> = unsafe {
2003            let bytes: Unique<[u8]> = Unique::<[u8; 0]>::dangling();
2004            Unique::new_unchecked(bytes.as_ptr() as *mut str)
2005        };
2006        Box(ptr, Global)
2007    }
2008}
2009
2010#[cfg(not(no_global_oom_handling))]
2011#[stable(feature = "pin_default_impls", since = "1.91.0")]
2012impl<T> Default for Pin<Box<T>>
2013where
2014    T: ?Sized,
2015    Box<T>: Default,
2016{
2017    #[inline]
2018    fn default() -> Self {
2019        Box::into_pin(Box::<T>::default())
2020    }
2021}
2022
2023#[cfg(not(no_global_oom_handling))]
2024#[stable(feature = "rust1", since = "1.0.0")]
2025impl<T: Clone, A: Allocator + Clone> Clone for Box<T, A> {
2026    /// Returns a new box with a `clone()` of this box's contents.
2027    ///
2028    /// # Examples
2029    ///
2030    /// ```
2031    /// let x = Box::new(5);
2032    /// let y = x.clone();
2033    ///
2034    /// // The value is the same
2035    /// assert_eq!(x, y);
2036    ///
2037    /// // But they are unique objects
2038    /// assert_ne!(&*x as *const i32, &*y as *const i32);
2039    /// ```
2040    #[inline]
2041    fn clone(&self) -> Self {
2042        // Pre-allocate memory to allow writing the cloned value directly.
2043        let mut boxed = Self::new_uninit_in(self.1.clone());
2044        unsafe {
2045            (**self).clone_to_uninit(boxed.as_mut_ptr().cast());
2046            boxed.assume_init()
2047        }
2048    }
2049
2050    /// Copies `source`'s contents into `self` without creating a new allocation.
2051    ///
2052    /// # Examples
2053    ///
2054    /// ```
2055    /// let x = Box::new(5);
2056    /// let mut y = Box::new(10);
2057    /// let yp: *const i32 = &*y;
2058    ///
2059    /// y.clone_from(&x);
2060    ///
2061    /// // The value is the same
2062    /// assert_eq!(x, y);
2063    ///
2064    /// // And no allocation occurred
2065    /// assert_eq!(yp, &*y);
2066    /// ```
2067    #[inline]
2068    fn clone_from(&mut self, source: &Self) {
2069        (**self).clone_from(&(**source));
2070    }
2071}
2072
2073#[cfg(not(no_global_oom_handling))]
2074#[stable(feature = "box_slice_clone", since = "1.3.0")]
2075impl<T: Clone, A: Allocator + Clone> Clone for Box<[T], A> {
2076    fn clone(&self) -> Self {
2077        let alloc = Box::allocator(self).clone();
2078        self.to_vec_in(alloc).into_boxed_slice()
2079    }
2080
2081    /// Copies `source`'s contents into `self` without creating a new allocation,
2082    /// so long as the two are of the same length.
2083    ///
2084    /// # Examples
2085    ///
2086    /// ```
2087    /// let x = Box::new([5, 6, 7]);
2088    /// let mut y = Box::new([8, 9, 10]);
2089    /// let yp: *const [i32] = &*y;
2090    ///
2091    /// y.clone_from(&x);
2092    ///
2093    /// // The value is the same
2094    /// assert_eq!(x, y);
2095    ///
2096    /// // And no allocation occurred
2097    /// assert_eq!(yp, &*y);
2098    /// ```
2099    fn clone_from(&mut self, source: &Self) {
2100        if self.len() == source.len() {
2101            self.clone_from_slice(&source);
2102        } else {
2103            *self = source.clone();
2104        }
2105    }
2106}
2107
2108#[cfg(not(no_global_oom_handling))]
2109#[stable(feature = "box_slice_clone", since = "1.3.0")]
2110impl Clone for Box<str> {
2111    fn clone(&self) -> Self {
2112        // this makes a copy of the data
2113        let buf: Box<[u8]> = self.as_bytes().into();
2114        unsafe { from_boxed_utf8_unchecked(buf) }
2115    }
2116}
2117
2118#[stable(feature = "rust1", since = "1.0.0")]
2119impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for Box<T, A> {
2120    #[inline]
2121    fn eq(&self, other: &Self) -> bool {
2122        PartialEq::eq(&**self, &**other)
2123    }
2124    #[inline]
2125    fn ne(&self, other: &Self) -> bool {
2126        PartialEq::ne(&**self, &**other)
2127    }
2128}
2129
2130#[stable(feature = "rust1", since = "1.0.0")]
2131impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for Box<T, A> {
2132    #[inline]
2133    fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
2134        PartialOrd::partial_cmp(&**self, &**other)
2135    }
2136    #[inline]
2137    fn lt(&self, other: &Self) -> bool {
2138        PartialOrd::lt(&**self, &**other)
2139    }
2140    #[inline]
2141    fn le(&self, other: &Self) -> bool {
2142        PartialOrd::le(&**self, &**other)
2143    }
2144    #[inline]
2145    fn ge(&self, other: &Self) -> bool {
2146        PartialOrd::ge(&**self, &**other)
2147    }
2148    #[inline]
2149    fn gt(&self, other: &Self) -> bool {
2150        PartialOrd::gt(&**self, &**other)
2151    }
2152}
2153
2154#[stable(feature = "rust1", since = "1.0.0")]
2155impl<T: ?Sized + Ord, A: Allocator> Ord for Box<T, A> {
2156    #[inline]
2157    fn cmp(&self, other: &Self) -> Ordering {
2158        Ord::cmp(&**self, &**other)
2159    }
2160}
2161
2162#[stable(feature = "rust1", since = "1.0.0")]
2163impl<T: ?Sized + Eq, A: Allocator> Eq for Box<T, A> {}
2164
2165#[stable(feature = "rust1", since = "1.0.0")]
2166impl<T: ?Sized + Hash, A: Allocator> Hash for Box<T, A> {
2167    fn hash<H: Hasher>(&self, state: &mut H) {
2168        (**self).hash(state);
2169    }
2170}
2171
2172#[stable(feature = "indirect_hasher_impl", since = "1.22.0")]
2173impl<T: ?Sized + Hasher, A: Allocator> Hasher for Box<T, A> {
2174    fn finish(&self) -> u64 {
2175        (**self).finish()
2176    }
2177    fn write(&mut self, bytes: &[u8]) {
2178        (**self).write(bytes)
2179    }
2180    fn write_u8(&mut self, i: u8) {
2181        (**self).write_u8(i)
2182    }
2183    fn write_u16(&mut self, i: u16) {
2184        (**self).write_u16(i)
2185    }
2186    fn write_u32(&mut self, i: u32) {
2187        (**self).write_u32(i)
2188    }
2189    fn write_u64(&mut self, i: u64) {
2190        (**self).write_u64(i)
2191    }
2192    fn write_u128(&mut self, i: u128) {
2193        (**self).write_u128(i)
2194    }
2195    fn write_usize(&mut self, i: usize) {
2196        (**self).write_usize(i)
2197    }
2198    fn write_i8(&mut self, i: i8) {
2199        (**self).write_i8(i)
2200    }
2201    fn write_i16(&mut self, i: i16) {
2202        (**self).write_i16(i)
2203    }
2204    fn write_i32(&mut self, i: i32) {
2205        (**self).write_i32(i)
2206    }
2207    fn write_i64(&mut self, i: i64) {
2208        (**self).write_i64(i)
2209    }
2210    fn write_i128(&mut self, i: i128) {
2211        (**self).write_i128(i)
2212    }
2213    fn write_isize(&mut self, i: isize) {
2214        (**self).write_isize(i)
2215    }
2216    fn write_length_prefix(&mut self, len: usize) {
2217        (**self).write_length_prefix(len)
2218    }
2219    fn write_str(&mut self, s: &str) {
2220        (**self).write_str(s)
2221    }
2222}
2223
2224#[stable(feature = "rust1", since = "1.0.0")]
2225impl<T: fmt::Display + ?Sized, A: Allocator> fmt::Display for Box<T, A> {
2226    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2227        fmt::Display::fmt(&**self, f)
2228    }
2229}
2230
2231#[stable(feature = "rust1", since = "1.0.0")]
2232impl<T: fmt::Debug + ?Sized, A: Allocator> fmt::Debug for Box<T, A> {
2233    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2234        fmt::Debug::fmt(&**self, f)
2235    }
2236}
2237
2238#[stable(feature = "rust1", since = "1.0.0")]
2239impl<T: ?Sized, A: Allocator> fmt::Pointer for Box<T, A> {
2240    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2241        // It's not possible to extract the inner Uniq directly from the Box,
2242        // instead we cast it to a *const which aliases the Unique
2243        let ptr: *const T = &**self;
2244        fmt::Pointer::fmt(&ptr, f)
2245    }
2246}
2247
2248#[stable(feature = "rust1", since = "1.0.0")]
2249impl<T: ?Sized, A: Allocator> Deref for Box<T, A> {
2250    type Target = T;
2251
2252    fn deref(&self) -> &T {
2253        &**self
2254    }
2255}
2256
2257#[stable(feature = "rust1", since = "1.0.0")]
2258impl<T: ?Sized, A: Allocator> DerefMut for Box<T, A> {
2259    fn deref_mut(&mut self) -> &mut T {
2260        &mut **self
2261    }
2262}
2263
2264#[unstable(feature = "deref_pure_trait", issue = "87121")]
2265unsafe impl<T: ?Sized, A: Allocator> DerefPure for Box<T, A> {}
2266
2267#[unstable(feature = "legacy_receiver_trait", issue = "none")]
2268impl<T: ?Sized, A: Allocator> LegacyReceiver for Box<T, A> {}
2269
2270#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2271impl<Args: Tuple, F: FnOnce<Args> + ?Sized, A: Allocator> FnOnce<Args> for Box<F, A> {
2272    type Output = <F as FnOnce<Args>>::Output;
2273
2274    extern "rust-call" fn call_once(self, args: Args) -> Self::Output {
2275        <F as FnOnce<Args>>::call_once(*self, args)
2276    }
2277}
2278
2279#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2280impl<Args: Tuple, F: FnMut<Args> + ?Sized, A: Allocator> FnMut<Args> for Box<F, A> {
2281    extern "rust-call" fn call_mut(&mut self, args: Args) -> Self::Output {
2282        <F as FnMut<Args>>::call_mut(self, args)
2283    }
2284}
2285
2286#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2287impl<Args: Tuple, F: Fn<Args> + ?Sized, A: Allocator> Fn<Args> for Box<F, A> {
2288    extern "rust-call" fn call(&self, args: Args) -> Self::Output {
2289        <F as Fn<Args>>::call(self, args)
2290    }
2291}
2292
2293#[stable(feature = "async_closure", since = "1.85.0")]
2294impl<Args: Tuple, F: AsyncFnOnce<Args> + ?Sized, A: Allocator> AsyncFnOnce<Args> for Box<F, A> {
2295    type Output = F::Output;
2296    type CallOnceFuture = F::CallOnceFuture;
2297
2298    extern "rust-call" fn async_call_once(self, args: Args) -> Self::CallOnceFuture {
2299        F::async_call_once(*self, args)
2300    }
2301}
2302
2303#[stable(feature = "async_closure", since = "1.85.0")]
2304impl<Args: Tuple, F: AsyncFnMut<Args> + ?Sized, A: Allocator> AsyncFnMut<Args> for Box<F, A> {
2305    type CallRefFuture<'a>
2306        = F::CallRefFuture<'a>
2307    where
2308        Self: 'a;
2309
2310    extern "rust-call" fn async_call_mut(&mut self, args: Args) -> Self::CallRefFuture<'_> {
2311        F::async_call_mut(self, args)
2312    }
2313}
2314
2315#[stable(feature = "async_closure", since = "1.85.0")]
2316impl<Args: Tuple, F: AsyncFn<Args> + ?Sized, A: Allocator> AsyncFn<Args> for Box<F, A> {
2317    extern "rust-call" fn async_call(&self, args: Args) -> Self::CallRefFuture<'_> {
2318        F::async_call(self, args)
2319    }
2320}
2321
2322#[unstable(feature = "coerce_unsized", issue = "18598")]
2323impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Box<U, A>> for Box<T, A> {}
2324
2325#[unstable(feature = "pin_coerce_unsized_trait", issue = "150112")]
2326unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Box<T, A> {}
2327
2328// It is quite crucial that we only allow the `Global` allocator here.
2329// Handling arbitrary custom allocators (which can affect the `Box` layout heavily!)
2330// would need a lot of codegen and interpreter adjustments.
2331#[unstable(feature = "dispatch_from_dyn", issue = "none")]
2332impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Box<U>> for Box<T, Global> {}
2333
2334#[stable(feature = "box_borrow", since = "1.1.0")]
2335impl<T: ?Sized, A: Allocator> Borrow<T> for Box<T, A> {
2336    fn borrow(&self) -> &T {
2337        &**self
2338    }
2339}
2340
2341#[stable(feature = "box_borrow", since = "1.1.0")]
2342impl<T: ?Sized, A: Allocator> BorrowMut<T> for Box<T, A> {
2343    fn borrow_mut(&mut self) -> &mut T {
2344        &mut **self
2345    }
2346}
2347
2348#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2349impl<T: ?Sized, A: Allocator> AsRef<T> for Box<T, A> {
2350    fn as_ref(&self) -> &T {
2351        &**self
2352    }
2353}
2354
2355#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2356impl<T: ?Sized, A: Allocator> AsMut<T> for Box<T, A> {
2357    fn as_mut(&mut self) -> &mut T {
2358        &mut **self
2359    }
2360}
2361
2362/* Nota bene
2363 *
2364 *  We could have chosen not to add this impl, and instead have written a
2365 *  function of Pin<Box<T>> to Pin<T>. Such a function would not be sound,
2366 *  because Box<T> implements Unpin even when T does not, as a result of
2367 *  this impl.
2368 *
2369 *  We chose this API instead of the alternative for a few reasons:
2370 *      - Logically, it is helpful to understand pinning in regard to the
2371 *        memory region being pointed to. For this reason none of the
2372 *        standard library pointer types support projecting through a pin
2373 *        (Box<T> is the only pointer type in std for which this would be
2374 *        safe.)
2375 *      - It is in practice very useful to have Box<T> be unconditionally
2376 *        Unpin because of trait objects, for which the structural auto
2377 *        trait functionality does not apply (e.g., Box<dyn Foo> would
2378 *        otherwise not be Unpin).
2379 *
2380 *  Another type with the same semantics as Box but only a conditional
2381 *  implementation of `Unpin` (where `T: Unpin`) would be valid/safe, and
2382 *  could have a method to project a Pin<T> from it.
2383 */
2384#[stable(feature = "pin", since = "1.33.0")]
2385impl<T: ?Sized, A: Allocator> Unpin for Box<T, A> {}
2386
2387#[unstable(feature = "coroutine_trait", issue = "43122")]
2388impl<G: ?Sized + Coroutine<R> + Unpin, R, A: Allocator> Coroutine<R> for Box<G, A> {
2389    type Yield = G::Yield;
2390    type Return = G::Return;
2391
2392    fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2393        G::resume(Pin::new(&mut *self), arg)
2394    }
2395}
2396
2397#[unstable(feature = "coroutine_trait", issue = "43122")]
2398impl<G: ?Sized + Coroutine<R>, R, A: Allocator> Coroutine<R> for Pin<Box<G, A>>
2399where
2400    A: 'static,
2401{
2402    type Yield = G::Yield;
2403    type Return = G::Return;
2404
2405    fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2406        G::resume((*self).as_mut(), arg)
2407    }
2408}
2409
2410#[stable(feature = "futures_api", since = "1.36.0")]
2411impl<F: ?Sized + Future + Unpin, A: Allocator> Future for Box<F, A> {
2412    type Output = F::Output;
2413
2414    fn poll(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
2415        F::poll(Pin::new(&mut *self), cx)
2416    }
2417}
2418
2419#[stable(feature = "box_error", since = "1.8.0")]
2420impl<E: Error> Error for Box<E> {
2421    #[allow(deprecated)]
2422    fn cause(&self) -> Option<&dyn Error> {
2423        Error::cause(&**self)
2424    }
2425
2426    fn source(&self) -> Option<&(dyn Error + 'static)> {
2427        Error::source(&**self)
2428    }
2429
2430    fn provide<'b>(&'b self, request: &mut error::Request<'b>) {
2431        Error::provide(&**self, request);
2432    }
2433}
2434
2435#[unstable(feature = "allocator_api", issue = "32838")]
2436unsafe impl<T: ?Sized + Allocator, A: Allocator> Allocator for Box<T, A> {
2437    #[inline]
2438    fn allocate(&self, layout: Layout) -> Result<NonNull<[u8]>, AllocError> {
2439        (**self).allocate(layout)
2440    }
2441
2442    #[inline]
2443    fn allocate_zeroed(&self, layout: Layout) -> Result<NonNull<[u8]>, AllocError> {
2444        (**self).allocate_zeroed(layout)
2445    }
2446
2447    #[inline]
2448    unsafe fn deallocate(&self, ptr: NonNull<u8>, layout: Layout) {
2449        // SAFETY: the safety contract must be upheld by the caller
2450        unsafe { (**self).deallocate(ptr, layout) }
2451    }
2452
2453    #[inline]
2454    unsafe fn grow(
2455        &self,
2456        ptr: NonNull<u8>,
2457        old_layout: Layout,
2458        new_layout: Layout,
2459    ) -> Result<NonNull<[u8]>, AllocError> {
2460        // SAFETY: the safety contract must be upheld by the caller
2461        unsafe { (**self).grow(ptr, old_layout, new_layout) }
2462    }
2463
2464    #[inline]
2465    unsafe fn grow_zeroed(
2466        &self,
2467        ptr: NonNull<u8>,
2468        old_layout: Layout,
2469        new_layout: Layout,
2470    ) -> Result<NonNull<[u8]>, AllocError> {
2471        // SAFETY: the safety contract must be upheld by the caller
2472        unsafe { (**self).grow_zeroed(ptr, old_layout, new_layout) }
2473    }
2474
2475    #[inline]
2476    unsafe fn shrink(
2477        &self,
2478        ptr: NonNull<u8>,
2479        old_layout: Layout,
2480        new_layout: Layout,
2481    ) -> Result<NonNull<[u8]>, AllocError> {
2482        // SAFETY: the safety contract must be upheld by the caller
2483        unsafe { (**self).shrink(ptr, old_layout, new_layout) }
2484    }
2485}