Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rollup of 7 pull requests #76637

Merged
merged 23 commits into from
Sep 12, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
ccf1f58
rustdoc: fix min_const_generics with ty::Param
lcnr Sep 3, 2020
b869aa5
Add saturating methods for `Duration`
marmeladema Aug 30, 2020
75e471a
Add MaybeUninit::drop.
m-ou-se Sep 8, 2020
caef832
Fix doc comment on MaybeUninit::drop.
m-ou-se Sep 8, 2020
656a17b
Rename MaybeUninit::drop to assume_init_drop.
m-ou-se Sep 9, 2020
a14efd1
Rename MaybeUninit::read to assume_init_read.
m-ou-se Sep 9, 2020
a94b2cb
Add safety docs about T's invariants in MaybeUninit::assume_init_drop.
m-ou-se Sep 9, 2020
43c7a9b
Fix broken doc links in MaybeUninit.
m-ou-se Sep 9, 2020
493c037
Eliminate mut reference UB in Drop impl for Rc<T>
carbotaniuman Sep 9, 2020
8f43fa0
Add WeakInner<'_> and have Weak::inner() return it
carbotaniuman Sep 9, 2020
bb57c9f
Format
carbotaniuman Sep 9, 2020
954361a
Update `std::os` module documentation.
CDirkx Sep 11, 2020
9abc6bd
Add revisions to const generic const_evaluatable_checked tests.
hameerabbasi Sep 11, 2020
5e188f5
Add revisions to const generic type-dependent UI tests.
hameerabbasi Sep 11, 2020
b729368
Address review comments
carbotaniuman Sep 11, 2020
bb9ce7c
Add missing examples on binary core traits
GuillaumeGomez Sep 11, 2020
7344f93
Rollup merge of #76114 - marmeladema:duration-saturating-ops, r=shepm…
RalfJung Sep 12, 2020
5d90d6e
Rollup merge of #76297 - lcnr:const-ty-alias, r=varkor
RalfJung Sep 12, 2020
c20356e
Rollup merge of #76484 - fusion-engineering-forks:maybe-uninit-drop, …
RalfJung Sep 12, 2020
a49451c
Rollup merge of #76530 - carbotaniuman:fix-rc, r=RalfJung
RalfJung Sep 12, 2020
2477f07
Rollup merge of #76583 - CDirkx:os-doc, r=jonas-schievink
RalfJung Sep 12, 2020
90c5b8f
Rollup merge of #76599 - hameerabbasi:const-generics-revs, r=lcnr
RalfJung Sep 12, 2020
0ed4bc5
Rollup merge of #76615 - GuillaumeGomez:missing-examples-binary-ops, …
RalfJung Sep 12, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
113 changes: 75 additions & 38 deletions library/alloc/src/rc.rs
Original file line number Diff line number Diff line change
Expand Up @@ -295,6 +295,13 @@ impl<T: ?Sized + Unsize<U>, U: ?Sized> CoerceUnsized<Rc<U>> for Rc<T> {}
impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Rc<U>> for Rc<T> {}

impl<T: ?Sized> Rc<T> {
#[inline(always)]
fn inner(&self) -> &RcBox<T> {
// This unsafety is ok because while this Rc is alive we're guaranteed
// that the inner pointer is valid.
unsafe { self.ptr.as_ref() }
}

fn from_inner(ptr: NonNull<RcBox<T>>) -> Self {
Self { ptr, phantom: PhantomData }
}
Expand Down Expand Up @@ -469,7 +476,7 @@ impl<T> Rc<T> {
// the strong count, and then remove the implicit "strong weak"
// pointer while also handling drop logic by just crafting a
// fake Weak.
this.dec_strong();
this.inner().dec_strong();
let _weak = Weak { ptr: this.ptr };
forget(this);
Ok(val)
Expand Down Expand Up @@ -735,7 +742,7 @@ impl<T: ?Sized> Rc<T> {
/// ```
#[stable(feature = "rc_weak", since = "1.4.0")]
pub fn downgrade(this: &Self) -> Weak<T> {
this.inc_weak();
this.inner().inc_weak();
// Make sure we do not create a dangling Weak
debug_assert!(!is_dangling(this.ptr));
Weak { ptr: this.ptr }
Expand All @@ -756,7 +763,7 @@ impl<T: ?Sized> Rc<T> {
#[inline]
#[stable(feature = "rc_counts", since = "1.15.0")]
pub fn weak_count(this: &Self) -> usize {
this.weak() - 1
this.inner().weak() - 1
}

/// Gets the number of strong (`Rc`) pointers to this allocation.
Expand All @@ -774,7 +781,7 @@ impl<T: ?Sized> Rc<T> {
#[inline]
#[stable(feature = "rc_counts", since = "1.15.0")]
pub fn strong_count(this: &Self) -> usize {
this.strong()
this.inner().strong()
}

/// Returns `true` if there are no other `Rc` or [`Weak`] pointers to
Expand Down Expand Up @@ -844,7 +851,9 @@ impl<T: ?Sized> Rc<T> {
#[inline]
#[unstable(feature = "get_mut_unchecked", issue = "63292")]
pub unsafe fn get_mut_unchecked(this: &mut Self) -> &mut T {
unsafe { &mut this.ptr.as_mut().value }
// We are careful to *not* create a reference covering the "count" fields, as
// this would conflict with accesses to the reference counts (e.g. by `Weak`).
unsafe { &mut (*this.ptr.as_ptr()).value }
}

#[inline]
Expand Down Expand Up @@ -931,10 +940,10 @@ impl<T: Clone> Rc<T> {
unsafe {
let mut swap = Rc::new(ptr::read(&this.ptr.as_ref().value));
mem::swap(this, &mut swap);
swap.dec_strong();
swap.inner().dec_strong();
// Remove implicit strong-weak ref (no need to craft a fake
// Weak here -- we know other Weaks can clean up for us)
swap.dec_weak();
swap.inner().dec_weak();
forget(swap);
}
}
Expand Down Expand Up @@ -1192,16 +1201,16 @@ unsafe impl<#[may_dangle] T: ?Sized> Drop for Rc<T> {
/// ```
fn drop(&mut self) {
unsafe {
self.dec_strong();
if self.strong() == 0 {
self.inner().dec_strong();
if self.inner().strong() == 0 {
// destroy the contained object
ptr::drop_in_place(self.ptr.as_mut());
ptr::drop_in_place(Self::get_mut_unchecked(self));

// remove the implicit "strong weak" pointer now that we've
// destroyed the contents.
self.dec_weak();
self.inner().dec_weak();

if self.weak() == 0 {
if self.inner().weak() == 0 {
Global.dealloc(self.ptr.cast(), Layout::for_value(self.ptr.as_ref()));
}
}
Expand All @@ -1227,7 +1236,7 @@ impl<T: ?Sized> Clone for Rc<T> {
/// ```
#[inline]
fn clone(&self) -> Rc<T> {
self.inc_strong();
self.inner().inc_strong();
Self::from_inner(self.ptr)
}
}
Expand Down Expand Up @@ -1851,6 +1860,13 @@ pub(crate) fn is_dangling<T: ?Sized>(ptr: NonNull<T>) -> bool {
address == usize::MAX
}

/// Helper type to allow accessing the reference counts without
/// making any assertions about the data field.
struct WeakInner<'a> {
weak: &'a Cell<usize>,
strong: &'a Cell<usize>,
}

impl<T: ?Sized> Weak<T> {
/// Attempts to upgrade the `Weak` pointer to an [`Rc`], delaying
/// dropping of the inner value if successful.
Expand Down Expand Up @@ -1910,11 +1926,21 @@ impl<T: ?Sized> Weak<T> {
.unwrap_or(0)
}

/// Returns `None` when the pointer is dangling and there is no allocated `RcBox`
/// Returns `None` when the pointer is dangling and there is no allocated `RcBox`,
/// (i.e., when this `Weak` was created by `Weak::new`).
#[inline]
fn inner(&self) -> Option<&RcBox<T>> {
if is_dangling(self.ptr) { None } else { Some(unsafe { self.ptr.as_ref() }) }
fn inner(&self) -> Option<WeakInner<'_>> {
if is_dangling(self.ptr) {
None
} else {
// We are careful to *not* create a reference covering the "data" field, as
// the field may be mutated concurrently (for example, if the last `Rc`
// is dropped, the data field will be dropped in-place).
Some(unsafe {
let ptr = self.ptr.as_ptr();
WeakInner { strong: &(*ptr).strong, weak: &(*ptr).weak }
})
}
}

/// Returns `true` if the two `Weak`s point to the same allocation (similar to
Expand Down Expand Up @@ -1992,14 +2018,14 @@ impl<T: ?Sized> Drop for Weak<T> {
/// assert!(other_weak_foo.upgrade().is_none());
/// ```
fn drop(&mut self) {
if let Some(inner) = self.inner() {
inner.dec_weak();
// the weak count starts at 1, and will only go to zero if all
// the strong pointers have disappeared.
if inner.weak() == 0 {
unsafe {
Global.dealloc(self.ptr.cast(), Layout::for_value(self.ptr.as_ref()));
}
let inner = if let Some(inner) = self.inner() { inner } else { return };

inner.dec_weak();
// the weak count starts at 1, and will only go to zero if all
// the strong pointers have disappeared.
if inner.weak() == 0 {
unsafe {
Global.dealloc(self.ptr.cast(), Layout::for_value(self.ptr.as_ref()));
}
}
}
Expand Down Expand Up @@ -2065,12 +2091,13 @@ impl<T> Default for Weak<T> {
// clone these much in Rust thanks to ownership and move-semantics.

#[doc(hidden)]
trait RcBoxPtr<T: ?Sized> {
fn inner(&self) -> &RcBox<T>;
trait RcInnerPtr {
fn weak_ref(&self) -> &Cell<usize>;
fn strong_ref(&self) -> &Cell<usize>;

#[inline]
fn strong(&self) -> usize {
self.inner().strong.get()
self.strong_ref().get()
}

#[inline]
Expand All @@ -2084,17 +2111,17 @@ trait RcBoxPtr<T: ?Sized> {
if strong == 0 || strong == usize::MAX {
abort();
}
self.inner().strong.set(strong + 1);
self.strong_ref().set(strong + 1);
}

#[inline]
fn dec_strong(&self) {
self.inner().strong.set(self.strong() - 1);
self.strong_ref().set(self.strong() - 1);
}

#[inline]
fn weak(&self) -> usize {
self.inner().weak.get()
self.weak_ref().get()
}

#[inline]
Expand All @@ -2108,26 +2135,36 @@ trait RcBoxPtr<T: ?Sized> {
if weak == 0 || weak == usize::MAX {
abort();
}
self.inner().weak.set(weak + 1);
self.weak_ref().set(weak + 1);
}

#[inline]
fn dec_weak(&self) {
self.inner().weak.set(self.weak() - 1);
self.weak_ref().set(self.weak() - 1);
}
}

impl<T: ?Sized> RcBoxPtr<T> for Rc<T> {
impl<T: ?Sized> RcInnerPtr for RcBox<T> {
#[inline(always)]
fn inner(&self) -> &RcBox<T> {
unsafe { self.ptr.as_ref() }
fn weak_ref(&self) -> &Cell<usize> {
&self.weak
}

#[inline(always)]
fn strong_ref(&self) -> &Cell<usize> {
&self.strong
}
}

impl<T: ?Sized> RcBoxPtr<T> for RcBox<T> {
impl<'a> RcInnerPtr for WeakInner<'a> {
#[inline(always)]
fn inner(&self) -> &RcBox<T> {
self
fn weak_ref(&self) -> &Cell<usize> {
self.weak
}

#[inline(always)]
fn strong_ref(&self) -> &Cell<usize> {
self.strong
}
}

Expand Down
4 changes: 2 additions & 2 deletions library/core/src/array/iter.rs
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ impl<T, const N: usize> Iterator for IntoIter<T, N> {
// dead now (i.e. do not touch). As `idx` was the start of the
// alive-zone, the alive zone is now `data[alive]` again, restoring
// all invariants.
unsafe { self.data.get_unchecked(idx).read() }
unsafe { self.data.get_unchecked(idx).assume_init_read() }
})
}

Expand Down Expand Up @@ -136,7 +136,7 @@ impl<T, const N: usize> DoubleEndedIterator for IntoIter<T, N> {
// dead now (i.e. do not touch). As `idx` was the end of the
// alive-zone, the alive zone is now `data[alive]` again, restoring
// all invariants.
unsafe { self.data.get_unchecked(idx).read() }
unsafe { self.data.get_unchecked(idx).assume_init_read() }
})
}
}
Expand Down
1 change: 1 addition & 0 deletions library/core/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,7 @@
#![feature(doc_cfg)]
#![feature(doc_spotlight)]
#![feature(duration_consts_2)]
#![feature(duration_saturating_ops)]
#![feature(extern_types)]
#![feature(fundamental)]
#![feature(intrinsics)]
Expand Down
49 changes: 40 additions & 9 deletions library/core/src/mem/maybe_uninit.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ use crate::any::type_name;
use crate::fmt;
use crate::intrinsics;
use crate::mem::ManuallyDrop;
use crate::ptr;

/// A wrapper type to construct uninitialized instances of `T`.
///
Expand Down Expand Up @@ -471,6 +472,8 @@ impl<T> MaybeUninit<T> {
/// *immediate* undefined behavior, but will cause undefined behavior with most
/// safe operations (including dropping it).
///
/// [`Vec<T>`]: ../../std/vec/struct.Vec.html
///
/// # Examples
///
/// Correct usage of this method:
Expand Down Expand Up @@ -519,8 +522,8 @@ impl<T> MaybeUninit<T> {
/// this initialization invariant.
///
/// Moreover, this leaves a copy of the same data behind in the `MaybeUninit<T>`. When using
/// multiple copies of the data (by calling `read` multiple times, or first
/// calling `read` and then [`assume_init`]), it is your responsibility
/// multiple copies of the data (by calling `assume_init_read` multiple times, or first
/// calling `assume_init_read` and then [`assume_init`]), it is your responsibility
/// to ensure that that data may indeed be duplicated.
///
/// [inv]: #initialization-invariant
Expand All @@ -536,16 +539,16 @@ impl<T> MaybeUninit<T> {
///
/// let mut x = MaybeUninit::<u32>::uninit();
/// x.write(13);
/// let x1 = unsafe { x.read() };
/// let x1 = unsafe { x.assume_init_read() };
/// // `u32` is `Copy`, so we may read multiple times.
/// let x2 = unsafe { x.read() };
/// let x2 = unsafe { x.assume_init_read() };
/// assert_eq!(x1, x2);
///
/// let mut x = MaybeUninit::<Option<Vec<u32>>>::uninit();
/// x.write(None);
/// let x1 = unsafe { x.read() };
/// let x1 = unsafe { x.assume_init_read() };
/// // Duplicating a `None` value is okay, so we may read multiple times.
/// let x2 = unsafe { x.read() };
/// let x2 = unsafe { x.assume_init_read() };
/// assert_eq!(x1, x2);
/// ```
///
Expand All @@ -557,14 +560,14 @@ impl<T> MaybeUninit<T> {
///
/// let mut x = MaybeUninit::<Option<Vec<u32>>>::uninit();
/// x.write(Some(vec![0,1,2]));
/// let x1 = unsafe { x.read() };
/// let x2 = unsafe { x.read() };
/// let x1 = unsafe { x.assume_init_read() };
/// let x2 = unsafe { x.assume_init_read() };
/// // We now created two copies of the same vector, leading to a double-free ⚠️ when
/// // they both get dropped!
/// ```
#[unstable(feature = "maybe_uninit_extra", issue = "63567")]
#[inline(always)]
pub unsafe fn read(&self) -> T {
pub unsafe fn assume_init_read(&self) -> T {
// SAFETY: the caller must guarantee that `self` is initialized.
// Reading from `self.as_ptr()` is safe since `self` should be initialized.
unsafe {
Expand All @@ -573,6 +576,34 @@ impl<T> MaybeUninit<T> {
}
}

/// Drops the contained value in place.
///
/// If you have ownership of the `MaybeUninit`, you can use [`assume_init`] instead.
///
/// # Safety
///
/// It is up to the caller to guarantee that the `MaybeUninit<T>` really is
/// in an initialized state. Calling this when the content is not yet fully
/// initialized causes undefined behavior.
///
/// On top of that, all additional invariants of the type `T` must be
/// satisfied, as the `Drop` implementation of `T` (or its members) may
/// rely on this. For example, a `1`-initialized [`Vec<T>`] is considered
/// initialized (under the current implementation; this does not constitute
/// a stable guarantee) because the only requirement the compiler knows
/// about it is that the data pointer must be non-null. Dropping such a
/// `Vec<T>` however will cause undefined behaviour.
///
/// [`assume_init`]: MaybeUninit::assume_init
/// [`Vec<T>`]: ../../std/vec/struct.Vec.html
#[unstable(feature = "maybe_uninit_extra", issue = "63567")]
pub unsafe fn assume_init_drop(&mut self) {
// SAFETY: the caller must guarantee that `self` is initialized and
// satisfies all invariants of `T`.
// Dropping the value in place is safe if that is the case.
unsafe { ptr::drop_in_place(self.as_mut_ptr()) }
}

/// Gets a shared reference to the contained value.
///
/// This can be useful when we want to access a `MaybeUninit` that has been
Expand Down
Loading