Skip to content

Commit

Permalink
Add support for 128-bit integers
Browse files Browse the repository at this point in the history
  • Loading branch information
Amanieu committed Feb 21, 2016
1 parent c447aa8 commit b6aa15a
Showing 1 changed file with 42 additions and 0 deletions.
42 changes: 42 additions & 0 deletions text/0000-int128.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
- Feature Name: int128
- Start Date: 21-02-2016
- RFC PR: (leave this empty)
- Rust Issue: (leave this empty)

# Summary
[summary]: #summary

This RFC adds the `i128` and `u128` types to Rust. Because these types are not available on all platforms, a new target flag (`target_has_int128`) is added to allow users to check whether 128-bit integers are supported. The `i128` and `u128` are not added to the prelude, and must instead be explicitly imported with `use core::{i128, u128}`.

# Motivation
[motivation]: #motivation

Some algorithms need to work with very large numbers that don't fit in 64 bits, such as certain cryptographic algorithms. One possibility would be to use a BigNum library, but these use heap allocation and tend to have high overhead. LLVM has support for very efficient 128-bit integers, which are exposed by Clang in C as the `__int128` type.

# Detailed design
[design]: #detailed-design

From a quick look at Clang's source, 128-bit integers are supported on all 64-bit platforms and a few 32-bit ones (those with 64-bit registers: x32 and MIPS n32). To allow users to determine whether 128-bit integers are available, a `target_has_int128` cfg is added. The `i128` and `u128` types are only available when this flag is set.

The actual `i128` and `u128` types are not added to the Rust prelude since that would break compatibility. Instead they must be explicitly imported with `use core::{i128, u128}` or `use std::{i128, u128}`. This will also catch attempts to use 128-bit integers when they are not supported by the underlying platform since the import will fail if `target_has_int128` is not defined.

Implementation-wise, this should just be a matter of adding a new primitive type to the compiler and adding trait implementations for `i128`/`u128` in libcore. A new entry will need to be added to target specifications to specify whether the target supports 128-bit integers.

One possible complication is that primitive types aren't currently part of the prelude, instead they are directly added to the global namespace by the compiler. The new `i128` and `u128` types will behave differently and will need to be explicitly imported.

Another possible issue is that a `u128` can hold a very large number that doesn't fit in a `f32`. We need to make sure this doesn't lead to any `undef`s from LLVM.

# Drawbacks
[drawbacks]: #drawbacks

It adds a type to the language that may or may not be present depending on the target architecture. This could lead to surprises, bu

# Alternatives
[alternatives]: #alternatives

There have been several attempts to create `u128`/`i128` wrappers based on two `u64` values, but these can't match the performance of LLVM's native 128-bit integers.

# Unresolved questions
[unresolved]: #unresolved-questions

How should 128-bit literals be handled? The easiest solution would be to limit integer literals to 64 bits, which is what GCC does (no support for `__int128` literals).

0 comments on commit b6aa15a

Please sign in to comment.