Comment by cogman10

7 hours ago

Yeah, this is something Java got right as well. It got "unsigned" wrong, but it got standardizing primitive bits correct

byte = 8 bits

short = 16

int = 32

long = 64

float = 32 bit IEEE

double = 64 bit IEEE

I like the Rust approach more: usize/isize are the native integer types, and with every other numeric type, you have to mention the size explicitly.

On the C++ side, I sometimes use an alias that contains the word "short" for 32-bit integers. When I use them, I'm explicitly assuming that the numbers are small enough to fit in a smaller than usual integer type, and that it's critical enough to performance that the assumption is worth making.

  • <cstdint> has int8_t, uint8_t, int16_t, uint16_t, int32_t, uint32_t, int64_t, and uint64_t. I still go back and forth between uint64_t, size_t, and unsigned int, but am defaulting to uint64_t more and more, even if it doesn't matter.

  • > you have to mention the size explicitly

    It's unbelievably ugly. Every piece of code working with any kind of integer screams "I am hardware dependent in some way".

    E.g. in a structure representing an automobile, the number of wheels has to be some i8 or i16, which looks ridiculous.

    Why would you take a language in which you can write functional pipelines over collections of objects, and make it look like assembler.

    • If you don't care about the size of your number, just use isize or usize.

      If you do care, then isn't it better to specify it explicitly than trying to guess it and having different compilers disagreeing on the size?

      5 replies →

    • Is it any better calling it an int where it's assumed to be an i32 and 30 of the bits are wasted.

Yep. Pity about getting chars / string encoding wrong though. (Java chars are 16 bits).

But it’s not alone in that mistake. All the languages invented in that era made the same mistake. (C#, JavaScript, etc).

  • Java was just unlucky, it standardised it's strings at the wrong time (when Unicode was 16-bit code points): Java was announced in May 1995, and the following comment from the Unicode history wiki page makes it clear what happened: "In 1996, a surrogate character mechanism was implemented in Unicode 2.0, so that Unicode was no longer restricted to 16 bits. ..."