Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Linux kernel booting process, part 3 (github.com/0xax)
82 points by 0xAX on Jan 31, 2015 | hide | past | favorite | 4 comments


Cool article, one thing that's been irritating me lately is how everyone is taking this table on faith:

| Type | char | short | int | long | u8 | u16 | u32 | u64 |

|------|------|-------|-----|------|----|-----|-----|-----|

| Size | 1 | 2 | 4 | 8 | 1 | 2 | 4 | 8 |

Here are the corrected tables, now a relic of history:

32 bit:

| Type | char | short | long | int | u8 | u16 | u32 | u64 |

|------|------|-------|-----|------|----|-----|-----|-----|

| Size | 1 | 2 | 4 | 4 | 1 | 2 | 4 | 8 |

64 bit:

| Type | char | short | long | int | u8 | u16 | u32 | u64 |

|------|------|-------|-----|------|----|-----|-----|-----|

| Size | 1 | 2 | 4 | 8 | 1 | 2 | 4 | 8 |

Instead we traded convention for programmer convenience, so that companies like Apple could mandate 64 bit binaries under the pretense that programmers wouldn’t have to do much work to convert from 32 (since they mistakingly kept int 4 bytes).

The fact is that originally long was 4 bytes and int was the native width of your processor (int should be 8 on most CPUs today, as they are 64 bit). Using long before a term like int doubled the length and created confusion as well.


Changing sizes on different processors is a bad idea- most uses should be fixed-size for ease of portability, and using 64-bit ints for a "default" type isn't always the best for performance.

Also, it's unclear from your post, but to clarify: the C standard mandates that sizeof(int) <= sizeof(long).


That's debatable. Even in 64-bit mode, x86-64 processors treat 32-bit integers as the default word size for the majority of integer instructions, requiring a operand size prefix byte (on top of the larger integer itself) for almost every instruction that operates on a 64-bit integer. If you want to complain to anyone about this, complain to AMD. Really though, 32 bits are enough for the vast majority of array indices and symbolic constants, why waste memory bandwidth and cache space on 64-bit ints for the sake of consistency?


Nice series. Waiting for an episode or two about Linux kernel on ARM.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: