First I want to apologize because English is not my native language. I'm taking CS50 Introduction to Computer Science course and I've come across the concepts of 'Endianness' and 'Word size' and even though I think I've understood them pretty well, there's still some confusion.

As far as I know, 'Word size' refers to the number of bytes a processor can read or write from memory in one cycle, the instruction bytes it can send at a time, and the max size of memory addresses also; being them 4 bytes in 32-bits architectures and 8 bytes in 64-bits architectures. Correct me if I'm wrong with this.

Now, 'Endianness' refers to the ordering of bytes of a multi-byte data type (like an int or float, not a char) when the processor handles them, to either storage or transmit them. According to some definitions I've read, this concept is linked to a word size. For example, in Wikipedia, it says: "endianness is the ordering or sequencing of bytes of a word of digital data". Big-endian means when the most significant byte is placed in the smallest memory address and low-endian when the least significant byte is placed in the smallest memory address instead.

I've seen many examples and diagrams like this one: Little-endian / Big-endian explanation diagram

I understand big-endian very well and little-endian when the data type being processed has a size equal or smaller than the word size is also clear. But what happens when it's bigger than the word size? Imagine an 8-byte data type in a 32-bit little-endian architecture (4-byte words), how are the bytes actually stored:

Ordering #1:

----------------->
lower address to higher address
b7 b6 b5 b4 |b3 b2 b1 b0
word 0      |word 1

Ordering #2:

----------------->
lower address to higher address
b3 b2 b1 b0 | b7 b6 b5 b4
word 0      | word 1

I've found mixed answers to this question, and I wanted to have this concept clear to continue. Thank you in advance!

0

There are 0 best solutions below