GB or GiB, The clear demonstration that in the early days of computing the experts got a bit carried away, we've all identified that old myth at some point, even in the cinema.
What basically happened was that as file sizes grew, computer scientists got comfortable and started expressing large values as if they were decimal values rounding with the decimal base metric scale.
Later, as file sizes increased a lot and the consequent error factor, they pulled the GiB out of their sleeve with the sole purpose of not giving in and continuing down the comfortable path, making the GiB then the mathematically correct value, and the GB that was originally the correct value becomes a mess.
However, the GiB has been rather hidden from the public. The GB currently, due to its arbitrary use, has become an ambiguous value that is calculated based on octal or decimal as you please, and the Gib is hardly used.
If we express it mathematically, misunderstandings and ambiguous social conventions that have no place in a purely mathematical discipline like computer science end, that is, a MiB is 1Byte raised to the 6th power, while 1Byte x 1,000,000 is an erroneous calculation, no matter how much the market sells it to us as 1 byte raised to the 6th power.
The funniest thing is when some forum member comes along and tells you that how dare you question the engineers you poor mortal

PD: Currently for Google (and for most apparently) the iB (KiB, GiB etc) is now the mathematically correct (1 KiB = a 1024 Bytes), and the B (KB, GB etc) is for commercial use and is worth x thousand (1 KB = a 1000 B). Remember also that the Byte is expressed with a capital B (kilobyte = kB), and the bit with a lowercase b (kilobit = kb).
Salu2