Well, I’m only about 9 years behind the times, but that’s because of my classical education.
As any well-educated computer scientist will tell you, most things in computing are measured in powers of two; a natural result of the fact that all of today’s computers are binary-based. And thus it is that a kilobyte is not 1000 bytes but 1024 bytes, because 1024 is 2 to the power 10. A megabyte is 1024 kilobytes and a gigabyte is 1024 of those.
That, at least, is how most of us were brought up, but there was always a certain degree of confusion caused by the fact that marketing types wanted to use 1000 where computer scientists used 1024, because it made their hard drives, flash drives etc look bigger. This is why, when you were sold a 100GB hard drive, and you took it home and plugged it in, it would look rather smaller; computers would report the size using a gigabyte that was about 7% bigger than the marketers’, so they would tell you that you had fewer of them.
Now, in 2000, the IEC established a new set of prefixes which are unambiguous, and have names which sound funny to those of us who have got used to thinking that ‘gigabyte’ is a normal sort of word. They are kibi, mebi, gibi etc – the ‘bi’ indicating ‘binary’. So a gibibyte (GiB) is 2^30 bytes, while a gigabyte (GB) is officially 10^9 bytes (a billion bytes), which is somewhat smaller.
Most of us have just ignored this change, but some systems now – like Apple’s ‘Snow Leopard’ operating system – apparently reports sizes in ‘official’ GB, so your 500GB hard disk will actually look like 500GB when it’s plugged in.
It isn’t really any bigger though. Plus ça change, plus c’est la même chose…