Technically he's right, giga does *only* mean a billion in decimal, I don't think it has any official change in meaning when bytes are used. But it is obviously a convention that any computer user understands.

Some standards organization came up with units like gibibytes, mibibytes, and exbibytes to supposedly avoid the confusion (as you've noted, sometimes deliberate confusion) associated with the differences between the decimal and binary values of these terms.

However those terms sound really lame and will never be used.


Edited by yn0t_ (20/01/2002 12:23)
_________________________
- Tony C
my empeg stuff