Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, for counts 32 bits is enough, unless you count something really small, like every individual byte in something.

And for counts 64 bits should be enough, since it's 20 billion billion.



Assuming you managed to accumulate enough items to fill a 64-bit counter, if you got 1 core of your super-fast PC to increment a 64-bit value for each one, it would take over 100 years to count them all. (Assuming you don't just decide to pop -1 in the counter. Presumably you'd want to make sure you haven't miscounted during the accumulation phase.)


Well, for counts 16 bits is enough, unless you count something really small, like every individual byte in something. And for counts 32 bits should be enough, since it's 4 billion.

someone somewhere 20 years ago.


For counts of things you have in memory, 32 bits is usually enough -- even if each thing is just 8 bytes, 2^32 of those objects would require 32 GB.


32GB isn't an exceptional amount of memory these days.


Sure -- but it is an exceptional amount of memory to use for tiny objects like those. If you're working with four billion objects at a time, they're probably more substantial than eight bytes.


Or you spent a lot of effort to get them to 8 bytes or even smaller, to fit as many of them as possible in memory. See use-cases like time-series/analytics databases, point-clouds, simulations with many elements...


You mean 4GB.

And yes, 32-bit programs can't use more then 4GB.


No, I mean 32 GB -- 2^32 x 8 bytes. We're discussing 32-bit integers, not pointers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: