I'm guilty of sometimes using big-O notation when discussing amortized average costs as well.
It seems nonsensical to discuss worst-case behavior of hash-based data structures since we can (and should) make it arbitrarily unlikely.
In other words, if O(n) is the limit as n goes to infinity then the probability of actually encountering worst-case beahvior on a hashtable operation is 0 (assuming a non-broken hash function). Near-worst case behavior becomes similarly improbable.
This is different than many algorithms like, say, classic quicksort which have worst-case behavior on values that naturally appear in practice.
It depends on the application of the hash table, but malicious agents could 'attack' code by purposefully creating collisions if the author isn't careful. An algorithm expected to run in O(1) that runs at O(n) could be catastrophic.
I'm just pointing out that the worst case time complexity is an issue that can't always be cast aside. Aren't cryptographic hash functions generally slower?
It seems nonsensical to discuss worst-case behavior of hash-based data structures since we can (and should) make it arbitrarily unlikely.
In other words, if O(n) is the limit as n goes to infinity then the probability of actually encountering worst-case beahvior on a hashtable operation is 0 (assuming a non-broken hash function). Near-worst case behavior becomes similarly improbable.
This is different than many algorithms like, say, classic quicksort which have worst-case behavior on values that naturally appear in practice.