Adding more states to an electronic system is trading robustness and noise insensitivity for better performance.
Think about it this way: as the number of states approaches infinity, you're back to analog computing.
It's a design parameter, not something that lets us break past limits on computational density, which right now is heat removal and quantum tunneling in transistors.
What do you think would be harder to implement in a fab on a budget in the future? A chip full of large gates that can handle 7 voltage levels stably, or a chip with 5x [1] as many of the smallest gates that physics and logistics allow one to build?
[1] If memory servers, adders and cache access and a bunch of other logic typically require O(nlog₂n) gates in binary, but O(nlog₃n) in ternary, Which means as the native integer size increases, bases greater than 2 scale better.
Think about it this way: as the number of states approaches infinity, you're back to analog computing.
It's a design parameter, not something that lets us break past limits on computational density, which right now is heat removal and quantum tunneling in transistors.