Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My thinking for ways to avoid this scenario are:

- If there is no concrete type that you can infer for the given literal, throw a type error.

- If there is no concrete type that you can infer for the given literal, fall back to a default, e.g. Int.

But this sort of situation is why backtracking during type inference can lead to pathological behavior.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: