I have been doing a couple of tests with pytorch allocations, it let me go as high as 120GB [1] (assuming the allocations were small enough) without crashing. The main limitation was mostly remaining system memory:
htpc@htpc:~% free -h
total used free shared buff/cache available
Mem: 125Gi 123Gi 920Mi 66Mi 1.6Gi 1.4Gi
Swap: 19Gi 4.0Ki 19Gi