Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Spark is technically not Python, even if we support PySpark with the relevant decorator but it's a very niche use case for us.

As for all the other Python packages, including proprietary ones, the FaaS model is such that you can declare any package you want in a function as node in the pipeline DAG, and any other in another: every function is fully isolated, and you can even selectively use pandas 1 in one, pandas 2 in another, or update the Python interpreter only in node X.

If you're interested in containerization and FaaS abstractions, this is good deep dive: https://arxiv.org/pdf/2410.17465

If you're more the practical type, just try out a few runs in the public sandbox which is free even if we are not GA.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: