Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
We managed to speed up our CI to save 168 days of execution time per month (home-assistant.io)
23 points by pyprism on May 14, 2024 | hide | past | favorite | 8 comments


Any time I see engineering time spent on splitting/sharding test suites, I can't help but wonder if access to a single beefier runner (e.g. 64+ cpus) would have alleviated all that work. Also always find the duplicated setup time a bit wasteful on resources.


Large box in the closet has always been a competitive strategy.

You have to keep your closet clean and organized or else everything is bad, but if you can, it works well.


Basically yeah. At my last company we switch from n GitHub runners to 2 128 core epyc boxes and massively (20x?) decreased our build times. A surprising thing is how much time is spent on IO to upload and restore the cache between steps that’s saved if the entire job runs on a single machine


Every time I see this type of story I can only think how people get rewarded for heroically/dramatically rescuing bad systems. While those who choose boring/proven technology at an earlier point, don't get recognition (or even hired in many cases).


They call pytest --collect-only and parse the output before distributing it to the Github Action python matrix. I'm a little surprised pytest doesn't offer the ability to cache collection. Though a slow collection time may be an issue that should be addressed since developers must suffer it locally!


Could do with more detail, is this using pytest xdist ?


They mention their Github pull request with all changes.


I must be misunderstanding. Every test run was spending 3 _hours_ just discovering which tests to run 9 times?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: