Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

68 GB ? Just ? Shouldn’t that be some 10-15 1-2 TB SSD in parallel ?


Technology for space lags behind consumer technology by 10-20 years. There are a few reasons for this:

- Long lead time to test and certify hardware.

- Higher reliability requirements (e.g. must work non-stop for 10 years)

- Must be able to operate in a higher radiation environment with little to no cooling. In a vacuum, and in zero gravity, cooling works very differently to how it does on Earth.

- These missions often take a decade or more to come together, and changing requirements throughout that process is hard, costly, and risky, so often they stay the same from the beginning.

Notably, SpaceX are bucking this trend a bit with their avionics which just runs on standard Linux machines rather than specialist machines or with a realtime OS, but they have mission lengths measured in minutes to hours, not decades.


They also simply do not store data locally.

The onboard storage is basically a buffer, they beam everything back to earth.


Yep. That means bandwidth is the limiting point, and that maxes out at 28 megabit in ideal conditions. A multi-terabyte array would be a waste.


Sometimes the smartest people in the world still stink at their short game




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: