Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why does it make sense to have the client do as much as possible so the server does as little as possible?


I could think of a couple reasons:

* bandwidth bill.

* rich interactions without back-and-forth server trips.

* What's on client-side can technically be saved on client-side. Thus, making distributed setup possible. And that contributes to lower bandwidth bill.

* client hardware is no longer wimpy, why not use all those CPU cycles on users' laptop/iphone/ipad/android/etc.


Because someone else is paying the client's power bill?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: