Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I know bandwidth is a problem, but isn't battery consumption a bigger problem? Whatever we lose in bandwidth usage, we gain in power drain, right?


Not if you have hardware support, ie the GPU does the decoding.


I'm not at all familiar with video encoding/decoding processes or what they entail. Can you explain how hardware support for it would drastically change power requirements?


Well, specialized hardware is generally more efficient than general purpose hardware. (see CPU vs GPU for graphics)

So it might use a little bit more power than h264 with hardware acceleration, but probably not all that much.


Sure, specialised hardware tends to be more efficient than general purpose hardware but it's not so much "special-purpose hardware is magic" as it is "the general purpose hardware doesn't take advantage of certain properties". As far as I understand the GPU's advantage over CPU (again, graphics is not my area of expertise) it's largely that graphics processes are massively parallel and the GPU is specifically set up to handle massively parallel things. Like I said, I know nearly nothing about video encoding/decoding so I was hoping someone who did could explain things for me. My only guess for how specialised hardware could dramatically save power for video decoding is that it might be comprised of massively parallelisable simple operations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: