The first time you open up the debugger and your v-table pointer is null (presuming you know what a v-table pointer is!) things start to be interesting.
Or there was that time I printf'd a 64bit number and a random stack variable was corrupted. That was a lot of fun.
Memory protection? Haha. No.
For that matter, my team just came across a bug in our own code a couple weeks ago, we were performing a DMA memcopy operation (for which you get to setup your own DMA descriptors yourself of course) over a bus and at the same time sending a command packet to the same peripheral that was being DMA'd to.
oops.
Expected things to be put into order for you? Nope. Not unless you implement your own queuing system. (Which we are doing for that particular component now.)
All in all it is a ton of fun though. I'm loving it. Having seen an entire system be built up from setting up a clock tree to init'ing peripherals to our very own dear while(1). (We actually moved away from while 1, async is where it is at baby! Callback hell in kilobytes of RAM! oooh yaaaah)
There is certainly nothing easy about it. I have quite some experience with low-level and embedded devices, but hardware has progressed to the point where a Cortex M3 is quickly becoming the jellybean baseline throwaway uC, and have you seen the datasheet of one of these things?
The amount of stuff they do is incredible, and everything is interconnected in unexpected and intricate ways.
I haven't messed around with x86 enough to know whether it is actually somehow easier than the ARM stuff, but I certainly agree with you that CPUs are damn complicated.
I loaded up the datasheet for the M3's when typing up this reply. 384 pages. Contrast that with the TI 74181 ALU datasheet from the days of old: 18 pages, most of which are just detailed diagrams of the distances between the pinouts. The logic diagrams fit on a single page. You can build a simple CPU using one of these machines in a few hours in your basement.
Hardware is only going to get more complicated. At what point does it become so complicated that no one person can reasonably understand how a computer works "under the hood", even from an abstract level?
I wouldn't say it is easier by any means.
The first time you open up the debugger and your v-table pointer is null (presuming you know what a v-table pointer is!) things start to be interesting.
Or there was that time I printf'd a 64bit number and a random stack variable was corrupted. That was a lot of fun.
Memory protection? Haha. No.
For that matter, my team just came across a bug in our own code a couple weeks ago, we were performing a DMA memcopy operation (for which you get to setup your own DMA descriptors yourself of course) over a bus and at the same time sending a command packet to the same peripheral that was being DMA'd to.
oops.
Expected things to be put into order for you? Nope. Not unless you implement your own queuing system. (Which we are doing for that particular component now.)
All in all it is a ton of fun though. I'm loving it. Having seen an entire system be built up from setting up a clock tree to init'ing peripherals to our very own dear while(1). (We actually moved away from while 1, async is where it is at baby! Callback hell in kilobytes of RAM! oooh yaaaah)