For me, web development seems like the exact opposite of desktop development.
In desktop development, you learn the underlying foundations first (the programming language, files, networking, graphics, GUIs, etc), then weave them together to build your application. It's slow, but worth it -- often the concepts are universal. How many times do you have to re-learn file IO?
In web development, it's really easy to build the finished application quickly but not understand any of the foundations. Look at how many "rapid development" frameworks there are out there; most of then are fast to install, and the result is usually awesome. And plugins! There are plugins for everything, and they work so well! But the foundations are intimidating -- especially because there are so many layers of abstraction piled on top of each other. Where would you start? The browser DOM? How JavaScript is interpreted? How your database implements transactions? The more web development I do, the more I realize how freakin' tall the stack is...and how little I really know about it. Kind of scary.
Perhaps I'm misreading your post, but RAD environments for desktop applications have been around for over a decade. I wasn't exactly an early adopter and I remember dragging and dropping a button onto a form using Delphi 4 under Windows 98. And then you could drag and drop to resize it and formatting it was so easy a 13-year-old could do it. I didn't have a clue about Win32 API or the graphics drivers or bitmap buffers or memory buses and it totally didn't matter.
This is the nature of things. We use our lower-level tools to build higher-level tools that make the technology (or processes like development) more accessible to more people. Could you imagine having to build everything from scratch every time a new job came along?
"Build us a banking system that can be used by our employees to safeguard our customers' deposits and analyze our loaning ability, while providing web-based access to accounts by customers."
"OK. First, we'll need several tons of sand, some raw copper ..."
I think it speaks volumes that the infrastructure design of the web (or perhaps the Internet) prevents todays web app developers from worrying about how their bits move from the server CPU to the users' screens.
P.S. I'm terribly disappointed in my fellow posters actually answering the rhetorical question.
What are you defining as file IO? What do you mean by re-learn--do you mean forgetting and having to learn again, or learning some new fact that changes your total understanding of the thing? What has gotten me in the past is learning how some particular language does file IO[1], which might mean learning something new about file IO I didn't know before (but didn't matter for what I wanted to do at the time) that expanded my perception. Though I guess the concepts of "read", "write", "rewrite", and "append" are the same.
I really don't think you get any more universality in desktops than you do in the web world. You might get more in the web world if you restrict your universe to the web world, due to a limited selection of choices on the front-end. (And backend too if you go the traditional route of learning LAMP with PHP on a shared host provider.)
Related to the other comment on RAD systems, even NeXTSTEP in '89-'95 had the WYSIWYG concept applied to everything as one of its major selling points. http://www.youtube.com/watch?v=j02b8Fuz73A Around 23:00 Jobs starts to demo making a GUI application that talks to a backend DB and shows pictures, and you don't need to know SQL or joins or anything.
So I don't think the desktop world has been any better in regard to keeping abstractions at bay or offering universal modes of thought (hey fork()!). The first time I programmed something with tkinter I kept asking "Where's my user-defined main loop? Where's my event handler? Where's my render stage? WTF is this 'pack' method?" Because my prior UI experience on a desktop was with pygame, which defaults to having you control every pixel.
[1]Bash likes its pipes and IO redirection, C likes its file pointers, C++ likes its streams, Java likes those too but also likes Buffer Objects et al., PHP has a nice function to glob a whole file in as a string (but you can shoot yourself in the foot if you run out of memory), "Python is obvious", "Perl is magic". If I try and load "./file" what's the context of "." and how does my language let me get a known value for that that's relative to my program and that's platform independent? What about encrypted folders, different file systems, file encoding? Will the language/OS abstract that for me? Now I want to start doing things asynchronously. Now I want to start talking to web files, now I want to start sending cookies and stuff when I'm talking to web endpoints instead of files, maybe I want to do socket IO now, and this whole trip has taken me through several other realms which seem to me pretty far removed from simple file IO.
I too find web development annoying for this reason - everything feels so arbitrary. Rather than learning a set of principles you're learning a huge bag of tricks.
The result is that when I forget something, I can't go back and figure it out from something simpler. It also makes it really hard to build abstractions that actually work.
Dunno. Bytes IO or text IO? evented IO? Lazy IO? Monadic IO? Which platform are you on? Will there be concurrent access? Is your file always a file or is it also a socket?
Yeah, you re-learn file IO quite a bit I'd say. And that's without talking about APIs going from straightforward (and broken) to completely baroque (and still broken).
Oh sure, sometimes
> it's really to [read a file] quickly and but not understand any of the foundations.
I don't know from Monadic I/O, but at least a couple of those might appear to the untrained eye to be application I/O rather than actual file I/O, which I would guess has been pretty uniformly oriented around the open(3) idiom by now.
I had a course in grad school in which we wrote our own database system, including having to calculate the storage to disk... as in, calculate the sectors, and where you had to break fields to a new sector, and read/write the raw binary data to/from the hard disk.
I have not really had to re-learn file IO since then.
Clearly there are new things to be learned... (do usb sticks even have sectors?)... but having that low-level experience tends to be more than enough for a front-end web developer.
In desktop development, you learn the underlying foundations first (the programming language, files, networking, graphics, GUIs, etc), then weave them together to build your application. It's slow, but worth it -- often the concepts are universal. How many times do you have to re-learn file IO?
In web development, it's really easy to build the finished application quickly but not understand any of the foundations. Look at how many "rapid development" frameworks there are out there; most of then are fast to install, and the result is usually awesome. And plugins! There are plugins for everything, and they work so well! But the foundations are intimidating -- especially because there are so many layers of abstraction piled on top of each other. Where would you start? The browser DOM? How JavaScript is interpreted? How your database implements transactions? The more web development I do, the more I realize how freakin' tall the stack is...and how little I really know about it. Kind of scary.