I'm surprised he brought that up, it's a weird point. Because a different part of the brain processes the constructed language it's no longer language? It's also untrue. We construct language patterns, poetry for example, where the positional number of a word or a syllable changes its meaning.
That said his point about the brain being wired to take the less computationally intensive route is a very important insight which I think extends beyond genetics and throughout the evolution of all biological processes.
Actually, his point is that our language organ takes the more computationally expensive route. Not the easier one. That's the puzzle.
I don't have his books on me, so I may misconvey this point, but IIRC, he also mentioned regular languages (you know, like with regexes) as another example of a computationally "easier" language family we don't pick up with our language organ. We don't speak arbitrary languages. The space of languages is filtered by genetics.
He delves into this point more deeply in many places and addresses the points you raise with a precision beyond what's found in interviews.
I suppose it depends on your definition of 'expensive'. By that I mean, if your computational model is inherently serialised, sure, regexes are cheap. If fuzzy matching and rough temporal correlation between processing units turns out to be cheap, perhaps regexes are ridiculous extravagance on your hardware family.
I suppose you could turn it around - assuming our language processing is optimal (bit of a leap) you can infer things about our hardware architecture by the languages which we parse efficiently.
That said his point about the brain being wired to take the less computationally intensive route is a very important insight which I think extends beyond genetics and throughout the evolution of all biological processes.