The explanation I’ve read in the past is that the difference lies in how VLC was initially focused on playback of video streamed over a network (hence the name Video LAN Client), which was generally more rocky a couple decades ago when the quality and bandwidth of the average connection was lower and the only routes between the client and server could be highly indirect or routed through weak nodes.
In that environment, playing at all without constantly dropping out or buffering is the goal, which is more achievable when fudging accuracy/correctness (which wouldn’t have been possible anyway).
Not sure how true that is though, it’s not something I’ve ever verified.
This has long since been the case, even back in the MPlayer days in the early 2000s.
I'm not really sure why this is the case though, because I believe both largely rely on the same libraries for many codecs.