Your logs may be split across different services, and some of those services may be difficult to access. For example, if you're using a CDN, many hits won't go to your server.
Off-the-shelf analytics solutions have a ton of heuristics and in-house knowledge built up over decades that help deal with inherently janky data. For example: over half of all internet traffic these days is bots. All modern analytics solutions know how to discriminate bot traffic (to an acceptable extent), you would have to re-create that or get heavily skewed data. As another example, inferring geography from IP: Google, Adobe, etc maintain their own in-house databases for this sort of thing.
Analytics solutions can be configured to send data for interactions that don't involve a page load. There are a lot of interactions that won't put anything into the logfile unless you re-invent tracking beacons. The situation gets more complicated for Single-Page Applications for both types of tracking, but generally the solutions for off-the-shelf technologies are more understood and can be applied more generally.
For another thing, cost and ease of access. Google Analytics has a free version. And for businesses, many of the people who want data work in marketing or UX, not the sort of people who have various types of expertise to dig around in logfiles. At the level of effort it would take for someone to build a logfile tool or learn to dig through the data, GA would offer more benefit at lower cost.
For a solo dev or small team running a not-too-complicated website, logfile parsing may make sense. For a large-enough organization, a custom solution that addresses their specific needs may make sense. But aside from a few sweet spots, most off-the-shelf tools are going to give a better benefit at lower cost. This is especially true of companies that have different teams with different needs and skills, and that benefit from integrations with other tools (like testing or personalization platforms).
Also, slightly-less related... analytics load time is not always a dominant factor in not recording clicks. Most clicks on ads go through a chain of redirects through a half-dozen different services that set cookies, read cookies, attach data, increment counters, etc. There's a large window where a user can cancel navigation before a request is made to the website. Analytics should run pretty fast, unless there is a long time in-between when your server receives the initial request and when the user begins to receive HTML.
I understand the reasons better, it’s unfortunate and slightly depressing to me that the community hasn’t been able to come up with a better solution. Particularly as the google analytics approaches involves giving data up to a 3rd party for processing.
Google has an Analytics product so that website owners can see how much money they make from AdWords, thus encouraging them to spend more money on AdWords. GA is a strategic complement to Google's advertising services, and that is how they derive value from it.
Google does not dip into GA data for their own uses. Post-GDPR this is extra-explicit because they take pains to clarify how much they are not a Controller and only a Processor, but this has always been in their TOS. Frankly, most people's GA implementations are dumpster fires and trying to make use of that data would be more cost and less benefit than punching themselves in the dick.
Google does look at GA data on occasion, but only for the purposes of debugging and for ensuring compliance with the TOS (e.g. they will flag accounts that contain PII).
Off-the-shelf analytics solutions have a ton of heuristics and in-house knowledge built up over decades that help deal with inherently janky data. For example: over half of all internet traffic these days is bots. All modern analytics solutions know how to discriminate bot traffic (to an acceptable extent), you would have to re-create that or get heavily skewed data. As another example, inferring geography from IP: Google, Adobe, etc maintain their own in-house databases for this sort of thing.
Analytics solutions can be configured to send data for interactions that don't involve a page load. There are a lot of interactions that won't put anything into the logfile unless you re-invent tracking beacons. The situation gets more complicated for Single-Page Applications for both types of tracking, but generally the solutions for off-the-shelf technologies are more understood and can be applied more generally.
For another thing, cost and ease of access. Google Analytics has a free version. And for businesses, many of the people who want data work in marketing or UX, not the sort of people who have various types of expertise to dig around in logfiles. At the level of effort it would take for someone to build a logfile tool or learn to dig through the data, GA would offer more benefit at lower cost.
For a solo dev or small team running a not-too-complicated website, logfile parsing may make sense. For a large-enough organization, a custom solution that addresses their specific needs may make sense. But aside from a few sweet spots, most off-the-shelf tools are going to give a better benefit at lower cost. This is especially true of companies that have different teams with different needs and skills, and that benefit from integrations with other tools (like testing or personalization platforms).
Also, slightly-less related... analytics load time is not always a dominant factor in not recording clicks. Most clicks on ads go through a chain of redirects through a half-dozen different services that set cookies, read cookies, attach data, increment counters, etc. There's a large window where a user can cancel navigation before a request is made to the website. Analytics should run pretty fast, unless there is a long time in-between when your server receives the initial request and when the user begins to receive HTML.