Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> That's kind of the point though: being reasonably sure that a commit contains a tree that the committer had seen at some point, instead of making up history with commits that contain trees that the committer never saw at any point at all.

I don't see how this follows. Merge-heavy histories in my experience tend to be far less bisectable. They have all sorts of "oops, fixup" nonsense going on, precisely because the author did not take the time to get things right the first time.

Any workflow that happens on a number of patches greater than 1 accepts poor bisectability as a risk. But the only real solution there is Giant Monolithic Commits, which we all agree is even worse, right?



Yeah if "merge-heavy" means "ship the reflog", I get what you mean.

But if "merge-heavy" means "use merges when it makes sense, use rebase when it makes sense", then you can get a nice history with `git log --first-parent` that groups related commits together, and also a nice history with `git log --cherry` that shows what the "always-rebase-never-merge" dogmatic people want.

If for this particular project it just so happens that merge doesn't make sense because of the specific needs of the project, then so be it, nothing wrong with that. Same with rebases.

Unfortunately this topic is another holy war where the ship-the-reflog dogma fights against the always-rebase-never-merge dogma.

No balance.

> I don't see how this follows. Merge-heavy histories in my experience tend to be far less bisectable. They have all sorts of "oops, fixup" nonsense going on, precisely because the author did not take the time to get things right the first time.

That sounds more like merge-only (a.k.a. "ship the reflog"). Doesn't have to be that way.

Evaluate trade-offs and choose based on that evaluation.

Does adding a new commit have any actual advantage (e.g. easily reverting one or the other) compared to just amending/squashing it, or is it just some developer's own subjective sense of purity?

Does re-ordering the commits have any actual advantage (e.g. change has a smaller context and can be more easily reverted that way) compared to just leaving those commits in that order, or is it just some developer's own subjective sense of aesthetics?

Does using merge commits bring any actual advantage (e.g. the project benefits from being able to bisect on PRs or features as a whole) compared to rebasing (not fast-forwarding), or is it just some developer's own subjective sense of purity?

Does rebasing bring any actual advantage (e.g. each commit is already atomic, fully self-contained, and well tested against the new base, so "grouping" them with a merge commit doesn't make sense) compared to doing a merge-commit, or is it just some developer's own subjective sense of aesthetics?

> Any workflow that happens on a number of patches greater than 1 accepts poor bisectability as a risk.

Poor bisectability or developers putting actual effort into ensuring commits are atomic and test them.

Bisectability is nice with good rebased commits. Bisectability is nice with good merge commits.

Bisectability is bad when developers don't care about keeping bisectability good.

> But the only real solution there is Giant Monolithic Commits, which we all agree is even worse, right?

It depends.

Those commits might not be easy to understand, but they sure as hell are easy to revert (more likely than not) if something goes wrong, because they tend to correspond almost 1:1 to GitHub issues (or Jira tickets, or whatever equivalent). Keyword "almost" because sometimes you can get 2 of those for the same issue/ticket/whatever.

But those 2 unproperly split commits (therefore huge) are still easier to revert compared to a spray of 10 unproperly rebased tiny commits where 9 of them are broken (because of what I mention in other comments where people only test HEAD).


Too much text. But what I will say is that the "good" merge workflow you posit really only exists in one place (Linux) and requires a feudal hierarchy of high value maintainers individually enforcing all the rules via personal virtuosity. I've never seen it scale to a "typical" project run by managers and processes.

Where the straightforward "get your stuff rebased into a linear tree" tends to work pretty well in practice. The rules are simpler and easier to audit and enforce.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: