• cbarrick@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 months ago

    I do wonder if jumping back by twice as much is optimal.

    It is. This is called exponential search, which maintains the O(log(n)) time complexity of binary search.

    (Where n is the position of the commit you are searching for.)

    • kevincox
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      7 months ago

      You seem to be talking about binary search, but this is a search with an unbounded end.

      I think the actual optimal thing would be just to take the first commit and bisect from there. But in practice this has downsides:

      1. Bugs that you are tracking down are typically biased towards recent commits.
      2. Older commits may not even be relevant to the bug (ex: before the feature was introduced)
      3. You may have trouble building older commits for various reasons.

      So I think “optimal” is sort of fuzzy in this context, but I’m sure you could come up with various models for these variables and come up with something that is optimal for a given model. I haven’t got around to doing that yet though.