Basically my understanding at the moment is at “what is going on here, how did this never confuse me before”.
func (r Result) IsIgnored() bool {
return r&resultInclude == resultInclude
}
Include bit means ignored?
func (p Pattern) String() string {
[...]
if p.result&resultInclude != resultInclude {
ret = "!" + ret
}
[...]
That’s consistent, as in no include bit meaning included - so again totally reversed?
defaultResult := resultInclude
[...]
addPattern := func(line string) error {
pattern := Pattern{
result: defaultResult,
}
[...]
for {
if strings.HasPrefix(line, "!") && !seenPrefix[0] {
seenPrefix[0] = true
line = line[1:]
pattern.result ^= resultInclude
[...]
It’s since become clear why XORing is used. Why XORing in the first place? Given the default result already has the include bit, that means the include bit is not set when the include prefix !
is, so again it’s consistent.
Basically my confusion is why the resultInclude
is called like this when it is set if the is no !
prefix. Any reason why not to do a simple ‘s/resultInclude/resultIgnore/’?
To my current understanding, this naming lead me to implement the ignore recursion wrongly:
m.skipIgnoredDirs = true
for _, p := range patterns {
if p.result&resultInclude == resultInclude {
m.skipIgnoredDirs = false
break
}
}
So this isn’t entirely academic, I’d appreciate some light shed on this. Git tells me @AudriusButkevicius introduced the bitmask and @calmh touched it later too (so did I, without noticing) - I hope someone can enlighten me.