The best flag value to be used (tips and good practice)

While learning the flag technique, I ran into some problem, so I am showing this example using C # with Enum:

  [Flags] enum PermissionTypes : byte { None = 0x0, Read = 0x1, Write = 0x2, Modify = 0x4, Delete = 0x8, Create = 0x10, All = Read | Write | Modify | Delete | Create } 

To check the hasFlag property:

 if((value & mask) == mask) {...} 

But when "hasFlag" applies to both "None" and "Read":

 Denote x = Current_Permission_Setting, x & PermissionTypes.None = always false x & PermissionTypes.Read = always true IFF 

(cont ') IFF x = { ODD byte value}


Question: What are the ideal sets of flag values ​​that can be safely used?


Help: Here is a complete example .

+4
source share
2 answers

As @antlersoft notes in the comments, you cannot use NONE as a bitflag.
The rest of the flags make sense. You need to use powers of 2 to get 1 bit per flag.

In any case, it makes no sense to test "With READ and NONE", since NONE means that READ is not installed.

The problem you are describing with odd values ​​of X does not make sense. The bit value will be odd if bit 1 is set. If you use your flags in series, bit 1 will only be set if READ is true.

+6
source

x & PermissionTypes.Read checks if the last bit of the x value is set, which in itself does not mean anything, as you found. A programmer can interpret it in several ways:

  • x has a Read flag set via x = Read or x = x | Read x = x | Read
  • x is odd (more precisely, it gives the remainder 1 when dividing by 2 ^ 1), similar to x & 3 - a reminder when dividing by 2 ^ 2.

Flags representing individual bit values ​​are usually used as in the example you showed. The programmer must decide what meaning to assign to each flag.

Another example of using flags is to check if the negative sign passed in the integer method by checking the "sign" bit:

 [Flags] enum StrangeFlags : uint { Negative = 0x80000000, } int x = -1; var isNegative = ((StrangeFlags)x & StrangeFlags.Negative) != 0 
+1
source

All Articles