How to represent an integer array using java.util.BitSet?

I need to represent an array of integers using a BitSet. Can someone explain I need logic for this?

+4
source share
3 answers

You can represent a set of integers using a BitSet , but not an arbitrary array. You will lose information about the order and repetitions.

Basically, set the nth bit of BitSet if and only if n appears in your set of integers.

 BitSet bitSet = new BitSet(); int[] setOfInts = new int[] { /* Your array here */ }; for (int n : setOfInts) { bitSet.set(n); } 
+4
source

first thought:
use BigInteger and create it like this: new BigInteger (int value, int base). Then you can toString () this, and then create a BitSet using this String (I don't know how to do this without parsing the string, however).
-
did not read it correctly. This method will help you create a BitSet array, not an entire BitSet containing the entire array.
I do not know how to make an array of integers with one bit. I think you will need some markup, but how to make a good delimeter in binary format is a good question.

0
source

I think the logic will be: Skip the integer array, test each bit and set this bit in a bitet such as bitset.set (array_pos + bit_pos)

0
source

Source: https://habr.com/ru/post/1310935/


All Articles