The practical application of confluence

I just read the Pure Functional Worst Case with Constant Time Sorted Lists by Clock by Brodal et al. and their introduction to various types of persistence in the context of data structures leaves me with an obvious question:

Conflict persistence: all versions can be updated and requested, and in addition, two versions can be combined to create a new version. Note that in this case it is possible to create an exponentially dimensional structure in polynomial time by repeatedly attaching it to itself.

What are the practical applications that allow you to create an "exponentially dimensional" structure in polynomial time, repeatedly connecting it to yourself?

+4
source share
3 answers

Here is a usage example. Imagine that the general data type “sequence” is used as an array in a situation where the array is expected to be sparse (i.e., most elements will contain the same value, with a relatively small number of points set to some other value ) If the sequence data type has this property, you can build a (possibly very large) array using the technique you mentioned, and with this use, space and time will still be reasonable.

Of course, you can create a special data type, especially for sparse arrays, and it will probably be a little more space and time than a general purpose data type, but the fact is that the general data type is quite well adapted to this usage pattern which you might not even need to create a special data type.

(Admittedly, this example refers to confluent perseverance in general, and not to “sorted lists” of the article. Then, in the paper, they made this remark about confluent perseverance in general, and not specifically about their own data structure.)

+11
source

I did not read the newspaper, but after seeing the definition of Confluent Resistance that you provided, I can correlate this with the Data Structure, for example, the Binomial Heap where the merge operation has a logarithmic run time and is very suitable for sets of union algorithms. Since the trees grow exponentially and, given the binomial heap, we can have several of them, and each of them can be combined in polynomial time. Performing a join operation would be the best application that I feel for such data structures, and that too in polynomial time.

+1
source

Your question says that you are reading an article saying that this property is an advantage of data structures that exhibit this type of persistence. Looking at the article, I do not think that what the authors intended was just a somewhat contradictory property of such data structures that they wanted to indicate the pleasure and pleasure of the reader.

As a further note, the exponential size that the authors talk about is, I think, the “mathematical” size (the number of nodes in an abstract sense), and not the size in memory — it would be impossible to access or write the exponential number of memory locations in polynomial time !

0
source

All Articles