Can I add a large number of DOM nodes without strangling the browser?

I have a web page on my site that displays a table, reloads the original XML data every 10 seconds (using XmlHttpRequest), and then updates the table to show the user any data addition or deletion. To do this, the JavaScript function first clears all the elements from the table, and then adds a new row for each data unit.

Recently, I came across several memory leaks in Internet Explorer caused by this DOM code to destroy and create (most of them are related to circular links between JavaScript objects and DOM objects and the JavaScript library we use, silently keeping a reference to every JS object created using new Element(...) until the page is unloaded).

When solving memory problems, we now found a processor problem: when the user has a large amount of data to view (more than 100 data units, which are 100 <tr> nodes to create, plus all the table cells for each column), the process binds the CPU until Internet Explorer will not prompt the user:

Stop executing this script?
A script on this page makes Internet Explorer run slowly. If it continues to work, your computer may become unresponsive.

It seems that when you run the code, the row and cell creation is multiplied by 100 + pieces of data, which causes the CPU usage to increase and the function is "too long" (from the point of view of IE) to run, which leads to IE to generate this warning for the user. I also noticed that while the "screen refresh" function works for 100 rows, IE does not redefine the contents of the table until the function completes (since the JS interpreter uses a 100% processor for this period of time, I suppose).

So my question is: is there a JavaScript way in the browser to pause JS and re-execute the DOM? If not, are there any strategies to handle creating a large number of DOM nodes and not having a browser shutter?

One way that I can think of is to process the logic of the โ€œupdate tableโ€ asynchronously; that is, as soon as the Ajax method for reloading the XML data is complete, put the data in some kind of array, and then set the function (using setInterval() ) to run, which will process one element of the array at a time. However, this is like re-creating threads in a JavaScript environment, which seems to be very complicated (for example, what if another Ajax request comes up, when I still recreate the nodes of the DOM table?), Etc.),


update : just wanted to explain why I accept RoBurg's answer. While doing some testing, I found that the new Element() method in my structure (I use mootools ) is about 2 times as slow as the traditional document.createElement() in IE7. I conducted a test to create 1000 <spans> and add them to <div> , using new Element() takes about 1800 ms in IE7 (runs on Virtual PC), the traditional method takes about 800 ms.

My test also showed an even faster method, at least for a simple test like mine: using DocumentFragments as described by John Resig . Running the same test on the same machine with IE7 took 247 ms, 9x improvement from my original method!

+1
source share
5 answers

100 <tr> is actually not much ... are you still using this new Element() framework? This may be the reason for this.

You should check the speed of new Element() vs document.createElement() vs .innerHTML

Also try creating an in-memory tree, and then add it to the document at the end.

Finally, note that you too often do not look at .length or other bits and beans.

+6
source

I had similar problems in round table rows of complex data, so there is something not quite right with your code. How does it work in firefox? You can check several different browsers.

Are you getting attached to onPropertyChange anywhere? This is a really dangerous event that caused me a serious headache. Do you use a CSS selector anywhere? They are known to be slow, i.e.

+1
source

You can create a string representation and add it as innerHTML in node.

+1
source

You can cloneNode (false) a repeating element, and then use it to loop instead of generating an element every time.

0
source

Avoid using one big loop to display all the data in one big step. Look at breaking the data into smaller pieces.

Mark each smaller snippet with a while loop. After executing this fragment, call the next fragment with setTimeout [time 1ms] to give the browser a โ€œrespiteโ€.

You can avoid using the while loop together and just use setTimeout.

Using the setTimeout method will slightly slow down the execution speed, but you should not receive a warning message.

Another thing is not to add each element separately to the document. Look at creating a new body by adding new rows to the new body and add tbody to the table.

There are many other things that can cause a slow application. It can be fun to wash them all.

There's a neat warning study here: http://ajaxian.com/archives/what-causes-the-long-running-script-warning

0
source

All Articles