Optimizing React Table Rendering By 160x !!!

Optimizing React Table Rendering By 160x !!!

React is “generally” a performant framework.. notice the “” on generally, Yeah that is because at times in react land you do feel very limited because of all the re-rendering which leads to performance issues when you are trying to create a large/complicated component where there is a lot of moving parts and data being manipulated left and right and managing to create a performant ui in all that does becomes a challenge sometimes if you don’t have the right tools and knowledge about it you can quickly shoot yourself on the foot in React. But for the most part, it is pretty quick and reliable hence the popularity.

The Issue

One of the issues I faced a while back when I was working in my company was that I was supposed to render a table of a very large number of columns and rows. For example, let’s assume we are working with a 2000 X 200 (rows x cols) size table.

Now if we calculate how many DOM nodes are needed for that size of table that will come around to be 200000 DOM nodes.

What are DOM nodes?
Your web browser parses HTML documents and builds a Document Object Model (DOM) that it can easily understand and manage. In this format, it can be easily understood — and modified — by a scripting language such as JavaScript.

Google recommends the DOM:

Have fewer than 1,500 nodes in total
Have a maximum depth of 32 nodes
Have no parent node with more than 60 child nodes

Now since we were dealing with a 200000 node size DOM tree you can imagine what would be going on.

SURPRISE SURPRISE!!

Well for those who are new to reading Chrome dev tools mem graphs, Let me tell you The graph above shows the non-stoping memory growth in the heap size (BLUE LINE). And the GREEN LINE Shows the growth of the number of nodes in the DOM. Both of which point that the app is well over the limit of the safe line which was somewhere around 1500 nodes and 32 max depth.

The result of all that was that our application which usually runs fine with 100MB to 200MB of RAM in Chrome, Required a full 4.9 GB – 5.2 GB !!! just to render that one page with the table.

I was barely able to record the memory usage because most of the time the page would just crash and won’t let me do anything with it.

Why soo much memory for DOM nodes?

Well everything and anything in your system will require some amount of memory to run it. In this case where we are taking about the HTML DOM nodes, The average size of a node is dependent on the average number of bytes used in each to hold the content, such as UTF-8 text, attribute names and values, or cached information.

Imagine a smartphone that allocates 1 GB of its memory for the Document Object Model (DOM), as it typically uses 3 GB of its 4 GB total for standard operations. To estimate, one could consider the average memory usage per node based on this allocation.

2 bytes per character for 40 characters of inner text per node
2 bytes per character for 4 attribute values of 10 characters each
1 byte per character for 4 attribute names of 4 characters each
160 bytes for the C/C++ node overhead

In this case N(worst_case), the worst case max nodes,

= 1,024 X 1,024 X 1,024
/ (2 X 40 + 2 X 4 X 10 + 1 X 4 X 4 + 160)

= 3,195,660.190476.

Found an amazing thread on stack overflow which explains this in detail and the above estimations are also taken from there https://stackoverflow.com/questions/42590269/safe-maximum-amount-of-nodes-in-the-dom

Anyways for processing all this also took a lot of time for example for the size of a 2000 x 200 table it took about 34.97 secs (TOTAL TIME SPENT IN CALCULATING AND RENDERING WHERE RENDERING WAS THE MAIN BOTTLENECK) + 14.31 secs (TIME SPENT BY MY SYSTEM (M1 Mac book air) TO RENDER THE INITIAL LIST). So a total of 49.28 secs !!! just to see the table load and then crash 🙁

Some more stats that support my points, For you to take a look.

Open to discussing more on these in comments 🙂

The Solution – Virtualization.

No not the kinda of virtualization which you run on your Windows system to install Ubuntu and have a feel of what an actual operating system feels like.

But there is a technique of partially rendering things in the dom and only rendering those nodes which are visible to the end user. So what we try to do is let’s say a user can see only 10 rows at a time and 10 columns at a time then we only render that much, Which is pretty easy and fast for the system to do.

If the user moves up or sideways we render more rows and columns in runtime while the user is scrolling and also keep destroying old nodes which are no longer on the user’s screen, That way we are able to maintain a safe amount of nodes for the browser to display and also not overshoot the memory.

For the people who are wondering yes! While virtualization may require additional compute resources compared to rendering all items at once, the benefits it provides in terms of performance, scalability, memory efficiency, and user experience often make it the preferred choice, especially for applications dealing with large or dynamic datasets.

And for doing that in React there are good set of libraries which we can use. One of them is react-window.

So an example code would be something like this:-

The package will give us a set of APIs and components to work with which are pretty easy to use.

In our case, we had a complex requirement where I had to freeze certain columns and rows while scrolling also the above example shows a list but when you render a bigger table then you should use VariableSizeGrid which treats your rows and cols as individual cells like a matrix and renders only those cells which are visible to the user.

Can’t release the full code for obvious reasons. As it is being used in prod.

After using virtualization the page loading time went from 49 secs to 300 ms minus the network delay depending on your internet.

So that saw a huge jump in performance and load time by 160x !!!

Well, that was it really, some minor changes were made like using useMemo and useCallback to optimise a few more things but most of the performance was gained by using the simple concept.
Btw did you know mobile development also uses the same concept to display large feeds in your social media apps, This is a really cool technique and I had a lot of fun integrating it with our project. Hopefully, someone else who is also struggling with the same issue will find this article useful or even if you are not it’s good to have a new weapon in your arsenal, You won’t know when you will need it 🙂

References

https://www.npmjs.com/package/react-window
https://stackoverflow.com/questions/3486239/maximum-number-of-divs-allowed-in-web-page
https://stackoverflow.com/questions/42590269/safe-maximum-amount-of-nodes-in-the-dom
https://developer.chrome.com/docs/lighthouse/performance/dom-size/
https://web.dev/articles/virtualize-long-lists-react-window

PS – I don’t take credit for any of the above info I learned and implemented most of it from the internet.

Leave a Reply

Your email address will not be published. Required fields are marked *