See folks, this is what I like in a reply. Valid counterpoints, links to supporting information, links to actual projects and codebases. Is that really so much harder than stamping one's feet and screaming "wah wah, is not" or "you used a movie reference I don't know, you must be a racist".
In particular that list of things that trigger reflows (but not necessarily paints -- there's a difference) is very handy. Bookmarked as that's important stuff indeed.
Though a LOT of it is stuff that reeks of using JS for things that's none of scripting's business anymore; at least for anything more than a class swap... or if it is JavaScript's business it should be Canvas's job. I guess MAYBE if you're working with SVG that they are DOM nodes it could be an issue, but really one shouldn't be creating/removing Nodes the same time you're manipulating or reading information like positioning/offsets/scroll.
I actually wonder if I've just been subconsciously avoiding combining the two out of some sort of instinct, as the properties listed there I'd NEVER use at the same time as node creation/manipulation as that's render. Natural habit of separating concerns?
And yeah, tens of thousands of rows is "absurd", but that's the point of benching and testing. You need absurd to stress things and make the cracks show. That's kind-of the point.
Though those "vanilla" scrips are so far off from how I'd solve that problem. Laugh is I bet my way would be roughly the same speed, maybe 5% slower at most, but be half the code saving you time elsewhere.
In particular both your approaches use of node cloning then trying to walk or nest access to set values can result in so long a recalculate and layout that you could use a script that takes twice as long to actually execute, and between JIT compilation and the render, you end up right back where you started.
But that's the problem with any case of trying to optimize and bench. 1) is it worth the loss of code clarity, 2) is it worth the extra code, 3) are we sick of playing whack-a-mole yet?
'Cause whenever you think you've got some grand new optimization, it invariably costs you time elsewhere.
Though all those demo's "bother me" with the lack of a noscript message, scripting only elements in the markup, endless pointless "DIV for nothing" and the garbage presentational class junk bootcrap suckers people with...
But worse is that massive event handler filled with conditionals instead of just making the buttons from the scripting and giving them their own handlers. If one were to bench smaller "more realistic" data sets you might find whilst you optimized for the "loop", you could be wasting as much time on comparisons and "jumps" you wouldn't need if you just trapped the buttons instead of your "main". Especially in cases like those delete buttons on the rows where event.currentTarget.parentNode.parentNode would be a hell of a lot simpler than all those "arrays for nothing" you're generating.
Since I would have no such arrays present. This.rows being redundant to the live tbody.rows, the "store" nonsense being redundant to tbody > tr.cells, etc, etc.
Also not sure why you have the non-functioning / broken anchors in there. Is that as a fallback for keyboard navigation?
Seriously though, to me that's 10k of JavaScript doing around 4k's job. (+/- 256 bytes). I'd be shocked if without the bootcrap if the HTML+CSS+Scripting for something that simple would break 8k total.
And it would be a lot clearer and easier to maintain in the process. Though I'm quickly coming to the conclusion I have a different definition of the word "easy" from the normies. No shock when things like the HTML specification can't even use words like "empty" or "void" properly, or mainstream programmers don't even know what the word "closure" means and abuse it when what they should be saying is ENclosure.