JS web frameworks benchmark – Round 3

It’s been a few months since I published Round 2 of my javascript web frameworks performance comparison. In Javascript land months translate to years in other ecosystems so it’s more than justified to introduce round 3.

Here’s what’s new:

  • Added a pure javascript version to have a baseline for the benchmarks (“vanillajs”).
  • Added cycle.js v6 and v7. What a difference the new version makes!
  • Added inferno.js. This small framework made writing a faster vanillajs version challenging.
  • Updated all frameworks to their current version.
  • New benchmarks: Append 10,000 rows, Remove all Rows, Swap two rows.
  • Added two benchmarks that measure memory consumption directly after loading the page and when 1,000 rows are added to the table.

The result table for chrome 51 on my MacBook Pro 15 reveals some surprising results (click to enlarge):

result_table

(Update: cycle.js v7 was reported with an average slowdown of 2, but I forgot a logging statement. Without it it improved further to a factor of 1,8. The table has been updated.)

  • Vanillajs is fastest, but not by much and that only by hard work.
  • Inferno.js is an incredibly fast framework and only 1.3 times slower than vanillajs, which is simply amazing.
  • plastiq comes in as  the second fastest framework being just 1.5 times slower, followed by vidom.
  • React.js appears to have a major performance regression for the clear rows benchmarks. (Clearing all rows for the third time is much faster. There are two benchmarks pointing to that regression which is a bit unfair for react.js)
  • Cycle.js v6 was slowest by far, v7 changes that completely and leaves the group of the slow frameworks consisting of ember, mithril and ractive.

All source code can be found on my github repository. Contributions are always welcome (ember and aurelia are looking for some loving care, but I to admin that I have no feelings for them. And I’d like to see a version of cycle.js with isolates) .

Thanks a lot to Baptiste Augrain for contributing additional tests and frameworks!

 

JS web frameworks benchmark – Round 2

Here’s a follow-up to my last blog post since there’s good reason for an update:

  • It turned out that ember performs better with ‘key=”identifier”‘ in the #each helper.
  • One of the vue.js creators contacted me and told me that they fixed vue.js in response to the my benchmark
  • Two other react-like libraries were added: Preact and react-lite

Here’s the new measurement showing the duration in milliseconds:

results2

(Click the image to see the graph in full size)

The duration is measured from the beginning of the dom click event to the end of repainting and thus includes javascript execution, rendering and layout duration. If you want to read more about it please check the according blog entry. All tests are run via a selenium test and multiple samples are taken and averaged (after removing the worst measurement). The full description for the benchmarks can be found in round 1.

And the new conclusions:

  • Ember got much faster for “partial update” (from 164 msecs to 58 msecs), but is still quite slow for create or update 1000 rows.
  • Vue.js improved a lot for the “update 1000 rows (hot)”  benchmark (from 435 msecs to 260 msecs)
  • Preact leaves quite a good impression. It’s a lot faster than react for create 1000 rows and update 1000 rows and not much slower for the rest.
  • Almost the same can be said about react-lite though the performance for “remove row” is rather weak.

All in all especially angular 2, vidom and preact impress with their performance. Still aurelia feels much faster in the browser than in the selenium tests (except for startup duration which might be my fault not using the bundler correctly). And yes, vue.js is now pretty close to the fastest frameworks.

You can view the benchmarks in the browser and find all source code on github.

Benchmarking JS-Frontend Frameworks

I must admit that this post came a bit unexpected. I just wanted to compare the performance of a few Javascript frontend frameworks and that’s where all the mess started.

What to measure

Before running a benchmark one should be clear about what to measure. In this case I wanted to know which framework is faster for a few test cases. I knew which test cases, which frameworks, which left unclear what faster actutally means. Let’s take a look at a chrome timeline:

Continue reading “Benchmarking JS-Frontend Frameworks”