Consider that you have an high-traffic websites which has some parts of its webpages that rarely change, like the header, the footer and some sidebars: why should you regenerate all those parts at every request?
ESI solves this kind of problem, but requires you to make a request through the network, as you need, at least, to hit the reverse proxy, which then handles the composition on a resource with sub-resources.
As I stated earlier, this is not an optimal approach for every use-case, so you definitely should try to use local caches (your users) to scale better.
HInclude fits perfectly in this context, as you only need to include the JS and add a namespace declaration to your documents:
1 2 3 4
whenever you need to aggregate data from a sub-resource you only need to add an hinclude tag:
or whatever – and it provides a nice behaviour when the sub-request
generates an error (status codes different from
it adds an
hinclude_$statusCode class to the tag.
A dummy benchmark
I provide here a benchmark, a simple and silly one, as you should be already able to understand the power of HInclude.
First of all let’s create a simple response, which aggregates header and footer directly from PHP, as we are used to do:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
and then the header and footer files:
1 2 3 4 5 6 7 8 9 10 11
1 2 3 4 5 6 7 8 9 10 11
Bear in mind that I use
to simulate some php code execution (
200ms seems a
reasonable amount of time – inspired by one of our
I took a look at Chrome’s timeline bar to get an idea of
the average time spent for rendering this resource, and it
If you try to use HInclude, just create a new page:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
and add, in
footer.php, a caching header,
which HInclude will made the browser take advantage of:
For the first user requests it will require
~220ms to render
the whole page: this is a pretty good starting gain, as we are
requesting header and footer in parallel, but as you retrieve
the page for the second time, performances will incredibly
improve, down to
~40/50ms: it’s, basically, a 90% performance
gain, but you should be aware that the biggest load time should
be spent within the main body of the page, that I just ignored in
this example; but gaining almost a half second for each pageview
is just a great goal achieved.
As pointed out by other people on twitter, HInclude has a few drawbacks – think about SEO – but you should be able to use it with contents that rarely need to play a major role in your SEO strategy (eg. never use HInclude to retrieve the body of a blog post2).