Quantcast
Channel: Planet Plone - Where Developers And Integrators Write
Viewing all articles
Browse latest Browse all 3535

Maurits van Rees: PLOG Tuesday evening

$
0
0

Laurence Rowe: Cache invalidation

Sharing experience from a non-Plone project. Cache invalidation is always hard. Pages may contain snippets from other pages, from other content items. It is hard to invalidate all pages that may contain such a snippet when updating this item.

ZODB caches the objects. Zope stores the transactions. A transaction has a list of objects that are changed. Other versions of the same object can then be removed from the cache.

I wondered if I could do this for caching pages in Varnish or in Elasticsearch. I tried this in the experimental.depends package. The first simple way kind-of worked. But new items that would need to end up on the page and were never there before, did not get inserted.

I looked at the Darcs version control system, which does similar things.

A folder contents page that shows the first five items, needs to take all items into account for possible inclusion.

I have an indexing process, asynchronous, that updates the info for anything that needs to be invalidated. It leads to a materialized view of the data, allowing faceted search across a deep join.

I think this principle could be extended to Plone. It would allow indexing to a page cache like varnish, or a search cache like Elasticsearch.

Varnish helps making often visited pages fast, it does not help with the long tail of pages that are hardly ever visited.


Viewing all articles
Browse latest Browse all 3535

Trending Articles