Here at mobiForge we’ve been broadly positive towards Google’s AMP since its launch in October 2015. It’s by no means the only way to achieve the wholesome goal of fast-loading web pages, but by providing a well-documented format to adhere to that delivers excellent performance, the web has benefitted. Google now reports that the average load time of AMP pages across their entire index is about 700ms, a very impressive result by any measure; The Washington Post states that their average AMP page load time is a stellar 400ms.
But a slight initial queasiness about AMP has morphed into a sense of distant foreboding as time brings clarity to Google’s overall direction. Our change in attitude is not related to the standard itself, but rather the treatment of AMP pages by Google in its general search results.
In July this year Google announced that AMP links would begin to show up in general search results (previously AMP links were surfaced only in news-related searches). These links are labelled with a lightning bolt and an AMP tag. This sounds great—the difference that AMP makes to casual web browsing is transformative—pages appear to load almost instantly, reducing the friction of clicks to almost nothing and the tagging ensures that users can see which pages are likely to load quickly.
So why the sense of foreboding? The issue is not the labelling of fast-loading pages. Rather, it is the approach of favouring a Google-approved method of achieving a fast-loading page. Other pages may load just as quickly but if they’re not built with AMP they won’t be called out. Lean sites that jump through hoops to maximise performance but that don’t use AMP will not get the preferential treatment. It would be far more equitable if reliably fast-loading sites were called out regardless of their underlying technology.
To be fair to Google, one of the big benefits of AMP is that it removes many of the potential reasons for loading delays and hence facilitates standing over the fast-loading claim that the lightning bolt suggests. Also, part of the AMP specification includes provision for caching resources to enable quick delivery, further reducing load-time variability. But none of these reasons is sufficient to favour a particular technology—the results should be the metric, not the underlying technology. Surely it is not beyond Google to build up a sense of average load time of a given page and label links accordingly? Google has already stated that load time factors into their ranking decisions so they are clearly measuring load times to some extent already. Furthermore, while AMP is a very strong indicator of page load time performance, it is by no means a guarantee—you can still link unnecessarily large resources.
There is another slightly disturbing aspect to all of this. The AMP specification explicitly states that by using AMP you are making your content available for caching:
“By using the AMP format, content producers are making the content in AMP files available to be cached by third parties.”
Any provider can cache AMP documents. Google makes their AMP cache freely available to everyone. AMP caching need not affect your analytics—the major analytics platforms are already widely supported by AMP through the <amp-analytics> tag. So content providers get free caching while retaining their traffic insight and users get fast-loading pages—what’s the problem? The problem is that, even though AMP caching is open to all, Google’s dominance of the search market means that most AMP pages will inevitably be served by Google. This takes a load off content provider platforms but ultimately means that yet more knowledge of the world’s web traffic accrues to Google, further bolstering their dominance. Yes, Google already see the clicks to external links in search results but actually serving the resulting traffic allows for more information gathering to take place, and builds uncomfortable dependencies. Your content is now being served by Google, the same company that monetizes your content and reports your traffic levels. How can can you ensure that the content is being served as originally authored, and that the traffic is being reported truthfully?
Ultimately at this point AMP still feels good, not in a wholesome way, but rather in a Brave New World kind of way. In this world PageRank is Soma, rationed by World Controllers in The World State. The stability of this world is based on citizens accepting their rank and a discouragement of critical thinking. Don’t meddle with the PageRank so kindly provided to you by the benevolent Controllers!
Are we collectively creeping, little-by-little, into a walled garden? Perhaps we’re already there. Even as I write this my CMS is sternly advising me on focus keywords, their density and the page URL structure. A “painless, amusement-sodden and stress-free consensus” version of our free and open web, a web where Google is judge, jury and executioner of content is a walled garden in all but name. Caution is advised.
UPDATE 16/9/2016: Matt Shull made many of the same points in his Medium post earlier this year.