Many moons ago I would write web applications using technologies like PHP or Python to directly serve web content – often using templating engines to more easily handle data display. These apps would thus ship server-side rendered plain HTML, along with a mix of browser-native forms and AJAX (sometimes with JQuery) for some interactivity and more seamless data submission. I’d use additional lightly-sprinkled JavaScript for other UI niceties.
For the past ten years or so, this all changed and my web apps – including those developed as part of my work and nearly all of my side projects – switched to using front-end frameworks, like React or Vue, to create single page apps (SPAs) that instead solely consume JSON APIs. The ease of adding front-end dependencies using npm
or yarn
, advancements in modular and “importable” JavaScript, and browser-native capabilities like the Fetch API’s json()
function encouraged this further, and the ecosystem continued to grow ever larger. Front-end code would run entirely independently from the back-end, the two communicating (often) only via JSON, and with both “ends” developing their own disparate sets of logic to the extent where progressive web application front-ends could leverage service workers and run entirely offline.
JSON APIs only ingest and emit the application data, and the browser-based consumers would then need their own substantial codebases concerned purely with modelling the data semantics, displaying it to the user, and allowing it to be mutated through messages back up to the server. The concepts of front-end only, back-end only, and “full stack” developers became popular, but now these are all role titles that seem to be very much out of fashion.
Some efforts, like Meteor, and later Next.js, Nuxt and their ilk, attempt to bridge the gap by bringing some of the actual rendering (or hydrating) back to the server (they often refer to themselves as “full-stack frameworks”), but even then we were usually still shipping tons of pretty opaque JavaScript to the browser, to the extent that we (or maybe just me!) relied purely on vendor documentation without really fully understanding what code exactly was being sent and executed when, how, and what was actually going on behind the scenes.
For me, part of the shift to building front-end experiences entirely de-coupled from the backend, and often served from entirely different places (yet implicitly linked by the JSON sent over the wire) was to do with accommodating future plans that might (definitely) never come to fruition. “This service might need a mobile app one day and I should construct the endpoint signatures and responses accordingly” or “I may want to eventually publicly publish this API spec to be consumed by third parties”. In reality, if these ever were to happen, I’d want to tidy them up with nicer status codes and messaging, and likely develop a whole new set of APIs anyway. When building large and complex web applications in this way, it’s hard to keep all the internally-needed endpoints sufficiently squeaky clean to enable simultaneous public third-party consumption.
A year or two of sustained development on such a de-coupled web service, with a constantly evolving UI and feature set, inevitably results in pretty extensive drift in the original API intentions, with the addition of seemingly-random extra attributes in the API response payloads (to meet the needs of small extra features needed by the front-end) or entirely new endpoints that are so specific to a particularly minor feature that they can’t be re-used anywhere anyway. Such APIs gradually seem to become more specific and less generally useful; does this then diminish the value of the data API – and application de-coupling – as a whole?
Separating the front-end static assets (mostly large JS bundles) such that they’re served by a dedicated (and cheap) CDN, like Cloudfront or Cloudflare, became attractive as it would allow the server component to remain solely concerned with receiving and transmitting lighter pure data payloads and thus achieve a higher scale:cost ratio. But if we don’t need to ship heavy bundles for our React app at all, then do we need this additional separation complication? It adds extra setup overheads and additional complexities in deployment, client caching, and “out of date” UIs from long-lived browser sessions trying to talk to the latest deployed version of our API.
There’s also a human element to this complication; new company staff or project contributors working on the “front-end” would first have to learn the API schema (for example, using OpenAPI or Swagger) before they understand the real semantics behind specific fields and can then know how to actually use the data. Though, in reality, front-end technical product managers would likely try to solve this by recommending further developments of a bunch of commented and descriptive Typescript types that can be used to unmarshal the actual data sent over the wire from the API – even more front-end only code!
None of this is to say that React or Vue in themselves are particularly heavy, but there are enough implicit dependencies and recommendations through “industry standards” that tell you that you need routers, state management packages, and the latest component library that by the time you’ve started to go down the route of an SPA it quickly gets to the point where adding an extra top-level dependency to handle your single datetime input is just the tip of the iceberg.
And don’t get me wrong – I’ve been writing and strongly supporting apps in this way for many years because I enjoy it; these approaches and frameworks take away a lot of the pain in crafting dynamic and responsive experiences, and the community support and package ecosystems enable development of such apps at breakneck pace. But just recently I have been thinking more and more about this direction and the distance we’ve come from pure server-rendered applications. Are we still reaping the benefits of de-coupled applications, or have things gone too far? Who knows?
I was keen to gain a broader understanding of the current landscape a little better. I went down a bit of a rabbit hole, starting with the (arguably, yet understandably, opinionated) essays on the htmx website, and then onto a number of articles and blog posts, some of which I link to below, discussing what “true” REST APIs could (should?) look like. As I read further and talked to others, I resonated strongly with the discourse and became more and more convinced that just-as-rich and just-as-dynamic pure web applications can be created without requiring fully de-coupled and solely data-driven APIs, and without the complications brought about by invasive front-end frameworks. In my head I mentally mapped out the more complex components of some of the apps I’d worked on recently, and could immediately see how simpler and lighter-weight technologies could easily help meet the dynamic requirements where needed.
I also quickly realised that whilst the JSON APIs I’d been working on seem RESTful, they’ve come a long way from being so. If anything, they have moved more away from being true REST than the “simpler” apps I’d built with more fundamental and transparent technologies a decade ago. It began to dawn on me that unnecessary data APIs represent a lose-lose situation; the front-end misses out on the instruction and expressiveness (in terms of information display and available actions to the user) of pure HTML that can be rendered directly to the browser, and the back-end has to keep mutating and “drift"ing in order to meet the ever-evolving needs of the user experience’s new features.
(To me, there’s also a significant impact on security and the risk of accidental over-projection of sensitive fields into JSON responses as APIs attempt to become more useful or generic, but this could be a whole post on its own.)
What I like about the concept of “hypermedia APIs” – in which the server responds with traditional user-navigable hypermedia (in this/my case, hyper-text) – is that they more explicitly dictate the data, its meaning, and its presentation and layout, without needing front-end specific code to understand it. Links can be provided enabling users to continue to navigate the application without a-priori knowledge of the API “schema”. Technologies like htmx (and Unpoly) can be used to request HTML partials from the server, allowing the browser to simply and efficiently update and render parts of the interface accordingly, providing SPA-like dynamic experiences without the larger front-end codebases to maintain.
I am excited to continue this journey of (re-)discovery of web applications. I have a few projects in mind to develop some ideas on (perhaps also using Alpine.js for some additional light interactivity where needed), with an aim of building fast and powerful experiences with lighter and “closer to the metal” technologies.
If, like me, you’re interested in reading more about this space, I can certainly recommend starting out with the following articles and blog posts, and continue to explore from there.
- REST APIs must be hypertext-driven (Roy T. Fielding, 2008)
- Don’t Build A General Purpose API To Power Your Own Front End (Max Chernyak, 2021)
- The API Churn/Security Trade-off (Intercooler.js, 2016)
- A Real World React -> htmx Port (htmx website, 2022) – a story of how and why David Guilot converted a production web app from React to htmx.