Sunday 5 June 2016

Towards a progressive REST API model

"Although REST interaction is two-way, the large-grain data flows of hypermedia interaction can each be processed like a data-flow network, with filter components selectively applied to the data stream in order to transform the content as it passes"
Roy Fielding's dissertation

This is the last post of my 5-piece API story "You don't manage the API, the API manages you". In my first real full-time job in a startup, founded around 2004, I had the role of both CIO and CMO. Back then, outsiders were not only laughing at this combination, but investors directly took is an example why the organization would be immature and not able to survive. It did, though, and prospered for a long time. Fast forward 10 years later and McKinseyDeloitteAccenture and Adobe are promoting the CIO/CMO model and more, design thinking as culture and blueprint. And CMO/CFO get closer with programmatic content and advertising. From my own experience in startups, but also consulting and service design, I know this model is not only about making the company customer-centric, and product-centric, but focusing the culture on most value [outcomes] instead of cost savings. It's about having a common goal, and allowing fast feedback for the product or service of the organization. This fast feedback requires fast, agile changes, a lean cycle, a risk culture and a real collaborative service, or product, design - I don't like singularity but it seems that cybernetics has won.

What does that mean for our API's? Can we really still revolve around a relatively static* hyperlink model that does not embody this feedback?


*) I am aware of solutions to that, like Gateway API's, Content Negotiation and Redirects but the core issue of no common ontology still exists and DDD alone cannot solve this


Funnily enough, one of the most hyped silver bullets these days, Microservices, are being advertised as the solution to own Micro-Moments along a customer journey. With the customer moving not only faster, but at different speeds, and jumping between channels, the organization has to do so too.

What I like in above's quote from the original REST source is that it describes the REST idea so well, like a DDD flow. Which is a superb concept. That works nicely as long as all dependencies and variants are universally known, and managed, and state can be contained. Nowadays however, such dependencies are seen as too much (in the Fairbank's sense) risk, and not lean enough. A nice example is Java 9, which is right now being decomposed, jigsawed, but you can take Microservices and reactive architecture as an example as well. I feel we need a progressive REST, a REST 2.0 that rethinks its tradeoffs and comes up with new primitives for a cybernetic, yet resilient, web in constant flux.

A progressive micro-state model for REST


With conversational interfaces that break from the document metaphor, where the relationships are more emergent, more realtime and event-based, more content curation happens, quality and risk are invariants of API's, resilience is key, and systems change faster than data, we need to decouple resource representations from their state in the overall system, and rethink the stateless tradeoff - we need to arrive at Micro-state.

One solution to this problem could be Event Sourcing. Not seeing Hyperlinks as static references to effectively Actions, but descriptions of actions, like intents in more contemporary data flow models like Redux and Reactive Streams with a server state. This would remove the need for meta formats like HAL etc and most of versioning, because the Link header would only contain be very basic references to directories, accessible via OPTIONS request. Those references would assume the API is semantically declared somewhere outside, and a developer knows how to handle these semantics. New headers could be used to declare context, an idea could be to reuse the "scope" and "state" that OAuth2/OpenID already has. They could reference a state on the server that would not be a resource yet could be queried by GET. URI's would only address entities by UID where there actually is one, while it could be perfectly fine to use GET/HEAD to obtain short-lived representations of state such as paginated lists rather than using fake-UIDs for GET requests. Even better would be that GET would allow request bodies to get around cases where URL's are too long, to brittle, uncacheable or not secure. The only problem with such a state, not a session, would be that it is non-transferrable on the server, but usable across channels, hence PATCH would have to be used to modify it and send intents as messages in an Event Sourcing system. It could even be synchronized in a CouchDB style, between multiple applications using a type of request that actually patches and versions cross-domain (in both the DDD and HTTP sense) state.

While this would break some of the REST principles, it would make integration much simpler and de-risk dependencies. More importantly, it would force API consumers and producers to think about state earlier, and what context information to submit. It would emphasise the need for sensible content-types and clear relationships, rather than meta-formats and machine-to-machine descriptions which only fulfill silver-bullet and AI dreams. It would also remove pointless philosophical discussions about idempotency and replace it with real user requirements of a constantly changing, emerging, system that needs to remain accessible for humans, machines and REST dogmatics.

No comments: