October 12, 2010

Artifact Ownership or When Not to Use REST

Here is an indication of situations when applying REST is not appropriate:
When all artifacts that are affected by a possible change are owned by the same project (e.g. stored in the same source code control repository) then REST is not a suitable style.

An example of this is an application that contains a database component for its private use. Usually the database schema is stored in the repository together with the source code. If you face a need to change one, it is easy to change the other (from a developer coordination point of view).

The general take away from this is that artifact ownership can be used as an indicator for how appropriate a candidate style is.

(Consider how much the Unix command line benefits from the uniform API pipe and filter style: Completely decentralized developers can contribute components (grep, awk, sed, less, sort,..) without even engaging into agreeing on how the components talk to each other). The artifacts that make up the unix tool box are maintained by many different parties all over the world.


  1. Good point. Good old SOA mainstream, in an otherwise pretty horrible document called the OASIS SOA Reference Model, calls this being "under the control of different ownership domains" - and I agree, in most cases it's a design smell if you apply a style that's intended for loose coupling in situations where this is just a YAGNI aspect.

  2. I should probably have also said that WS-* isn't appropriate in the mentioned case (application with private database) either. But then...it never is anyway :-)

  3. Imagine I would like to create an application that consists of a client component, maybe known as local knowledge base, and a server component, maybe know as global knowledge base.

    User agents interact mainly with the local knowledge base, which forwards processing steps as needed to the global knowledge base. The global knowledge base profits from information integration that is achieved by consuming other information provider.

    One goal is that the user agents should always get preprocessed personalized pieces of information. Another goal is that the user, that uses that user agent, should be able to add, modify and delete pieces of information.

    However, it should not only possible that these changes would be pushed back into the global knowledge base. Furthermore, the global knowledge base should be able to push back changes to the information providers, where it consumed this information initally from.

    So, the whole application should be able to easily close the information flow life cycle and the user does not need to propagate changes via the inferface of the "original" information provider (here the information provider, where the global knowledge base consumed the information from).

    I would say the two main components of this application interact somehow as intelligent intermediaries. Although, for the user agent it could look like that the local knowledge base is always the origin server for the user agent (this shouldn't forbid that the user agent can be able to show provenance information re. the consumed information provider etc.).

    As a conclusion of your first statement in this post, I would say that such an application development can be leaded by trying to follow the REST constraints, or? What do you think?