I just finished backing up my old old blog from somewhere on the web I won’t mention, and will republish these posts, for historical reasons rather than for their quality, over the next few days. Also, expect a new look and feel for the html side of things (isn’t gray way too sad?).
Jeroen in the comments say that Microsoft is watching what the industry does in AOP… But to be really honest, Microsoft was one of the first in the industry to provide interception and decoration of code with its COM+ Services. You can ask Clemens about his experience hooking up in this COM+ pipeline! (I was looking for posts about it, not sure he ever put down on the net what he explained in one conference I was at in Paris).
Moreover, I think it’s fair enough to say that the interception layer has been implemented in the .net world, through the use of the TransparentProxy / RealProxy couple. What prevents that from being used more efficiently? It’s as slow as it gets. It needs a context on its own. Some people (I’m not putting names as I can’t assert for sure, but I have my opinion on the matter) pushed to have call contexts instead of object instances contexts, which results in it not to be optimized before .net 3 or 4, maybe (according to this guy which should known what he’s talking about). And that’s a shame.
My brain has been wondering around lately, about how feeds and syndication can play with the whole service oriented paradigm and the GXA stack.
There are really two worlds in the blogging community: The simplistics and the technologists.
For the simplistics (yes the word is mine, invented by shuffling letters around in something that looks readable. I should fill a patent for that word), RSS works and blogs work because they are technically simple (remember, Really Simple Syndication). A feed is easily written, easily read (even without an aggregator), and doesn’t require much.
For the technologists, blog-related technologies are limited and the kind of functionality we could bring to people is inherently limited by the technology we’re using. Scoble says constantly that RSS is so much better than the web. Reality is, syndication still relies on a very web like interaction; only the client side of things you see and use is simpler and more productive.
You could’ve guessed by now that I’m in the second category. But is the first category really right anyway? Let’s see the technologies around the RSS format:
· MetaWebLog API. XML-RPC based, which I don’t consider a simple format, and that have its shares of interoperability issues.
· Trackback, Ping… Not complex in the way they work, but still complex concepts to adopt from a web perspective.
· Vocabulary, which includes pings, trackbacks, permalinks, etc. Is this vocabulary simple? Does my mother understand anything to it?
· Are we leveraging the underlying tool? REST-based APIs are not used, http content negotiation is not either. And we’re, once again, stuck into HTTP. It might be a coacroach, but you know what, I wouldn’t ride it for a competition.
We’re seeing a big paradigm shift (once again) to Services and SOAP. Is RSS leveraged? Surely enough it could, you can always chunk arbitrary XML within a SOAP envelope. So why the title, am I advocating RSS now? Hear me my friend, It’s not at all my point. The actual format of the <item>, I couldn’t care less. RSS is not what matters. With XSL-T widely available, at most I would prefer a format with full type fidelity that I can turn into whatever I want depending on what the subscriber consumes (remember, content negotiation…).
I am advocating ATOM not as a file format, but as a process that should look at the overall infrastructure, and define best practices, just like how Trackback, Ping and other ad-hoc technologies developed themselves. Not through boards, not through committees, but through open work and interoperability testing. What should ATOM do?
· Leverage WS-Security to provide authentication of a feed author. If I receive content from Scoble, that’s one thing. If I receive a stock alert from Bloomberg, that’s another thing.
· Leverage WS-Eventing and al. Having clouds is alright, but please, XML-RPC is dead, for god’s sake let it REST in peace.
· Leverage MTOM. Enclosures are good, but MTOM solves the same problem in a service oriented way.
· Leverage WS-Federation for comments and to find a way to solve comment spam
We have the underlying pipe technology, we lack the proper binding to a new service oriented world for blogs and content management in general.
I can hear people screaming, so many technologies to handle, this is going to be over complex and won’t ever get implemented. Remember, any market consolidates, any market gets fewer players, and competition is driven by how you can provide your users with the greatest flexibility. You don’t write the API for MTOM or WS-Security, it’s already there (ok, not MTOM, but you get the point), just like you don’t reimplement the full XML-RPC stack each time you need it in one of your projects. The effort does gives you things the current state of technology cannot:
· Authentication of the author of an item, a trackback, a comment…
· Real push technology
· Discoverability of new blogs in a virtual URL space
· No ties to HTTP in the technology, reuse of the idea of bindings
· Standards standards standards
· Ship more than just text with enclosures. Leverage your architecture as a content delivery system. Just imagine if newsgator was sending updates to their software in a standard way, push based mechanism and proper authentication?
ATOM is not pushing forward these ideas (which were at the origin the ones that were discussed), and ends up just mistaking RSS as an architecture when it is only a format.