Having the fortune to work as CSO for Talis, an innovative UK software company, in one of the most exciting times for software and the internet; I thought I would share some ideas and insights I am finding exciting at the moment.
Entries in disruption (2)
The platform is dead. Long live the platform
It seems to me that as we pass into the era of web 2.0, the software platform as we know it today will cease to have significant commercial value.
The principal reason being that the internet and web2.0 is allowing a move from code sharing to instance sharing for software platforms causing the existing network effect mechanism for platforms to fail.
The good news is we can expect new platform models to emerge based on the properties of sharing a single, persistent online instance rather than code sharing and multiple isolated instances (e.g. windows).
Some companies have already hooked into some aspects of this new model. eBay and Amazon as platforms have it, Google as a platform does not. I am of course talking about the architecture of participation becoming the principle network effect mechanisms for web2.0 platforms. That is, if the actions of the users contribute to the shared state of the platform (through which every platform application they may use) in such a way as to enhance the experience of other users, then their is a strong network effect based upon participation.
It is important to note that the forces enabling this new model are also undoing the previous model.
Here's why (IMHO).
Platforms
Over the past 10-15 years, Microsoft demonstrated both the enormous intrinsic and commercial value of software platforms. We have seen this battle for control of the platform played out over many segments of the software industry and layers in the software stack (Oracle, Syabase at the DB layers, Windows OS/2, IBM Websphere vs BEA Logic for application servers, smybian etc).
The return on capital invested simply dwarfed other software models and so platform leadership passed into law for many software companies as the one truth strategy for growth. The amazing value creation being principaly driven by two forces; reuse and the network effect.
Reuse: every application built on a platform is saved from having to make the investment to build features that the platform provides. This massively lowers the cost of production (therefore capital invested) for application developers.
But software libraries and software components do this also, but are not considered platforms. The difference between software platform and software library is the network effect or ecosystem.
Network Effect: Each application built for a particular platform increases the value of having that platform and, by extension, every other application that already uses the platform. So the more applications for a platform, the greater the value of the platform. So for the owner of the platform, that has a model that can extract commercial value from the platforms massive intrinsic value; the return on capital is a function of the investment that OTHER people have made. Or put another way, they achieve a return on capital NOT invested by them. Pretty sweet.
But the real questions should be "What causes the network effect in platforms". What is the mechanism by which the investment of application developer A has increased the value of the platform and of application B.
Does that same mechanism hold in the world of web2.0???? My believe is NO it doesn't. And that will have a profound effect on the strategy of software companies over the next 10 years. In fact we are already seeing it.
Traditional cause of the platform network effect
Was the dependency on the users to have purchased and installed the platform in order to use the applications.
Choice of purchase defined which applications you could use, naturally the platform with the better range and quality of software is more valuable (just like in the games console industry).
Web 2.0 removes the need for user purchase of the platform
As functionality moves off the users machine into a standards based cloud, the user choice of application platform effectively disappears. By definition web 2.o platforms API is web based and implementation neutral.
Consider the Google search APIs. If there is one or 100 applications based on it, the value of the platform is not much enhanced, those applications do not add anything to each other, no network effect. From an ecosystem point of view Web2.0 APIs are much more like software libraries than platforms.
Web 2.0 platform network effect
But web2.0 platforms have a new trick that traditional platforms don't have. They can easily present one shared instance i.e. state to all the different users of all the different applications. This allows the actions of one user using application A of the platform to enhance the experience of another user using application B of the platform. This is the architecture of participation. It is easy to see how both eBay and Amazon increase the power of their content based network effect through open access APIs. It is also easy to see how this doesn't work for Google, the end user of a search app typically can't affect the shared state of the platform.
Open Source Network Effect
Developer still need traditional software platforms though.
So web2.0 platforms allow sharing of the state of the platform. Where as traditional platforms allowed read only sharing of the code.
There is a way that traditional platforms can drive a network effect by allowing participation in the shared code of the platform. They can let users contribute to the code. This can immediately drive a whole new network effect which hugely increases the intrinsic value of the platform. Unfortunately for the existing platform vendors, nobody wants to submit code that somebody else will make money off. So this can only be done through open source. Linux is hugely valuable, but nobody can make $billions of its commercial sale, at least not directly.
Interestingly, as more open source code is created, it becomes easier to remix the code and create yet more open source software. The more general the software the more valuable for it to have an open source incarnation i.e. platforms are the natural place for opensource to target as we have seen with Linux, MySQL, JBOSS etc.
So for all the reasons above, I am pretty sure that as web2.0 progresses, we will see the rise of a different type of platform and the existing platform players will have a very hard time in holdings onto any serious returns.
Long live the platform of participation.


Web Services and the Innovators Dilemma
Web 2.0 is a vision of the web where content and functions can be remixed and reused to create new content or new applications. Web services and the semantic web are two of the key enablers for this vision but there appears to be dual approaches to both web services and the semantic web emerging. Why is that? Which is best?
Web Services
SOAP & WSDL - opens up new vista of possibilities by solving some of the real hard problems (WS-this that and the other), requires expertise and new infrastructure e.g. toolkits app servers to manage complexity. Unsurprisingly the app server vendors are driving the new standards in enterprise software.
REST - open up a new vista of possibilities by making it very easy to use web application APIs, so new audiences can get involved and doesn't require much in the way of changes to existing software stack. This is largely being driven by a very different community from the enterprise web services lot.
Semantic Web
RDF & OWL - open up new possibilities by solving some really hard problems. Requires expertise and therefore tooling and new infrastructure like a new query language, data storage, parsers etc. Driven by standards bodies like the W3C .
XHTML & Microformats - opens up new possibilities by lowering the barrier to participation for producers and consumers, uses existing technology, can be hand crafted i.e. disintermediates the expert.
It seems to me that the difference in complexity and cost between the approaches is actually a symptom of something deeper.
SOAP Webservices are trying to go beyond what expert developers could already do with RMI, DCOM etc.
By its nature it must compete with what is already possible which is mission critical software systems that are Trusted, secure, reliable, accountable, and typically have a high cost of failure. Most of these developers could not buy into a new way of working if it mean going backwards in any of those critical areas.
Similarly, RDF & OWL are trying to go beyond what expert developers can do with semantics in XML today.
If you are familiar with the work "The Innovators Dilemma" by Clayton M. Christensen, you may recognise this as the classic description of sustaining innovation. It must be better than what went before because it competes along the same dimensions with the same audience.
Clayton also describes what he terms as "Disruptive Innovation" of which one type is the low-end disruption. This is where a technically inferior innovation radically reduces the barrier(be that skill, cost or location) to entry thereby allowing an audience that was previously excluded to participate. This competes on new dimensions with a new audience.
This massive new audience is currently excluded from the traditional solution so the disruptive innovation only competes against being better than nothing for this audience.
So disruptive innovation allows a new, less skilled community to participate and do new kinds of things. Almost by definition this community is larger than the community of experts i.e. it is the long tail.
If we consider both REST and Microformats we see that neither are technically as good as Web Services and RDF. But both are significantly easier with lower skill and cost barriers for both producer and consumer. And sure enough Amazon are finding that the vast majority of the users of their platform are using the REST APIs.
Software standards have always had a massive network effect. What good is a standard if nobody else uses it. This makes the size of the community around any standard or approach hugely important. The pace of innovation is also deeply linked to the size of the community that can innovate. Consider the number of web authors(including bloggers) who can probably get their heads around REST and Microformats. It is vastly larger than the community hard core software developers on the planet.
Clayton describes, with many examples, how low-end disruptions rapidly become better and better until the complex high end solutions are pushed off the map.
It wouldn't be the first time that innovation become de facto outside the corporate firewall but eventually become good enough to be adopted by the enterprise.
Am I saying that Webservices and RDF are doomed. The truth is I have no idea, but I doubt it. The reason that experts create these new solutions is because they are needed to solve those difficult problems. But, on the other hand, vastly more innovation is likely when ordinary people gain the ability to do what is a "solved problem" for the expert. I would put money on Web2.0 emerging first from the ordinary web user rather than the software experts.
Microformats:
http://www.microformats.com/
http://www.tantek.com/presentations/2004etech/realworldsemanticspres.html

