Why aren't plugging into cloud plugfest events anymore?

An intrinsic property of the innovation cycle is time: It requires time to let innovations settle in their current state, Cloud Interoperability Plugfests are facing serious challenges for relevance.

The Cloud Plugfest Initiative, with whom CloudWATCH2 collaborates, does not collect user interaction statistics beyond Mailchimp’s free subscription options, particularly regular event registration and participation is not cohesively collected. Hence a historic analysis and trajectory extrapolation for the future is not possible.

This makes it difficult to measure the success of the meetings, let alone measuring the impact of plugfests as such, even though CloudWATCH2 did collect participation information for the three testing events it organised (of which the first had to be cancelled, see above). It is questionable whether the current plugfest format is still relevant. While participation levels between the second and the third plugfest are negligible, the stark difference of the respective outcomes is very sobering in terms of assessing the success of the traditional plugfest with high participation in its heydays compared to contemporary events.

While, for example, Cloud Plugfest 10, co-located with the EGI Technical Conference 2013 in Madrid eatured three days of workshops and actual testing packed with attendees between 30 and 50 on any of the three days, recent plugfests faced participation levels of less than 10 at each event.

The reasons behind this observation are not conclusive, yet several conjectures serve as plausible explanations.

 

Conjecture 1: Active development vs. maintenance.

Looking at the mere chronology of events, Cloud Plugfest 10 took place in autumn 2013, and more recent plugfests over the course of 2016. Standards such as OCCI and CDMI, representing technical cloud interfaces, were relatively new (OCCI 1.1 was published in 2011), and implementations were rare and in an immature state.

Fast-forward three years, and presuming continuous interest and demand in standards-based implementations, one would expect implementations to mature in that time, alongside with maturing and near-perfect standard implementation and interoperability. Naturally, the need of interoperability testing and implementation guidance of developers in 2013 will have subsided in 2016, explaining the decline in participation to events.

 

Conjecture 2: Correlation of event participation with project funding.

From a European perspective, the heydays of cloud plugfests correlated with the funding of three major projects as part of the EC FP7 programme lasting from 2007 to 2013, with projects running well into 2016. These three major projects were:

All three projects together comprised involvement of nearly all EU member countries, including Norway and Switzerland, in particular the EGI-InSPIRE project covered almost all member countries.

All three projects received significant funding from the EC (35%, 48% and 63% finding for EGI-InSPIRE, EMI and IGE, respectively) continuing the EGEE series of projects funded by the EC in the years before. With EGI-InSPIRE initiating the cloud-related activities in this ecosystem in September 2011 as a federation of cloud infrastructure – the EGI Federated Cloud based on standardised interfaces such as OCCI, CDMI, OVF, GLUE, Usage Records and others, activities in standards conformance and interoperability testing in the academic cloud landscape in Europe sharply increased, impacting ancillary projects such as OpenNebula, GRNET’s Okeanos project and many more with connections and collaborations in the EGI community.

Correlating available sparse historic information with the runtime and funding of the projects mentioned above, the second half of the EGI-InSPIRE project seeing the EGI Federated Cloud initiative ramping up, particularly correlates with the most successful and most visited Cloud Plugfests.

This leads to a possible conjecture: Participants attended Cloud Interoperability Plugfests simply because EC project funding was available to cover the costs. Without funding, attendance might have been considered of lower importance.

 

Conjecture 3: Lack of incentives for service providers to implement standards.

Industry operates on a fairly simple condition: Spend as little money for as much revenue as possible. Although simplified, this serves well in explaining some of the underlying mechanisms of this conjecture. If existing services generate revenue over and above the cost of sales (cost of supply in case of products) then this represents an appropriate response to an existing demand, in a relatively stable equilibrium.

 

In such a scenario, deciding to sign off an expense to implement a particular standard without the demand side expressing this need represents a highly speculative cost that is difficult to justify, unless it is a standard being implemented internally in order to improve cost of supply and therefore increase the organisation’s profit margin. This scenario can be observed time and again, and industry standards and best practices for service operations and implementation emerge as a direct corollary of this. As expressed by Sebastian Kirsch of Google Zurich, at the International Industry-Academia Workshop on Cloud Reliability and Resilience hosted by EITDigital and Huawei Europe, as a recollection from memory, “Standardise, standardise, standardise!”. What Sebastian meant, however, was not the aim to standardise on the public interface level, but internally, to improve reliability and resilience, and thus lower the cost of service in terms of service incidents, outages, and software errors.

Alternatively, a scenario including a rising demand of standardisation at the service interface level may support service providers in justifying the expenses of implementing previously disregarded standards in two ways, (a) through direct sponsoring of implementation in a project funding manner, or (b) as a threat and weakness of their own offer compared to others in the competition.

While alternative (a) is quite straight-forward in terms of cost-benefit analysis (vulgo: “Pay me to implement the standard!”) in a customised software services business model, alternative (b) activates competition mechanics in that an organisation may consider rising demand of standards implementations in a SWOT analysis as a weakness (“Demand requires support of standards, which our products do not provide”) on the technical level, and as a threat to business sustainability (“Our services would be outcompeted, therefore our revenue of the services may diminish.”) on the financial level.

 

Conjecture 4: EC projects have an intrinsically different perception of security.

ISO 27001 etc. are considered an industry baseline set of standards. However, EC projects seem to be considered an incubator of technical innovation and therefore focus on technical maturity of their outputs. Perhaps correlating with conjecture 3 above, EC projects thus seem to operate on the presumption of not having to integrate customer demand and customer orientation (i.e. market readiness) into their project plans and activities: While H2020 Research and Innovation type project proposals are written with customer demand and need in mind, these seem being insufficiently subjected to project outputs and results as such.

 

Conjecture 5: The cadence of innovation, particularly disruptive innovation, may have become too fast.

Referring back to the Wardley Mapping methodology, especially the cycle of “Peace, War, and Wonder” (see above), in intrinsic property of this cycle – and the cycle of innovation and standardisation – is time: It requires time to let innovations settle in and turn into products (or services), and finally commodities (or utilities).

But what if the frequency of innovation, especially disruptive innovation becomes too high, cutting deeply into the time necessary for innovations to mature and set the scene for standardisation to occur?

Signals that that might be the case are there, for example:

  • The business models and business strategies of Uber, AirBnB, Facebook and Google are under serious scrutiny or threat, with the latest example of Uber’s license to operate in London being revoked
  • These companies are increasingly considered not as tech companies but as companies with a classic business model that just happens to aggressively use technology – but “dodging” the pertaining sector’s regulations: Uber in the sector of hail riding services, AirBnB in the sector of hospitality, Google and Facebook in the news & media publishing sector.

Large-scale IT tech firm leaders begin to at least think about the pace of change, the pace of innovation and its impact on society