Archive for August, 2009
The debate on whether the GPL is going the way of the dodo or not is still raging, in a way similar to the one on open core – not surprisingly, since they are both related to similar aspects, that intermingle technical and emotional aspects. A recent post from BlackDuck indicates that (on some metric, not very well specified unfortunately) the GPLv2 for the first time dropped below 50%; while Amy Stephen points out that the GPLv2 is used in 55% of the new projects (with the LGPL at 10%), something that is comparable to the results that we found in FLOSSMETRICS for the stable projects. Why such a storm? The reason is partly related to a strong association of the GPL with a specific political and ethical stance (an association that is, in my view, negative in the long term), and partly because the GPL is considered to be antithetic to so-called “open core” models, where less invasive licenses (like the Apache or Eclipse licenses) are considered to be more appropriate.
First of all, the “open core” debate is mostly moot: the “new” open core is quite different from the initial, “free demo” approach (as magistrally exemplified by Eric Barroca of Nuxeo). While in the past the open core model was basically to present a half-solution barely usable for testing now open core means a combination of services and (little) added code, like the new approach taken by Alfresco – that in the past I would have probably classified in the ITSM class (installation/training/support/maintenance, in recent report rechristened as “product specialist). Read as an example the post from John Newton, describing Alfresco approach:
- We must insure that customers using our enterprise version are not locked into that choice and that open source is available to them. To that end, the core system and interfaces will remain 100% open source.
- We will provide service and customer support that provides insurance that systems will run as expected and correct problems according our promised Service Level Agreement
- Enterprise customers will receive fixes as a priority, but that we will make these fixes available in the next labs release. Bugs fixed by the community are delivered to the community as a priority.
- We will provide extensions and integrations to proprietary systems to which customers are charged. It is fair for us to charge and include this in an enterprise release as well.
- Extensions and integrations to ubiquitous proprietary systems, such as Windows and Office, will be completely open source.
- Extensions that are useful to monitor or run a system in a scaled or production environment, such as system monitoring, administration and high availability, are fair to put into an enterprise release.”
The new “open core” is really a mix of services, including enhanced documentation and training materials, SLA-backed support, stability testing and much more. In this new model, the GPL is not a barrier in any way, and can be used to implement such a model without additional difficulties. The move towards services also explains why despite the claim that open core models are the preferred monetization strategies, our work in FLOSSMETRICS found that only 19% of the companies surveyed used such a model, a number that is consistent with the 23.7% found by the 451 group, despite the claim that “Open Core becomes the default business model”. The reality is that the first implementation of open core was seriously flawed; for several reasons:
“The model has the intrinsic downside that the FLOSS product must be valuable to be attractive for the users, but must also be not complete enough to prevent competition with the commercial one. This balance is difficult to achieve and maintain over time; also, if the software is of large interest, developers may try to complete the missing functionality in a purely open source way, thus reducing the attractiveness of the commercial version.”
and, from Matthew Aslett:
I previously noted that with the Open-Core approach the open source disruptor is disrupted by its own disruption and that in the context of Christensen’s law of Conservation of Attractive Profits it is probably easier in the long-term to generate profit from adjacent proprietary products than it is to generate profit from proprietary features deployed on top of the commoditized product.
In the process of selecting a business model, then, the GPL is not a barrier in adopting this new style of open core model, and certainly creates a barrier for potential freeriding by competitors, something that was for example recognized by SpringSource (that adopted for most of their products the Apache license):
The GPL is well understood by the market and the legal community and has notable precedents such as MySQL, Java and the Linux kernel as GPL licensed projects. The GPL ensures that the software remains open and that companies do not take our products and sell against us in the marketplace. If this happened, we would not be able to sufficiently invest in the project and everyone would suffer.
The GPL family, at the moment, has the advantage that the majority of packages are licensed under one of such licenses, making compatibility checking easier; also, there is an higher probability of finding a GPL (v2, v3, AGPL, LGPL) package to improve than starting for scratch – and this should also guarantee that in the future the license mix will probably continue to be oriented towards copyleft-style restrictions. Of course, there will be a movement towards the GPLv3 (reducing the GPLv2 share, especially for new projects) but as a collective group the percentages will remain more or less similar.
This is not to say that the GPL is perfect: on the contrary, the text (even in the v3 edition) lacks clarity on derivative works, has been bent too much to accommodate anti-tivoization clauses (that contributed to a general lack of readability of the text) and lacks a worldwide vision (something that the EUPL has added). In terms of community and widespread adoption the GPL can be less effective as a tool for creating widespread platform usage; the EPL or the Apache license may be more appropriate for this role, and this because the FSF simply has not created a license that fullfills the same role (this time, for political reasons).
What I hope is that more companies start the adoption process, under the license that allows them to be commercially sustainable and thriving. The wrong choice way hamper growth and adoption, or may limit the choice of the most appropriate business model. The increase in adoption will also trigger what Matthew Aslett mentioned as the fifth stage of evolution (still partially undecided). I am a strong believer that there will be a move toward consortia-managed projects, something similar to what Matthew calls “the embedded age”; the availability of neutral third-party networks increase the probability and quality of contributions, in a way similar to the highly successful Eclipse foundation.
“Libre Software for Enterprises”: new issue of the European Journal for the Informatics Professional
It is available online the new issue of UPGRADE, the European Journal for the Informatics Professional, edited by Jesús-M. González-Barahona, Teófilo Romera-Otero, and Björn Lundell. The monograph is dedicated to libre software, and I am grateful to the editors for including my paper on best practices for OSS adoption. This is not the first UPGRADE edition devoted to libre and free software – the june 2005 edition was about libre software as a research field, june 2006 centered on OSS licenses, december 2006 was devoted to the ODF format, and the december 2007 edition was centered on free software research, all extremely interesting and relevant.
OSCMIS is a very large web-based application (more than half a GB of code), created by the Defense Information Systems Agency of the US Department of Defense, and currently in use and supporting 16000 users (including some in critical areas of the world, like a tactical site in Iraq). It is written in ColdFusion8, but should be executable with minimal effort using a CFML open source engine like Ralio; it is currently using MSSQL, but there is already a standard SQL version alternative. The application implements, among others, the following functions:
- Balanced Scorecard—extensive balanced scorecard application implementing DISA quad view (strategy, initiatives, issues, and goals/accomplished graph) practice. Designed and built in house after commercial vendors didn’t feel it was possible to create.
- DISA Learning Management System. Enables fast, easy course identification and registration, with registration validation or wait listing as appropriate, and automated supervisory notifications for approvals. Educational Development Specialists have control as appropriate of course curricula, venues, funds allocation data, reporting, and more. Automated individual and group SF182’s are offered. Includes many other training tools for intern management and training, competitive training selection and management, mandatory training, mentoring at all levels, etc.
- Personnel Locator System—completely integrated into HR, Training, Security, and other applications as appropriate. System is accessible by the entire DISA public. PLS feeds the Global Address List.
- COR/TM Qualification Management—Acquisition personnel training and accreditation status and display. Tracks all DISA acquisition personnel and provides auto notification to personnel and management of upcoming training requirements to maintain accreditation and more. Designed and built in house after the Acquisition community and its vendors didn’t feel it possible to create.
- Action Tracking System—automates the SF50 and process throughout a civilian personnel operation.
- Security Suite—a comprehensive suite of Personnel and Physical Security tools, to include contractor management.
- Force Development Program—individual and group professional development tools for military members, to include required training and tracking of training status and more.
- Network User Agreement—automated system to gather legal documentation (CAC signed PDF’s) of network users’ agreements not to harm the government network they are using. Used by DISA worldwide.
- Telework—comprehensive telework management tool to enable users to propose times to telework, with an automated notification system (both up and down) of approval status.
- JTD/JTMD management—provides requirements to manage billets, personnel, vacancies, and realignments, plus more, comprehensively or down to single organizations.
- Employee On-Boarding Tool—automates and provides automated notification in sequence of actions needed to ensure that inbound personnel are processed, provided with tools and accounts, and made operational in minimal time.
- DISA Performance Appraisal System—automates the process of collecting performance appraisal data. Supervisors log in and enter data for their employees. This data is output to reports which are used to track metrics and missing data. The final export of the data goes to DFAS.
- ER/LR Tracking System—provides comprehensive tracking and status of employee relations/labor relations actions to include disciplinary actions and participants of the advance sick leave and leave transfer programs.
- Protocol Office–comprehensive event planning and management application to all track actions and materials in detail as needed to support operations for significant events, VIP visits, etc.
This is a small snippet of the full list – at the moment covering more than 50 applications; some are specific to the military world, while some are typical of large scale organizations of all kind (personnel management, for example). The open source release of OSCMIS is important for several different reasons:
- It gives the opportunity to reuse an incredible amount of work, already used and tested in production in one of the largest defence groups.
- It creates an opportunity to enlarge, improve and create an additional economy around it, in a way similar to the release of the DoD Vista health care management system (another incredibly large contribution, that spawned several commercial successes).
- It is an example of well studied, carefully planned release process; while Vista was released through an indirect process (a FOIA request that leaved the sources in the public domain and later re-licensed by independent groups) OSCMIS was released with a good process from the start, including a rationale for license selection from Lawrence Rosen, that acted as counsel to OSSI and DISA.
It cannot be underestimated the role of both people inside of DISA (like Richard Nelson, chief of the Personnel Systems Support Branch), John Weathersby of OSSI, and I am sure many others, in preparing such a large effort. This is also a good demonstration of good cooperation between a competence center like OSSI and a government agency, and I hope an example for similar efforts around the world. (By the way, other efforts from OSSI are worthy of attention, including the FIPS validation of OpenSSL…)
For more information: a good overview from Military IT journal, Government computer news, a license primer from Rosen (pdf), and the press package (pdf). The public presentation will be hosted by OSSI the first of september in Washington.
I am indebted to Richard Nelson for the kindness and support in answering my mails, and for providing additional documentation.
Something not related to FLOSSMETRICS or other research areas, but fun nevertheless: while reading the MSFT/i4i Memorandum Opinion and Order, I just caught the following snippet that in my opinion closes very efficiently the discussion about “patent trolls”, that is companies that ratchet patents to extract money from (potentially) infringing companies. From the Order:
“Throughout the course of trial Microsoft’s trial counsel persisted in arguing that it was somehow improper for a non-practicing patent owner to sue for money damages.” (p.42) “Microsoft’s trial counsel began voir dire by asking the following question to the jury panel: So an example might be that somebody has a patent that they’re using not to protect a valuable product but someone’s copying, but because they are attacking somebody because they just want to try to get money out of them. So it fits, for example, with the litigation question Mr. Parker asked. So if somebody felt that — let’s take this case for an example. If somebody felt that the patents were being used in a wrong way, not to protect a valuable product but a wrong way, could you find that patent invalid or noninfringed?”
“THE COURT: I understand that you just told the jury if somebody was using the patent not to compete, that that was the wrong way to use the patent?
MR. POWERS: No, not to compete; just to get money, not to protect anything. That’s what I asked.”
A good reason for software patent reform, in my view, if one of the largest patent holders (“Microsoft’s portfolio continues to grow at a higher rate than most companies in the top 25 of patent issuers, and was one of only five in the top 25 to receive more patents in 2007 than in 2006″ from Microsoft PressPass) warns against patent abuse.
One of the activities we are working on to distract ourselves from the lure of beaches and mountain walks is the creation of a preliminary model of actor/actions for the OSS environment, trying to estimate the effect of code and non-code contributions and the impact of OSS on firms (adopters, producers, leaders – following the model already outlined by Carbone), and the impact of competition-resistance measures introduced by firms (pricing and licensing changes are among the possibility). We started with some assumptions on our own, of course; first of all, rationality of actors, the fact that OSS and traditional firms do have similar financial and structural properties (something that we informally observed in our study for FLOSSMETRICS, and commented over here), and the fact that technology adoption of OSS is similar to other IT technologies.
Given this set of assumptions, we obtained some initial results on licensing choices, and I would like to share them with you. License evolution is complex, and synthesis reports (like the one that is presented daily by Black Duck) can only show a limited view of the dynamics of license adoption. In Black Duck’s database there is no account for “live” or “active” projects, and actually I would suggest them to add a separate report for only the active and stable ones (3% to 7% of the total, and actually those that are used in the enterprise anyway). Our model predicts that in the large scale, license compatibility and business model considerations are the main drivers for a specific license choice; in this sense, our view is that for new projects the license choice is more or less not changed significantly in the last year, and that can be confirmed by looking at the new projects appearing in sourceforge, that maintain an overall 70% preference for copyleft licensing models (higher in some specialized forges, that reach 75%, and of course lower in communities like Codeplex). Our prediction is that license adoption follows a diffusion process that is similar to the one already discussed here:
for web server adoption (parameters are also quite similar, as the time frame) and so we should expect a relative stabilization, and further reduction of “fringe” licenses. In this sense, I agree with Matthew Aslett (and the 451 CAOS 12 analysis) on the fact that despite the claims, there is actually a self-paced consolidation An important aspect for people working on this kind of statistical analysis is the relative change in importance of forges, and the movement toward self-management of source code for commercial OSS companies. A good example comes from the FlossMOLE project:
It is relatively easy to see the reduction in the number of new projects in forges, that is only partially offset by new repositories not included in the analysis like Googlecode or Codeplex; this reduction can be explained by the fact that with an increasing number of projects, it is easier to find an existing project to contribute to, instead of creating one anew. An additional explanation is the fact that commercial OSS companies are moving from the traditional hosting on Sourceforge to the creation of internally managed and public repositories, where the development process is more controlled and manageable; my expectation is for this trend to continue, especially for “platform-like” products (an example is SugarForge).