Distributions
The value of specs
The value of design specifications ("specs") for open-source projects is something of an open question. Some projects, with the Linux kernel perhaps being the most prominent, eschew specs in favor of code. Other projects, such as various OpenStack sub-projects, have a fairly heavyweight process that requires specs for most proposed features. In a recent openstack-dev discussion, the value of requiring specs for the Nova compute component was called into question.
Nova manages the compute resources for an OpenStack cloud. Those resources are in the form of different kinds of virtual machines (VMs) from hypervisors such as KVM, Xen, VMware, and Hyper-V or from container technology like LXC or Docker.
That discussion started with a June 24 post from Nikola Đipanov that was rather negative about the whole spec process:
Đipanov cited a few examples where he saw that the spec process had gone wrong that he had encountered just that week. He noted that a big part of the problem is that Nova is so large and tightly coupled that a heavyweight process has been put in place to, effectively, slow or stop changes from being made. It is the tight coupling that needs to be addressed, but that the process itself is preventing that.
But Daniel Berrange and others did not see things quite that way, with Berrange pointing out that the situation was far worse before the Nova project adopted specs:
New OpenStack features always require a blueprint in Launchpad, but many components, including Nova, have adopted a requirement that most features need specs based on a project-specific template. Blueprints are typically a much simpler statement of the problem to be solved, while specs require a great deal more detail, including design information, use cases, impacts, and more. In addition, new features are only approved for a single six-month development cycle; if they spill over into the next cycle, they must be re-reviewed and approved again.
While specs have made things much better, there are still a number of
problems with the process, Berrange said. It is too rigid and
bureaucratic, and too many features are being pushed into the spec process
that could simply be handled with just a blueprint. Also, tying the
spec review and approval schedule to that of the overall development cycle
is counterproductive: "We should be willing to accept and
review specs at any point in any cycle, and once approved they should
remain valid for a prolonged period of time - not require us to go
through re-review every new dev cycle as again that's just creating
extra burden.
" In addition, as Đipanov also noted, there is only a
subset
of the
core team (which is "already faaaar too small
", Berrange said)
that can approve specs, which creates further bottlenecks.
Others strongly agreed that specs have made things better, but some questioned whether the Gerrit code-review tool was the best mechanism for reviewing specs. One of the reasons for requiring specs (and placing them into Git repositories so they could be reviewed via Gerrit) was the inability to comment on blueprints in Launchpad. But code-review tools foster a line-by-line approach, which is not optimal to review specs, as Technical Committee chair Thierry Carrez noted:
Part of the problem is that the spec template is overkill for many features, Carrez said. It would be better to start small and build more into a spec as it gets reviewed:
You *can* do this with Gerrit: discourage detail review + encourage idea review, and start small and develop the document in future patchsets as-needed. It's just not really encouraging that behavior for the job, and the overhead for simple features still means we can't track smallish features with it. As we introduce new tools we might switch the "feature approval" process to something else. In the mean time, my suggestion would be to use smaller templates, start small and go into details only if needed, and discourage nitpicking -1s.
Đipanov agreed with Carrez's ideas, and suggested that investigating other tools might be in order. On the other hand, Kyle Mestery noted that the Neutron networking component had recently switched from a heavyweight spec-based process to one that uses "request for enhancement" (RFE) bugs instead. The reasons behind the switch, as outlined in a blog post from Mestery, sound rather similar to the complaints heard in the Nova thread. So far, that switch is working out well, Mestery said.
The RFE process was also championed by Adam
Young. He strongly agreed with Đipanov that Gerrit was not the proper tool
for the job and suggested that keeping the documentation with the code (and
keeping
them both in sync) would avoid "bike shedding about Database
schemas
". But Berrange said that
hearkened back to the days before specs for Nova, which "really didn't work at all - code reviews are too late in the workflow
to start discussions around the design, as people are already invested
in dev work at that point and get very upset when you then tell them
to throw away their work
".
But Young is fairly adamant that the spec process is holding back progress in the code:
On the other hand, the spec process not necessarily the real bottleneck, as James Bottomley pointed out; review bandwidth will not magically increase simply by removing specs from the process. He is concerned that precious reviewer time may be wasted on things that should already have been accepted (because they are an obvious bug fix, say) or rejected (for bogus code). Reducing the number of reviews required to get to a resolution is the way to stretch review resources.
There is another, possibly overlooked, advantage to the spec process that
Tim Bell raised: it allows operators and
other users without Python knowledge to "give input on the overall approach being taken
". If commenting is left
until code review time, it leaves out those who aren't able to read the
code—and who may have important thoughts based on running OpenStack in
production. Đipanov acknowledged that, but is still concerned
about the weight of the process for many of the features proposed for Nova.
In a summary post, Đipanov outlined the positives and negatives with regard to the Nova process that had emerged from the discussion. The strident "specs don't work" attitude from his initial post is replaced with a more even-handed view. The post also makes some concrete suggestions for moving forward.
Full-blown specs should not be required from the outset, he suggested. Instead, a simpler blueprint that is mirrored into the repository could be used and a spec should only be created if multiple core team members (or a larger number of contributors) request one (by making a negative vote on the blueprint). In addition, feature approval should not necessarily expire when a release is made—expiration should strictly be for the specific features that require it. Lastly, new tools should be considered that would facilitate a more nimble process, perhaps along the lines of what Carrez described.
At some level this is a struggle between those of a more "agile" mindset and those who are more process-oriented. It seems that there is broad agreement that improvements are needed to the current Nova development process, but where and how those changes come is not yet clear. The OpenStack project, though, has multiple components, each with its own process, that can be studied to see what works and what doesn't—and why. Beyond that, sub-projects like Nova can also look at the wider free-software world for ideas. A bit of observation and iteration is likely all that is required to find some useful improvements to the Nova development process.
Brief items
Distribution quotes of the week
openSUSE Leap 42.x
We felt that Leap, with reference to motion, i.e. how the distribution moves forward, provides a nice contrast to Tumbleweed. It also represents that we are taking a leap to get there.
Happy 2nd Epoch CoreOS Linux
CoreOS celebrates its second birthday with an alpha release. "Two years ago we started this journey with a vision of improving the consistency, deployment speed and security of server infrastructure. In this time we have kicked off a rethinking of how server OSes are designed and used."
Distribution News
Debian GNU/Linux
Debian to switch back to ffmpeg
After nearly a year of consideration, the Debian project has decided to switch back to the ffmpeg multimedia library at the expense of its fork libav. See this wiki page for a summary of the current reasoning behind the switch.Preparing for GCC 5/libstdc++6 (and GCC 6)
Matthias Klose notes that GCC 5 will soon be the default compiler in Debian sid. "Compared to earlier version bumps, the switch to GCC 5 is a bit more complicated because libstdc++6 sees a few ABI incompatibilities, partially depending on the C++ standard version used for the builds. For some C++11 language requirements, changes on some core C++ classes are needed, resulting in an ABI change."
Ubuntu family
Ubuntu 14.10 (Utopic Unicorn) reaches End of Life
Ubuntu has announced that version 14.10 (Utopic Unicorn) will reach its end of support on July 23. The supported upgrade path is via Ubuntu 15.04.
Newsletters and articles of interest
Distribution newsletters
- DistroWatch Weekly, Issue 617 (July 6)
- Ubuntu Weekly Newsletter, Issue 424 (July 5)
Page editor: Rebecca Sobol
Next page:
Development>>