Send As SMS

Thursday, November 30, 2006

Travelling at the speed of Wikipedia

Upon reading the New Scientist article about the Antikythera mechanism's purpose it occurred to me to wonder whether it was mentioned in Wikipedia. In the time that it took for me to:
  • locate the appropriate article,
  • notice that it went straight from the abacus to Schickard's mechanical calculators without mentioning the Antikythera mechanism and
  • work out how to add a discussion section to suggest adding a reference to the Antikythera mechanism,
someone else had come along and added a paragraph on the mechanism to the article itself!

Amazing.

Links for 2006-11-30

Wednesday, November 29, 2006

Links for 2006-11-29

Tuesday, November 28, 2006

Sun's Simon Phipps in London, 20-Dec

(Simon is Sun's Chief Open Source Officer)

The BCS Open Source Specialist Group is arranging a talk by Simon on, you guessed it, opening Java. I'll be there.

Links for 2006-11-28

  • How cocaine is made. The best anti-drug video ever? I'm not so sure. Plenty of people eat sausages and chicken "nuggets" despite knowing what they are. Still, the amount of petrol/gasoline used is remarkable.
  • Makezine's Open source gift guide. Cool! (Thanks to Jedd for the link.)
  • Mark Cuban finally posts something link-worthy :-) An analysis of which of a local newspaper's customers are most strongly bound to it (i.e. local display ad buyers, vs. national display ad buyers - which Google etc. have on their radar - and classifieds buyers - which Craigslist has already grabbed) and how to derive new benefit from it. Interesting.

Saturday, November 25, 2006

Thresher's word-of-mouth campaign

Of particular interest to UK residents planning to stock up on wine/champagne for Christmas; Threshers is offering 40% off purchases between November 30 and December 10 to anyone who presents this coupon. There is no mainstream advertising for this, it appears to be aimed particularly at those "in the know".

(Note that the link is to Stormhoek, a South African wine maker which has been successfully building its UK brand with Hugh Macleod's help, almost entirely on his Hughtrain's markets-as-conversations basis. It sounds as though Stormhoek is gradually persuading Threshers of the value of the approach. Well done Hugh!)

A repeat of the Starbucks fiasco (free coffees for friends and family, which got out of hand, which led to the cancellation the offer) seems unlikely; this is "only" a 40% reduction, and noting that it can't be used in combination with the existing 3-for-2 offer (33.3% reduction) which covers most/all of their wine anyway, really does look like a promotion rather than a giveaway. Still, if you're aiming to stock up for Christmas anyway, and Threshers is convenient to you (my only local bottlo is, in fact, a Threshers), an extra 6-7% off is not to be sneezed at.

Enjoy.

(via gapingvoid)

UPDATE 2006-12-02: Hugh wrote the Hughtrain, not the Cluetrain...

Links for 2006-11-25

  • The Stunt Train SEO Marketing Manifesto, a potted summary of the SEO school of thought; no so much how to tweak your web pages to get better rankings, but how to structure your business/activity as a whole to that end. (via gapingvoid)
  • Rodrigo Dauster's Elusive Customer Manifesto, which points out in not so many words that ignoring the cues and information that customers provide continually in favour of intrusively asking for the same information (surveys, call centre scripts for common transactions, polls, ...) is dumb, a blog post Brands should learn from my wife which offers some motivating examples. (via gapingvoid)
  • Complex life on Earth may have been spawned by an ecological disaster which, unlike previous ones, happened to occur when more complex species with higher resource requirements were ready to appear (and the removal of competition from simpler species paved the way). This reminds me of ideas about the sudden increases in personal power (and what we'd now call human rights) following the Black Death in medieval Europe; that the sudden reduction in population allowed former serfs to acquire productive land cheaply.
  • Litvinenko appears to have had radioactive polonium in his system (Po-210, half life is 138 days, rather than decades or millenia for other isotopes). This is a little odd; pre-mortem his doctors had already publically eliminated heavy metals, in particular thallium.

Friday, November 24, 2006

Links for 2006-11-24

  • Bruce Perens petitions Novell to renegotiate its deal with Microsoft.
  • The 800 bed development in Perisher gains an approval. (The article says "approved a $112 million concept plan"; it is not clear whether this is a formal DA or more an agreement in principle, but it would suggest that the development is now [near-]certain.)
  • A group of scientists establishes a series of connections between gasses released from a fissure in Iceland in 1783/4, an unusually cool northern winter (through increased albedo), lowered water levels in the Nile and possibly even drought in India. This work provides a predictive model for the consequences of eruptions at northern latitudes, a counterpart to existing models for the northern consequences of tropical eruptions.
  • The laws of simplicity. (Life simplification, etc.)
  • DRM is dead, sort of (son of DRM?): "the new model for us is partnership. It always was, I’m just not sure we got it". (via Boing Boing)

Wednesday, November 22, 2006

On Novell's open letter

I've just posted this to the SCLUG list:

On Tue, 2006-11-21 at 23:05 +0000, ed wrote:

> personally i'm in mixed minds about novell now. i cannot decide if their
> intentions are good or evil.
>
> http://www.novell.com/linux/microsoft/community_open_letter.html
>
> from the above they want to make it appear that their ideas are good and
> pure.

Novell is not in a position where it can admit "it is our intention to
co-operate with Microsoft in destroying its only serious platform
competitor, thereby helping Microsoft to re-establish its effective (and
unlawful) monopoly". Consequently, no useful information about Novell's
actual intentions can be gleaned from Novell's stated intentions.

> am i right in thinking that the gpl now prevents Novell from
> distributing gpl'd code, since they have an alliance with a company now
> that is taking/trying to take action against gpl'd code distributors...
> all seems like a big mess now.

RTFL!

Note in particular clause 7. If Microsoft is maneuvered into actually
suing someone for patent infringement over use of non-Novell Linux, and
wins (so it becomes established in law that running Linux does require
licenses on Microsoft patents), then Novell will have to abort (or at
least suspend) its Linux business overnight:

GPLv2> For example, if a patent
GPLv2> license would not permit royalty-free redistribution
GPLv2> of the Program by all those who receive copies
GPLv2> directly or indirectly through you, then the only way
GPLv2> you could satisfy both it and this License would be to
GPLv2> refrain entirely from distribution of the Program.

Note that this is a constraint on Novell's activities, not a constraint
on its licensees' activities; consequently Novell can't escape this
particular constraint by indemnifying its licensees (customers). This is
different to a prohibition arising from co-operation with an attacker
that you suggest; such measures are being discussed for protection
against DRM and patents in GPLv3.

It is conceivable that Microsoft is planning to do this to Novell, but I
suspect that it's more likely that Microsoft is aiming to repeat the FUD
campaign that it has already waged under the SCO banner (which requires
that the game be played for an extended period; the FUD being more
valuable than victories in court). It will, no doubt, successfully FUD
at least some customers into choosing Windows over Linux.

Note also that it may in fact be pure FUD, that Microsoft may have no
patents that a court will uphold against Linux distributers.

Some thoughts on the open letter:

Novell> Our interest in signing this agreement was to
Novell> secure interoperability and joint sales agreements,
Novell> but Microsoft asked that we cooperate on patents as
Novell> well, and so a patent cooperation agreement was
Novell> included as a part of the deal.

This is a non-sequiter. To see why, imagine that it had said "but
Microsoft asked that we cooperate on large-scale armed robbery as well,
and so the hiring of a few hundred mercenaries was included as a part of
the deal". It is not enough to state that a term was included at the
other party's request; by virtue of acceding to this request (and
signing the deal) Novell has itself implicitly asserted that this is a
reasonable term, despite any spin that it's now trying to put on it.

Novell> We disagree with the recent statements made by
Novell> Microsoft on the topic of Linux and patents


Oh dear, how astonishing. Perhaps Novell's lawyers are so inept that
they neglected to even form an opinion on, and/or discuss with
Microsoft, the possibility of patent infringement in distributing Linux.


- Raz

Monday, November 20, 2006

Calvin and Hobbes on creativity

Can't find the cartoon, however:
Calvin: “You can't just turn on creativity like a faucet. You have to be in the right mood.”
Hobbes: “What mood is that?”
Calvin: “Last-minute panic.”
- Bill Waterson
Indeed.

UPDATE 2007-02-02: Thanks Rado, the comic is here.

Friday, November 17, 2006

Sun, Java and GPLv2

I just posted a comment on Cringely's pulpit which (a) ended up being a little long and (b) had its formatting dropped. Here is what it was supposed to look like:

> Because this isn't your father's GPL,
> that's why. Sun put Java under GPL v.2,
> which gives the original licensor some
> unique rights.

Gnu GPLv2 is now 15 years old and covers substantially _ALL_ software that's under any version of the Gnu GPL (indeed, I had trouble locating a copy of the text of GPLv1). This has been the case since before the term "open source" entered wide use.

(It's possible that you're thinking about GPLv3 but (a) it's still a draft and (b) it's even more hostile to the sort of co-opting that you describe.)

What Sun's actually done, and what almost no company before them has done, is to bend over backwards to do this right. They've resisted the siren-song of corporate counsel who feel the need to FUD their employer into paying them to invent entirely new legalise, which doesn't interoperate with anyone else's legalise. (My own failure to convince Zawinski that a GPL dual-license was a good thing for Mozilla still smarts; it meant that for the first couple of years of the Mozilla project (until dual-licensing took place, after Zawinski quit), Gnome developers were shut out completely. This experience has perhaps biased me, but to see a major corporate source drop done right is fantastic.)

Further, note that Sun hasn't merely pinned the tail on a politically-correct GPLv2 donkey, they've gone through this in excruciating detail to get it just right. Instead of taking the "obvious" LGPLv2.1 option for the libraries, they've taken note of the existing practice by other open-source Java projects and adopted GPLv2 with "the classpath exception". With respect to the transition period for their own libraries (they hold outright copyrights in the compiler and VM, but the libraries contain encumberances which will take time to remove, so they've not made a library release yet), they've worked with the Software Freedom Law Center to craft a specific exemption that avoids trapping applications atop the standard APIs becoming GPLv2 encumbered when shiped _with_ the open-source Sun VM under GPLv2 and closed Sun libraries.

The legal groundwork that they've done is exemplary; it's really, really impressive. Someone inside Sun has asked some open-source/free-software advocates how it _should_ be done, and then listened very closely to the answer(s).

> Say you extend Java, under GPLv2 the way to
> give your improvements to the world is by
> giving them back to Sun.

No, your obligation is to make the complete source for the modified work available to your own licensees under the terms of the GPLv2. Note that Sun will not accept contributions into their _own_ source tree unless you sign a contribution agreement granting them what amounts to co-ownership. (You lose no rights, other than the right to sue Sun for using your work; Sun gains equal rights.) This is actually a weaker requirement than the FSF itself requires for contributions to the GNU project (they require outright assignment). In Sun's case it allows them to solve the sticky problem of continuing to service their paying customers for whom GPLv2 is a non-option (primarily embedded software developers). I wish I'd known about this variant of dual-licensing in 1998!

(Note also that Sun has the good sense/fortune to own almost the entire JRE+JDK outright; Mozilla had far more encumberances in 1998, so the GPLv2 was never going to work as the only option. They ended up releasing an eviscerated source tree anyway (wouldn't even build), but this is as much about how widespread the encumberances were as about licensing limitations. This takes nothing away from the brilliance of what Sun has actually done.)

> Or, if you'd like to keep those changes to
> yourself, it requires negotiating a non-GPL
> license with Sun, which means you'll have to
> PAY Sun to USE YOUR OWN CODE.

Here copyright and property concepts get a little muddled (and thus Stallman's railing against the term "intellectual property"). If you keep your changes to yourself, or only distribute them within your own organisation, then the GPL does not oblige you to do anything. Indeed it cannot, as a naked license (not a contract/agreement), it cannot impose obligations unless you are performing an act (e.g. distribution) that is controlled by copyright law. Note that "use" is _not_ controlled by copyright law, and therefore not restricted by GPL.

On the other hand, if you want to take Sun's source, create a derivative work and distribute it without GPL obligations then, yes, Sun offers an option which involes money changing hands. Note that the difference with other projects using the GPL (Linux, GNU toolchain, EMACS, ...) is that there is _no_ legal way to distribute your derivate work without GPL obligations (most projects' copyrights are too widespread to get get unanimous consent (not hypothetical; it's actually been tried with the Linux kernel) and the FSF would point-blank refuse), so Sun is providing an additional option (for a fee) not extracting monies for activities that would otherwise be free of charge.

> Under GPLv2 Sun benefits significantly more
> than it would have under the original GPL.

No, GPLv2 was essentially a cleanup of GPLv1.

> Sun controls the code, it controls forking,

No, it doesn't. Anyone is free to fork it tomorrow. Seriously.

> and anyone who wants a special deal has to pay.

True, this option exists. For most projects using GPLv2, anyone who wants a special deal simply can't get one. This plays nicely into the hands of proprietary software developers. Sun's approach avoids this competitive exposure.

> For a product that was generally given away,
> anyway, going with GPLv2 will probably make
> Sun more money -- probably a LOT more money
> -- than the company would have made by
> keeping the source closed.

Perhaps. The JRE+JDK was a loss-leader from the outset; the intent was to get adoption as widespread as possible so that they could sell related products and services. Sadly, they were obsessively focussed on preventing forks, which meant no open-source licensing, which severely curtailed reach amongst their largest natural constituency (developers with horizons wider than "we use it because it comes from Microsoft"). Sun has at last realised this error, realised that trademark law makes it possible to prevent forks from creating confusion, perhaps even realised that the ability to fork is a good thing, not a bad thing (Stallman himself opposed the EGCS changes to gcc, until that group forked gcc, refined the approach and convinced everyone that it was a better approach; Stallman finally relented, so EGCS is now known as gcc version 3).

So, big picture, yes, if opening the most widely used Java implementation (a) releases a lot of pent up demand (there was so much that there are already open-source implementations of most of the JRE+JDK) and, (b) more importantly, leads to the embrace of Java by a lot of open-source developers who have to date carefully avoided it because of Sun's stance, then yes, the platform's presence will enlarge and Sun's related-products-and-services revenue will increase substantially. This is good for Sun, good for the open-source community and good for the free-software movement. In fact it's good for just about everyone except Microsoft.

Thursday, November 16, 2006

Links for 2006-11-16

Tuesday, November 14, 2006

perl: warning: Setting locale failed.

{{ Another "I've fixed this before, but had trouble remembering how, so here it is for posterity" post. }}

I periodically encounter this, particularly in chroots:
perl: warning: Setting locale failed.
perl: warning: Please check that your locale settings:
LANGUAGE = "en_AU:en_US:en",
LC_ALL = (unset),
LANG = "en_AU"
are supported and installed on your system.
perl: warning: Falling back to the standard locale ("C").
The cause is that /etc/environment contains:
LANGUAGE="en_AU:en_US:en"
LANG=en_AU
The fix is to run
# dpkg-reconfigure locales
and to turn on some useful locales (en_(AU|GB|US) with various UTF-8/8859-1/8859-15 character sets in my case).

Monday, November 13, 2006

Links for 2006-11-13

  • A new study suggests that space elevators may have a problem for human passengers: the Van Allen radiation belts. Not only is the radiation considerably stronger at the equator than elsewhere but, more importantly, anticipated space elevator speeds (think high-speed train, just vertical) are about a hundreth of rocket speeds, thus increasing single-trip radiation exposure to potentially lethal levels.

Sun's J2SE under Gnu GPLv2, TODAY!

I can't help having an "I'll believe it when I see it" response to this news but, allegedly, just seven hours from now Sun will make a partial j2se source release under the Gnu GPLv2. A live webcast is scheduled for 10:30am PT (5:30pm UTC) today and according to Tim Bray:

Unmodified GPL2 for our SE, ME, and EE code. GPL2 + Classpath exception for the SE libraries. Javac and HotSpot and JavaHelp code drops today. The libraries to follow, with pain expected fighting through the encumbrances. Governance TBD, but external committers are a design goal. No short-term changes in the TCK or JCP.

I wonder whether Sun will be a little more reasonable about Java trademarks than Mozilla has been about Firefox (think Iceweasel).

(via Scobleizer)

Friday, November 10, 2006

Links for 2006-11-10

  • A group of researchers has defined Jitterbugs; keyboard loggers that use modulated keystroke delays to provide a covert channel through which to "exfiltrate" logged information to remote eavesdroppers on services that process keystrokes (ssh, telnet, IM, ...). (via Schneier on Security)

Valuing YouTube, back of the envelope

As a result of discussions on related subjects with various friends recently, I've just done some rough calculations, with an astonishing (or not...) result: Google's purchase price is within reason.

Assumptions:
  • For simplicity, treat Google as a proxy for all of YouTube's customers so YouTube sells all of its inventory to Google with a standard 4-ads-per-page setup (I'm only counting video viewing pages, although this is most of them on YouTube).
  • Each video is no more than 10MB in size (YouTube rules).
  • YouTube buys its bandwidth for an average of $0.07/GB (a somewhat plausible figure that I've heard bandied about for buying at the volumes that YouTube uses).
  • Ad clickthroughs occur, on average, once per thousand impressions (informed guess).
  • 100 000 000 video downloads per day (BBC).
  • Clickthroughs earn a dollar each (arbitrary guess, but I suspect that it's conservative).
  • 5% is a reasonable economic discounting rate (close to current bankrates and inflation).
Calculations:
  • 1000MB/10MB tells us 100 downloads per GB,
  • with 4 ad impressions per video download, this tells us 400 ad impressions per GB,
  • with a 1/1000 clickthrough rate, this tells us 0.4 clickthroughs per GB,
  • with $1 revenue per clickthrough, this tells us $0.40 revenue per GB, against a bandwidth cost of $0.07 for a GP of $0.33/ GB. (That's a staggering 472% markup or, more usefully, an 82.5% gross margin.)
WOW!

So, what is this in dollars?
  • 100 000 000 downloads per day / 100 downloads per GB tells us 1 000 000 GB per day (a quadrillion bytes / day, or an average of 93Gb/s),
  • with a GP of $0.33 per GB, this tells us $330 000 GP per day (note that, in my simplified model, this means that Google was paying YouTube around $400 000 per day prior to the acquisition; no wonder they were interested in acquiring!)
  • This is about $120 450 000 per year,
  • which, when divided by 5%, gives us a net present value of $2.409 billion.
If Google insists upon a discount for investor risk of about a third, this puts offer price at about $1.6 billion, which is almost exactly what did happen.

I didn't make these numbers up to get this result, I simply made obvious guesses and happened to end up landing within 3% of the actual sale price. Lest anyone assume I am claiming a particular expertise here, most of the numbers really are guesses and I don't expect that any of them are accurate to within 3%. Consequently the result was just a fluke, but the reality is somewhere nearby and it at least suggests that Google's purchase price wasn't all that crazy.

About the assumptions:
  • YouTube was/is presumably selling some inventory to higher paying advertisers (than Google), typically when someone was mounting a huge campaign (as I write, Virgin appears to have bought almost all of the current inventory, at least for UK users). It is also likely that, in some cases, Google wasn't willing to buy, or was bidding lower so some other advertising network was able to place ads. It is even possible that some views attracted no pay-per-click advertising at all. I suspect that "Google as proxy" is actually pretty close to the mark.
  • The average size for videos will actually be smaller than 10MB, which means that these figures underestimate profits.
  • The bandwidth figure is very difficult to substantiate as (a) it's commercially sensitive and (b) they'll be buying bandwidth in many different ways and many different places from many different organisations. Fortunately, on these figures, YouTube remains [somewhat] profitable all the way up to a little over five times this estimate, so unless I'm a long way out, there's some real profit here.
  • The 1000 clickthroughs per impression comes from an industry insider, and not just an offhand "oh, it's about 1000:1", but actual figures from a day's operations, which came out at just above that figure.
  • The $1/clickthrough stems from my own experience. I have no idea what the whole-of-market figures are (but as for the bandwidth figures, note that the profit margin is so huge that this can drop by more than 75% and still leave a profit margin).
  • The 5% discounting rate is right on the Bank of England's rate, marginally under the US Fed rate and marginally above the ECB rate. I'd say it's about right, but if you want to use the US Fed rate (5.25%), then this will take about 5% off the valuation.
  • The "discount for investor risk" is pulled out of the air; I really don't know what discount Google would seek, particularly given the need to placate the RIAA, but 1/3 seems like a reasonable starting point.

Thursday, November 09, 2006

Nostalgia...

I recently came across a handful of post-project-review materials for one of the projects that I worked on in Germany in 2001. One was a draft of a paper that I co-wrote and an internal presentation that I gave called "Enfinity: The Good, the Bad and the Ugly" which reviewed our experience of using the Enfinity e-Commerce "platform" to implement a wine marketplace (xWine) for Italian winemakers and German retailers. The audience was the company's Frankfurt-based programmers/developers who had had little/no contact with Enfinity and, with one exception, no involvement in the xWine project. Both the paper and the presentation were intended for internal distribution only, but as
  • none of the German company that I worked for (Integra GmbH), its French parent (Integra SA) or the customer (xWine) still exist,


  • none of this material is the implemented system itself
and
  • Intershop's Enfinity is now at release 6 (we used release 2, which was their first Java release and was more or less a case study in Second System Syndrome) and therefore presumably no longer exhibits any of the described problems
I feel comfortable with wider distribution five years on.

Much of the prose in the draft of the paper was written by my then-colleague Gregor Klinke, whose English is considerably better than my German. The draft is well and truly readable despite some of the, ah, improved grammar.

The slides that I prepared for the presentation were rather minimalist. One of the participants did work out that what I was using was a web-browser in fullscreen mode (showing no chrome at all) rather than, say, PowerPoint with the corporate template. It has been interesting to watch, over the last year or so, the growing backlash against using cluttered corporate PowerPoint templates, or in fact, using PowerPoint at all. I am no longer alone...

It is amazing (well, for me) to watch the video (379MB) five years on; I can't quite believe that my hair was so long at the time. It's sad that there wasn't a microphone closer to me (what was used was the built-in microphone on the camera, which was several metres away) and that the recording ended abruptly a couple of minutes before the Q&A ended, but it's adequate. If you, dear reader, do decide to watch it, then it's worth knowing that the only person in the room who did not have recent experience in implementing e-Commerce systems was my boss, Jürgen. He was also very much in favour of the use of Enfinity. He starts the presentation sitting on my left but, by virtue of taking a phone call during the presentation ends up on my right (actually to the left side of the projection screen, from the camera's perspective). Naturally it was while he was out that I was explaining why his favourite part of Enfinity (a flowchart editor) was, in reality, not so helpful.

I also found a draft of some sections that I wrote for IPM, the Integra Project Methodology. Most of it is a repeat of the above, but there was also a little on some mis-steps in sizing the project team; here is an excerpt:

For the xWine project, Integra Germany assembled a team of twelve developers, two of them front end specialists, ten to handle the back end. Of the ten, less than half had substantial previous programming experience and only one had ever worked in a programming team of more than four people. In hindsight, this was always going to be a difficult proposition, but some of the more serious difficulties that this created could have been avoided with a more careful assessment of the team's makeup.

Three phases of the project are interesting here:

(Prelude: Six weeks into development, the development team was restructured, shrunk to ten people (seven of them backend developers, two of those seven were experienced developers) and all moved to a single city; a single room in fact.)

  1. During the first week after this move it was observed that the less experienced developers in the team were frequently having trouble completing the tasks that they had been assigned and were spending a very large amount of time stuck on a problem that, with help from someone more experienced could be solved in minutes. A "get help fast" rule was introduced. This increased productivity for the less experienced developers but promptly caused the people who were being asked for help continually to have trouble getting any work done at all.

  2. A happy medium was sought in which people needing help were to spend at least some time trying to fix the problem before interrupting someone else. We operated in this mode for another five or six weeks. This was still frustrating and difficult for the more experienced developers in the group who took to working odd hours, not attending meetings, etc. in order to gain some amount of uninterrupted time in which to get creative work done.

  3. At thirteen weeks, for contractual reasons, the development
    team shrunk again to five (one front end, two experienced back end, one inexperienced back end and a technical-project-manager/developer) for the final four weeks of the project. During this period, output went up dramatically. Number of check-ins is a very coarse measure, and it is fair to say that we were late in the project so we were doing lots of small changes rather than creating new files, and we were working longer than normal hours, but number of Java file check-ins per developer per day went from about seven to about seventeen. The output of the team as a whole went up by about 25-30%.

The primary cause of this large shift appears to have been that the team's more experienced developers were more frequently able to work without interruption or distractions for longer periods of time.

Not apparent from the above is the impact that this situation had on the quality of the software that was developed. Several parts of the system required components to be implemented that were particularly complex, particularly abstract or both. These were typically beyond the reach of team's less experienced developers, but required long periods of uninterrupted focus for the team's more experienced developers to produce. During the second phase listed above, work of this type was frequently delayed for weeks. This in turn led to large amounts of code being written that worked around the absence of these key components which rendered the system as a whole larger, more complex, less consistent, less robust and less maintainable then it would otherwise have been.

The lesson that we have drawn from this experience is that, whilst is tempting to bulk up a project team with less experienced developers, both from a revenue standpoint and for the opportunity for less experienced developers to learn a great deal, the results can be very damaging to the project itself. Certainly, having experienced developers out-numbered two or three to one by less experienced developers was excessive. We suspect that a ratio of about 1:1 is about right. This

  • allows the team's more experienced developers time to create the more complex parts of a solution,

  • still provides plenty of scope and support for less experienced developers to learn while remaining productive and

  • avoids the costs associated with paying experienced developers to handle even the simplest parts of the implementation of a system.

Wednesday, November 08, 2006

Links for 2006-11-09

Friday, November 03, 2006

Links for 2006-11-07

e^iπ

XKCD succinctly describes what was my initial response when I was first shown this remarkable piece of mathematics. Of course this was just after the lecturer had led us by the nose through a 20 page derivation from first principles; my next response was to assume that I had made some monumental error...

Thursday, November 02, 2006

Test-driven programming, extrapolated

Many agile methodologies offer test-driven programming (write the unit tests first, then write the code), the premise being that [adequate] tests largely define what the code should achieve and that several things go better if what an artefact is supposed to achieve is [somewhat] defined before that artefact is created (I'm simplifying, obviously).

Well, what if we took this thinking a little further:
  • What we are to test should at some level be a consequence of how the user(s) will use the software. Perhaps we should write the user manual before we write the tests.
  • What the user manual documents will need to be what the user will experience in using the software; even though we're writing the user manual rather early, we'll still need some screen mockups, workflows, key use cases, etc. created first.
OK, so far this is not so counter-intuitive for a developer accustomed to working agile, except that I've moved the user-manual forward somewhat. What if we went one step further?
  • The user manual describes the user experience that the organisation will be aiming to sell / promote to customers / users. Perhaps before we describe the user experience that we are aiming to sell / promote, we should produce the marketing communication materials (press releases, brochures, ...) that we'll actually be using for that selling and promoting.
Crazy you think? Amazon doesn't.
Once we have gone through the process of creating the press release, faq, mockups, and user manuals, it is amazing how much clearer it is what you are planning to build. We'll have a suite of documents that we can use to explain the new product to other teams within Amazon. We know at that point that the whole team has a shared vision on what product we are going the build.
Even this is not far from typical practice; the key features of a new release are typically agreed across all involved teams very early in the process. However, producing something as concrete as a press release and the supporting marcom material not only provides a crystal clear vision but also (a) allows scope creep to be stamped out fast and (b) forces important [customer-visible] items that come up during development to be stated where all can see it on an "it'll be stamped out otherwise" basis.

I like it.

Dr Adams, or, How I Learned to Stop Worrying and Love Voting Machines

Scott Adams (of Dilbert fame) has a rant on voting machine fraud:
Now don’t get me wrong – there’s a 100% chance that the voting machines will get hacked and all future elections will be rigged. But that doesn’t mean we’ll get a worse government. It probably means that the choice of the next American president will be taken out of the hands of deep-pocket, autofellating, corporate shitbags and put it into the hands of some teenager in Finland. How is that not an improvement?
I'm not entirely certain that I agree, but...

(via Cardboard Spaceship)

snapshot.debian.net, historical .debs

A couple of years ago I had a need to get at some older .debs of something that had previously been uploaded to sid, had been superceeded and was not included in any release; I therefore assumed that it had disappeared into the æther. At the time someone directed me to a site that kept all old versions of all builds of all Debian packages for pretty much exactly this purpose; I solved my immediate problem and promptly forgot about it.

Although this site lives in Debian's namespace, it does not appear to be too well advertised, so this is another of those "for when I next need to remember it" posts.

(Thanks to Sean Furey for reminding me this time around.)

Fingers and Holes in a Shaken Cornstarch Solution

Wild, particularly the bit at the end.

(Thanks to Colin McCormack for the link.)