Showing posts with label opensource. Show all posts
Showing posts with label opensource. Show all posts

21 Jun 2012

Rails test coverage: sometimes 100% is just right

DHH, the éminence grise of the Ruby on Rails world, took a swipe at the test-first cult with his provocative article "Testing like the TSA", saying in effect that 100% test coverage is mad, bad, crazy behaviour, worthless, and an overall affront to good taste and a crime against humanity. [I paraphrase.] Since we enforce 100% code coverage at all points through our development process, I want to explain how this does not necessarily make us time-wasting, genital-fondling idiots, how the needs of our business drive our quality strategy, and how this pays off for us.

Quality Stamp

At Sage our customers demand and deserve the best we can deliver. We are very quality focused because we build accounting solutions in which getting the right answer matters a great deal: perhaps some customers don't care about quality, but ours demonstrably do. Perhaps in some cases time-to-market is much more important than reliability or maintainability: it is a business decision, and there is no one-size-fits-all answer. However, if you're building for the future and want to avoid years of functional paralysis and a costly rewrite, building an application on a solid quality foundation makes a lot of economic sense.

Write less code

The most effective way to maintain 100% test coverage is by writing less code. We refactor like crazy, and we refactor our tests just as much as our code. We don't repeat ourselves. We spend time creating the right abstractions and evolving them. Having 100% test coverage makes it much easier for us to do this: it is a virtuous cycle.

We've been doing Rails development at Sage for five years now, and we've learned a few lessons. Even if you're writing unit tests with 100% code coverage, you're doing it wrong if:

  • Generators are used to build untested code (i.e. using the default Rails scaffolds to build controllers and views)
  • Partials are the most sophisticated method of generating views, and they look like PHP or ASP
  • The tests are harder to understand than the code

green-refactor-red

What is the alternative? Well, if all of the controllers and views look pretty much the same, factor them out. The Rails generators create enormous amounts of crappy, unmaintainable boilerplate code – every bit as as much as a Visual Studio wizard. On the other hand, if the controllers and views are each completely different and unique flowers, is it for a good reason or is the code just a mess? Chances are, if the code looks like a mess, so does the app.

In my experience it's also basically useless to attempt to retrofit unit test code coverage onto a project that doesn't have it: the tests that wind up written are always written to pass, and they rarely help much. I haven't yet seen a project that could be rescued from this situation.

Whom do you trust?

When DHH says that the use of ActiveRecord associations, validations, and scopes (basic Rails infrastructure) shouldn't be tested, he's claiming that Rails is never wrong: not now, not in the future, not ever. It's his choice to make that promise, but it would be irresponsible of us to believe it:

  • Rails changes all of the time. Sometimes there are even bugs! (Crazy talk, I know!) But active record associations and scopes are complex and ornery, and can easily be broken indirectly (through a change elsewhere in the code).
  • Because we operate on the Internet, new security risks and fixes appear constantly: zero day attacks are real. We need to react to these threats quickly, and being able to prepare and deploy new versions of our apps based on updated components immediately is crucial. Having a robust test suite makes it much cheaper and less stressful to implement these changes, which drives down technical debt and makes development more responsive, and oh yeah, helps prevent a costly rewrite.
  • We use components that extend and complement the behaviour of Rails. DHH calls out the example of testing validations to be particularly useless. Well, what about when the validations methods change in a rails upgrade? Or you want to adopt a new plugin that changes core Rails behaviour? Or you want to refactor an application to move validation to a more useful place? In all of those cases the tests on validation code would be useful.

Often this means a function in a spec mirroring a function in a model (but with enough difference in naming and syntax to be truly maddening). Yes, this feels stupid sometimes, but it is a very cheap insurance policy, and sometimes it pays off.

Time split

Coffee mug reading 'I ♥ Spreadsheets'

DHH says that you shouldn't be spending more than 1/3 of your time writing tests. This leads to a question: how are you characterizing your time? Is the person doing the implementation also the person making design decisions? If you are doing behaviour-driven development you are actually vetting the requirements at the time you write the tests, so is it a good idea to skip that part and move on to the coding? If you spend time refactoring tests to speed up the test process, should that be counted? Should the time spent writing tests before fixing bugs be counted? Have you decided to outsource quality to a bunch of manual testers? What is your deployment model? I'm reluctant to put a cap on the time writing tests. I find this metric as useful as dictating the time spent typing vs. reading, or the amount of time thinking vs. talking: my answer is not yours, and the end result is what matters.

Risk assessment

We enforce 100% test coverage because it ensures that no important line of code goes completely untested. One can decide to write tests for "important" code and ignore the "unimportant" code, but unfortunately a line of code only becomes "important" after it has failed and caused a major outage and data loss. Oops!

Road sign: reality check ahead

DHH avers that the criticality and likelihood of a mistake should be considered before deciding to write a test about something. However, this ignores the third criteria: cost. Is it cheaper to spend time deciding the criticality and likelihood of writing vs ignoring tests for every single line of code, or is cheaper to just write the stupid test and be done with it? Given the cost of doing a detailed long-term risk analysis on every line of code, does anybody ever really do it, or is the entire argument just an elaborate cop-out? The answer gets a lot clearer once you elect to write a lot less code, and it gets easier once you resign yourself to learning a new skill and changing your behaviour.

Closing

Code coverage is a great way to measure the amount of exposure you have to future changes, and depending on your business, it might be necessary to have 100% coverage. A highly respected figure speaking ex cathedra can be very wrong when it comes to the choices you need to make, and sometimes it shows. 100% code coverage may seem like an impossible goal, especially if you've never seen it done. I'm here to tell you it's not impossible: it's how we work, and in our case it makes a lot of sense.

17 Apr 2012

Homogeneous web development: Meteor, Derby, Firebase and the portents of doom


A variety of new web frameworks are being cooked up that allow you to write one set of seamless code for the client and server.  It's a problem that has haunted the web development community since the dawn of JavaScript and the DOM.  One approach is to basically define the database operations on the client.  Does that sound like a good idea, or does that sound like a great idea?

Exposes the MongoDB API directly on the client to work on automatically-synced data subsets. What could possibly go wrong?  Let's name the project after a flaming ball of rock and find out for sure!

Is client-side MVC too confusing? Is Node.js too immature? Let's combine them and see what happens!  (It remains to be seen whether Derby is named after a hipster hat or a county fair event.)

"We have a full security system in the works that will allow you to control read and write access on individual locations in Firebase on a per-user basis. However, it’s not ready for widespread use yet, so right now all data in Firebase is publicly accessible. Please keep this in mind when building apps! Please contact us if you need security or want to be one of the first to try out the new system." *

Despite my scornful tone, I'm actually very optimistic on these technologies and very hopeful that at least one of these will be ultimately successful.  I'm also really happy that I'm not going to be the first person trying build an application on this stuff. Given the theme of the project names, it's fair to say that most early adopters will get burned.


* Yes, that's a direct quote.

8 Nov 2011

AGPL revisited: how MongoDB licensing differs from MySQL

Now that the Affero General Public License (AGPL3) is actually being used by successful projects, I'm looking at it again. Specifically, MongoDB is AGPL3 licensed, and it is being used for commercial applications. But how?!? I though the AGPL was complete communism, and that's what excited me so much about it - one touch of the the brush, and the whole batch of milk is stained vermillion, and your entire enterprise now belongs to Richard Stallman so he can use it to fund GNU HURD.

The AGPL actually has some pretty fixed boundaries:
A compilation of a covered work with other separate and independent works, which are not by their nature extensions of the covered work, and which are not combined with it such as to form a larger program, in or on a volume of a storage or distribution medium, is called an "aggregate" if the compilation and its resulting copyright are not used to limit the access or legal rights of the compilation's users beyond what the individual works permit. Inclusion of a covered work in an aggregate does not cause this License to apply to the other parts of the aggregate.
Upon reflection, the AGPL isn't as restrictive as I once thought. Let's take what I consider to be the most successful GPL (v2) product: MySQL*, and consider what would have happened if it had been released under AGPL instead. Since Amazon used MySQL code to build RDS, under the AGPL Amazon would be forced to release the code they use to provide the RDS service. They would not be forced to release the code for Amazon.com** however: that would clearly be outside the boundaries set out in AGPL.

Also consider that Facebook uses MySQL internally, with something like 4000 MySQL databases to power much of their site, and they've made many changes to MySQL in order to make that possible, some of which they've made public. If MySQL had been AGPL-licensed, they would have been required to make those changes publicly available under the same license.

Google is also reportedly one of the largest users of MySQL, and in a similar spirit they have released some of their tools. However, they released these tools under the more permissive Apache 2.0 license: if MySQL had been released under AGPL3, Google would most likely have been forced to release these tools under AGPL3 as well.*** And now that Google is also offering Google Cloud SQL made with GPL-based MySQL, they don't have to share their work as they would if MySQL were AGPL3-based.

All of this to say: if you want to use MongoDB to power a web app, have fun: the boundaries within the AGPL3 are there to help you, and probably won't require you to hand over your code to every visitor. However, if you see MongoDB and think "hey, that's cool, I'm going to offer a web service with the MongoDB API and become a cloud provider of NoSQL data storage, just like Amazon SimpleDB" then you will have made a derivative work, and you'll have to share those changes with the world under AGPL3.

Finally, IANAL, not in any jurisdiction, and if you base your legal strategy on lay analyses found on personal blogs, then sadly you're not alone and you're in very risky company. Best of luck, however, in finding a copyright attorney who will dig through these issues for you and give you an opinion for less than $500k.



* The Linux kernel is more widely used than MySQL, but it's so mixed up with other licences that it can't just be GPL anymore, not honestly - and the copyrights are owned by so many different people that nobody can claim ownership. MySQL, on the other hand, was always extremely diligent about maintaining ownership of every line of code they include in their distribution (which made acquisition by Sun and Oracle all the more attractive).
** ... that is, provided Amazon.com was built using MySQL, which it isn't AFAIK.
*** They could still licence their code any other way they want, as they own it, but they'd be required to license it under AGPL3.

26 Nov 2010

OSX as a Ruby on Rails dev environment: Package Managers

A lot of Rails developers like to use OSX as their development platform. Although everybody hosts Rails apps on Linux (or Solaris under duress) lots of people love OSX for its productivity, clean interface, and most importantly, its typography.

However, as some have noted, setting up Rails on a mac is hardly a frictionless process. Unlike Linux distros, OSX has no built-in package manager; you get your version of OSX and you get your patches and you'd better like what you get, because every app is going to be updated when Apple or the vendor feel like updating it. This is the same as the Windows world, and it's ugly.

So a couple of efforts have stepped in to fill this void: MacPorts and Homebrew. Neither of these is going to feel like a complete solution if you're used to a package manager like APT or YUM, but they do at least automate the installation process for various open source packages. After all, when you want wget there's no reason you should have to find the website.

I'll start with MacPorts since that came first. MacPorts was inspired by BSD Ports; it is built in Tcl and C and contains a very complete set of available packages. It is quite popular and is the venerable incumbent. And personally, I hate it. I've had my OSX install ruined twice while using MacPorts, just by installing system updates; although I obviously did something wrong, it just isn't a robust solution. If MacPorts is the solution, I don't want to hear the question.

Another alternative is Homebrew, new Ruby-based system developed on Github. It has been around for less than two years, and it's a very active project with a lot of contributors. It stresses extensibility, and lots of recipes have been written to support various packages - predictably, those most popular with Rails developers. Although I don't think it solves the brittleness problem MacPorts suffers (it doesn't address operating system component and library version dependency issues) it is very actively developed, focused on the Rails world, and easily customizable to meet individual needs.

So, although you're probably not going to get set up with a Rails development environment with OSX as quickly as you would on Ubuntu (despite Ruby being included in Xcode), there are good solutions to keep you from pulling your hair all the way out. Which will bring you to the point where you can enjoy and appreciate the kerning on the fonts in TextMate as you write your Rails code.

7 Sept 2010

Whether the test or the code is more important depends solely on who has to write the tests

In the world of open source development, automated tests are like gold. They're the glue that makes it easy to maintain projects with hundreds of collaborators. When they don't exist, code dies, and nobody knows about it - that would be a bad thing, so preventing it is job one. Unless of course, it means you actually have to write tests for your code, in which case it's delegated just as far down the food chain as possible. And nothing's further down the food chain than a paying customer who's already paid you.

Let's say you paid the author/owner of an open source project to add support for something you need. Let's just say that it's something you need, but that would be useful to him/her as well as others. And let's just say that s/he puts that code in his/her distribution. And you pay him/her for his/her time and effort. All is good.

Then, about six months later, you discover a bug in his/her library, in the very code that you paid him/her to write. You fix the code, and that's a good thing, because you really need it to work. It really should have worked in the first place, but oh well. Shit happens, right?

So then let's say that the code is all hosted on Github [because this hypothetical case happened in 2009/2010 and anything worthwhile is being hosted on Github], that you branched the main project, made your change, and committed it. Then you send a pull request to the project maintainer explaining the situation. Beautiful, this is exactly how open source is supposed to work. Git is wonderful, Github is fantastic, and everything just works because of it.

And you get the answer back from a minion of the author that you paid: "well, no, we can't accept this change, because you see, there is no test that was broken in the first place, and no new test has been written to prove that this change is a good one. So go write a test and then we'll think about it."

Which strikes you as a bit odd, because hypothetically, if s/he wrote the code in the first place, and s/he/they is/are such [a] holy motherfucking test-first code ninja[s], s/he would never write a line of code without tests for it. Except the evidence is in the code that never worked in the first place (despite your having, er, paid for it).

So now you have to maintain your own branch of this stuff in perpetuity, because they have rules, you see, and standards, and these rules and standards say that they won't accept changes that don't fix tests. Oh, and by not accepting your fix they're actually hurting your reputation, because the fact that the stuff they wrote for you doesn't work with their library might look like it's your fault, not theirs. But you paid them, and it's all over now. Unless you want to try to write a test for them, which they'll consider accepting.

What you might expect from the maintainer would be an apology, a gracious acceptance of the fix, and for him/her to write the test s/he should have written in the first place (if that's what makes him/her so goddamn happy).

Purely hypothetically speaking. I mean, it would be totally inappropriate to name names if this actually happened.

11 Jul 2010

Adobe Flash: Just because Steve Jobs says it's bad doesn't mean it's good

Steve Jobs' self-serving Thoughts on Flash were controversial to say the least. Yes, he was hypocritical and self-serving (as usual), but he certainly wasn't wrong.

Adobe wants everyone to treat Flash as if it is an open standard, but they haven't made it open source. They made some parts of it open source, but not the parts that matter - and as a result, developers are constantly left wondering which platforms are going to work.

@cyanogen on Twitter: Also, Flash is not going to run on your G1/Magic. At least not the official Adobe version. Ever.
@cyanogen on Twitter: Flash doesn't work because it uses a native (non-portable) library which uses ARMv7 instructions. It can't run on older processors.
As a friend said, "Apple seems just as evil as Microsoft, just not as
successful. And Jobs seems even more evil than Bill Gates. Certainly
a bigger bastard." I totally question Steve Jobs' motives in wanting to crush Flash, but I don't think Adobe deserves a great deal of sympathy.

18 Jan 2010

Rogers tells HTC Dream users to turn off GPS or 911 calls won't go through

On January 15 I received an SMS message from Rogers telling me I'd better disable GPS on my phone or I wouldn't be able to make 911 calls. This is the latest chapter in the unhappy saga of the HTC Dream on Rogers.
Rogers/Fido service message: URGENT 911 Calls: Please disable GPS location on your HTC Dream device to ensure all 911 calls complete. HTC is urgently working on a software upgrade and we will provide details shortly so you can re-enable GPS.

Instructions: Select Menu - Select Settings - Select Location - Uncheck Enable GPS Satellite

Message de Rogers/Fido : URGENT - Appels 911 : Veuillez désactiver la localisation GPS sur votre appareil HTC Dream afin de vous assurer que tous les appels 911 soient acheminés. HTC développe le plus rapidement possible une mise à jour du logiciel et nous vous fournirons les détails sous peu afin que vous puissiez réactiver la fonction GPS.

Instructions : Sélectionner Menu - Sélectionner Paramètres - Sélectionner Location - Désactiver les satellites GPS
First Rogers announces that they're not providing any more upgrades to the software on this platform. Then they announce that they'll upgrade Dream users to the HTC Magic for free (well, with a contract extension). Then the damn thing just doesn't work. Ah, the joys of early adoption...

I just want an Android device with a keyboard. Is that too much to ask?

13 Jun 2009

Vancouver's Open Data, Open Standards, Open Source and the Vancouver Public Library

Vancouver has adopted a policy of Open Data, Open Standards, Open Source and I'm really excited about it. David Ascher presented on the topic at Open Web Vancouver 2009 and pointed out that if we don't engage the city and use this data it will go nowhere.

The Vancouver Public Library is one of my favourite places. I love libraries, I love books, but the library here in Vancouver is a really special library for me. So I've been thinking of ways that the library could share data so that I could build applications to make the library more interesting and more valuable to the people of the city.

Here's some data I'd like to have:
  • Books on order

    I'd like to know what new books are currently on order, but not available. I want a preview of coming attractions.

  • Most unpopular books

    What doesn't get checked out? What's likely to get sold in the next round of disposal, ahem, book sale?

  • Most popular books

    What's everybody reading?

  • Top 100 sites for library patrons

    What are the most popular sites browsed from the library? I'd like to be able to contrast this with the most popular sites according to Alexa. That should help tell the library what sorts of services patrons need.

These are things that I could mash up into interesting applications, such as presenting a unified view of new popular books on Amazon and which ones are in the library, or popular in the local community.

2 Jun 2008

Tools by tools no longer cool

For a while there I thought that Microsoft was going to take everybody down with Visual Studio Team System. They'd take their superior IDE and debugging environment, add testing and fix their crappy version control system, and they'd own the world. "Nobody else will be able to deliver everything in one package," I thought. "They'll undercut everybody else until they own the landscape, and then they'll milk us like the clueless cows we are."

I even chose Perforce for a version control system. I looked at CVS and decided it was crap; Subversion was still not there, and everything else was just not good enough. "Microsoft uses Perforce," I thought, "and how wrong could they be?" (At that point I was still in fear and awe of Microsoft. Hell, I even thought Longhorn was going to rule the world.)

How different the world is suddenly. Yes, Microsoft has a beautiful IDE that permits you to smoothly debug Windows software. But who can afford to run web software on Windows? It is simply murder on a business model. And desktop software on Vista? Yeah, right. As a result, Team System is terribly quaint all of the sudden. Trac, Subversion (or Git if you're really cool), and BaseCamp are really all you need for web development, so why would you bother administering a SQL server database and a domain controller and an exchange server and a project server and a team system server and buying CALs for all of the above and along with the hardware to run it -- all for tens of thousands of dollars? And if you want to do truly distributed development between a core team, external contractors, or even (gasp) a wide community, Team System won't even do it. And there's the rub: that's the way software is built today.

Yesterday I saw an ad for Perforce: they're giving away a 2-user version, "No questions asked." Whoop-tee-doo, who cares. They can't even give that away. Microsoft versus Borland versus IBM was like a tyrannosaurus fighting a triceratops and a pterodactyl. It just doesn't matter.

22 Apr 2008

Affero GPL: fear and loathing, or ignorance and indifference?

A friend of mine tried to set up a project on SourceForge using the Affero GPL. Six months later they got back to him and added the AGPL to their (loooong) list of supported licenses. To date, Google Code doesn't support the AGPL. Granted, Google may not be particularly interested in the AGPL from a business standpoint (they want to make money off hosting and one-to-many-many-many services). SourceForge seems to have gotten over its queasiness. Does Google have something against the AGPL, or do they just not think about it?

18 Apr 2008

AIR delays the inevitable

I've made a bit of peace with AIR since my post earlier. The release notes actually do mention how to install it, and twhirl does work (pretty much, anyhow).

The ridiculousness of blowing away OOo does underscore that folks at Adobe don't work on Linux – doubtless they only really work on Mac and Windows. The fact that they never noticed they had murdered OOo proves it. They don't really understand free software (big surprise, they make their money selling Photoshop discs) and their AIR platform is a roach motel built to extend the life of their Flash development suite. Yes, certain segments are open source: big whoop, as long as the desktop runtime is closed, they can maintain their control. It's all about control. So no matter how shiny and quick and pretty they make AIR, it remains another boring proprietary trap.

Are people really going to fall for it again? Maybe, but they won't do it for long. Competing platforms will race AIR to the bottom, giving away as much as possible to gain developer mind share. Even Microsoft will eventually give away everything on Silverlight in a desperate bid to regain some sort of relevance in the world of developers under the age of 40. Gnash is catching up with Flash. Can Adobe keep running? Yes, but the question is how far.

So eventually AIR won't slag your Linux machine when you install it, and Adobe will have its cross-platform runtime dream. It'll even work, pretty much (those operating system-specific chrome APIs will be the death of them, I swear). But there is no long-term reason the development community will want to paint itself into this corner: there are just too many attractive options. They're going to have to bow to free software sooner or later, and by delaying the inevitable they're missing an opportunity to gather early, unanimous support.

14 Apr 2008

Squeezebox mash note

A couple of months ago I ordered a new home music player, the Squeezebox Duet. I was a little reluctant because it wasn't cheap (nearly $500 by the time you include shipping, import duties, etc) and it requires server software, but I liked the look of it and I liked the fact that the server software was open source. I didn't really look too deeply: I just thought "shiny!" and dug out my credit card. Besides, my bro-in-law loves his old Squeezebox, so I figured it'd be good.

But zOMG, I was totally unprepared for what I got: more than meets the eye. This thing is incredible – The real deal in our living room.a real example of what a great product a company can build when it totally understands how to use open source software to build a platform and build a committed community of users and developers. Everything about it is cool. Yes, I knew I was essentially buying an iPod-style remote control for my stereo which would hook to my network and suck all of my music off my computer (actually, my NAS). But it really is a platform.

First, the hardware is very slick. It is well designed, feels nice, looks very chic, and has tons of room to grow. They built in so many extras that the platform can grow for a long time just catching up with the hardware: 3-axis accelerometer, USB adapter and SD flash slot on the remote, WiFi all over the place... simply spectacular. You could build a bunch of cool things on this platform.

Second, the software is quite capable. It was easy to install (on Ubuntu, I just had to add their repository and the package). Unfortunately, the latest version of the server software, SqueezeCenter, has not yet been adapted to run on the NSLU2 I purchased for the express purpose, but it is doing okay on my four-year-old laptop. Besides, I might be able to put it directly on my NAS200 soon.

Third, they're working quite actively on the software, and they walk the talk Screenshot of Jive, the remote software, running on my Ubuntu desktop.when it comes to full disclosure: the whole stack is open. They not only release all of their code, they give you a real toolchain and support in making changes. Unlike other devices which manufacturers churn out and abandon, they've recently released an update to the remote control firmware that addresses issues with seeking through music lists, making it much more friendly and responsive. The remote control software also builds and runs on desktop Linux, OSX, and even ghetto Windows (a true tour de force) which makes it easy to build and debug your add-in modules (in Lua: how very nerd chic) or your own custom firmware. It also just gives you a nice remote control to run on your laptop. The open source firmware makes it possible for a community of hackers to come up with endless cool applications for all of that tasty overengineering that went into the remote.

But here's the clincher: I had an issue with my album images not showing up. I've been compulsively tagging my music collection and applying album images to make the lists look pretty, but the images weren't showing up properly – even worse, it was sending corrupt images to the browser, throwing the format and making it all look weird. So I started looking at it, and reported the problem on their bugtrack system. One of their engineers fixed it fourteen hours later; I grabbed the in-development build the next day and my issue was fixed. Twenty-four hour turnaround.

That is not only great service, it helps them build and take advantage of a community of expert users. Unlike some companies which persecute people who fix their problems, at least one part of Logitech has it figured out.

13 Apr 2008

Speech synthesis on Ubuntu

Text-to-speech (TTS) has been around for a couple of decades, and it keeps getting better. There are a bunch of really fun untapped applications for it, combining RSS, filters (like Pipes), podcasts, telephony, and hidden speakers.

Under Linux there is a nice package available called Festival. To get started, grab an appropriate package, such as:
  • festvox-hi-nsk (Hindi male)
  • festvox-kallpc16k (american English male)
  • festvox-rablpc16k (British English male)
  • festvox-mr-nsk (Marathi male)
  • festvox-suopuhe-lj Finnish female
  • festvox-suopuhe-mv (Finnish male)
  • festvox-te-nsk (Telugu male)
Too bad you can't get a female speaker except in Finnish. (I had never heard of the Indian languages Marathi and Telugu, and I consider myself a language buff... sigh.)

The results are pretty good. Here's how to use it from the command line:
text2wave text-file.txt -o audio-file.wav

For extra fun, use pidgin-festival to turn incoming instant messages into speech (use festival-gaim if you haven't made the jump to Hardy Heron yet).

20 Feb 2008

Why is this man smiling?

Richard M. StallmanHe is smiling because he is winning. His vision for free software is taking serious hold. It is possible to live a productive digital life using only free software, and I pretty much do (damned "free" Flash player).

When he wrote "the prospect of charging money for software was a crime against humanity" at first I thought it was just baseless, humourous hyperbole, but upon reflection I came up with a different perspective on it. Software allows for automation. Human effort is expended to create software to solve a particular problem, and the human is compensated for that. But should the method for achieving that result become "property"? And whose "property" will it be? Almost certainly, that "property" will belong to a corporation. And corporations are not human. They are legal "people" (emancipated by the U.S. Supreme Court) but they are immortal, amoral, and their true role in our society is incompletely understood. Without the mechanisms provided through free software, and because of the structure of copyright (dictated by corporations), that "property" will almost certainly not be made available to anyone but other corporations able to pay for it. Being immortal, they enjoy many advantages humans do not, and humans cannot reasonably compete in certain areas.

Stallman is a clever guy. He figured out how to do a jiu-jitsu move on copyright to give people an economic model to put software in the hands of other people and organizations (yes, even other corporations). He invented copyleft, a way of enforcing the continued sharing of software, and did it in such a way that corporations couldn't cut it down without sawing off their own arms: copyleft is based on copyright. Without his invention, there would be no practical alternatives to corporate tools made by corporate tools.

11 Feb 2008

Superior software

In preparation for the imminent arrival of a new toy, I set upon installing open source firmware for our router. I bought the router, a Linksys WRT54GS, specifically because it is "hackable" – runs Linux and has had a nice selection of software adapted for it. I was a little anxious, knowing that I'd have to quickly configure it to get the internet back online, and hoping that I wouldn't screw it up and have to run to the store to buy a different router.

I crossed my fingers, did a rosary, held my breath, and installed the firmware. It worked, and not only did it work: it maintained all of my previous settings and worked exactly as it had before, but now with a huge number of extra features and configuration options. What a nice surprise: in the past, installing firmware always trashed my configuration – even when it came from the manufacturer. The extra features are really cool, and the configuration tools are much better.

Okay, so I like open source, big deal. What really surprised me here was that open source gave a better product than the supplier gives. Hardware manufacturers haven't really embraced open source software yet, except as invisible software utilities hidden behind a sealed plastic case (source code redistributed only begrudgingly). Yet the open source firmware for routers (such as X-Wrt) and mp3 players (Rockbox) is arguably much better than the stuff the manufacturers put on the devices in the first place. Actually, software for devices like residential routers is deliberately crippled to protect the lucrative, inflated business device market.

One of these days a hardware brand, someone like VTech or D-Link, is going to decide to take full advantage of these tools and contribute to their development. They'll save a lot of costs, sure, but more importantly they'll sell to people who want to do whatever they please with their devices.

29 Oct 2007

BlackBerry support in Ubuntu Gutsy

To charge or backup your BlackBerry device under Ubuntu Gutsy Gibbon, there is a nice little GPL package named barry put together by some nice folks in Ontario. It isn't part of the Ubuntu software catalog yet, so here are some steps to get this up and running. (Note: the authors give instructions on how to build it from source – take your pick.)

First, you'll need to install an updated version of libopensync0 (0.22). Add the following lines to your /etc/apt/sources.list:

#opensync
deb http://opensync.gforge.punktart.de/repo/opensync-0.21/ etch main
deb-src http://opensync.gforge.punktart.de/repo/opensync-0.21/ etch main
So, go get that along with some other prerequisites:

sudo apt-get update
sudo apt-get install libopensync0 libglademm-2.4-1c2a libtar
Then, download and install barry.

wget "http://downloads.sourceforge.net/barry/libbarry_0.9-1_i386.deb?modtime=1192146928&big_mirror=0"
wget "http://downloads.sourceforge.net/barry/barry-util_0.9-1_i386.deb?modtime=1192146873&big_mirror=0"
wget "http://downloads.sourceforge.net/barry/barrybackup-gui_0.9-1_i386.deb?modtime=1192146747&big_mirror=0"
wget "http://downloads.sourceforge.net/barry/libbarry-dev_0.9-1_i386.deb?modtime=1192146953&big_mirror=0"
wget "http://downloads.sourceforge.net/barry/libopensync-plugin-barry_0.9-1_i386.deb?modtime=1192147004&big_mirror=0"
sudo dpkg -i libbarry_0.9-1_i386.deb
sudo dpkg -i barry-util_0.9-1_i386.deb
sudo dpkg -i barrybackup-gui_0.9-1_i386.deb
sudo dpkg -i libbarry-dev_0.9-1_i386.deb
sudo dpkg -i libopensync-plugin-barry_0.9-1_i386.deb

Note: you probably don't need the libbarry-dev package unless you plan to develop with this stuff, but I installed it anyhow. Won't hurt anything. The libopensync package should enable some sort of integration with your local copy of Evolution or Thunderbird, but I haven't tried it.

After all you can plug your BlackBerry into the USB port and it will finally charge. These packages don't put any entries in the Gnome menu system, but you can run "barrybackup" which will let you back up all data (and restore, if you want).

24 Sept 2007

Defending his turf

Slashdot referenced an article by Derek Sivers today: "7 reasons I switched back to PHP after 2 years on Rails". CmdrTaco managed to turn a case study on a failed project into an opportunity to badmouth Rails.

I didn't see any condemnation of rails in the article (quite the contrary, in fact). It was more of a case study for Joel Spolsky's classic position: don't rewrite from scratch. As Joel puts it:
They did it by making the single worst strategic mistake that any software company can make:

They decided to rewrite the code from scratch.
Instead, Joel advocates doing what Sivers wound up doing: refactor and fix the existing code.

But of course that wouldn't fit CmdrTaco's corporate agenda: to promote LAMP over a growing upstart, as PHP loses market share month after month to the not-very-comparable ASP.Net (mostly encumbered) and the comparable Ruby on Rails (also free). So he passed along the grossly misrepresentative summary of the article and came up with a similarly misleading title. I guess I expected more from CmdrTaco (a.k.a. Rob Malda): he was once an insurgent in the revolution, but he's over thirty now, and therefore not to be trusted.

23 May 2007

Sun gets it

Sun CEO Jonathan Schwartz gets it: software patents are bad juju, and he's promising to use his patent portfolio to protect Red Hat and Ubuntu from the depredations of the Beast. Here's a guy that understands the business, and although his company has seen some hard times since the party ended back at the beginning of the century, Sun is taking risks and trying hard to ride the free software commodification wave. The earlier you bend to the inevitable the more likely you are to survive. Sun is showing admirable flexibility; sometimes a sharp downturn gives you license to take bold steps that a gradual decline does not.

16 May 2007

Microsoft: Patent Troll

Desperation is never sexy. It is sad (if unsurprising) to see that Microsoft has stooped to becoming a lowly patent troll. They've been moving in this direction for a while, starting their patent portfolio as a purely defensive measure, but then trying to intimidate the Samba project, then financing SCO to be their stooge in a battle against GNU/Linux. But so far, they had just been using their patent portfolio as a FUD tool. But now they've turned a corner and decided to monetize their paperwork, shaking down big scaredycat organizations. Then they decided to subvert Novell. Now they're setting the stage to take their campaign wider and try to scare individuals.

It's a sad state of affairs for the once-proud company that released great products like Windows for Workgroups and Excel. No, Microsoft was never really innovative, but they were at least competently derivative: at one time they could take somebody else's concept and improve upon it. Remember how they took on Netware and destroyed it (with a better product)? Today all they seem capable of doing is screwing up their products and taking out their frustrations on their customers.

Microsoft is apparently taking theatrical cues from Joseph McCarthy. "I have a list here of 235 patents the communists are infringing." Of course they won't say what they are, because if they do, those patents will be immediately challenged with prior art and worked around by the open source community. Plus, their competitors will unleash hell with their own patent portfolios.

The funny thing is that if Microsoft still had decent prospects it wouldn't dare resort to this sort of two-bit shakedown operation. Microsoft made an empire by appropriating the good ideas of others and incorporating them into its products (then using cross-subsidy and vendor lock-in to exterminate the competition, the tools of a convicted monopolist). By doing so they added value and met the needs of the end user. But they seem incapable of doing that anymore, so now they resort to intimidation. It's a sad end to a once-proud company.

1 May 2007

Microsoft follows Adobe to open source

Microsoft has (not quite yet) announced that it will release source for Silverlight, following Adobe's recent move with Flex. Since Microsoft is only planning to support Windows and Mac for the runtime, Adobe has a slight advantage with the huge, lucrative Linux market [har]. Adobe's big advantage is that they are two years ahead and not Microsoft. Microsoft's big advantage is [er, wait, give me a minute, I'm sure I'll think of something... oh!] they 0wn your Windows box and can nuke the Flash player from orbit with Windows Update [diabolical laughter here]. They can't do that, though... it'd piss off their customers, and I guess they care (though they have a hard time showing it).