Continuous Integration

Release Schedule

May 2020
S M T W T F S
« Nov    
 12
3456789
10111213141516
17181920212223
24252627282930
31  

Branches

Twitter

Simply ship. Every time.

It’s been a hard day’s night

02/23/2007

The impending 2.0.0.2/1.5.0.10 release has been heretofore painful in a way reminiscent of the 2.0.0.1/1.5.0.9 release, but interestingly enough, it’s been so for completely different reasons.
I’m currently at the office, signing bits and pushing them out to the mirror farm.
It’s been an interesting release from a build/release standpoint: I fried my brain doing the first half of the release; rhelmer graciously saved my (and, by extension, our collective) ass(es) by stepping in last weekend to finish up another round of release candidates (which turned out to be _RELEASE candidates), but pulled the same burnout trick I pulled. So I swapped back in for him to finish it all up. There’s been a bunch of handoffs that we’ve never really done before, but it’s seemingly worked thus far.
I’ve done late nights before, as most of the Mozilla project is used to, but not as consistently during any release as I have during this one, except 1.5.0.2. Maybe.
Anyway, this will be a very different night from the morning I was here, working sometime around the RC1 cycle and feel asleep, to be awoken by the finance guys coming in promptly at 8:30 am.
And it’ll be different than the night I had to come in to do some binary signing, and “accidentally” had a party no one was supposed to know about, which, incidentally, “leaked” the top secret information of where I spend all my time building and releasing looks like.1
Yes, this morning will be different because, armed with a pillow and snuggly blanket from home, I’m totally prepared to snag a quality 2.5 hours of shuteye, just to ensure maximal freshness for release activities.


I’ve learned my lesson.
I never thought I’d still be having slumber parties at this age.
But I must admit, I’m actually kinda happy that I still am, though.
______________________
1 Number one comment from people walking by? “You don’t, by chance, like planes… do you?”

Eine Identität, ein Netz, eine Firma.

02/19/2007

Having spent most of my weekend moving1, I haven’t had much time to miss the fact that I don’t yet have Innerwebs at the New Hotness.2
Being the geek that I am, I of course called Comcast to set up a deactivate/reactivate service appointment, and they promptly showed up on moving day, but then after about a rousing twenty-minute round of “coax cable hunt,” followed by bothering my [new] downstairs neighbor to find out why the cable goes into her balcony, but does not pop out of the floor of mine, I get some sordid tale involving remodeling-gone-wrong-three-years-ago-and-the-cable-was-broken-after-that-
and-you’ll-have-to-drill-through-concrete-and-that-involves-talking-to-the-homeowner’s-
association-but-they-only-meet-once-every-three-decades.
Or something like that.
Not such a huge deal since I’ve been busy packing, unpacking, and repacking. But since I decided to take it easy tonight, I really started jonesing for the Series of Tubes.
On a lark, I decided to see if there were any open Whee-fee access points. I’m still in Mountain View, so I wasn’t too surprised when my laptop associated with an access point called “GoogleWiFi.”
I had never used it before, but knowing Google and their penchant for recording-every-single-thing-any-user-ever-searches-for, I wasn’t very surprised when they wanted me to log in with my Gmail account before I could do anything.
In some sense, while it kinda grates on me, I admit that I can’t complain about it too much. I’ll also openly say: I’m pretty impressed with the quality of the service3; and they don’t seem to filter out things like outbound ssh and such.4
(It also makes me wonder how many Mountain View-ites are completely basking in Google-backed electromagnetic radiation, since I have no idea where the Google access point is, but I’m writing this from my bed in my new apartment, and I’m getting an average of 44 msec ping times to the default gateway with about four percent packet loss.)
But none of that is actually what I was thinking about when I started writing this.
No, what prompted this particular episode of blahginess was the blatant [and somewhat unsettling] realization of how much I happen to rely on a particular entity (to say nothing of that entity being a privately traded corporation) to provide me with a chunk of my daily Internet experience.
This realization took form after I clicked on someone’s Picassa web album, linked to from a blog post indexed by Google Reader, all served up by [some AP claiming to be] GoogleWiFi.
As I traced back my steps of my quick checkin this evening, I’ve spent most of it on Google properties: Gmail, Reader, Google News.
And I don’t know how I feel about that.5
Either way, it now makes the prospect of those saying there will be a “shadow GoogleNet” that we’ll all be using in 5ish years less… insane-sounding.
But, until the home owner’s association can get its butt in gear to get stuff fixed6, I’m pleasantly surprised such a service exists and actually works… and will gladly use it to let you all know that I’m somewhat troubled by the thoughts I’m having while mulling over its privacy- and future-of-the-Internet-implications as I do so.
____________________
1 One almost-down, five to go
2 As rhelmer would tell you, this has been my new catch phrase over the last week. “New Vista-Signing Hotness.” “New BuildBot Hotness.” Etc. Yes, It’s pretty 1995-ish.
3 Which is to say it actually works, and is usable
4 Which, if I really cared about the privacy implications, I would shunt all of my web traffic through an ssh proxy. I won’t, however, be using this particular link to be doing any Interwebs banking.
5 Of course, it can’t be very great and/or squee-u-lar, since I took the time to write this…
6 That’s not me being purposefully vague; I don’t think anyone really knows what the problem is, and thus any solutions are still scary and mysterious.

Head in the Clouds, Charlie

02/13/2007

My IFR clearance—a concept which I’ve been meaning to devote an entire bloggity-blog post to—was issued by air traffic control as follows:

Cessna three-two-three-romeo-foxtrot is cleared to the Napa County Airport via: on takeoff, right turn, heading zero-six-zero within one nautical mile of the airport, radar vectors IMPLY intersection, Victor one-oh-seven, Oakland, Victor one-ninety-five, CROIT intersection, Victor one-oh-eight, Scaggs Island, direct. Climb and maintain three thousand; expect five thousand, five minutes after departure. Departure frequency is one-two-one-point-three, squawk zero-three-two-three.

It took me about ten minutes to set up the departure.
Everything went mostly fine… until my vacuum pump failed.
Then it got interesting.

Read More

Head in the Clouds, Bravo

02/11/2007

The FAA, in its infinite wisdom—and, for once, I’m actually not being sarcastic while referring to the concepts of “the FAA” and “wisdom”—allows you to record up to twenty hours of your required forty hours of in-IFR-conditions training to be in a sim. This tends to help out, since flying a sim is around $50/hour (depending on the sim) and flying a real plane is $130+/hour (depending on the plane).
As such, I’ve been putting some amount of sim time in lately, and I think I’ve begun to discover an interesting secret of IFR flying.
Its application(s) to Real Life ™, if any, are left up as an exercise to the reader.

Read More

An Open Letter, Charlie

02/09/2007

Before I present my open letter, some context is necessary.
From #foxymonkies:

00:23 <@Ryan> @preed-scotch: Drinking at the office? :0

00:24 <@preed-scotch> Ryan: hell yeah.

00:24 <@preed-scotch> I’m stuck here until 4 am.

00:24 <@preed-scotch> What would *you* be doing if you wer stuck here until 4 am.

00:24 <@Ryan> ouch

00:24 <@preed-scotch> so

00:24 <@preed-scotch> I’ve got me some Aqua Teen Hunger Force DVDs

00:24 <@preed-scotch> and three bottles of scotch

00:24 <@preed-scotch> and three hours to kill.

00:25 <@preed-scotch> time to get tore up.

00:25 <@preed-scotch> followed immediately by signing Win32 builds.

Now then…
Dear Cygwin,
Please kindly FOaD.
Love,
preed

An Open Letter, Bravo

02/05/2007

Dear Firefox Community,
I need your help here: how do you best pimp Firefox in three minutes?
Slashdot recently posed this question in regard to Linux, and at the time, I mostly ignored it, because… well… I didn’t find it particularly relevant.
But then, in a coffee shop this weekend, I was getting my morning cup o’ awakeness, and the woman behind me in line started up a conversation: “Oh, I have Firefox installed! But I was going to delete it. I like your shirt though…”
I smiled and said “No! You should start using it! We just released a new version.”
She asked “Well… why should I use Firefox?”
I first said “Well, because it’s more secure than Internet Explorer,” and then I quoted the IE was insecure for “like hundreds of days last year,” and Firefox was insecure for “like a week or something.”
She said “Wow, that’s really cool.” But then seemed underimpressed.
I then said “It’s also one of those ‘feel-good’ things, you know; Firefox is built and supported by a community of people, and we work really hard to a build a browser with only users and their online experience in mind.”
“Yeah, that’s true,” she said. She then said “I originally downloaded it because Safari wouldn’t display something I wanted to see.”
Before I could really respond, it was time to order my coffee, and then since she was behind me, we both got distracted, and never were able to really finish the conversation.
So, my question to you: if you only had two to three minutes to talk up Firefox and get someone to keep it on their machine, how would you explain why it’s the best browser around to someone who doesn’t care a bout (or maybe even understand) “attack surfaces” and “days of exposure,” and gets “The Community” in the abstract, but… not enough to make it relevant to their personal livfe.
Which aspects of the ‘fox we all know and love so well would you focus on in 150 seconds?
Aaand… go!
Sincerely,
Your Friendly Build Engineer
P.S. I’ve also been trying to figure out whether or not she was flirting with me, but… that’s another post altogether.

“Your mission, should you choose to accept it…”

02/02/2007

Mozilla IT (thanks Aravind and Justin!) recently archived to tape a huge set of builds from the mozilla.org FTP staging server.1
These aren’t just the builds, but also the original build artifacts, from the original tinderboxen, through what was released (and probably [hopefully?!] is still available) on the FTP server.


I asked IT to make three copies of this particular backup tape: one to store with the rest of our backups, one for the Build Team to keep offsite2, and one to keep significantly offsite.
To achieve the last requirement, Beltzner, who happened to be in town this week, will be helping me to find a safe, shady spot in the Toronto office for this little bit of Mozilla Project history.
_________________
1 Thus reclaiming a bunch of space so we could keep… releasing software.
2 Which is likely to translate to “somewhere in my apartment.”

A Spoon Full of Triage…

02/02/2007

It’s a sour medicine, but if the list is ever going to become… realistic, it has to happen.
I, too, joined Coop and rhelmer in making my bug list reflect reality this evening. Now I only have five bugs!!
What ever will I do with all my free time?!
I think the weirdest part of going through my bug list was finding bugs I had actually already fixed, just never RESOLVED. D’oh!
Going through the queue also prompted me to update one of everyone’s favorite bugs (with good news, even!) and file a new, necessary bug (it’s that time of year… although I hear from the ReedBot it’s a dup.)
I don’t expect our bug list will reflect reality in the short term… but I think we can get there in the medium term. It will require being realistic about it, and it will involve gnashing of teeth in some cases, but… it’ll be nice to look at an open bug list that doesn’t

Head in the Clouds, Alpha

02/01/2007

I had a couple of people ask me about a recent post.
Their inquiry basically amounted to “Woah… are you ok?”
Truth be told, I was appreciating the picture on a “sometimes you can’t quite see where you’re going, but if you have the right tools, you’ll probably be OK”-level, not an apocalyptic “OMG We’re all going to die!”-level.
Plus, as I said, I really enjoy the composition.

***

Speaking of “IFR ahead,” as the blogosphere has become a permanent fixture of the InnerWebs, people have started diary-ing their various experiences while getting various airplane-related ratings.
I didn’t do any such thing for my private pilot’s license, probably because I had so many things going on in my life when I started, and I dragged it out for so long… but I thought I’d try and write a bit about my experience getting my IFR ticket
(I’ll always make these as mostly-extended posts, since they’re not particularly Mozilla-specific.)

Read More

Downplaying the “Distributed” Dogma

01/30/2007

Benjamin recently wrote about the current state of our effort to try to import our CVS repository to… something from this century.

His conclusion is spot on, although I think it… minimizes the head-banging he and I have been going through for a couple of weeks. My original characterization didn’t turn out to be far from the truth, it seems… except, it’s me and my quad-core-P4-with-4-gigs-of-RAM sitting there, bloodied and bruised on the floor, not ClearCase. ;-)

I was somewhat surprised by the number of responses to Benjamin’s post that seemingly amounted to “Can’t you just use Subversion? Subversion works. And if you want distributed, use SVK.”

Well, the first issue with that is cvs2svn1 doesn’t seem to import the Mozilla CVS tree anymore: it’s hitting the error that Hg tends to hit2, and while completely dying is arguably more correct, bzr and cvs2svn 1.3.0′s approach—annotating and ignoring the error, so the import can actually continue—is much more satisfying.

The second issue is that the march towards a distributed version control system really isn’t about a distributed version control system; it’s about using a tool that support merging algorithms that weren’t invented in the 80s, back when you never did branches anyway, because it was annoyingly difficult with the tools of the time.

During the original discussion, the main issue that limited Subversion’s advancement in the race was that it didn’t support any better merging functionality or techniques than its predecessor. It requires external tools to record which merges had been performed and the actual algorithms used are the old ones we all love and/or hate.

Now don’t get me wrong: I use Subversion for all my personal stuff and I like it. I think it’s a great improvement over CVS (which I used for years and imported from) and in many (most?) cases, I would recommend it.

But when you’re going to be doing the kind of “agile”3, disruptive, reconstructive work that Mozilla 2.0 requires, at a minimum, you need a tool that makes branching and merging easy. SVN does work for me (and lots of other people and projects) because I’m not faced with, for example, renaming nsIFrame::GetPresContext, a task where a branch makes a lot of sense, and I’m going to be doing hundreds of renames.

I contend that it’s not so much that we require (or necessarily even want) a “distributed” version control system. In fact, as a counter example, Perforce is a [closed source] centralized VCS that has a lot of great features, including merging primitives that are awesome. Accurev is another (although, I’ve never personally used it.)

We just happen to be focused on “distributed” VCSs because those are the only open source offerings that have merging facilities that handle complicated situations and get the merging stuff right. This is likely because a distributed version control system isn’t worth anything if you can’t merge your work back in easily and [more importantly] reliably.

I’ll concede, of course, that once you have things like offline diff/commit and easy patch sharing among peers, all built-in-and-tracked-by the VCS, that’s (possibly addictive) icing on the cake.

But it’s not about “distributed” part. It’s about the capable-merging part.4

Breaking code apart is easy. Putting it back together is hard.

We want and need a tool that intrinsically expects, is designed to handle, and expertly supports the latter.

_____________________
1 As of 1.5.0
2 Which amounts to deleting files which don’t exist on branches [possibly yet] that they’re being deleted from.
3 I hate using that [buzz] word.
4 Coincidentally, Joel recently blogged about version control systems and large teams, and it seems the Windows team uses a model very similar to that of the 2.6 kernel developers, and possibly similar to what we’ll end up using. It seems that easy branching (which is easy) and easy merging (which is hard) is the only real way to scale a development project into the thousands.

Newer Posts
Older Posts