You are here

lsf 2007, FAST

Monday morning Sara ran off to catch her plane, and I went downstairs to the Linux Storage and Filesystem workshop. It was two days, Monday and Tuesday. I was attracted to it because I have a poor tolerance for talks--I just can't imagine a more deathly boring and inefficient way to transfer knowledge than talking to a large group in front of a bunch of powerpoint slides. This conference had shorter time slots, often with several people grouped together, and a smaller attendee list.

And, sure enough, people did ask lots of questions and argue a bit during the talks, which I thought was good. I still found it a little tiresome after a while, but I got to meet a few people and have some useful conversations.

The rest of the week is the FAST conference, which is all talks with big audiences, and not so interesting to me, so I'm just sitting outside, catching up on a bit of work, and reading the abstracts so I have something to talk to people about after they come out of the talks....

The first two papers were about studies of hard drive failures, with a couple results that were new to me at least--I hadn't realized that hard drives tend to fail more uniformly over their timeline than the typical "bathtub curve" you might expect. But I guess it makes sense given that they're made of complex moving parts, not just electronics.

The third talk was presenting data from a huge study of filesystems on PC's at Microsoft. The sample design seemed totally bonkers to me--they used a mass email to employees, who were entered into some kind of drawing. The response rate was 22 percent, but they ended up with tens of thousands of filesystems. So they saw lots of filesystems, but does that kind of response rate actually allow any inferences about the whole population? Seems like a mistaken emphasis on quantity over quality. What do I know.