Russell's Blog

New. Improved. Stays crunchy in milk.

A new Prometheus

Posted by Russell on December 08, 2012 at 2:33 a.m.
One year ago, UC Davis law student Megan Glanville was killed a stone's throw from my front door. She was crossing the street for a morning run. It was foggy. The driver didn't see her.

Since then, the intersection where she died has been redesigned. It is now a three-way stop with modern LED lighting. Watching over the scene, there is a new flashing red beacon.

This sort of infrastructure is easy to take for granted. As a Commissioner for the City of Davis, I suppose I pay closer attention to these things that most people do. I've payed particular attention to this little piece of city infrastructure because I pass through it several times a day.

Something has changed there since the red beacon went up. Up and down the boulevard, for almost a mile, there are crossings to access the bicycle path. Drivers now stop and let me cross. They never did that before. I am not exaggerating when I say that wherever the beacon's light falls, the feel of the street has changed. It's no longer the tail end of a lonely country road. It's a neighborhood street, and people act accordingly.

I would like to think that drivers feel the significance of the flashing beacon. I would like to think that they have noticed that the intersection has been redesigned. I would like to think that they know that Megan Glanville died there. In all likelihood, they are oblivious to these things. They stop and smile and waive me through anyway.


Good design matters. That's why.

The rocky ledge runs far into the sea,
And on its outer point, some miles away,
The Lighthouse lifts its massive masonry,
A pillar of fire by night, of cloud by day.

Even at this distance I can see the tides,
Upheaving, break unheard along its base,
A speechless wrath, that rises and subsides
In the white lip and tremor of the face.

And as the evening darkens, lo! how bright,
Through the deep purple of the twilight air,
Beams forth the sudden radiance of its light
With strange, unearthly splendor in the glare!

Not one alone; from each projecting cape
And perilous reef along the ocean's verge,
Starts into life a dim, gigantic shape,
Holding its lantern o'er the restless surge.

Like the great giant Christopher it stands
Upon the brink of the tempestuous wave,
Wading far out among the rocks and sands,
The night-o'ertaken mariner to save.

-- The Lighthouse, Henry Wadsworth Longfellow

On a superficial level, a flashing red beacon is a utilitarian thing. If you look more closely, you will see that it is also a thing of beauty. It is an avatar of the compulsion we all feel to protect, to warn, to guide. The humble beacon is one of the better angels of our nature, sculpted with massive limbs of galvanized steel and eyes of electrically exuberant gallium phosphide. It sends our message out into the world, again, and again, and again.

be careful

be careful

be careful

be careful

be careful


Trouble with SoftSerial on the Arduino Leonardo

Posted by Russell on May 25, 2012 at 12:05 p.m.
While I was wandering around at Maker Faire last weekend, I heard someone say, "Woah, is this the Leonardo?" And lo, there was a handful of Arduino Leonardo boards lined up on a shelf for sale. I instantly grabbed one, and bundled it home to play with it.

The Leonardo is Arduino's latest board, announced last September. It uses the Atmega32u4 chip, which has onboard USB. This has two important implications; first, the Leonardo costs less than the Uno, and second it will be able to operate in any USB mode. That means people can make Human Interface Devices (HID), like mice and keyboards and printers, with Arduino, and present themselves to the host using the standard USB interfaces for those devices. That means you can build things that don't need to talk via serial, and use the host's built-in drivers for mice and printers and whatnot. This is a big step forward for Open Hardware.

Anyway, I'm developing an little remote environmental data logger to use for part of my dissertation project, and I thought I'd see if I could use the Leonardo board in my design. I'm using the Arduino board to talk to an Atlas Scientific pH stamp, which communicates by serial. It works fine on the Uno with SoftwareSerial (formerly known as NewSoftSerial until it was beamed up into the Arduino Core mothership).

Unfortunately, it didn't go so well on the Leo. The board can send commands to the pH stamp, but doesn't receive anything. I swapped in an FTDI for the pH stamp, and confirmed that the Leonardo is indeed sending data, but it didn't seem to be able to receive any characters I sent back. I tried moving the rx line to each the digital pins, and had no luck. Here is my test program :

#include <SoftwareSerial.h>

#define rxPin 2
#define txPin 3

SoftwareSerial mySerial( rxPin, txPin );

byte i;
byte startup = 0;

void setup() {

  mySerial.begin( 38400  );
  Serial.begin(   9600   );

void loop() {
  if( startup == 0 ) {             // begin startup
    for( i = 1; i <= 2; i++ ) {
      delay( 1000 );
      mySerial.print( "l0\r" );    // turn the LED off
      delay( 1000 );
      mySerial.print( "l1\r" );    // turn the LED on
    startup = 1;                   // don't re-enter
  }                                // end startup

  Serial.println( "taking reading..." );
  mySerial.print( "r\r" );
  Serial.println( mySerial.available() );
On the Uno, I see the number increasing as the read buffer fills up :
taking reading...
taking reading...
taking reading...
taking reading...
taking reading...
On the Leo, it seems that nothing ever gets added to the read buffer, no matter how any characters I send over from the FTDI or which pins I used for the rx line :
taking reading...
taking reading...
taking reading...
taking reading...
taking reading...
taking reading...
I really wanted to see if I was crazy here, but I'm one of the first people among the General Public to get their hands on a Leonardo board. So, I started talking with Ken Jordan on #arduino on Freenode (he goes by Xark) who has a similar board, the Atmega32u4 Breakout+. It's based on the same chip as the Leonardo, but it has different pinouts and a different bootloader. He flashed the Leonardo bootloder onto his board, and worked out the following pin mapping :
Arduino 1.0.1     Adafruit         ATMEL
digitalWrite pin  atmega32u4+ pin  AVR pin function(s)
----------------  ---------------  ------------------
D0                D2               PD2 (#INT2/RXD1)
D1                D3               PD3 (#INT3/TXD1)
D2                D1               PD1 (#INT1/SDA)
D3#               D0               PD0 (#INT0/OC0B)
D4/A6             D4               PD4 (ICP1/ADC8)
D5#               C6               PC6 (OC3A/#OC4A)
D6#/A7            D7               PD7 (T0/OC4D/ADC10)
D7                E6 (LED)         PE6 (INT6/AIN0)
D8/A8             B4               PB4 (PCINT4/ADC11)
D9#/A9            B5               PB5 (OC1A/PCINT5/#OC4B/ADC12)
D10#/A10          B6               PB6 (OC1B/PCINT6/OC4B/ADC13)
D11#              B7               PB7 (OC0A/OC1C/PCINT7/#RTS)
D12/A11           D6               PD6 (T1/#OC4D/ADC9)
D13#  (LED)       C7               PC7 (ICP3/CLK0/OC4A)
D14   (MISO)      B3               PB3 (PDO/MISO/PCINT3)
D15   (SCK)       B1               PB1 (SCLK/PCINT1)
D16   (MOSI)      B2               PB2 (PDI/MOSI/PCINT2)
D17   (RXLED)     B0               PB0 (SS/PCINT0)
D18/A0            F7               PF7 (ADC7/TDI)
D19/A1            F6               PF6 (ADC6/TDO)
D20/A2            F5               PF5 (ADC5/TMS)
D21/A3            F4               PF4 (ADC4/TCK)
D22/A4            F1               PF1 (ADC1)
D23/A5            F0               PF0 (ADC0)
-     (TXLED)     D5               PD5 (XCK1/#CTS)
-     (HWB)       -  (HWB)         PE2 (#HWB)
This was derived from the ATmega 32U4-Arduino Pin Mapping and ATMEL's datasheet for the ATmega32U4 chip. Once that was worked out, he flashed my test program onto his board, and also found that SoftwareSerial could transmit fine, but couldn't receive anything.

Ken rummaged around a little more, and had this to say :

The SoftSerial seems to use PCINT0-3 so there seems to me a minor problem in Leo-land in that only PCINT0 appears to be supported (and it is on "funky" output for RXLED). Hopefully I am just misunderstanding something (but it imay be the interrupt remap table is incorrect for Leo).
Then he disappeared for a little while, and came back with :
I have confirmed my suspicion. When I disassemble SoftSerial.cpp.o I can see that only __vector_9 is compiled (i.e., one of 4 #ifdefs for PCINT0-3) and the interrupt vector 10 is PCINT0 (0 is reset vector so offset by one makes sense). So, unless you hook serial to RXLED pin of CPU I don't believe it will work with the current libs.

Also I believe the Leo page is just wrong when it says pins 2 & 3 support pin change interrupts (I think this was copied from Uno but it is incorrect, the only (exposed) pins are D8 D9 D10 and D11 that support PCINT according to the ATMEL datasheet (and these are PCINT 4-7 not the ones in the interrupt mapping table AFAICT).

I believe this is where I can stop worrying that I'd be wasting the time of the core Arduino developers, and say quod erat demonstrandum; it a bug in SoftwareSerial. Hopefully they can update the Arduino IDE before the boards hits wider distribution.

Update : So, it turns out that this is a known limitation of the Leonardo. David Mellis looked into it, and left this comment :

You're right that the Leonardo only has one pin change interrupt, meaning that the software serial receive doesn't work on every pin. You should, however, be able to use pins 8 to 11 (inclusive) as receive pins for software serial. Additionally, the SPI pins (MISO, SCK, MOSI) available on the ICSP header and addressable from the Arduino software as pins 14, 15, and 16 should work.
He is, of course, correct. I'm not sure why my testing didn't work on pins 8-11, but they do indeed work fine. Unfortunately, this means that the Leonardo is not compatible with a number of cool shields. The Arduino SoftSerial Library Reference documentation has been updated with a more detailed list of limitations.

Moving forward by stopping

Posted by Russell on April 02, 2012 at 4:06 p.m.
Just a three weeks after I was sworn in for my term on the City of Davis Safety and Parking Advisory Commission, UC Davis law student Megan Glanville was killed just a few dozen feet from my doorstep. She was out jogging on a foggy morning, and truck coming into town from the county road ran her down in the crosswalk. I never knew Megan, but her death deeply upsets me.

I've been worrying about pedestrian and bike safety ever since my little sister was nearly killed by a careless driver.

I find it extremely frustrating that most people do not look beyond the (usually imagined) behavior of the people involved in an accident like the one that almost killed my sister, or that did kill Megan Glanville. Either they identify with the frustrating experience of driving, and blame the victim, or they side with the law, and place the responsibility at the feet of the operator of the more dangerous vehicle. I will always side with the person who suffered more, but both views are myopic. When someone has been killed in an accident, the question of who was more "right" in that sliver of time is irrelevant. It is worse than irrelevant; it is an insult to the lives of all the people affected.

There are other, far more urgent questions that need to be raised. If you see a problem, the first question you should always ask is, "In what way am I responsible for this?" We are all bound together by bonds of mutual responsibility, and nothing happens among people, good or bad, for which each of us are not in some sense responsible. That is what words like "society," "community," and "civilization" mean. They describe the fact that the bonds that link us together are fundamentally inescapable. There is such a thing as integrity, but there is no such thing as self-reliance. Interdependence is the very essence of what makes us human. And so, if you see something that upsets you, the first thing you should look at is your own role in causing it. Through our choices, we were all present on morning that George Souza killed Megan Glanville. You. Me. Everyone. We all had a hand in it.

Clearly, we failed. You failed. I failed. Someone is dead as a consequence of that failure.

So, let us set aside the choices of George Souza and Megan Glanville, and look at the choices we made that contributed to this terrible thing. They are easy enough to see :

This is the crosswalk where Megan was killed, which is part of a system of roads that belong to the City of Davis. The arrow on the yellow sign is pointing almost directly at the spot. The laws that govern the design of the road are a kaleidoscopic fugue of local, county, state, federal and international regulations. Within that often contradictory matrix of statutes, the city government has a small keyhole of authority within which it may choose what the road looks like and how it works.

From an engineering point of view, it's pretty clear what the problem is. The road on the left is just a stone's throw from the border of the city. Beyond the border, it is a wide county road that cuts a nearly straight line for miles among orchards and farms. When it crosses into the city, this road suddenly plunges into a dense residential neighborhood with no transition whatsoever. The intersection where Megan was killed is the very first intersection an eastbound driver encounters in the City of Davis. So, drivers come in from the county road going at county road speeds, and roar through this intersection where people are trying to cross to the bike path that parallels the road. Add a little darkness and bit of fog, and the accident was basically inevitable.

Why was this intersection designed this way? I don't know. According to the laws and statutes that regulate its engineering, there is nothing particularly wrong with it. But then again, houses that catch fire and burn people alive inside are often built to code. Compliance with the law is not enough. Only thoughtful design can keep people safe, and the absence of that thoughtfulness killed someone.

So, who is to blame? The legislators who wrote the statues describing how intersections should be designed? The engineers whose designs were constrained by those statutes? The City of Davis Public Works Department that built and maintained it? Surely, some of the responsibility falls to them. But not very much. If you've ever driven, walked or bicycled through the intersection of Lake and Russell, then a great deal of the responsibility falls on you. If you've ever felt uncomfortable or unsafe while passing through it, then you knew someone would get hurt there sooner or later.

The Council Chambers are open to the public. The meetings and agendas are available weeks in advance for all to see, at You can even submit your concerns in writing if you don't have time to come to the meetings. In other words, you had the reason and the means to get this fixed, or at least play a part in getting it fixed, before Megan Glanville was killed. I share in this responsibility; I serve on the commission charged with advising the City Council on these things, and I did not raise this issue either. And I use this intersection several times a day. And I always feel unsafe. It is my fault too.

So, here is what is going to happen. The City Council was asked, and agreed, to take steps to prevent anyone else from getting killed. The proposed changes will add stop signs on Russell Boulevard in both directions, a blinking red light in case drivers don't see the stop signs in the fog, and four new street lights for better illumination overall. It will cost about $20,000.

This is a much better design. It's impossible to know if it would have saved Megan's life had it been in place in December, but it seems likely that it would have. I strongly support it.

Roads are not natural phenomena. They are public infrastructure, and they are designed and built and maintained in exactly the way the public asks them to be. Let's try to do a better job of holding up our end of that conversation.

Ultimaker EM leakage

Posted by Russell on February 01, 2012 at 4:06 p.m.
Steven Lucero, the machinist for Biomedical Engineering, just finished building his own (well, the college's) Ultimaker 3D printer. BME is in the process of setting up a pretty awesome rapid prototyping facility, with an Objet Eden 260, a MakerBot and now an Ultimaker. To go with the 3D printers, Steven is also setting up a little electronics lab to go with it. A few weeks ago, he asked me for a shopping list of items that would be useful for electronics hacking, which I was delighted to provide.

One of the things I had to put on my shopping list for Steven was a basic oscilloscope. It's amazing how useful these things are. I've wanted my own o-scope for a long time, and so I happened to have a bunch of low-cost o-scopes bookmarked. Steven ended up buying exactly the one I would have bought for myself, which is sold by Sparkfun (they're sold out right now, unfortunately).

I wanted to see if I liked the software, so while Steven was printing something on his Ultimaker, I put two of the probes next to the X and Y stepper motors to measure the EM leakage.

Like most modern digital oscilloscopes, you can freeze the trace, and save the data as a CSV file onto a USB key. So, here's what the EM leakage from the stepper motors looks like during a print :

Cool, huh?

Teaching, week 2

Posted by Russell on January 28, 2012 at 1:50 a.m.
For reasons I don't fully comprehend myself, I decided it would be perfectly normal and reasonable to invent a class in a peculiar new topic, and then try to convince the University to let me offer it as a course. To my surprise (and horror), they let me do it. Thus was born Robotics for Laboratory Applications, under the auspices of UC Davis Biomedical Engineering and the good graces of Marc Facciotti.

Seven students signed up. Two of them I already knew because were on the UC Davis iGEM team, which shared our laboratory space during the year they worked on the project that won them Best Foundational Advance. I'm still getting to know the other five, but so far I'm impressed with them. UC Davis has some pretty brilliant undergraduates.

This is my first time teaching in an official (although perhaps not quite formal) capacity, and it's kind of interesting to see how things look from the other side. When a professor distributes a handout, for example, it doesn't seem like a big deal. However, it's kind of surprising how long it takes to print, collate and staple seven copies of everything. I definitely underestimated that today, and barely had their safety information sheets and IT policy documents ready in time. OK, I didn't have them ready in time, but fortunately one of the students hadn't finished eating his lunch, and this gave me an excuse to disappear for a minute.

The purpose of the class is to design a "minimally invasive" extension for our 3D printer that will allow us to use it as a general purpose laboratory robot. Friday's class was devoted to narrowing down the scope of the project to focus on a single function. We kicked around a lot of cool ideas, but didn't quite settle on a single one yet. I've set next Friday as the deadline for reaching a consensus.

One of the functions we might implement is a pipetting robot. We were wondering how well this would work. Just to illustrate the idea, I suggested they just give it a try.

Yes, that is just a pipetter taped to the hot end of our 3D printer. With two pieces of masking tape. To our surprise, I was able to maneuver the pipetter into a tip, seat the tip and position it over a small bottle-cap full of water. Operating the plunger manually, it worked.

We were not at all expecting to be able to get a good seal between the pipetter and the tip, but it worked just fine. I tried it a couple of times after the class, with different tips and pipetters, and didn't have any problem. Very encouraging, in terms of feasibility.

Say it with us : Another open letter to the Chancellor

Posted by Russell on November 23, 2011 at 2:05 a.m.
Dear Chancellor Katehi,

I know this joins a growing list of open letters addressed to you, but you will find that this one really is addressed to you, rather than at you.

On Saturday, I signed the petition for your resignation. After this evening's townhall meeting, I withdrew my signature. You've restored some of my confidence in your ability to lead this campus, although reservations remain. The way I see it, you have two choices; lead, or resign. I would prefer you that lead.

Unfortunately, it seems that you are not getting the best advice in that regard. I offer these thoughts in the hope that they will point the way.

Sometimes, it is necessary to break a small rule in order to protect a more important rule. Civil disobedience is not disregard for rules in general; it is a statement about the relative importance of two rules that are, or have become contradictory. The Civil Rights rights movement broke many local ordinances and state laws, but this was done in order to push the country into compliance with the Fifth and Fourteenth amendments of the United States Constitution. This is a proud and honorable part of American history, and has been a model for tremendous positive change around the world.

The Occupy protests belong to this tradition; they are peacefully but deliberately breaking a small law, in this case, ordinances against camping, in order to protect the country from the existential threat of economic nihilism. UC Davis was occupied by its students because they object to its destruction.

You spoke powerfully this evening about the burning of universities and libraries in Greece. "No one has the right to destroy the public's property," is how I think you put it. I am absolutely in agreement. I believe it is indeed your duty as chancellor to protect this campus, because there are indeed anarchists who are very eager to burn it down. However, the anarchists who threaten UC Davis are not camping on the quad tonight. They are in Sacramento and Washington D.C.

One of the most difficult problems in politics is building coalitions. You are always divided from your natural allies by social boundaries. Race, religion, age, region, class and gender create boundaries to mutual comprehension. Professors are separated from students by the roles each must play in the classroom, and administrators are separated from both professors and students by billowing layers of university bureaucracy.

You have fought for this campus in Sacramento and Washington, and you have seen just how frustrating and lonely it is to fight for public education these days. You clearly understand how important places like UC Davis are for the future of this country. But on Friday, your officers used chemical weapons on people who were engaged in the very same fight.

So, let me make a suggestion. Don't just apologize. Don't just seek to heal and muddle forward. The panels, investigations, reports and meetings you discussed this evening are all important positive things, and by all means proceed with them. However, you must know that they will not save this campus from the fire that is coming.

Instead, look across the divide and recognize Occupy UC Davis for what it is; the banner of your true allies. They are the infantry in the battle to save public education. Join them. Next time you walk the halls of Congress to fight for this campus, don't just bring a few token students. Bring an army.

The past few days have made it painfully clear that while you can be quite articulate, you are not a skilled politician or tactician. This sits in stark contrast to the students, who have frankly run circles around you. If you're willing to fight for us, your errors can and will be forgiven. However, if you are going to be an effective advocate for this campus, you will need to employ the skills you have, and find trustworthy friends to help when you are out of your element. An alliance of the Chancellor's office and the student movement would be much greater than the sum of its parts, and would certainly be more productive than continuing to antagonize one another. If you want that alliance, you are going to have ask for it in the language of mass protest; with symbols, not words.

You will find that there is plenty of room on the quad for another tent. There could be no symbol more unambiguous than joining the students in committing this trivial infraction. The world is watching.

Whose university? Our university. Say it with us, Chancellor.

Russell Neches
Graduate student, Microbiology
UC Davis

These are not the microbes you are looking for

Posted by Russell on October 04, 2011 at 6:45 p.m.
A few months ago, I tweeted, "I've been working in a microbiology lab for two years, and just realized we don't actually have a microscope. Huh."

Jack Gilbert and some other people proceeded to give me grief for what I intended as an interesting observation about the current state of the art in microbiology. So, I decided to remedy the situation. Evidently, we do have a microscope, I just didn't know where it was.

Here are some cool things I found by randomly poking around in some of my samples from Borax Lake. This first thing I found is probably some kind of diatom from in the sediment of the little hot spring just north of Borax Lake. I'm not looking for diatoms, but it looks really, really cool.

Here they are at 100x magnification.

This is somewhat less cool-looking, but is probably what I'm actually looking for. In the little bubble of water surrounding the granule in the center, there were a couple little rods hopping around. No clue what they are, they're there, doing what they do.

3D printing update

Posted by Russell on October 04, 2011 at 12:19 a.m.
I've been working a bit on the software that generates my 96-well dilution plate. I have a new version that cuts the plastic use by about 80% and print time by about the same. Also, it now prints with the wells upside-down on the build platform, which should help cut down contamination during the printing process.

Things to do :

  • I'm going to try cutting the plastic use even more by adding a skirt around the plate (like a normal titer plate), and adjusting the outer height of each well.
  • Add a fill-line to each well.
  • Raise the well edges a little more, and add drain-holes between wells to prevent spillage between wells and to make filling easier.
  • Add embossed row and column labels.
  • Add an embossed text area for user notations (e.g., for which sample group is this plate calibrated).
Hmm. I might pull this off yet.

Also, if you are interested in this stuff, the UC Davis Biomedical Engineering made me instructor of a variable unit class (graded P/NP) called "Research internship in robotics for the laboratory" for Winter 2012. Sign up for BIM192, sec 2 (the CRN is 24791).

Borax Lake : Sample collection and processing

Posted by Russell on October 01, 2011 at 1:57 a.m.
I've admired Rosie Redfield's lab-notebook-as-blog from the moment I started reading it, and I've been looking for an excuse to steal pilfer abscond with adapt her idea. However, most of the work I've been doing up to this point has been computational, and try as I might, I can't make myself keep a lab notebook for programming. That's what things like GitHub are for. I've started actually doing some of the laboratory and field work for my thesis project, and so I finally have an excuse to do some open lab and field notebook blogging.

Rosie might be amused to that I'm starting off with some work I'm doing at an arsenic-heavy lake, although Borax Lake is known more for boron than arsenic. If I bugger up my assays, I just hope she'll get on my case in the comments before I submit anything for publication.

I will write more about this as I go along, but my goal for my thesis project is to try to get an idea about the modality of microbial migration. Specifically, I want to know if microbial taxa, when they colonize a new environment, arrive individually or as an existing consortium. I hope to find out by reconstructing population structures from metagenomic samples from widely dispersed but ecologically similar environments.

I learned about Borax Lake from Robert Mariner of the US Geological Survey, who was kind enough to respond to my emails and patiently discuss his survey results over the course of several lengthy telephone calls. He also volunteered a lot of useful information, such as which sites have rattlesnakes and where they are likely to be found, and helped enormously in search and selection of sampling sites. Without this help, I probably would have had to give up on this project as I originally imagined it.

Borax Lake was one of many thousands of sites across the American West surveyed over a 40 year long project USGS project led by Ivan Barnes and Robert Mariner to study the chemistry and isotopic composition of mineral springs. Extensive analysis of Borax Lake water was conducted by in August of 1972 by John Rapp, and again in July of 1991.

Dr. Mariner pointed out that Borax Lake is administered by the Nature Conservancy, and a little bit of Googleing and emailing got me in touch with Jay Kerby, the Southeast Oregon Project Manager for the Nature Conservancy. Jay was very helpful, and walked me through the process of obtaining sampling permits for my project.

Before I talk about Borax Lake, I need to say that it is absolutely essential that you obtain explicit, written permission before collecting samples. As scientists, we've got to get this stuff right if we want to avoid stuff like this. The fact that some researchers did not (for whatever reason) obtain permission to use the cells they used to make important discoveries, or did not cooperate in good faith with the originators of those cell lines, has made it much more difficult for me to do my own research. Kary Mullis, if you're reading this, thanks for PCR (really), but...

Anyway, I'm not sure if Borax Lake itself is going to be a good candidate for my project (it has very unique chemistry), but it is surrounded by ephemeral pools of brine that may be good analogs to coastal salt ponds. You could think think of this as island biogeography, but inverted; I'm looking for islands of ocean isolated by oceans of land.

The lake itself has a very peculiar mineralized ledge a few inches above the shore. The water has been precipitating an extremely hard material for a very long time. I tried to collect a small amount of it to examine in the lab, and discovered that it is as hard as concrete. Even with the aid of a hammer, I couldn't dislodge any small pieces. I didn't want to damage the ledge itself by taking a larger piece, so I left without any samples of the precipitated material. Borax Lake is sitting atop a thirty foot high pedestal of this stuff.

The Nature Conservancy has been working on some plans to make the site more accessible, but I don't imagine it will get many visitors. It is way off the beaten path. Next time I visit, I'm going to bring a truck. The lake is off of a very lonely state road, up several miles on unpaved, unmarked fire roads, followed by a few miles of ATV tracks. A horse would probably the the ideal way of getting there, but my trusty little Toyota still managed.

This is one of the hotsprings just north of Borax Lake. The first record I have of it is from May 1957 by D.E. White of the USGS. It was visited again in June 1973 by Robert Mariner, and again in September 1976 by Robert Mariner and Bill Evans. The next visit was in July 1991 by Robert Mariner. I measured a surface temperature of 65°C. To my surprise, I saw a couple of Borax Lake chubb swimming around near the cooler (but not much cooler) periphery.

I took four kinds of samples : Unprocessed water samples in 500ml bottles, unprocessed sediment samples in 50ml conical tubes, processed water samples for environmental DNA in Sterivex filters, and processed sediment samples for environmental DNA using Zymo's Xpedition Soil/Fecal miniprep kits. I divided the unprocessed samples between the freezer and the 37° room, and I'll save my notes on the filtered water samples for another article.

One of the unusual things about the Xpedition miniprep kit is that the first spin column is not a DNA binding column; it's more like a crap-catcher. So, you are supposed to keep the flow-through, not discard it as you would with a DNA binding column. John got a little ahead of himself, and discarded the flow-through from four columns before he realized the protocol was different from, well, just about all of the other DNA extraction mini-preps on the market. Fortunately, I collected many extra samples. Also, when I split the work between John and myself, I split up the samples into evens and odds, so that neither of us would be working on all of one group of replicates.

This led to an important lesson : Do not discard the lysis tubes after you've removed the supernatant. It occurred to me that the wreckage of beads, muck and buffer at the bottom of the spent tubes was probably full of DNA, so I added 500 μL of molecular-grade water, vortexed them, and put them back into the centrifuge at 10,000g for a minute, and spun the supernatant through the orange-capped columns. Two of the four yielded plenty of DNA. I'd probably have gotten more if I'd used lysis buffer instead of water, and the bead-beater instead of the vortexer.

I'm still not totally sure what rationale to apply for the last step. The Xpedition miniprep lets you elute the DNA with anywhere from 10 to 100 μL of buffer. If you use less elution buffer, you get less total DNA, but the DNA you get will be at higher concentration. Elute with more, and you get more DNA, but at lower concentration. The actual amount of DNA can vary over four orders of magnitude, and so guessing right is very helpful. But... impossible. I decided to elute in 50 μL, and that seems to have worked OK for my purposes.

I then measured the DNA concentration in a Qubit fluorometer with Invitrogen's Quant-iT high sensitivity assay for dsDNA. Because this requires me to go one-by-one, this is not how I would like to quantify my samples in the future. But, for thirty seven samples, it was easy enough.

Sample μg/mL Source Description
1 0.723 Don Edwards Wildlife Refuge Salt crystals from site A23
2 0.582 Don Edwards Wildlife Refuge Salt crystals from site A23
3 0.531 Don Edwards Wildlife Refuge Salt crystals from site A23
4 0.824 Don Edwards Wildlife Refuge Salt crystals from site A23
5 0.209 Don Edwards Wildlife Refuge Salt crystals from site A23
6 27.8 Don Edwards Wildlife Refuge Mat community from site A23
7 - Don Edwards Wildlife Refuge Mat community from site A23 (field processing failed)
8 2.57 Borax Lake Sediment (poor collection)
9 44.9 Borax Lake Sediment
10 44.9 Borax Lake Sediment
11 25.1 Borax Lake Sediment
12 47.1 Borax Lake Sediment
13 48.2 Borax Lake Sediment
14 41.2 Borax Lake Sediment
15 41.2 Borax Lake Sediment
16 40.9 Borax Lake Sediment
17 26.1 Borax Lake Sediment
18 31.6 Borax Lake Sediment
19 - Borax Lake Sediment (lost during extraction)
20 81.6 Borax Lake Sediment
21 35.6 Borax Lake Sediment
22 35.9 Borax Lake Sediment
23 - Borax Lake Mat community from hot spring
24 - Borax Lake Mat community from hot spring
25 3.13 Borax Lake Mat community from hot spring
26 44.0 Borax Lake Mat community from hot spring
27 - Borax Lake Mat community from hot spring (salvaged sample)
28 - Borax Lake Mat community from hot spring
29 2.76 Borax Lake Mat community from hot spring (salvaged sample)
30 - Borax Lake Mat community from hot spring
31 - Borax Lake Mat community from hot spring (salvaged sample)
32 - Borax Lake Mat community from hot spring
33 0.72 Borax Lake Mineralized mat community from hot spring
34 22.8 Borax Lake Mineralized mat community from hot spring
35 100 Borax Lake Mineralized mat community from hot spring
36 29.2 Borax Lake Mineralized mat community from hot spring
37 39.2 Borax Lake Mineralized mat community from hot spring (salvaged sample)

For the 19 samples that had DNA concentrations above about 20 μg/mL, I ran a gel to check the size distribution. It looks like the Zymo miniprep performed about as well as they claimed; most of the fragments seem to be between 5 and 10 kilobases, with a fair amount of DNA in fragments larger than 10 kilobases.

I only need about a picogram of input DNA for each transposase tagmentation library, and I only need fragments bigger than about 3 kilobases. So, this process exceeds my absurdly modest requirements by a lot.

I should mention that Anna-Louise Reysenbach graciously lent me a pH probe to use in the field after mine turned out to be dead as a doornail. Issac Wagner, a postdoc in her lab, spent a couple of hours helping me get their field probe calibrated with my meter. Unfortunately, their probe turned out to be in only somewhat better condition than mine, and Anna-Louise asked that I leave it in Portland rather than risk taking bad data with it. I drove directly from Borax Lake to the UC Davis Genome Center in about seven hours, and immediately took the chemical measurements on our benchtop pH meter. It didn't work out, but I still greatly appreciate the help from Anna-Louise and Issac! (Also, thanks goes to my little sister Anna, who has to take Portland's MAX over to Portland State to return the ailing pH probe.)

Spoon to Bench: A field DNA processing gadget review

Posted by Russell on September 30, 2011 at 1:24 a.m.
In my previous article, I outlined my plans for sequencing a very large number of metagenomes. Assuming that works, there also the problem of actually getting the samples in the first place. Aaron Darling likes to begin the story of metagenomics by saying, "It all begins with a spoon..."

So, how do you get the microbes from the spoon to the laboratory?

One of the things I learned from my experience in Kamchatka was just how tricky collecting samples in the field really is. From lining up permissions and paperwork, to dealing with cantankerous Customs officials, to avoiding getting mauled by bears, the trip from the spoon to the bench is fraught with difficulties. If you mess it up, you either don't get to do any science or you'll end up doing science on spoiled samples.

And then there is the DNA extraction. My lab mate Jenna published a paper last year where she created synthetic communities from cultured cells, and then examined how closely metagenomic sequencing reproduced that community. She found that the community representation was heavily skewed, but that the DNA extraction methodology was critically important. Because it was very difficult to know how well the extraction process was going to work on hot spring sediment, Albert Colman's group basically brought every DNA extraction kit they could lay hands on to Kamchatka. Also, they brought a whole lab with them; a 900-watt BioSpec bead beater (that almost killed our generator), a centrifuge, mini-fuge, a brace of pipetters, gloves, tips, tubes, tube racks, and a lab technician to run the show (see my Uzon Day Four post to see a little of that; also, most of the of heavy crates in the photos).

Albert, Bo and Sarah really did an excellent job pulling all of this together, but it was hard. Watching them (and helping them where I could) got me to think very carefully about how I want to conduct my field research. One thing is for sure; as much as I respect our BioSpec bead beater, I am not going to carry it into the field. Period. In fact, if I can possibly manage it, I am going to restrict my supplies and equipment to what I can carry in a daypack.

I'm still working on how I will do water sampling, but I think I might have found a solution to sediment sampling at the ASM meeting in New Orleans. Zymo Research just came out with a line of field DNA extraction kits that are intended specifically for field collection. The idea is pretty straight-forward; they combined a DNA stabilization buffer with a cell lysis buffer, and made a portable, battery-operated bead beater to go with it.

It's super cool, but I hemmed and hawed for a few months after ASM. I was a little suspicious of my own judgement; the system includes a cool gadget, and so of course I wanted it. I spent a month reading protocols and tinkering around before I finally decided that if the system works the way Zymo claims, it's just about the best thing for my purposes. What clinched it was re-reading Jenna's paper, which clearly shows the importance of thorough cell disruption.

So, I finally decided that I had to give it a try, and that's what this article is about. If you like, you can think of it as a parody of the tedious gadget reviews on Gizmodo and Engadget, with maybe a dollop or two of Anandtech's penchant for brain-liquefying detail.

I guess this wouldn't be proper gadget review unless I started with a meticulous series of photos documenting the unboxing. So, uh, here are the boxes.

The big one contains the sample processor, and the two smaller ones contain 50 DNA extraction mini-preps each. I'm going to leave the mini-prep kits sealed for now, since I'm going to use them for my field work. Zymo provides two DNA extraction mini-kits with the sample processor, so I'm going to use those to test out the system.

Underneath the documentation (directions are for suckers) and the mini-kits, there is the sample processor, a charging station, a 12 volt lithium ion battery pack, and an international power adapter. They also provide some little disks, which I think are for using with conical tubes (they recommend using skirted tubes, since conical tubes can shatter), and a couple of pairs of earplugs.

The earplugs turned out to be... prescient.

The sample processor itself is an modified Craftsman Hammerhead Auto Hammer. Upside? I can buy extra batteries from Sears! Downside? Seeing the $71.99 pricetag from Sears really makes Zymo's $900 pricetag hurt. Our super-powerful bench-top BioSpec bead beater is only about twice that.

When I asked, Zymo said that they've actually modified some of the internals of the Crafstman tool, but this might have just been to discourage me from traipsing off to the hardware store to buy some PVC pipe fittings and a hacksaw. Experience tells me, though, that I could easily fritter away $800 worth of time replicating their engineering. OK, $700. It's a really nice international power adapter.

I was a little disappointed to note that the Craftsman part is made in China. Not that I have anything against things being made in China, but I was under the impression that Craftsman was an American brand. It's a little like discovering that a jar of authentic-seeming salsa is made in New Jersey, or something. I'm sure they make perfectly good salsa in New Jersey. Nevertheless, I have a deep-seated belief that salsa should be made in a Southwestern state by grandmothers who each know five hundred thousand unique salsa recipes, and Craftsman tools should be made in Pennsylvania or West Virginia by guys who wear blue overalls and carry their lunches in pails.

OK, so maybe I do have something against everything being manufactured in China. While using the sample processor in the lab, it suddenly made a very loud click that I hadn't heard before. When I looked carefully, I noticed that there was a piece of metal debris caught in the motor vent. It seems to be made out of aluminum (it's not ferromagnetic). My guess is that this is debris from the manufacturing process, not a broken part of the device. I shook out two other smaller pieces, but lost them before I could photograph them. It looks like the three pieces are part of a square. Most likely this is the remains of an improperly handled punch-out, like a metal version of a paper chad. As you can see, it got kicked around inside the motor housing until it was ejected into the vent. I think Craftsman (or their subcontractor) should get the blame for this, rather than Zymo.

Here is the soil/fecal mini-kit. Each prep uses three sets of spin columns. The bead bashing tubes, as they are labeled, are in the upper right, along with two tubes of lysis/stabilization buffer and a tube of elution buffer.

The protocol says to add the sample first, and then add 750ml of lysis/stabilization buffer, and then bead-beat. But... then you would have to bring a p1000 and tips along with you. No thanks. The sample tubes and the beads had better be chemically stable, or they'd wreck everything. So, I aliquated the buffer into the bead tubes before leaving the lab, and left the p1000 behind. Zymo includes some very fancy spin columns with this kit; they have their own caps, and little nubs on the flowthrough channels that you need to snap off before you use the columns. I've not encountered anything quite like these.

The final step of the kit includes these green-capped columns that are pre-filled with buffer. I wasn't expecting any liquid to be in them, and so of course I spilled the first one on my foot. Don't do that.

So, I took a little miniature field expedition to the exotic environs of the Putah Creek Riparian Reserve to try this out. It didn't take long to find a place that promised to have plenty of microbes.

Here's a soil sample before processing.

I processed some of these samples for 45 seconds (the directions recommend a minimum of 30 seconds). Usually it seems to work fine, but occasionally the tube explodes and splatters mud and buffer all over the inside of the lysis chamber.

The exploding tube problem appears be caused by grit preventing the threads from closing correctly. In other words, it was my fault. Be extra careful to get the dirt actually inside the tube. Here's what it's supposed to look like.

After processing, the samples are noticeably warm. If you are going to process for much longer than 45 seconds, I suggest you stop and let the sample cool for a few minutes before continuing.

Here are the yields I measured for the mini-kit preps (minus the tube that exploded), eluted into 100 μL of buffer.

Source Yield
Potted plant 80.6 μg/mL
River muck 0.669 μg/mL
River muck 1.13 μg/mL
River muck 0.595 μg/mL
I messed up the extraction protocol a little bit (and I used too much elution buffer at the end), but still got enough DNA to work with. Not too shabby for a first try.

I decided I had to throw these samples and DNA away because I don't actually have permission to use samples collected on UC Davis's campus. That's also why I'm not showing a gel.

How to sequence 10,000 metagenomes with a 3D printer

Posted by Russell on September 19, 2011 at 1:15 a.m.
For my thesis project, one of the things I would like to do is sequence many different samples, perhaps on the order of several hundred or thousand. It's easy enough to build sequencing libraries these days, at least, with Illumina, anyway. Obviously, doing a couple of hundred lanes of Illumina sequencing would be ridiculous (not even Jonathan Eisen is that nice to his graduate students), and so I'll be using several barcoded samples pooled into each lane. The barcoding chemistry itself was fairly tedious, until people starting doing transposon-based library construction.

A transposon is a little piece of DNA that copies itself around inside the genome of an organism, via an enzyme called transposase. Here's what the genetic element looks like :

Transposase binds the element at the inverted repeats on either end, and coils it into a loop. Then it cuts the DNA at the inverted repeats, and the complex floats away. It leaves complementary overhanging ends in the chromosome, which are usually repaired by DNA polymerase and DNA ligase (DNA gets broken surprisingly frequently in the normal workaday life of a cell; that's why DNA repair mechanisms are so important). When it's complexed to DNA, transposase grabs the DNA like this :

The transposase we're using (Tn5) is a homodimer; the two subunits are in dark and light blue. The inverted repeats (red) are bound to the complex at the interfaces between the subunits. The pink loop is the DNA that gets cut and pasted.

The complex then floats around in the cell until the transposase recognizes an integration site somewhere else in the genome. It then cleaves the DNA and inserts the payload into the break. DNA ligase then comes along and fixes the backbones. You can see why this kind of transposon is also called a cut-and-paste transposon.

The reason these are interesting for library construction is that you can prepare a transposon complex where the loop of payload DNA is broken. When the transposon integrates, it pastes in a gap. If you add a lot of transposons that aren't too choosy about their binding sits, they will chop up your target DNA. Fragmentation is one of the steps needed for sequencing library construction. What's nice about transposons is that when you use them to chop up your target DNA, they leave the two halves of their payload stuck onto the ends.

If you stuck your sequencing adapters on there, the fragmentation process also includes adapter ligation. If you added barcodes along with the sequencing adapters, the reaction combines almost all of the library construction into a single digest. Epicentre whimsically named this process "tagmentation." Get it?

However, there's still a fly in this ointment. The distribution of transposon insertions is a function of the relative concentrations of charged transposon complexes to target DNA, and DNA extraction, even from seemingly identical samples, can have highly variable yields. So, it's very important to control the input concentrations and reaction volumes during the digest. This is fairly easy if you're only making a dozen or so libraries, but what if you want to make ten thousand of them?

Measuring DNA concentrations of lots of samples is relatively easy, and there are lots of ways of doing it. We have a plate reader that can do this by florescence on titer plates with 1534 wells, or we could (ab)use the qPCR machine to give us DNA concentrations on 384 well titer plates. There are other ways, too.

However you quantify the DNA concentrations, you have to dilute each sample to the desired concentration before you can start the tagmentation process. If you get the concentrations wrong, the library comes out funny.

A few dozen library constructions calls for hours of tedious work at the bench. I've gotten better at wetlab stuff since my first rotation, and the transposon-based library construction helps a lot, but staking my Ph.D. on reliably powering through lots of molecular biology would be a bad idea. Some people might not blink an eye at this, but as soon as I find myself repeating something four or five times, my computer science upbringing starts whispering there has got to be a better way in my ear. And lo, there is indeed a better way.

Hundreds or thousands of library constructions would call for a robotic liquid handling machine. I spent some time researching these things, and I'm not impressed. The hardware is nice, but programming the protocols involves wading into a morass of crumbling, poorly maintained closed source software, expensive vendor support contracts, and a lot of debugging and down-time. Oh, and they're terrifyingly expensive, and can be kind of dangerous.

Dispensing water into titer plates doesn't seem like a very challenging robotics application, so I thought about building my own robot. It would probably be about the same amount of work as ordering, programming and debugging one of the commercial robots, and it would be more fun.

But, robots are just such a mainframe-ish solution. If there is one thing my dad taught me, it's that a lot of little machines working in concert will beat the stuffing out of a single big machine. The trick is figuring out how to organize and coordinate lots of little machines. The key to this problem is to do lots and lots of little reactions in parallel; the coordination requires lots of precise dilutions simultaneously. Getting this part right would crack the whole thing wide open, allowing you to easily do more reactions than you probably even want.

So. I'm going to make my own custom microtiter plates, just for the dilution. This satisfies the coordination criteria, and allows me to treat a plate-load of reactions identically. If each well has the right volume for the dilution, I can just fill all the wells up to the top, pipette in the same volume of raw DNA with a multichannel pipetter, let the DNA mix a little, and all the wells will be at equal concentration. Then I pipette that into the tagmentation reaction, and I'm done. With a good multichannel pipetter, I can do 384 reactions about as easily as I could do one.

All that's necessary is a 3D printer, and the ability to procedurally generate CAD/CAM files from the measured DNA concentrations. As it happens, this is really easy, thanks to a little Python library called SolidPython :

These are the wells of a 96-well plate with randomly chosen volumes for reach well.

One of the things I'm worried about is contamination. 3D printers are not really designed for making sterile parts. So, what I've done here is design a mold, and I'm going to cast the plate itself in PDMS silicone elastomer. PDMS is easy to cast, and it has the nice property of being extremely durable once it's set. And, even better, when exposed to UV, the surface depolymerizes and turns into, essentially, ordinary glass. I can autoclave the heck out if it, blast it with UV, and indulge in all manner of molecular paranoia.

If I can figure out a way to reliably sterilize thermoplastic, I'll skip the business with the PDMS casting, and simply print microtiter plates directly, like this :

By the way, I used the dimensions of a Corning round-bottom 96 well microplate. You can download the model from my account on Thingiverse.

So, I ordered a personal 3D printer. It looks like the hottest Open Source personal 3D printer right now, and the only one with a build volume larger than a titer plate, is the Ultimaker. I'd have really liked to have gone with MakerBot Industries' Thing-o-Matic, but the build volume is just a scoche too small. Come on, guys! Just a few more millimeters? Please?

Unfortunately, the Ultimaker has a four to six week lead time, so I have to wait for a while before ours arrives. At the suggestion of Ian Holmes, I headed off to Noisebridge, a hackerspace in the San Francisco's Mission District where they have a couple of 3D printers available for people to use. The machines are Cupcake CNC's, MakerBot's first kit. The ones at Noisebridge are... well, let's just say they are well-loved. The one I used had to be re-calibrated before it would go. 3D printers are pretty straightforward machines when it comes down to it, so it only took me a couple of minutes of poking around at it to figure out how to make the right adjustments. Then, it worked like a charm!

As you can see, I was a bit conservative about the design, since I wasn't sure how good the print quality would be (especially after my cack-handed ministrations).

I'm experimenting with PDMS casting now, but I'm going try some tests to see how thoroughly I can clean thermoplastic with UV. I'd really like to just order up a nice 384 well plate, and get right to it!

Anyway, I need to thank (or perhaps blame) Aaron Darling for getting me interested in transposon-based library construction, and for pointing out their significance to me.

New Equipment Thursday

Posted by Russell on August 25, 2011 at 7:40 p.m.
My vacuum desiccator arrived today, and so naturally I put it to productive use. You know. For science.

Haw! This thing is cool.

Sneak Pique

Posted by Russell on July 13, 2011 at 3:41 a.m.
I'm about to release a new piece of Open Source software; it's a fast, fully automated and very accurate analysis package for doing ChIP-seq with bacteria and archaea. I'm doing my best to avoid the sundry annoyances of of bioinformatics software; it uses standard, widely used file formats, it generates error messages that might actually help the user figure out what's wrong, and I've designed the internals to be easily hackable.

I have two problems, though.

Most of the people who will be interested in using this software are microbiologists and systems biologists, not computer people. At the moment, the software is a python package that depends on scipy, numpy and pysam. I used setuptools, wrote tests with nose, and hosted it on github. If I were going to distribute it to experienced Linux users, it's basically done. However, installing these dependencies on Windows and MacOS is a showstopper for most people -- scipy in particular. So, how would you suggest distributing it to MacOS and Windows users?

The second problem is... it's kind of ugly. I wrote the GUI in Tk, which is not particularly great in the looks department. Should I bother creating native tooklkit GUIs? Would that make people significantly more comfortable? Or would it be a waste of time?

Questions of microbial ecology

Posted by Russell on April 27, 2011 at 10:50 p.m.
When the first environmental sequencing projects were conducted, the genetic bredth present within an environmental sample so far outstripped the available sequencing capacity at the time that it was only possible to obtain a tiny slice of the genetic material present. This gave researchers two choices; either target a particular gene, or go fishing. Both approaches have been extremely fruitful. Targeted studies of ribosomal RNA led to the discovery of the archaea, among other important accomplishments. The "fishing" approach (which has a shorter history) has also led to exciting discoveries. If you do a literature search for your favorite enzyme with the word "novel," it's quite likely that most of the recent publications will involve some kind of metagenomic survey.

As the cost of sequencing continues to plummet, a third approach to environmental sequencing has suddenly become possible: Exhaustive sequencing. It should be possible not only to survey the entire genomes of the organisms present (although assembling them is another story), but also to survey the population-level variability of the organisms present. This is a rather unprecedented development. Microbial communities have suddenly gone from the most challenging ecologies, with only a handful of observable characters, to a spectacularly detailed quantitative picture.

Here is an example from one of my datasets :

This is a small region in the genome of Roseiflexus castenholzii. I have mapped reads from an environmental sample to the reference genome, yielding an average coverage of about 190x. If you look closely at the column in the middle (position 12519 in the genome, in case you care), we see some clear evidence of a single nucleotide polymorphism in this population of this organism.

As it happens, this coordinate falls in what appears to be an intergenic region, between a phospholipid/glycerol acyltransferase gene on the forward strand to the left and a glycosyl transferase gene one the reverse strand to the right. The two versions appear with roughly equal frequency in the data. For this organism, I've found single nucleotide polymorphisms at thousands of sites. There are also insertions and deletions, and probably rearrangements.

In this ecosystem, I'm able to get between 50x and 300x coverage for almost every taxon present. This should make it possible to see variants that make up only a percent or two of their respective taxon's population. With data like this, it should be possible to do some really beautiful ecology!

For example, suppose one wanted to see if a community obeys the island biogeography model. One could measure the theory's three parameters, immigration, emigration and extinction, by comparing the arrivals and disappearances of variants between the "mainland" and the "island" over time. The ability to examine variants within taxa should make these measurements very sensitive. Additionally, because these are genomic characters, it should be possible to control for the effects of selection (to some extent) by leveraging our knowledge of their genomic context. The 12519th nucleotide of the R. castenholzii genome is perhaps a good example of a character that is unlikely to be under selection because it happens to sit downstream from both flanking genes.1

So, here is my question to you : What ecological model or process would you be most excited to see studied in this way?

1 Well, actually I haven't looked at this site in detail, so I'm not sure if one would or wouldn't reasonably expect it to be under selection. My hunch is that it is less likely to be under stringent selection than most other sites. I'm basing this hunch on eyeballing the distance of this locus from where I think RNA polymerase would be ejected on either side, and that both transcripts terminate into its neighborhood. My point is that it should be possible to have some idea of how selection might operate on a particular locus based on its genomic context. One should take this with the usual grain of salt that accompanies inferences drawn solely from models. A better example would be a polymorphism among synonymous codons, but I wasn't able to find one in a hurry.

Bioengineering side project

Posted by Russell on February 22, 2011 at 5:10 p.m.
I've been working on a little bioengineering side project, and I just finished putting together a working version of the firmware. It'll probably take some refinement, but I've managed to get the microcontroller to do what I need it to do -- measure visible light irradiance over a wide range of intensities.

This is the light intensity in microwatts per square centimeter measured at about a 0.3 second resolution. I haven't done any of the actual bio- part of the bioengineering, so for the moment the light curve is the beginnings of a sunset at Mishka's cafe.

I'll post more about this once I have the prototype working. Next up, 3D printing!

A sequencer of our own

Posted by Russell on January 27, 2011 at 4:12 p.m.
We just finished running our new GS Jr. gene sequencer for the first time. It produced 115,698 shotgun reads of our E. coli. Here is the read length histogram :

And the GC content histogram :

This was our first time going through the shotgun library protocol, which is pretty involved. For example, we're going to have to be more careful next time when we load the picotiter plate. We got a few bubbles trapped in there. It's kind of funny how obvious the bubbles are in the raw florescence images (this is an A, around cycle 200) :

I've uploaded the FASTA file and the qual file, in case you want to try to assemble your own E. coli genome.

Fun with de Bruijn graphs

Posted by Russell on October 29, 2010 at 4:34 a.m.
One of the projects I'm working on right now involves searching a better approaches to assembling short read data metagenomic data. Many of the popular short read assembly algorithms rely on a mathematical object called a de Bruijn graph. I wanted to play around with these things without having to rummage around in the guts of a real assembler. Real assemblers have to be designed with speed and memory conservation in mind -- or, at least they ought to be. So, I decided to write my own. My implementation is written in pure Python, so it's probably not going to win any points for speed (I may add some optimization later). However, it is pretty useful if all you want to tinker around with de Bruijn graphs.

Anyway, here is the de Bruijn graph for the sequence gggctagcgtttaagttcga projected into 4-mer space :

This is the de Bruijn graph in 32-mer space for a longer sequence (it happens to be a 16S rRNA sequence for a newly discovered, soon-to-be-announced species of Archaea).

It looks like a big scribble because it's folded up to fit into the viewing box. Topologically, it's actually just two long strands; one for the forward sequence, and one for its reverse compliment. There are only four termini, and if you follow them around the scribble, you won't find any branching.

New ceramics

Posted by Russell on May 21, 2010 at 7:36 p.m.
I just retrieved my latest ceramics from the studio today, and I'm really happy with how they turned out. This is the first four of a series of twelve pieces glazed the same way, with a turquoise underglaze, a vibrant blue topglaze, and a buff-mix claybody with an aluminum oxide treatment.

This is my favorite piece, and the largest thing I've finished so far. It holds about 1400 ml. It's a little big to eat out of, but a little too small to really use as a serving bowl. I think I will use it mostly as decoration, but it could make a nice serving basin for two people, or maybe for serving a side dish, or something like that.

Now I have six more things to finish before the quarter is over!

What Google knows

Posted by Russell on April 28, 2010 at 11:26 a.m.
After six months of using Google Latitude, I've amassed about 7108 location updates, or about 38 a day. It would probably be a lot more if I hadn't managed on occasion to break the GPS or automatic updating by fiddling with the software.

It's actually quite useful to have this data, especially if it's correlated with some richer information. For example, I've consulted the data to answer questions like, "Where was that awesome sandwich place I ate at last month?" It's also extremely useful to be able to share this data with Google because it allows me to quickly cross-reference location coordinates with Google's database of businesses and addresses. You can also download your complete location history in one giant blob (just ignore the warning that the History map only displays 500 datapoints, and download the KML file). Once you have the KML file, you can do whatever you want with it. For example, I uploaded mine to Indiemapper to map my wanderings for the last six months (Indiemapper is cool, but I quickly found that this dataset is really much too big for a Flash-based web application).

Not surprisingly, I spent most of my time in California, mostly in Davis and the Bay Area, with a few trips to Los Angeles via I-5, the Coast Starlight, and the San Joaquin (the density of points along those routes is indicative of the data service along the way).

The national map shows my trip to visit my dad's family in New Jersey and Massachusetts, as well as a layover in Denver that I'd completely forgotten about.

I have somewhat mixed feelings about this dataset. On one hand, it's very useful to have, and sharing it with my friends and with Google is very useful. It's also cool to have this sort of quantitative insight into my recent past so easily accessible. On the other hand, I'm not particularly happy with the idea that Google controls this data. I chose the word controls deliberately. I don't mind that they have the data -- after all, I did give it to them. As far as I know, Google has been a good citizen when it comes to keeping personal location data confidential. The Latitude documentation makes their policy pretty clear :


Google Location History is an opt-in feature that you must explicitly enable for the Google Account you use with Google Latitude. Until you opt in to Location History, no Latitude location history beyond your most recently updated location if you aren't hiding is stored for your account. Your location history can only be viewed when you're signed in to your Google Account.

You may delete your location history by individual location, date range, or entire history. Keep in mind that disabling Location History will stop storing your locations from that point forward but will not remove existing history already stored for your Google Account.


If I delete my history, does Google keep a copy or can I recover it?

No. When you delete any part of your location history, it is deleted completely and permanently within 24 hours. Neither you nor Google can recover your deleted location history.

So, that's what they'll do with it, and I'm happy with that. What bothers me is this: Who owns this data?

This question leads directly to one of the most scorchingly controversial questions you could ask for, and there are profound legal, social, economic and moral outcomes riding on how we answer it. This isn't just about figuring out what coffee shops I like. If you want to see how high the stakes go, buy one of 23andMe's DNA tests. You're giving them access to perhaps the most personal dataset imaginable. In fairness, 23andMe has a very strong confidentiality policy.

But therein lays the problem -- it's a policy. Ambiguous or fungible confidentiality policies are at the heart of an increasing number of lawsuits and public snarls. For example, there is the case of the blood samples taken from the Havasupai Indians for use in diabetes research that turned up in research on schizophrenia. The tribe felt insulted and misled, and sued Arizona State University (the case was recently settled, the tribe prevailing on practically every item).

You can't mention informed consent and not revisit HeLa, the first immortal human cells known to science. HeLa was cultured from a tissue biopsy from Henrietta Lacks and shared among thousands of researchers -- even sold as a commercial product -- making her and her family one of the most studied humans in medical history. The biopsy, the culturing, the sharing and the research all happened without her knowledge or consent, or the knowledge or consent of her family.

And, of course, there is Facebook -- again. Their new "Instant Personalization" feature amounts to sharing information about personal relationships and cultural tastes with commercial partners on an op-out basis. Unsurprisingly, people are pissed off.

Some types of data are specifically protected by statute. If you hire a lawyer, the data you share with them is protected by attorney-client privilege, and cannot be disclosed even by court order. Conversations with a psychiatrist are legally confidential under all but a handful of specifically described circumstances. Information you disclose to the Census cannot be used for any purpose other than the Census. Nevertheless, there are many types of data that have essentially no statutory confidentiality requirements, and these types of data are becoming more abundant, more detailed, and more valuable.

While I appreciate Google's promises, I'm disturbed that the only thing protecting my data is the goodwill of a company. While a company might be full of a lots of good people, public companies are always punished for altruistic behavior sooner or later. There is always a constituency of assholes among shareholders who believe that the only profitable company is a mean company, an they'll sue to get their way. Managers must be very mindful of this fact as they navigate the ever changing markets, and so altruistic behavior in a public company can never be relied upon.

We cannot rely on thoughtful policies, ethical researchers or altruistic companies to keep our data under our control. The data we generate in the course of our daily lives is too valuable, and the incentives for abuse are overwhelming. I believe we should go back to the original question -- who owns this data? -- and answer it. The only justifiable answer is that the person described by the data owns the data, and may dictate the terms under which the data may be used.

People who want the data -- advertisers, researchers, statisticians, public servants -- fear that relinquishing their claim on this data will mean that they will lose it. I strongly disagree. I believe that people will share more freely if they know they can change their mind, and that the law will back them up.


The EFF put together a very sad timeline of Facebook's privacy policies as they've evolved from 2005 to now. They conclude, depressingly :
Viewed together, the successive policies tell a clear story. Facebook originally earned its core base of users by offering them simple and powerful controls over their personal information. As Facebook grew larger and became more important, it could have chosen to maintain or improve those controls. Instead, it's slowly but surely helped itself — and its advertising and business partners — to more and more of its users' information, while limiting the users' options to control their own information.

Comcast melts in the rain

Posted by Russell on April 20, 2010 at 10:57 p.m.
For reasons I do not wish to fathom, my internet connection from home sucks whenever it rains. When I try to imagine why this might be the case, it calls to mind some truly horrifying images of what might be going on in Comcast's wiring closets.

How much does it suck? Well, here is a histogram of 200 ping times from my house to a machine at UC Davis, about 3000 feet from my front door. For comparison, I simultaneously collected 200 pings from my colo machine, which is 3000 miles away in Boston. The inbound and outbound packets from the colo go over Level3, so I've labeled it thusly.

Now, I wouldn't really expect a residential cable modem connection to measure up very well against a colocated server in terms of latency, but this isn't just a failure to measure up. This is just a regular old fashioned failure.

What ticks me off the most is that I pay $636 a year for this crap, and that my only alternative is AT&T DSL. I'd rather shave my tongue with a used bayonet than see a penny of my income fall into the hands of AT&T. Why does broadband suck in America?

I believe, Sir, that I may with safety take it for granted that the effect of monopoly generally is to make articles scarce, to make them dear, and to make them bad.
- Thomas Babington Macaulay

A desirable extinction

Posted by Russell on March 25, 2010 at 3:19 p.m.
Some weeks ago, Buzz (my cat) escaped out my front door while I was carrying my bicycle into the apartment. For ten or twenty minutes, he romped through the ivy and bushes around my apartment while I followed him around rattling a bag of cat treats. Eventually, he let me pick him up and take him back inside. Naturally, he picked up a few fleas. Naturally, they have multiplied.

Oddly, the fleas don't seem to like Neil very much, nor do they like me. It's just poor Buzz that's beset by the nasty little critters.

Figure 1: A flea.

As it happens, I've been thinking about endogenous metrics for estimating the sampling quality of an environmental shotgun sequencing dataset, and Buzz's little problem presented an opportunity to play with a simplified problem. So, I have decided to make Buzz, or rather his fleas, into a small experiment in ecology. I am going to try to see if I can drive them into extinction.

Now, this is normally what a pet owner does when they discover their pet has contracted some sort of annoying parasite, but I decided to take a more quantitative approach.

Figure 2: A cat.

It's simple enough to count fleas on a cat, if the cat is willing to cooperate. Buzz loves the flea comb, and will gleefully hop onto the coffee table and wait to be combed if you show it to him. So, in the interest of science, I convinced my roommate to count the number of passes I made with the flea comb and how many fleas I captured (posterity will remember your efforts, Mehdi). Using his tally, I plotted the cumulative number of passes verses the cumulative number of fleas.

Figure 3: Fleas captured

As expected, it became somewhat more difficult to capture the next flea as more fleas were captured, suggesting a depletion curve. The value of the asymptote should be the actual number of fleas on Buzz at the time, and reaching that number would imply local extinction for the fleas. Of course, there are probably other fleas lurking about that would recolonize Buzz. In principle, if I were to repeat the exercise frequently enough, Buzz would become a sink for fleas, and their migration to his fur would gradually deplete them from the environment.

There are a couple of different ways to model the impact of the combing on the flea population, with various advantages and disadvantages. All we really want to do here is to estimate the value of the asymptote, and so a simple model is probably sufficient. I showed this data to my fried Sharon Shewmake, an economics graduate student. Sharon, after editorializing on the endeavor ("Ew."), suggested this very simple model.

Assume that Buzz is not going to sit still long enough for the fleas to reproduce, for more fleas to migrate to his fur, and that the fleas already on his fur are going to stay put unless captured. Thus, there is a fixed initial population which only changes as a result of capturing fleas. Next, we assume that any given flea is equally likely to be captured on a single pass of the comb. So, the expectation value for number of fleas captured on a single pass is the product of the current population and the probability of capturing a flea.

where N is the population of fleas and p is the probability of any particular flea being captured on a single pass. One could tart this up a bit by modeling it as a stochastic process and executing a bunch of Monte Carlo trials until the outcomes converge, but that seems like overkill for a simple single variable problem like this. We will put up with the intellectual inconvenience of capturing fractional fleas.

This is a little easier to see if we let N represent the number of fleas remaining on the cat, rather than the number of fleas captured.

If we stretch our credulity far enough to imagine this as a continuous function, we can express it as a differential equation.

Sorry if this bothers you. Not only are we extracting fractional fleas, but we are now modeling the combing process as a sort of flea-killing-combine continuously mowing its way through the fur. This is a model, so you shouldn't be surprised to find massless rope and spherical cows. Anyway, it has a nice easy solution.

Well, what the heck. This is a decaying function, so let's pluck a minus sign out of the exponential factor, and maybe tack on a scale factor for the initial population.

While we're at it, why don't we go back to letting the function stand for the number of fleas captured, rather than the fleas on the cat.

This gives us a nice function to use for a linear regression. A little help from scipy, and we find that the initial population is estimated at 39.7 fleas, and the decay factor is 0.011.

Figure 4: Flea population

I captured 34 fleas, so that means I missed about five or six. In order to be reasonably confident that I'd captured all 39 fleas, I would have had to continued for about 400 passes with the comb, instead of 173. Buzz is a patient cat, but he started to loose interest around 120 passes, and had to be fetched back onto the coffee table a few time times during the last 50 passes. My guess is that 400 passes would require some kind of sedative. On the other hand, he does seem to like Guinness, so there may be something to that.

Science has been served. I'm going to the pet store to buy some flea collars.


Posted by Russell on March 13, 2010 at 3:56 p.m.
A few weeks ago, my dad sent me this really nice espresso machine to cheer me up. Actually, he sent it to me because it was it was his birthday. He's a pretty awesome dad that way -- I only sent him a book.

I'm still getting the hang of getting a decent pull of espresso out of it. I've found that my burr grinder doesn't quite go fine enough for espresso, so I'm going to have to take it apart and see if I can adjust the grinding wheels so they're closer together. Anyway, here is my latest effort :

TJ's is coming to Davis

Posted by Russell on March 07, 2010 at 9:04 p.m.
Somehow this escaped my notice. Trader Joe's is building a new store in Davis! And right across the street from me, no less. Hurry up and bring the good eats, TJ's.


Posted by Russell on March 03, 2010 at 12:13 p.m.
To my surprise, I learned this weekend that Davis has a lively population of burrowing owls, Athene cunicularia. How cool is that?

I took some nice shots, but Jonathan has a 300mm zoom, and I don't.

They're one of the few species of owl that is active during the day, though I think these guys were only awake to watch various chattering bipeds on the hiking trail. They seem comfortable with people getting within about 30 feet of their burrows, so you can get pretty close. If you go any closer, they start to do the "I don't like you" dance. If you ignore the display and keep getting closer, I'm not sure if they would run into their burrows or have at you with their claws and beaks. Owls will mess you up, even these little guys. At least they're polite enough to warn you, so heed the warning.

UC Davis, meet the internet.

Posted by Russell on January 12, 2010 at 1:42 a.m.
One of the wonderful things about web applications is that they can be available 24-7. They can sit there and quietly do their jobs -- taking orders, billing credit cards, assigning work orders, or whatever -- even when the office is empty. That's one of the main reasons one would go to the trouble of putting a web front end on something in the first place.

So, the question of the evening is, does the UC Davis registrar know this?

Um... no.

This is a completely automated process. If you do this during the daytime, it just goes ahead and populates a table in whatever chthonic legacy database system that is swaddled in this blob of early 1990's vintage web programming. It's not like having the office open at the time actually helps.

Attention shoppers! It's 4:45 Central Time, and will be closing for the day in 15 minutes! Please complete your order before the site is disconnected for the evening. We will open again tomorrow at 8:30 A.M. Thank you for shopping at!

On the upside, at least it doesn't complain that my browser isn't supported. Yay.

A review: Sasha's Soup Club

Posted by Russell on December 06, 2009 at 11:05 p.m.
One of the cool things about living in Davis is the amazing number of unique creative enterprises and projects people open up to the public. I suppose Davis is a comfortable place for people to try things out.

If you live in Davis, you should try Sasha's Soup Club. After a few weeks of envying the tasty lunches my labmates were enjoying, I joined the mailing list. I just received my allotment Leek and Potato soup, delivered by Sasha herself.

What can I say? It's damn good soup, exactly as described. The flavor of the potatoes and leeks both stand out nicely. Whatever else is in it, the other flavors are there to make a nice background.

I get nervous about making things with so few flavors. When I aim to make a simple soup, it will usually end up with six or seven different ingredients with strong flavors. If one of them comes out a little weak, you can still enjoy the others.

My general approach to hobbies is massive over-engineering. This is why the computer desk I built for my mother is rated for 7200 pounds (I tested it by stacking dead tractor engine blocks on top of it). I know that I'll never make a living as a chef or as a furniture builder. But if I build something, goddamnit, it's not going to fall down. So, when I make soup, or a sandwich, or a salad, I keep adding ingredients that I'm sure will taste good until something in my head says, "Yup, it'll hold."

It's greatly reassuring to me that there are people who know how to make awesome things with simple economy. I know I can make potato leek soup myself; I made some just last week. It was good, but then again, anything would be good if you loaded it up with enough garlic, onions, cheese, olive oil, peppercorns and sea salt. I wouldn't have had the confidence to make this soup.

Now, the only problem is not eating it all before I have a chance to gloat over it at lunch.

First lab rotation

Posted by Russell on November 26, 2009 at 4:24 p.m.
Now that I have a free moment between chopping potatoes and mashing them, I figure I should post the paper and talk I wrote summarizing my first lab rotation.

I tried to make the paper look like an PNAS article, but alas, their LaTeX template leaves much to be desired. I like how the talk turned out little better, thanks the wonderful Beamer package for LaTeX.

Speaking of science

Posted by Russell on November 26, 2009 at 10:38 a.m.
One of the projects I've been working on this semester is designing a teaching unit for the UC Davis introductory biology curriculum. The introductory biology courses are required for a whole bunch of different majors at UC Davis, so all three of them are offered every quarter. It's a huge effort, and thousands of students take these classes every year. This is part of a teaching seminar led by Scott Dawson (don't bother Googling for him -- you'll find the wrong guy). The idea behind Scott's seminar is to take the half-dozen or so concepts that the BIS2A, 2B and 2C students struggle with, design teaching units aimed at those topics, and then try them out on volunteers. My topic is stochastic processes, which has been great fun. The project isn't finished yet, so I'll save the details for later.

One of the other issues we've been addressing in the seminar is how scientists relate to non-scientists. This is, for obvious reasons, an essential teaching skill. Even if they hope to be scientists someday, students are not scientists. If you don't find a way to talk with them about science, then you're wasting their money and their time.

The idea that the educator is largely responsible for the success (or failure) of the student hasn't really seeped into higher education, although it's been the standard thinking in primary and secondary education for decades. Not all elementary school teachers are good at what they do, but it is generally agreed that if they are good, the results will be seen in the subsequent success of their students. In higher education, things don't really work this way.

The most often cited reason for poor instruction at the college level is that many professors consider teaching secondary to their research. While this is clearly true in many cases, teaching in higher education doesn't just suffer from playing second fiddle to research. Many, many professors (even whole departments) who take teaching seriously are nevertheless not very good at it.

There are two causes, both of which are systemic problems. First of all, people who teach at the college level are usually not trained as teachers. Many (most?) professors have no education training whatsoever. Yet, even if you have natural skills, teaching isn't something you can do effectively without at least a little theory and training.

The result is that most of the teaching in colleges is done by amateurs and autodidacts. In contrast, at the primary and secondary level, teaching has been a job for trained professionals since the turn of the last century.

The second problem, which is partly a symptom of the first, is regular old-fashioned chauvinism. It is the responsibility of the student to learn, but many professors fail to see how they fit into this. This might be acceptable at a private, endowment-supported institution, but such places are exceptions. The Harvards and Oxfords of the world are free to treat their students however they like, but public institutions are ultimately responsible to the taxpayers. The taxpayers support such institutions for two reasons; to conduct research, and to educate their kids. Sink-or-swim pedagogy is a dereliction of duty.

This is a problem that extends far beyond the classroom. I was listening to NPR on the drive down to Los Angeles, and caught a story on All Things Considered about the reception of Darwin's On the Origin of Species. Some extracts :

"That fraction of people who figured that they could and should keep more or less up to date with what was happening in geology, in botany, in zoology, even in physics and mathematics is a much bigger fraction than it is today," says Steven Shapin, a Franklin L. Ford Professor of the History of Science at Harvard University.


"We hear about scientific findings," says Shapin. "But the proportion that can evaluate them and follow along with them, as opposed to hearing about them, is very, very small."

Shapin says that since people can't be completely conversant with the relevant science, "They're looking for an answer to the question, 'Who can we rely on? Who's speaking the truth? Who can we trust?' "

I think the good professor is missing the point. The problem is not simply that science has gotten more complicated and technical. It is true that there is more of it, and that it moves faster. The reason I don't buy Dr. Shapin's argument is that this is not at all unique to science. Everything moves faster and is more technical now than in 1859, and people seem to cope just fine.

The problem is that scientists do not spend enough time talking with the general public. Only a small minority of scientists take the trouble to arrange their findings in a form digestible by the lay audience, as Darwin did. When they do, it is almost never cutting-edge research that fills the pages. Very few scientists go on television or the radio. The practice today is to bring research to lay the audience only when it is neatly tied up (or, the research community feels that it is, anyway). There are those who do otherwise, but there is a negative stigma to it; scientists who announce their findings with press releases instead of peer-reviewed papers are usually regarded with suspicion.

Darwin's target audience for Origin -- the typical educated Briton in 1859 -- would not have much of an advantage on the average American in 2009. A Victorian gentleman would probably have had better handwriting and more patience for trudging through elliptical turns of phrase than an American high school graduate, but I don't think they would have much of advantage when it came to comprehending an unfamiliar scientific topic. The advantage Darwin's audience had was that it had Darwin.

When a good teacher notices that a student is failing to learn something, they will look first at their own teaching methodology for the problem. The same goes for scientists; when the general public doesn't understand or care about a scientific topic, a good scientist should look first at how they are publicizing their work. If the public doesn't think your research is important, then either you aren't explaining it well enough, or maybe it actually isn't very interesting.


Posted by Russell on October 24, 2009 at 5:58 p.m.
Looks like I got my gene to grow in E. coli!

The colonies that didn't get the plasmid I'm using to carry MXAN7396 turn blue when grown with X-gal (bromo-chloro-indolyl-galactopyranoside). The ones that got the plasmid don't.



Posted by Russell on October 21, 2009 at 4:04 a.m.
I finally got through the double-PCR phase of the protocol without wrecking something. Yay!

My hacked up version of the gene gene is getting snipped up with everyone's favorite restriction enzymes (BamH1 and EcoR1). Then I get to splice it into a plasmid, and electroport the plasmids into some cells, and maybe they will do something interesing.

The cloning blues

Posted by Russell on October 19, 2009 at 8:43 p.m.
I've been doing my first laboratory rotation in Mitch Singer's lab, and trying to learn what people are actually doing when they publish these spiffy experimental results. So far, I've mostly been wrecking things. Fun disasters of the week :
  • Wrecked a DNA extraction by grabbing the wrong Pipetter and putting 300 microliters into a tube instead of 3.
  • Misread an illegible label and used butanol instead of ethanol, destroyed second attempt at the aforementioned DNA extraction.
  • Dropped the wrong tube in the trash, screwed up the third attempt at the aforementioned DNA extraction.
  • Kept a gel on the UV bench too long while trying to chop out little cubes with a razor blade, annihilated all the DNA, and screwed up fourth attempt at aforementioned DNA extraction.
  • The PCR cycler didn't close correctly, and my reaction tubes evaporated; screwing up fifth attempt at aforementioned DNA extraction. (At least this one wasn't my fault.)
I'm now spending the evening in the lab running everything over again, for the sixth time. Yays!

I definitely sticking to informatics -- that part of the rotation is going pretty well. I'm just not cut out for benchwork.

SmartMeter data from PG&E

Posted by Russell on October 05, 2009 at 1 a.m.
PG&E is no angel, but it deserves credit for making what looks like a good-faith effort to get California off of carbon. It's serious enough to have quit the US Chamber of Commerce to distance itself from the group's extremist views on climate change.

PG&E still owns six coal burning power plants, curiously located in Florida, New Jersey and Pennsylvania (presumably it uses them to swap power with other generators). It generates about 46% of its electricity from hydroelectric dams.

Rucker Creek dam, a small PG&E facility in Nevada County

One of the more interesting projects PG&E is undertaking is improving the resolution of its demand monitoring using SmartMeters. There is a lot of hype about the "Smart Grid," but basically it boils down to realtime use monitors, like these :

that are wired up to report the data somewhere. It's basically an off-the-shelf Tweet-A-Watt.

According to the PG&E web site, they are using SmartSynch meters, which use TCP/IP over some kind of wireless network. It's difficult to find information about the hardware itself, probably on account of the assorted idiots wetting their pants about people h4X0ring their refrigerators (actually, I don't know if Bill Mullins is an idiot, but his article about smart meters is depressingly typical).

Yes, it is possible for a bad person to break into your PG&E account to obtain this data.1 But so what? Power meters are inductively coupled to the circuit they measure. They can look, but they cannot touch. IOActive, a security research firm, claims that they can break into certain smart meters and "cut off power." I suppose we are meant to construe this as "cut off power to the house," but that isn't what power meters do. That is what those huge knife switches, with the lock-out-tag-out rings, are for. I'm skeptical that a certified electrician would work on a residential circuit with a computer controlled on-off switch. I certainly wouldn't. What "cut off power" probably means is that they can shut down the microcontroller, and stop the meter from collecting or reporting data. We're left to speculate, though, because the report is confidential. I speculate that they are hyping a buffer overflow exploit to gain as much attention as possible.

Nobody is going to h4x0r your refrigerator and reprogram it to be an E. coli chemostat. If you are worried about your personal data floating around on the big bad internets, your worries are better directed at your bank and your health insurance provider. The bad guys don't care that you left your bathroom light on all night last Thursday; they just want the routing number for your savings account.

While the data isn't very valuable for nefarious purposes, it is extremely valuable in the noble (if mundane) pursuit of frugality. Here's what PG&E shows you if you've been upgraded to a smart meter :

Having the graphs is neat, but the usability of the site is poor. Fortunately, they let you download the data as CSV files, although you have to go a week at a time. It's all very 1995. Happily, is working on a real-time data browser tool called Power Meter which will make this a lot nicer. For now, I just wish I had an XML-RPC interface.

I've already learned something from this data. On the 29th and 30th, I was at the Granlibakken conference center for the UC Davis Host Microbe Interaction conference. Those days show dramatically less power use between about 22:00 and 2:00, which is when I'm usually hacking at my desktop machine. One more reason to start thinking about replacing this behemoth.

1. Actually, it's stupidly easy to gain access to someone's PG&E account if you have their account number. Just create a new web account, type in the account number, and there you go! Now you can really fuck with them by paying their bill, which is about all you can do with a PG&E account.

Premises regrettably lacks belfry, cave

Posted by Russell on September 18, 2009 at 1:25 a.m.
I was going out to the garden to wash off the cat-litter-scooper this evening, and I noticed a tiny bat clinging to my screen door. It was hanging out about knee level, and I was worried my cats would take an interest in it. So, I tried to shoo it away, but it just chirped at me and clung onto the screen harder.

I thought maybe it was hurt (or worse, sick), so I captured it in a plastic bowl to observe. It didn't do anything to evade capture, and allowed itself to be sort of gently scooped up by the edge of the bowl. It walked around a little and chirped, but didn't do try to escape.

Since it didn't seem to be interested in flying around the apartment, I transferred it to the lid of the bowl, where it allowed itself to be photographed. I put a bead of water near its nose, which it prodded a little but didn't seem to drink.

I brought it outside again to see if I could get it to fly away. I held the lid out over a soft patch of ground and lowered it quickly, it spread its wings but didn't fly. I tried a few more times, and got it to fly as far as the fence. Finally, some tapping on the fence convinced it to flap away.

Does anyone know if this is normal behavior for this kind of bat?

A busy month

Posted by Russell on September 09, 2009 at 8:06 p.m.
This summer has turned out a bit like a chocolate bar in the sun; everything started out nice and neat, but everything ended up squished into the bottom of the wrapper.

Right now, I'm working with Andrey Kislyuk on our little piece of the DARPA FunBio project. We're in the middle of a two-week code sprint, so I'll save that for a later post.

I also moved to a new apartment, and that didn't go nearly as smoothly as it could have. The guy we subleased from was in the process of buying a house, and the loan underwriter decided to yank back the money after he'd closed escrow (or was in escrow, or something). Evidently they wanted a sworn affidavit from the gardener that he was contracted to take care of the grounds. Anyway, the upshot was that instead of a nice leisurely move, he got stuck in the apartment for three weeks longer than he expected, and I was homeless for a week. Fortunately, one of the staff scientists in our lab was generous enough to let me stay at his apartment. Neil and Buzz got to learn about stairs, which they evidently adore.

Over Labor Day weekend, I went with Srijak and some of his friends from San Diego on a day hike at Lassen Volcanic National Park. I've always loved California, but it's nice to be reminded from time to time exactly why I love this place so much.

Because it's awesome.

Summertime things

Posted by Russell on July 30, 2009 at 12:47 a.m.
Here are a couple of random cool things that aren't quite enough for a post on their own, so I'll stick them together.

It's been absurdly hot in Davis. Since the summer started, I've lost about nine days of productivity on account of my brain being too hot to function. By the time I get to the heavily air conditioned Genome Center building, I spend the rest of the day wanting to stick my head in a bucket of ice water.

Happily, the evenings tend to be very pleasant. And no, I'm not going to take Nate Silver up on his challenge. Good on you, Nate.

On Saturday, the Mondavi Center hosted Dengue Fever for a free concert on the quad. They are really great live! Chhom Nimol got all the little kids in the audience to come up on stage and dance. It was a great show.

My labmate Lizzy just adopted an adorable rescue puppy of unknown origin named Dweezil. He is very sweet, and already very well adapted to life with humans. He seems to love everybody, but Lizzy especially.

Meanwhile, my own rescue animals continue to puzzle me. Why does Buzz like to sleep behind my monitor? It's hot, and the cutter on the tape dispenser keeps poking him in the head and causing him to emit annoyed grumbling noises and squirm around. There are lots of comfy places he could sleep, but he likes this spot for some reason.

What the hell?

Posted by Russell on July 25, 2009 at 5:17 a.m.
A guy in my apartment building just fell down the stairs right in front of my bedroom window. I put my glasses on and saw a puddle of blood slicking the concrete walkway, about ten feet from my pillow. About half the blood was coming out of his forehead, the other from a seep in his shirt where his collarbone is.

I went outside to see if he was moving. He wasn't. He didn't respond when I spoke to him. So, I did the logical thing -- I grabbed my phone and I called 911.

And it fucking crashed. So, I tried again, and it crashed again. I was in the process of ripping out the SIM card and charging up my old phone when the Davis 911 dipatcher called back. The good news is that the EMTs were fast. As soon as the dispatcher hung up, I stepped out to the street to wait for them, and I could already see the lights coming up the street.

So, listen here Google, T-Mobile and HTC: FUCK YOU. Fix your shit.

OMG snake.

Posted by Russell on July 04, 2009 at 2:22 p.m.
After it started to cool down yesterday, I went for a bike ride around the research farms, and I saw what I thought was a seam in the bike path. It wasn't until I was almost on top of it that I noticed that it was the shadow of a snake crossing the path! I ran over the poor guy before I could hit the breakes.

Fortunately, the snake survived, and went slithering into his hidey hole in the roots of one of the huge trees that line the bike path. I used a stick to touch the end of his tail to make sure his spine wasn't broken, and he reacted in about the way you would expect a not-run-over snake to react.

Sorry this isn't a very good picture. Since T-Mobile pushed out the Android Cupcake upgrade, my phone has been ridiculously, pathetically slow. It took almost a minute and a half to get the camera application open and snap a picture. By that time, the snake had spent 30 seconds slithering around on the bike path checking itself out (which would have been an awesome shot), and then gone about 20 feet into the grass. Boo Android! Fix your shit!

The snake was about four feet long and about the width of two fingers. The head was sort of bullet-shaped, as opposed to shovel-shaped, so it's probably not a viper. My guess is garter snake.

Bike safety column in print

Posted by Russell on June 19, 2009 at 4:55 p.m.
My article on bike safety is in print in the Davis Enterprise! I was invited to write this column as a follow up to my Davis Crash Map article and Anna's accident in 2007.

It's not up on their web site yet. I'll update with a link one they post it.

:: update ::

Here is the text of the article :

The Davis Enterprise: June 19, 2009
Davis Bicycles! column #20
Title: When road design gets personal Author: Russell Neches

Two years ago my little sister was riding her bicycle to a friend’s house. A woman was diving home from work. They met when the car hit Anna at 30 mph.

Before I go further, Anna is OK.

The weeks following the accident were hard. Aphasia, hematoma, and dental prosthesis became a regular part of family conversation. It was a month before we were sure she would get better.

Anna lives in Norman, Oklahoma. Norman is a lot like Davis; it’s roughly the same size, population and distance from the state capital. Norman hosts a big university and encourages bicycling.

After the accident, I desperately wanted someone to take responsibility. At first, I blamed Anna for not being more careful. Then I read the police report, and blamed the driver. But when I visited Norman and stood by the splashes of dried blood on the asphalt, I found I couldn’t blame either of them. The blame belonged to the road itself.

In sharp contrast to Davis, Norman has some of the sloppiest road design in America. The road where the accident happened has no curb, no sidewalk, no lane markings, no lights, and no center divider. The street is a smear of asphalt that informally fades into gravel and scrubby grass on its way to becoming front yard. This wasn’t some lonely country road. It happened downtown, right next to the University of Oklahoma. The equivalent spot in Davis might be about Seventh and E Streets. Until Anna’s face slammed into the windshield, the driver had no way of knowing for sure that she was driving on the wrong side of the road.

Davis does a pretty good job when it comes to road design. Even out amongst the farms, most of the roads have reflectorized lines to mark the center and shoulders. This isn’t because paint is cheaper in California. It’s because public officials have found that the lines help people be safer drivers.

With Anna’s final round of reconstructive surgery still in the works, I hope I can be forgiven for being preoccupied with bicycle safety. I’m a scientist. When scientists get worried, we go back to the data. Mapping the last couple of years of Davis accident reports indicates that the biggest problem spot in our town is the much-debated Fifth Street corridor.

It has been proposed to transform the stretch of Fifth Street north of downtown from a higher-speed four-lane road with frequent stops into a lower-speed two-lane road with center turn pockets. The design would look somewhat like B Street does now. I was surprised to learn that the two roads carry about the same amount of traffic.

Not everyone likes the idea, and some warn that slowing traffic may result in congestion. This must be taken seriously, and so detailed computer models have been constructed. The models show that the proposed design would actually increase throughput and reduce congestion somewhat.

This counterintuitive result is something with which I have personal experience. I grew up in Los Angeles, the poster city for congestion. It got that way because people tried to solve congestion problems by adding lanes. What we got for our billions of dollars was even worse congestion. LA has more acreage under roads than under destinations, and yet it is still asphyxiated.

Roads are ancient technology. Roman engineers would find California’s freeways impressive, but would learn little from them. But even ancient technology can be improved. We didn’t get from swinging stone axes to landing robots on Mars by refusing to try new things. Lane reduction has been tried in other cities, with great results for safety and efficiency.

The proposed Fifth Street design sounds like something worth trying. It will make Davis a safer, more efficient place walk, bike and drive. Repainting and installing different signals is part of the normal process of maintaining and improving roads. The proposal would simply guide this process. If it doesn’t work, the city has more paint. My family learned the hard way just how important lines of paint really are.

I’ve made an interactive map at displaying the last couple of years of Davis accident data. I hope it will inspire you think about how our roads are designed, how those designs succeed, and how they can be improved.

— Russell Neches is a microbiology graduate student at UC Davis. He has commuted to school and work through Los Angeles, New York and Boston on various vehicles including bikes, cars, trains, subways and on foot.

:: update 2 ::

Here is the direct link to the article on the Davis Enterprise website :

Another solar transit

Posted by Russell on June 03, 2009 at 6:01 p.m.
Now that she's all graduated and stuff, Mimi came out to Davis for my birthday this week. She took me to this place in Napa called, of all things, Ubuntu. Not only is it a nice word in Zulu, but it's also a Linux distribution, a restaurant and a yoga studio. Anyway, they were the runner up for best new restaurant of 2008 in the New York Times Food & Wine section.

It was delicious, but kind of difficult to describe. Evidently, the chef walks out into his garden each morning, peers at the ripening ingreedients, and invents the day's menu based on what's ready to eat. Neat!

Anyway, Mimi is here in Davis until Friday. We're having a great time biking around town and doing Davis-y things.

Bike saftey in Davis

Posted by Russell on April 25, 2009 at 11:18 p.m.
I've been tinkering with a little data visualization applet for looking at bicycle crash data in Davis, and I thought this map might be interesting to people. This is a image was generated with Google Maps and a heatmap overlay generated with a gheat tile server.

This is for 168 bicycle accidents that happened between 2004 and 2006. I have a lot more data, but 95% of the work in this little project involves parsing and renormalizing it. Evidently, police reports are not written with data processing in mind! I suppose that makes perfect sense. An officer at the scene of an accident probably has things on her mind besides generating a nice, easy to parse data point for future analysis. The priority seems to be completeness, rather than consistency. My parsing code, for example, has to be able to correctly detect and calculate distances measured in units of "feeet".

I'll release the applet here once I make an interface for it (and get the rest of the data imported). Stay tuned.

Awesome police report

Posted by Russell on April 24, 2009 at 3:23 a.m.
I'm not sure if I'm glad or sorry to have missed this conversation.

Fun with My Tracks, an accident, and Biking in Davis

Posted by Russell on April 21, 2009 at 5:40 a.m.
I was biking home today, and I decided to take a detour to enjoy the warm evening (and to avoid the not enjoyable warm apartment). About half way around the Davis Bike Loop, I remembered that I wanted to try out My Tracks. Here's the result :

After wandering off the Bike Loop a bit, I decided to head home. I was biking down Russell Blvd., and I witnessed a very scary car accident. The accident happened where I stopped recording the track, at the red marker. A guy in a cherried-out lifted F-150 was sitting at the traffic light (that's the point where I turned around). When the light turned green, he floored it. According to the other witnesses, he was racing with someone, or trying to catch someone who had cut him off. I couldn't see the other car because it was behind his gigantic stupid truck.

What I did see, though, was that he accelerated continuously until he reached the next intersection (the red marker), where he had a head-on collision with a girl in a 1990's Honda Civic trying to make a left turn. His engine was deafeningly loud even a block away, and I heard it roaring and down-shifting right up until the crash.

Looking at the damage to her car, it looked like he basically ran it over. The lift kit on the truck put his undercarriage about level with her roof, and there were even little ladders installed to climb up to the doors. After he ran over the Civic, he swerved around a bit, jumped the median, sideswiped a small SUV in the oncoming traffic, spun 180 degrees, and snapped his axle. When the axle snapped, I heard his engine redline for half a second and then cut.

Happily, nobody was hurt. The girl in the Civic was pretty much petrified, though. She was convinced that the accident was her fault because she didn't get out of the way.

I told her this was nonsense; the truck was going more than double the speed limit, and I'm pretty sure he didn't have his lights on (it was dusk, but not completely dark yet). She asked me about five times, "How much do you think it will cost to fix?" I told her, "Cost you? Nothing. He was committing maybe a dozen moving violations, and probably racing someone. His insurance company will probably be so happy not to have to pay medical bills that they will buy you a whole new car."

Maybe she could have been a little swifter completing her turn, but it's a busy street and there is a lot of pedestrian and bicycle traffic (it parallels a bike path). Making a quick turn is probably not a good idea. Or, maybe she could have waited until this asshole passed, but, as I pointed out, he was going maybe 50 or 60 in a 30 zone, and accelerating. She timed her turn right for reasonable traffic flow, but didn't account for total maniacs among the oncoming traffic. It would have been difficult to judge when he would reach the intersection she was turning through.


As it turns out, Davis has been thinking about redesigning this stretch of Russell Blvd. for several years. If you look at the proposed design, it would have made this accident impossible or unlikely. You can't race on a one lane road, and a landscaped medium would have prevented the second collision.

HOWTO: Repair a broken Brompton chain tensioner

Posted by Russell on April 13, 2009 at 12:56 a.m.
A little while ago, I was riding home on my wonderful Brompton bicycle, and the chain tensioner suddenly disengaged. That had never happened before, and I discovered that the gear on the chain tensioner had completely smashed to bits. Here's what it looked like :

After puzzling about it for a while, I think I understand what happened. I use the same chain oil on my Brompton that I use on my racing bike. The "oil" is actually a mixture of a heavy lubricant in a volatile solvent. The solvent evaporates after coating the chain, and dissolves whatever gunk has accumulated. I think the solvent damaged the plastic. I've seen this happen with some plastics when they come into contact with gasoline. The gasoline dissolves the plasticizing agents, and leaves behind an open matrix of molecules, like a very, very fine sponge. The open matrix has a huge surface area and oxidizes rapidly. Your nice flexible plastic turns into something hard and crumbly, like a stale cookie.

That's what I think happened here. The remaining bits of plastic are still relatively flexible, but the bits that broke off have turned into a powdery mess.

The guy who sold me my bike offered to let me buy an idler wheel off of one of the bikes in his stock, but I didn't want another plastic gear. Here is what I built :

I bought a standard anodized aluminum derailleur gear from a local bike shop, and attached it to the Brompton chain tensioner arm with a few pennies worth of standard hardware. The new idler wheel (gear? cog?) slides along a little stainless steel tube I picked up at the hardware store and cut to length. This gives it enough play to allow for easy shifting. The tube has just the right tolerance to allow the gear to spin very easily, but not wobble.

Here's the exploded view :

From top to bottom, the parts are :

  • regular old bolt
  • two washers
  • stainless tube
  • gear
  • another washer
  • lock washer
  • nut
The gear is part of a Shift Biscuits pully set (though probably any pully set would work), and the steel tube is 7.94mm x 7.4mm (stock No. 7117) cut to length with a hacksaw (this took a while). The rest of the hardware is standard hardware store stuff.

I had to saw off the plastic axle tube on the chain tensioner arm because it would have prevented the idler wheel from sliding into the right position for the outer gear. I chose a bolt with a hex-head that fit snugly into the socket on the tensioner arm (similar to the bolts on the toggles for locking the frame in place). Once the nut is tightened against the lock washer, the axle is extremely rigid. The gear slips across the tube with almost zero play.

The shifting action is actually much smoother than it was with the plastic gear, and the bike seems to make a little less noise than I remember (that could be my imagination).

Yay! I've got my bike back, and without another dorky plastic gear, too. Neat!

Warm spell

Posted by Russell on January 13, 2009 at 8:52 p.m.
I was biking to the gym yesterday, and it occured to me that since it was so beautiful outside, I may as well just keep biking. So, instead of sitting on the stationary bike, I did a three and a half hour ride down some random county road. It was awesome, even with my funny little commuter bike.

Along the way I passed this pathway planted with olive trees through the middle of one of the UC Davis research farms.

Note to self: Plant more olive trees.

Second quarter at Davis

Posted by Russell on January 09, 2009 at 2:41 a.m.
This quarter, I'm taking :
  • Mathematical Methods : Laplace transforms, Fourier transforms, Greens functions, and their applications to partial differential equations.
  • Quantum Mechanics: Again. For the heck of it.
  • Numerical Methods: Analysis of the performance, stability and error propagation of numerical algorithms in finite precision systems.
So far, it's way more fun than last quarter. Still, I'm disappointed that the molecular structures course I wanted to take conflicts with numerical methods. Hopefully I can take it next quarter.

The preliminary exam for mathematical methods is in the middle of finals week at the end of this quarter. That is going to suck.

This One for That One

Posted by Russell on November 04, 2008 at 1:01 p.m.
A lot of people fought long and hard to give me the right to a secret ballot. This time around, though, I'm happy to show it to anyone who cares to look.

I like this ballot system much better than the InkaVote thing they have in LA, and much better than any kind of computerized bullshit. I spent four years using computers to design fusion reactors, but I sure as hell don't trust them with an election. Pen and paper, thanks.

First actual week of grad school

Posted by Russell on September 24, 2008 at 1:28 a.m.
When it rains, it pours! I've been hanging around all summer wondering what to do with myself, and now suddenly there is a huge explosion of activity. After a four day expedition to Davis to sign paperwork at Graduate Studies, register for classes, look for a job, look for an apartment, and get a sense of what it will be like to live here, I drove back to Los Angeles for my cousin Nathalie's wedding (Sunday) and my great aunt's 88th birthday (Monday). Both were beautiful. Then I packed my car and drove back to Davis this afternoon.

Tomorrow, I have a job interview and tour for an on-campus job working in one of the Department of Entomology greenhouses. Not quite as good as a TA position, but it would pay for rent and get me outside and moving around on a regular basis.

I still haven't found a place to live yet, but a very nice fellow from my department is letting me crash on his living room floor until I do. Classes were supposed to start Thursday, but evidently the professor for that course isn't here yet.