The GeoCities Cage at Exodus Communications


The following pictures were taken with my [cheap] digital camera at the GeoCities cage in Exodus Communication's "Wyatt" data center on 5/7/1999.


This is what the inside of the cage looks like. Nearly all of the machines in the cage are Sun Ultra 2s. Attached to nearly all of these boxes are dual sets of MTI JBODs ("Just a Box Of Disks") (either 2500s or 8300s). The newer JBODs (the 8300) make up the majority of the storage arrays in the cage.

Nearly all of the (15+?) isles in the cage look identical to this; There are nine shelves per rack -- Each rack holds three U2/dual JBOD clusers.

As you can see, there all raised floors throughout the entire data center. At the far end of this isle, you can see the chain-link fence which separates our cage from Exodus's other customer cage(s).


The hardware is arranged in the racks so that the machines are installed face-to-face and back-to-back. This is a view of an isle of machines that are back-to-back. Holes in the raised flooring and in the ceiling above pipe cooled air through all of the hosts in the racks.

Most of the cables you see in this picture are SCSI, network (front and back end), console, and power. With roughly 120 clusters and 12TB of storage space, lots of cable is needed. Although it looks like nothing but spaghetti, it's actually fairly well managed...


These are a little closer pictures of some of the U2/JBOD clusters. These are MTI 8300 JBOD units, housing twelve 9G drives each. They are striped and mirrored in a software RAID-10 configuration using Veritas Volume Manager.

Since early 1998, GeoCities's site architechture has been such that if one of the web servers went down, the data contained on the local disks (JBODs) (right) would be unavailable to the outside world. We have begun moving towards high-capacity and high-availability "Network Appliance" hardware file servers (left), which can be mounted to any number of web servers, via NFS. While this also does not completely remove the "Single Point of Failure" from the site, average uptime has been increased due to the increased internal fault tolerance inherently built into the NetApp filers, such as redundant power supplies, data striping and parity checking (RAID-4), and the "Write Anywhere File Layout" ("WAFL") FS.

Each GeoCities server is multi-homed, so that it exists on two LANs. One of which is the "public" (or "front-end") connection, which is used to serve internet service requests to the outside world. Each machine also has access (through a second NIC) to a "private" (or "back-end") network, which is used for inter-machine communication. This private network is primarily used for NFS traffic. There are four Cisco Catylist 5500 switches installed in the GeoCities cage, which handle all of the network traffic (both public and private) for the site. As you can imagine (and see from these pictures), this equates to a whole bunch of ethernet cables. Cable management gets increasingly difficult to grasp each time a new box is added to the mix.



However, when there's time to plan and it's done right, it can also be quite nice. This is the other extreme, and its a testament to the anality of the Exodus staff. Is this some of the most beautiful cable management you've ever seen, or what??!!?

The above picture are some of the newer patch panels, connecting the hosts to the Cat 5500 switches.

The above-right picture are two of GeoCities's newest switches, Geo-C and Geo-D.

To the right, you can see a rear-view of some of the patch panels. The blue bundles of ethernet cable run up to the top of the racks, where they run out (in 8" thick bundles of cable) across the cage to the various individual patch panels in each rack.


Silicon Valley was built on top of five major fault lines, so Earthquake preparedness is a big deal in the area. This shot was taken looking straight up from within one of the rack isles. The two thick metal bars are welded to the tops of the racks, and prevent them from moving independently of each other. Then, there are four 1/2" thick steel rods that bolt them all together and prevent any particular rack from swaying too far one way or the other.

If you want more pictures, please let me know.