zombies ahead: studying web site stagnation.

When you’re redesigning, redeveloping or refreshing (pick your term) your Web site is the easiest time to take stock of its level of activity and atrophy. OK, ideally you could/should do this continuously, but then we should also work out regularly and eat only healthy foods … but I digress.

We’re in the process of changing content management systems and, in the process, developing a new taxonomy for our sprawling and mostly decentralized Web site. Before attacking how to categorize/organize the content, I picked up my torch, machete and compass and wandered into the jungle that is our current site. I looked at 206 site accounts — which vary from dozens of pages to single standalones — deciding to start with culling the dead. Since a previous CMS software update accidentally republished all our pages, even long-abandoned pages list 7/9/07 as most recent update, meaning we had to delve deeper.

Figure 1: A Web site that never was.

What I found were bodies. Lots of dead bodies. At last count 35 of the 206 site accounts were either abandoned, never started or rendered obsolete. That’s 17 percent or over 1 out of every 6 accounts virtual ghost ships plying the Web’s waves.

Some involved programs or ad hoc projects that went away. Other accounts existed but never published. Many were pages and accounts abandoned because a department or office decided to start over. However, the pages left to flounder on the high seas are just as easy to find via Google or other search methods as the updated pages and accounts. Ay, cap’n, there’s the rub.

Figure 2: A ghost ship on the high seas.

Around 90 percent of those accounts reside outside of our office, and we preach: update your sites, cull your dead pages, etc. But in a decentralized system, many users are not actively thinking about updating pages, whether because of workload (it’s just one of many tasks in their job) or turnover, nor of decommissioning expired content. And we’re too busy to stay on top of tens of thousands of pages. Then you wake up one day and BAM, 1 out of every 6 pages is a zombie, eating your Web site’s brains.

To be fair, user-friendliness is a key factor. Our CMS had four separate editable regions for most pages — title, subtitle, body and right column. To edit any one of these sections involved five steps: check the section out, perform editing, save, confirm and publish. Throw in having to chose a template and then making a title and metadata, that meant creating most pages involved 22 steps. Small wonder some overloaded page admins didn’t want to work on the Web site. EDIT: I realized if you avoid micropublishing, you can get it down to 16 steps. But still.

Of course, at this point all we can do is learn. And know that creating a system that is better for people who have to maintain the pages will create a more active and updated Web site that is better for those who visit it.

5 Comments

Filed under Web

5 responses to “zombies ahead: studying web site stagnation.

  1. Interesting discoveries you’ve made there. I’m afraid we’d find the same proportion of dead sites on our decentralized web presence. I’d be interested in hearing about how you plan to share this information with the campus — or if you plan to use it only for internal planning within your team (and with your devoted readership).

  2. Dave

    You almost need a system that notifies the admins of changes to individual pages. It’s an inverse strategy, since it doesn’t tell you what hasn’t changed, but it could point you in the right direction.

    Alternately, some kind of internal reporting system that lists all sections or pages and lists their last-updated timestamp would provide at-a-glance information on who’s publishing and who’s piloting the SS Ghost Ship. Export the list to a spreadsheet and sort by date, and you’ll know.

  3. We’re on the verge of a CMS upgrade ourselves, and I have started to take a similar inventory of sites. I’m hopeful that we won’t have as many problems, since we’re more of a centralized environment, but I’m sure we’ll have our fair share of zombies hiding out there.

    One thing we’re implementing in the new CMS is pretty much what Dave suggested – not only will we have a reminder for page owners to check/update their pages, but also a list of page/site updates as they happen – so we can see who the frequent updaters are, and who doesn’t touch their pages as frequently.

  4. insidetimshead

    ANDREW: Right now I’m sharing the information anecdotally, and probably will mention it in a training session tomorrow on migration. The situation is simple on the surface, but has complicated dynamics, so I’m still trying to figure out how to more widely disseminate it.

    DAVE: A good idea, but since there are 10,000+ pages (or more, I’ve stopped counting), just that part of the tracking would prove Herculean. Maybe if we can figure out some kind of threshold or self-notification if a site hasn’t been touched in a year. Or two. Or 10.

    JD: I am darned impressed with the Hamilton Web presence, so I reckon your upgrade will make it sparkle more. As I mentioned to Dave, the management of this out-of-control nuclear octopus could challenge tracking attempts, but maybe there’s something we could engineer that keeps us posted on particularly types of activity.

  5. We did a major overhaul and reorganization of site navigation and content starting about 3 years ago, then went to a CMS. Between the two transitions we were able to prune a lot of underbrush–that is, on OUR site servers.

    What we couldn’t do anything about was all the cached versions available via Google search. Every so often people ignore the subliminal message–if you follow a link and it says the page isn’t there, that means the page isn’t there. They back up, choose the cached version, and then send an inquiry about a degree we no longer offer, or a conference that took place 7 years ago.

    Sigh.

    Any suggestions? It’s not a daily annoyance, but I don’t like creating a negative user experience and I’d like to spare staff having to deal with these irrelevant requests for information. People occasionally get mad, but we can’t singlehandedly clean up the whole WWW.

    I might add that our pruning was made easier by the fact that we’ve had centralized administration of Web content, headquartered in the Communications office, since 1998 when I started working here. That was a charge from my boss when I was hired and I’ve been glad ever since.

    This doesn’t mean you won’t find a ghost ship floating here or there–just makes it a lot easier to roll out a new design or restructure content.

    I also got to hire a part-time content writer because there was no way the distributed content creation system was giving us fresh, readable content on a regular basis, so we did a big rewrite over time. (I know, I know–cries of envy–control AND staff resources. We’re at 3.8 FTE/5 heads for all internal/external communications for the whole campus, if that makes you feel better.)

    @BarbChamberlain
    Director of Communications and Public Affairs
    Washington State University Spokane
    @WSUSpokane
    http://www.spokane.wsu.edu

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s