As the election drew near, many political and stats junkies (like me) became fans of Nate Silver, aka @fivethirtyeight, the shrewd political number-cruncher and blogger for the New York Times. His way of aggregating the most reliable presidential polls into megapolls, and factoring in those polls’ historical accuracies, was considered by some to be as revolutionary as the introduction of “Moneyball” — or use of undervalued stats — on baseball.
Like anyone who develops a following, Silver soon drew his shares of detractors. Newsmen, pundits and politicians alike scoffed at his methodology, and Silver tended to respond quite intelligently with an unrivaled grasp of statistics. Even as the news networks hyped the election as anyone’s game last week, Silver said his estimations “represent powerful evidence against the idea that the race is a ‘tossup.’ A tossup race isn’t likely to produce 19 leads for one candidate and one for the other — any more than a fair coin is likely to come up heads 19 times and tails just once in 20 tosses.” And, yes, unless Florida reverses course, he will have called 50 of 50 states correctly. That he even triggered the briefly popular Is Nate Silver A Witch? website tells something about his crossover success.
But let’s forget politics for a moment (please!); what’s impressive here is the rise of analytics writ large. Silver succeeded by keen understanding of statistics, willingness to discard dubious assumptions and eagerness to innovate. In higher education, we always talk about working smarter not harder and trying innovative things … then everyone rushes to “best practices” and well-plowed ground and research (like that on “Millennials”) based on questionable assumptions.
It all starts with data. Working with the web and social media avails us to a wealth of analytics and metrics via Google and other methods. But as Silver cautions, it’s about looking for the right data, not necessarily the most obvious or easiest. Avinash Kaushik, perhaps one of the top experts in web analytics, jokes that “hits” is short for “how idiots track success” … i.e. the number of visits to your website tells you only surface information. Instead, he says, look at things like bounce rates (how many people visit one page and immediately leave), average number of pages per visit and what paths and tasks users complete while on your site.
Google’s In-Page Analytics (seen above) is one of my favorite tools for seeing where visitors go after hitting a page. Those orange tags are click-through percentages, which you can roll over for numbers. I look at our home page using this tool very frequently to see what is and isn’t working, and regularly check other key pages. It’s interesting to see that sometimes switching out a picture or changing wording can have an impact on click rates. Among the most basic tips:
- Pics of students work better than anything else. (Except maybe sunsets, but that’s a whole other story.)
- Pics of logos and/or clip art are virtually useless. The only logo anyone ever clicks is the Oswego logo at the top left to get back to the home page.
- Don’t overpromise or mislead with link names. I’ve seen pages where users think they are getting one thing because of a page name, only to realize the info they seek is not there. In cases like these, a user is more likely to leave our site entirely than go back. (We’ve seen this fixed by merely changing a link or page name.)
- If your page has an embedded video but a very low average time on page, it’s pretty clear that video isn’t getting watched much. You can correlate with YouTube views — there’s a chance they’re watching it on YouTube — but you can often spot a dog quickly. This also ties into our data that shows videos about students and/or made by students tend to do much better than any other videos.
Another great Google Analytics feature is event tracking, which lets you see microtrends. With our new megadropdown headers and Popular Links, developer Rick Buck inserted a Google event tracking code to get a finer picture of who clicks where. The Academics part of the header rules, as it does in breakout tracking. This underscores our longtime push that good academic content and information architecture remain key to a college website’s success.
In addition to looking small, we look big. We recently completed our third month of compiling, filing and sharing a monthly web and social media analytics report, which has provided clues into what works and what doesn’t. We will learn even more as we add and hone various measurements and see trends in longer spans of data.
On a related note, you should also look long-term and not be so hasty that you change things too quickly. Silver’s data worked because he had large sample sizes. You need to track a page for at least a month (maybe more) to ensure you have a good enough sample size to judge user activity. A day or two is too small a sample size to glean a full picture.
Some colleges are showing a need and desire to invest in data. Ithaca College, for example, recently hired Colleen Clark as a full-time marketing analyst, and Colleen describes what that entails in this interview with Karine Joly of Higher Ed Experts. Not all colleges are in a position to hire full-time web analysts, but institutions should ensure that at least one (probably more) people in their organization have enough training, knowledge and — importantly — time to look at stats and trends.
Because as Nate Silver showed with this election, relying on conventional wisdom and erratic statistics get you results that are only as good as their flawed data. The more data you have, the better you understand it, the more effectively you implement what it shows, the higher the chances you can start achieving some real wins … whatever you do.