When less is more, A/B test insights

Last year we re-focused on the admissions area of wayne.edu to reduce the depth of the site (from five to two levels deep) and bring more resources to the front page. As a result the landing page highlights the six most popular audiences and their navigation items.

Each audience has its own landing page, with the same navigation and unique elements which speak to that audience. We found these audience landing pages were being visited less frequently than before the redesign.

Our theory

Initially, we thought users were clicking on a link below their audience heading and getting directly to their needed page. We decided to test this theory using a tool called Hotjar to record where users clicked on the page.

Admissions homepage clickmap

Although each audience listed their entire menu below the headings, the top three items were by far the most popular.

Revised hypothesis

We decided to test if removing the least popular items made it more clear to users that they could visit an audience landing page. If this is the case, are they more likely to visit these pages by clicking the audience heading or ellipse at the bottom of the link list?

“A” variant of the page

Admissions A variant

“B” variant of the page

Admissions B variant

Winner: Shorter list of links with the ellipses

After running the experiment for two weeks, Google found a statistically significant winning version.

Admissions experiment results

Headers or last item ellipse?

Admissions header vs. Ellipse

Of the clicks to the audience landing pages, the headers yielded ~2.5% of the clicks while the ellipses yield ~1%.

Insights

The takeaway from this experiment is it’s possible to go too far while reducing the depth of a website. Having everything accessible from the homepage may be good if you’re familiar with all the options, but it can be overwhelming for unfamiliar users.

Keeping a website as flat as possible while reducing the number of choices to entice users along yields the most interactions. It allows for the addition of refined and contextual content to reinforce a user’s decisions along each page of their journey.

[Video] Web Workshop: Intro. to Google Analytics

Last week I presented a workshop on Google Analytics. Since many schools/colleges/departments use the tool to track Web visitors, I thought it would be a good opportunity to get them in a room to explain the features/power of the system.

The workshop covered the following topics:

  • Setting it up on a site/multiple sites
  • Account/Property/View management
  • Intelligence Events
  • Real time
  • Audience overview/behavior
  • Technology/browser/mobile
  • Acquisition/referrers/search/campaigns/social
  • Behavior/visitor flow/site speed
  • Events/tracking/formy
  • Goals

Since a handful of people could not make the workshop, I recorded it. The audio is not ideal, but it will do.

Next workshop:

The August workshop will be on social media content strategy. The date/time is still being determined, but it will be posted here when it is confirmed.

The social Web beyond Facebook & Twitter – Instagram edition

Recently Wayne State was featured on College Recruiter’s list of top 10 colleges on Instagram. It got me thinking about the importance of micro communities. A lot of schools follow every new shiny thing that comes their way. That approach gets people to think they are “leading edge” but six to twelve months later, when the community or the internal resources dry up, the school is left with wasted resources that could have been used to build a more solid and engaged community.

I initially posted about our first 48 hours on Instagram as a litmus test. Since then we have been keeping up a continuous growth of followers and interactions. More importantly for us is the ability to connect with students, alumni and the community on a personal level. Looking back at the last six months on Instagram has allowed us to validate its ability to accomplish that very goal.

Our approach

We started the journey by “looking into the pond” during the first 48 hours which turned in to “getting our feet wet” during the first six months. Over the past six months we’ve kept Instagram isolated from our other social networks. This is by design and because we didn’t want to set up false expectations. The photos we posted were meant to grow the Instagram community and nothing else. If the community could stand on its own, we knew promoting it other places would only accelerate its growth.

Growth of followers vs engagement per photo

The graph above shows our growth in followers (blue) over the past six months from 0 to 1,000. The lines in green are photos we posted and their “interactions”. We consider an interaction a “like” or a comment on a photo.

Looking at the graph we were able to develop some insights about our reach. Instagram is an interaction based media, if you don’t post people don’t notice you. So keeping a constant stream of photos is important to the growth of the community (duh). The second is that we suffered from the same “shiny thing” syndrome that we were trying to avoid: lots of photos, interaction and growth initially then after three months we dropped off. Although we never dropped in followers, we didn’t grow at the rate we should have been.

The real secret to gaining Instagram followers

Since we were not promoting our account beyond the network itself, the only way to “advertise” that we were part of the community was to actually be a part of the community. This may seem like a novel concept to some but it is by far the first thing overlooked when resources are tight. “Let’s just push out content” is heard too often around meeting room tables. Our secret isn’t a secret at all, the largest factor to our follower growth didn’t come from our photos, it came from us liking, commenting and following others.

Listening and engaging when appropriate by far had the largest impact in our follower growth. We consider the photos we post as a secondary benefit to being on Instagram. Students are tagging us or geo-locating photos around campus at a rate of one every fifteen minutes. That is far more content than we could ever, or would ever, want to post.

Where we go from here

From here is the long road of supporting and interacting. That includes:

  • Integrating Instagram into our social dashboard (Socialy)
  • Promoting the community photos on Digital Signage
  • Driving more traffic to our newly launched profile page
  • Involving the campus community in our posts
  • Continuing to find those things that connect students and alumni back to campus

View Wayne State University’s profile on Instagram

May 2012 Commencement Wrap-up

In years past I’ve done wrap-up posts about Commencement communication and the live stream (OK, I guess only one made it public). I wanted to start this tradition to ensure we have a historical record of statistics and lessons learned.

Below was our homepage and live streaming page on commencement day. We decided not to take over the entire homepage but instead use the standard banner area to promote the event. The homepage typically drives the most traffic to our live events page, but this year it was different, 95 percent of visitors came directly to the live streaming page. Let’s dive in to why this was the case this year.

A few differences this year

Typically we do everything in-house, this year we decided to move commencement off campus to Ford Field, home of the Detroit Lions. We also decided that instead of using Ustream we would use a vendor to provide streaming services. With that came the need to find an interactive chat system that met our needs. We decided to use Chatroll because it offers the open ability for guests to participate and allows people to log in with their Twitter or Facebook information. Plus it offers the ability to moderate the chat if needed.

Physical Event

The May commencement was a single ceremony, 2,600+ graduates with ~20,000 total guests. This was a big event so we knew the live stream would be popular. The event also took place from 7-10 pm on a Monday. Typically the event is smaller and takes place during the day on a weekend.

Total Viewers & Chat

On commencent day we had 5,579 page views and 3,087 total users watch the stream. This made the live stream page the third most popular page on wayne.edu that day. We were able to put up an archive of the stream right away, which continued to make the page popular for days after the event. In total we had 4,239 unique pageviews from 100+ countries. 1/3 of all the viewers were from outside of the US. We knew having the stream available was important to international students since that feedback has been consistent year after year.

Above is a standard screenshot for each of our moderators who were checking the live stream from a remote location to make sure the video and audio were up at all times. They were tasked with reading through the entire public chat looking for anything suspicious. Then there was the additional monitoring of email, Twitter, Instagram and of course, a backchannel on Campfire so we could discuss all the strings we were pulling in real time.

Chatroll made it really easy, actually much easier than Ustream’s IRC client, to mix in promotional messages, links for users to share on their social networks, ability to paste links in the chat for all to see, and even ban certain words. I’ll say the interface and features of Chatroll is far superior to Ustream but at the same time there is a cost associated with it. We estimated around 500 chatters based on previous years, apparently we hit that 500 in the first 20 minutes. What we didn’t know was even if someone is viewing the chat they are considered “online”. Obviously we had more that 500 people viewing the live page at a time which basically closed up the chat to those first 500.

In the end we had 289 active users who posted a total of 3,281 messages. 74 signed in through Facebook, 12 through Twitter. 24 percent of users who signed in with Facebook “recommended” the event to their friends. 92 percent of signed-in Twitter users tweeted about the event through the chat. Although not a lot of people used their existing social accounts, it was nice to see the ones who did take action to spread the event.

Traffic Sources

This year we had a lot more people promoting and pushing commencement material out (because it was a combined ceremony) so we were not able to get a fine grained picture of which medium drove the most traffic.

By looking at our traffic sources the one thing that struck me as interesting was the number of people who landed on the page by searching some variation of “wayne.edu/live”. Apparently it was the most distributed URL for offline material which caused a lot of direct and search traffic (not to mention direct traffic).

 

Social & Photos

Surprisingly our students decided that Instagram was the place to post photos this year. Of all the photo sharing services it was by far the most popular. We ended up favoriting all the commencement shots we could find, 65+ in total. A lot less than we were expecting but it seemed like most people were just tweeting instead of sharing photos. In total we saw 600+ twitter mentions during commencement. That is in line with the percent of our followers vs the total number of students we have 8,000 (twitter followers) / 33,000 (enrollment) = 25 percent. 600 (mentions) / 2,200 (graduates) = 27 percent. Facebook on the other hand saw far less activity during the event.

Here is a snapshot of some of our Instagram favorites:

Commencement in 2 minutes – timelapse

Lessons Learned

From the Web communications perspective we learned a lot this year. The first is that we should have mandated that everyone use a single URL for promotion. That URL should have been wayne.edu. The reason is two-fold, the first is we would have been able to customize the page for the event and include “extra” context that may have enticed an outsider to learn more about the university. The second is it would have reduced the number of searches for “wayne.edu/live”. It’s an unnatural URL that most students and family members are not use to visiting.

Secondly, chat is crucial, especially the ability to sign in as a guest. If we would have known the 500 chatter limit was including people who were simply viewing the chat window and not signed in we would have handled it differently. We probably would have had a screenshot of the chat window with a button to chat. Once clicked it would have loaded the chat window in its place. This would have given the more interested chatters the ability to join in.

Lastly, we learned that knowing the program beforehand is crucial. We knew a little about who was speaking and the general format but when the event started late then started to run long the online audience started to get a little antsy. In total we only had to ban six people from the chat for causing a disruption and continuously swearing.  But it’s the little details that matter, which schools will be walking across the stage at what approximate time, who is talking at any given time and some of the history of the event and why it’s such a structured event.

Overall the event went really well and the live stream gave friends and family who couldn’t make it the ability to be part of the ceremony.

An archive of the event is currently up at http://wayne.edu/live/

Having a little fun with server maintenance

This weekend the Computing & Information Technology (C&IT) department upgraded the power in the data center. An explanation of the reason for the upgrade can be found at the ProfTech blog. In addition, C&IT has an announcement of what services were effected on their website.

How this effected the Wayne State Web

In short, this meant every website hosted in the main Web server environment would be down. Basically shutting our visitors out for 10 hours. This included the homepage, admissions website, application, events calendar, api, and content management system, just to name a few.

The down time user experience

It’s never a good experience when you click on a dead link. C&IT brought us in the loop early and we tried to come up with a plan to keep the servers online during the maintenance period. Unfortunately the entire datastore would be down, and moving a read only version to our off campus would take longer than the maintenance period itself. We decided a single maintenance page made the most sense for the time of day and number of visitors that it would affect.

We designed the page based on the promotional images we used around campus to warn students/staff about the maintenance. We kept it simple and gave the user some calls to action directing them to additional information. Below is a screenshot of the page.

The maintenance screen

Scheduled Maintenance Screenshot

“Have a little fun”

If you notice on the page there is a second link to “have a little fun“. We wanted to give anyone who was unfortunate enough to land on the dead end page something to do to pass the time and show a little personality.

One of our former developers, Nick West, was playing around with javascript and gaming a while back and came up with this exploding W page. We passed it around internally for a while but never had a chance to use it publicly. We thought this would be a perfect opportunity. With his permission, we added it to our maintenance page to see how many people found it.

Scheduled Maintenance Have Fun Screenshot

During the maintenance period only ~9% of visitors clicked through to the “have a little fun” and spent an average of two minutes on the page. We expected about this percentage and amount of time on the page, but I explain why the actual quantity of visitors was quite a bit lower than we expected below.

Lessons learned

The goal was to have this maintenance page come up when any page was accessed with a “503” (the server is temporarily unavailable) response and a “retry-after” so Google and other search engines didn’t index the temporary page. Coincidentally an article was posted on SEOMoz just days before our downtime which outlines the best practices for handling maintenance situations.

Everything seemed to be planned well for the maintenance but we encountered two issues which prevented us from analyzing the downtime completely.

  1. The .htaccess file didn’t get included in the files that were sent to C&IT. (completely my fault for not checking)
  2. The Google Analytics account did not have the “full domain” filter enabled. (again, an oversight on my part)

Because of these two issues the maintenance screen was only displayed on the homepage of each domain, not on every single file accessed. That cuts out a major chunk (like 95%) of the traffic to our server and potentially hurt some page ranks. Lastly, the oversight of the “full domain” filter in the Google Analytics prevented us from seeing exactly where the traffic was from. Seeing just “/” and “/w/” give us absolutely no insight.

Test, test, then retest

In the end we were glad to have at least some explanation of the down time up, but because of these two issues I don’t have much insight to share here. I am taking this situation as another opportunity for the importance of testing and retesting. Having a dry run of any IT or Web related activity may take a little extra time but in the end will produce the best results.

View the maintenance page: http://wayne.edu/maintenance/

Have a little fun: http://wayne.edu/maintenance/w/

User Testing: You are not your users

A majority of people assume a Web page is just a digital piece of paper, but in reality it is just a single step in an entire experience. I will use the illustration below to show how every page is connected to another. The illustration can be looked at in two ways. Most people within an organization tend to think a visitor travels from the inside (homepage) out to the edges. But in reality the visitor is more likely to start on a random spot within the system and then figure out what their next step should be. They don’t have a heads up display (like in a video game) that they can pull up at any time to see where they are in relation to everything else. It is up to the information architect and the designer to give the visitor visual cues and sign posts to orient them within the first ten seconds.

The MaRS website as a graph

Every site is unique

Since no two sites have the same goals and end user needs, the only way to optimize your site is to look through the eyes of your users. For us, we often find insights when we aren’t looking for them. We have been trying to optimize our current students page for some time now. It’s taken us a little longer to figure out than expected, but for a good reason.

I wanted to share our experience with everyone, not as a how to, but as an insight into a process that I think every Web worker should be aware of.

Passive user testing

There are two ways to test your site’s effectiveness. Formal user testing is when you recruit a specific type of user and have them complete pre-defined tasks or watch them use your site in a controlled environment. Passive user testing is when you watch the user in their native environment and they’re unaware their actions are being analyzed. Both have their pros and cons, but both are necessary for a well rounded analysis of a website. I am going to focus on passive testing for the purpose of this post because it’s something everyone should be doing all the time.

Everyone knows Google Analytics is the most popular way to analyze your users passively. But you shouldn’t stop there, GA can only tell you so much. Figuring out what your visitor’s motivations and goals are takes time and experience, being aware of where your users are going is just the first step. You can’t just look at one GA report and know what is right and wrong about your site, it takes analysis over the course of a few months with many different tools.

Motivations of a current student

We knew the “Current Students” menu item was the second most clicked menu item from our homepage, “Directory” is the most clicked. We had a hunch about the motivations of current students but we had to know for sure. We set up CrazyEgg on the current students page to see where they were clicking. CrazyEgg tracks both “active” clicks on links and clicks made on things that are not links. We knew they were looking to get to resources as quickly as possible, Pipeline, Calendar, Email, Blackboard and Course Schedule.

What we discovered is “Pipeline” was by far the most popular link. But it stumped us a little bit because we have a direct log in to Pipeline, Blackboard and Email right from the header of every page on wayne.edu so the user doesn’t have to click through to find the links. Obviously not enough students knew about it.

Give the user a hint

So we decided to give the current student a hint about the drawer and the log in ability to see if we could change behavior and drive more traffic through the form instead of clicking the link and waiting for the log in page. We changed the page to drop the drawer down for five seconds then back up to “preview” to the user what is hidden up in the header.

As you can see from the heat maps above the “hint” didn’t change the user’s behavior significantly. Only 3 percent of visitors changed their behavior and used the form once they knew it was there. We ran the tests for an equal amount of time on the same days of the week to ensure we were getting as close to the same population as possible.

Don’t hide important elements

Going back to the drawing board we decided to re-organize the entire page and just plop the log in form right in front of the user. We knew students wanted to log in to these services and we just hated the fact they were going through so many steps to do it.

Above is the re-aligned current students page. It has almost all the same information on it, just re-organized. We did drop the news and events because they were getting less than 1 percent of the clicks on the page and replaced it with some of our social media activity to make students aware where we were. All the links are in alpha order above the fold to allow for the easiest of access. Previously we had the links split up based on perceived importance.

We tested the page yet again with CrazyEgg. Success! 20 percent of visitors used the form to log in directly to the service of their choice. We were happy and were about to call it a day, but then we noticed something interesting. The “Current Students” menu item was being clicked by 5 percent (257 clicks) of visitors. Looking into it further we determined it was not only being clicked by users who entered directly on the current students page but also by people who came from our homepage.

Why would users click on a menu item right after they clicked on it to get to the page?

Orientating your visitors

Regardless where your visitors come from they should be able to orient themselves within two seconds of viewing a page. We noticed with the re-aligned page we had moved down the page title and the menu item wasn’t being highlighted to show the user a “state” of it being selected. So we decided to test making the menu item selected to see if it changed the user’s confidence and that they were on the page they expected to land on, the one specific to current students.

What do you know, it worked! I didn’t think it would have the impact it did but when the menu is selected only 1 percent (59 clicks) clicked on it. In addition the log in form gained another 1 percent of visitors using it.

You are not your users

Time and time again I have to remind everyone making Web decisions that they are not the primary user of their site. Like the illustration at the beginning of this post there are two ways to view the same information. Inside out or outside in. The more you can understand the way your users view your site the more you can understand their motivations and make it easier for them.

User testing isn’t an exact science nor is there a magic formula or tool to use. It takes persistence, patience and insight, but in the end the time spent is worth it.

A/B Testing: Schedule a Campus Tour

We are always on the hunt to optimize our websites and making changes on a consistent basis. I like to call the process “Micro Redesigns” and I have been talking about it more and more this year. A lot of the examples I use are from Wayne State. We are lucky enough to control the complete user experience from the bottom up so we have a lot of opportunity to play. And by play I mean optimize, optimize, optimize.

What is a micro redesign?

It is taking small deliberate steps to reach a larger goal (which may be not 100 percent apparent at the time).

No perfect website

No matter how long you work on a site and the number of people you test it on there is always room for improvement. For example, our homepage has felt the same for the last few years but has actually changed a lot. Instead of doing a sweeping redesign every two years we decided to focus on one piece of the site every two months and optimize it. As of right now, two years after launch, only 20 percent of the homepage is the same as the original look. We have increased the usability of every piece of the site without interrupting any stakeholders use of the page. It isn’t just the homepage either, sub pages have been changing also. More about that in the next week or so.

Schedule a tour example

This last month we set out to try and improve the number of conversions to our “Schedule a Tour” link on the wayne.edu homepage. We knew the link was getting traffic but we tested our homepage with prospective students asking them to start on the homepage and schedule a tour and they would completely overlook it. We decided to test making it more visual. Below is a comparison between the initial and the proposed visual treatment.

The original (left) design features text and a link to schedule a tour now. The proposed (right) design features a map image with a marker similar to Google Maps with a smaller amount of text.

We decided to put the two versions up against each other for a month to see which one performed the best. We had our idea of which one would perform the best, but we had to have the data to back it up.

The results

Launched:  Jun 22, 2011 | Completed: August 9, 2011

After the first week it was pretty clear which version was performing the best. But we made sure to run the test for at least a month before making a final decision. Just over a month, 1 million visits and 1,000 clicks later, the results were pretty conclusive. The newer, more visual schedule a tour button resulted in a 67 percent increase in the number of click throughs to the schedule a tour form.

Completed a reservation

 

I try to follow my own advice as often as possible but sometimes for what ever reason it doesn’t happen. This was one of those times. I can’t stress enough to test your configuration to ensure you can follow a user through the complete funnel to determine exactly what changes you made resulted in the most overall impact.

Analysis

One could easily say that giving anything more color or space to click would yield more clicks. But we think these results show something deeper than that. I would agree in some sense that more color and space do yield more clicks but for this case I think it was what we added. The initial design didn’t connect the label with what the user was about to do. Adding a visual map and a marker that people were already familiar with (Google Maps), made a connection with an existing construct immediately.

If you look at the graph above you will see the goal to complete a visit is set up correctly and is recording but what isn’t recording is what page, or version of the “Schedule a Tour” button they came from that resulted in the goal completion. Google is just supplying “(entrance)” as the referral page and this isn’t helpful at all.

One note on the graph above, it is only the tours completed coming from our homepage, it is not a total of all the tours coming in from the entire website.

It’s important to have that source page to determine that although more people clicked on the more graphical version of the button, if they didn’t ultimately convert then it doesn’t matter how many people click it. We need to find which version produced the best ROI overall, not just in the micro sense.

Dream big, think small

In the end it comes down to the notion of improving the overall experience for your visitors. Don’t lose sight that those numbers are not a mass of people coming in hoards, but individuals coming one by one with a goal in mind with no tolerance for getting the run around.

An insight into the May Commencement Web traffic

Last week Thursday and Friday we had our May Commencement. The ceremony was split up between five events spaced throughout the days. We streamed the events live as we have in previous years and I wanted to give a little insight into the amount of viewers we received.

Physical Attendance

To give a little prospective we have 3,500 graduates in all five ceremonies and the attendance was as follows:

  • 2,350 Ceremony 1
  • 1,200 Ceremony 2
  • 2,300 Ceremony 3
  • 2,350 Ceremony 4
  • 1,900 Ceremony 5

~10,000 total friends and family attend the physical event on campus.

Promotion

We used our typical tried and true channels to promote the live stream. There were prominent links on the Commencement website for months while students were getting ready for the event. We also place a banner on wayne.edu to direct traffic to the live stream on the day of the event. The commencement committee sent out an email on the day of the event with a link to watch the event live. Unfortunately this year the email link to watch wasn’t tagged in a way for us to track the exact click through rate so we just have an estimate. Lastly we promoted the event socially on Facebook and Twitter.

A break down of the traffic sources is below. The diagram is pretty crude but it works to illustrate promoting it socially (we thought would bring the most amount of traffic) was actually not true. From what we could track, viewers primarily came in through email and the commencement website. There is still ~9,000 “direct” visits to the stream page that are unaccounted for, we are still looking in to where these people came from.

Traffic Sources

  • ~9,000 unaccounted for direct visitors
  • 3,863 referrals from the Commencement Website
  • ~1,500 referrals from email
  • 560 referrals from wayne.edu banner
  • 512 click throughs from Facebook
  • 449 click throughs from Twitter

Total Viewers

  • 3,510 Total unique viewers for all ceremonies
  • 1,693 Viewers for commencement 1
  • 1,327 Viewers for commencement 2 & 3
  • 909 Viewers for commencement 4
  • 873 Viewers for commencement 5

Live Chat

Like all live streamed events we open the chat up to everyone. We moderate it for profanity and people causing issues but other than that the community does a good job moderating itself. Here is a break down of the number of comments for each ceremony. For some reason the 3rd ceremony didn’t record the chat but we were watching and it was in line with #4 and #5.

  • 577 Comments on commencement 1
  • 345 Comments on commencement 2
  • 152 Comments on commencement 4
  • 204 Comments on commencement 5

What’s next?

This week Ustream introduced “Ustream on Facebook“. This is going to introduce a whole new audience for our events. We try to drive traffic to a branded page so visitors can learn more about WSU if they are interested. We installed the Ustream app on our facebook page and will be testing it out with our next live event.

One major thing they changed was the replacement of IRC chat for Facebook chat. This is going to introduce a new variable since it requires a Facebook account to chat. We don’t like the idea of having two chat interfaces so we will probably be migrating the wayne.edu/live chat to FB.

The second major shift is the reduction in analytics we can pull about the people who land on the Facebook page compared to the traditional Wayne State page.

The good thing is more people will be exposed to the events going on around campus. The problem of figuring out where to do our promotion is not a bad problem to have if that exposure is already high. Only time will tell which method is going to work best for our audience.

Design Decisions: “Apply Now” button

Last week Wayne State was mentioned the in the Chronicle of Higher Education for our increased traffic to the admissions application from wayne.edu. I wanted to break down a little more about what we did and why.

We work very closely with the admissions office and talk regularly with some of the students who field the calls to our general phone number. This is something we do on a regular basis with any site we oversee, it is important for us to identify problems we can solve on the web before they get to a person and waste resource time. Something that came up was the question about application deadlines, prospective students would call and simply ask when they had to apply by. Wayne State has a rolling admissions schedule so a student can apply and get accepted at any time but each semester has a cut off date. In addition, since the schools/colleges/departments control the content on their websites the level of information can vary. We decided to tackle this challenge on the homepage.

First Try

We decided to play with a few options of adding the due date to the homepage. This would not only solve the prospective student issue but also give the visitor a sense of urgency if they were considering applying. What we came up with is shown on the right. It was a great first step but we got some great initial feedback, the graduate programs all have their own deadlines so we couldn’t publish just one date. We were okay at the time with just publishing the next upcoming semester for graduate.

Refinement

After seeing the page day in and day out for a few weeks we started to notice some redundancy with the “Application” and “Apply” labeling being right on top of each other. So we started to refine the style to put a little more emphasis on the upcoming semester and less on the actual due date.

We also noticed the “Giving” menu item was seriously getting overlooked. Although it was much larger the type and positioning didn’t look like it was clickable. Here is the break down of the click throughs year over year.

The 52% decrease had us worried so we knew we had to at least bring the link back up to it’s normal state. It gave us a chilling reminder that with A/B testing it is important to change only one thing at a time. Otherwise you don’t know what outside consequence you might have.

Current Design

The current design was meant to focus on the upcoming semester they would be applying for which leads to the apply wording visually. We also moved the “Giving” link back up to see if it made any difference. You can see the currently live design on the right. The results so far have been really good. The “Giving” menu item returned back to its normal click through rate. And overall we were seeing a 30% increase in the number of click throughs from the original design.

A breakdown of the clicks from Sept 13 – Dec 21, 2010 to Dec 22 – March 31, 2011 can be seen below. This is the previous 100 days and the 100 days before that.

Year over year increase

This change had us excited, it meant we were on the right path to entice visitors to take action and we had a way to track it. Looking deeper in to our analytics though we realized we were on to something even larger. Below is the year over year stats for the “Apply” link and the new style. It is a little harder because the single link was now split into two but what we found was in the same time period a year prior the “Apply” click through rate actually decreased 17%. What that means is from the previous year we were able to increase the click through rate by an astonishing 62%.

Below you can see a break down of the year over year change:

Looking to the future

We are now starting to plan our next revision of the “Apply Now” buttons to see if we can push that increase even further. Our next step is to clean up the information a little bit to make it even clearer. Here is an overview of the options we came up with. The left most image is the original version for reference, the second is the current and the rest are new.

We realized both undergrad and graduate admissions promote the same semester at the same time so we don’t need to display that label twice. Since we are combining the labels we probably need to remove the separator line so they can be in the same context. We still haven’t decided 100% on which to implement but we will be testing them when we do.

Lessons Learned

Making changes isn’t always going to improve results and improving one result isn’t necessarily going to impact the entire system. We learned that it is important to stick to one change at a time, measure and refine. Not all changes will be earth shattering and you have to accept you may be impacting the user experience for the wrong reason.

We also learned gathering proper results takes time, at least a month to see a clear picture. It may be tough to put up with the opinions of a few people if they don’t like what you are trying but in the end it is all about how it resonates with the end user. They are the only one that matters.

Lastly we learned it is important to try something, there are always opportunities for growth and far too often we hear “let’s tackle that in the next redesign”. Decide on a micro goal, figure out a way to measure it and implement. It is better to have tried and failed than to have never tried in the first place.

It's a jungle out there, why we interact socially

Sometimes I get caught up in coding something totally inspired by another project. It not only gives me an excuse to continue to program but also be flexible and create tools that add value to the university.

Today, 37 Signals announced they would start publicly displaying their “Smile Ratings”. This got me thinking, we rate all our twitter mentions as “Happy”, “Indifferent” and “Sad” in the same way, why not display that information similarly.

So I took an hour and this is what I came up with:

Continue reading “It's a jungle out there, why we interact socially”