[Friday Links] The Social Version

Friday web design links go web only. Enjoy.

Wayne State goes Twittering

Twitter LogoBranching out past the confines of our university homepage means working with public API’s and services. We started yesterday with Twitter, the first of many social and external sites we will be pushing information.

We choose twitter mainly because of its simplicity. The idea of having a full blown facebook application with no street credit seemed unnerving to us. Sticking to short titles and links gives us the ability to talk about anything Wayne State and not be locked into a certain user action or have to maintain a code base that maybe 10 people interact with.

Items will be posting as they come up, we are not setting a schedule or pattern. We don’t have any followers at the moment but we have not posted its existence other than here. We hope it will become a tool for people interested in the university can use to keep up with while keeping up with their friends. We will also be posting deadlines and important date as they approach and occur.

Follow Us

Optimizing high traffic pages with web standards

Wayne.edu HitsThe first days of a semester are always big for students and for hits to the Wayne State Homepage. 40,500 visitors the first day and 36,500 the second.

We are lucky in the fact that our hosting environment is setup to regularly respond to this amount of bandwidth. That is not to say we don’t have to optimize the site, far from the truth. The inherited version of the homepage weighed in at 400k total, 25k just for the HTML and over 40 http requests, it was a bad situation.

Recently we moved the homepage to the main server and rebuilt the site from scratch. Giving us ~120k total weight, 6k in HTML and just 15 http requests (including google analytics). Making this optimization didn’t happen over night, it took some time to get the numbers down this low, here is what we did.

  1. Scrapped the original table based layout for CSS and XHTML.
    This took the site down by 70% right off the bat.

  2. Optimized all images and combined where necessary.
    Removed ~10 http requests and saved 50% in file size.

  3. Used Yahoo’s YSlow rules to have a goal to reach.
    We knew it was not possible to get an “A” grade since we do not have a CDN but we did our best in all other categories.
    YSlow wayne.edu

  4. Combined and Minified all Javascript and CSS.
    Saved ~5 http requests and 40% in file size.

  5. Configured far future expire tags and ETags.
    Making a second visitor with primed cache only need to grab ~50kb and 2-3 http request depending on the rotating panel.

Back end changes helped a lot on the server load also. A few things helped the processing time for each http request.

  1. Removed all DB connections.
    All the dynamic data is updated via a cron with static html, this reduced the impact on the server per page load dramatically.

  2. Reduced the number of PHP includes.
    Combined as many files as possible and used full paths for includes to reduce the number of files the server has to access per page load.

  3. Took the homepage out of a php framework and wrote custom functions.
    Further reducing the overhead of the page load to just the essentials.

Overall we have had great response from the decreased load times and the standardization of the page. Response time during high traffic times like the start of classes and early registration have been kept so low it has eliminated any complaints we received in the past about slow page loading.

It took about 2 months to get everything in place and tested but it was well worth it. We have a standard compliant site now with very little impact on the server environment. Keeping it simple, flawless and eliminating any wasteful overhead definitely attributed to the projects success.

Code Update: PHPSimpl 0.8.4

The newest version of PHPSimpl has been released and we have updated the code on the production web server. The release bring some much needed features to the framework and a few bug fixes.

We know this is not a widely used framework because we take an active role in its development and feature set. (We also track its download rate :-X) But we use it for a few specific reasons, knowing we are probably not the only ones in this situation we outlined them below.

  1. We work in a shared server environment where we don’t have root access to the server.
  2. We do not have direct control when the server software gets updated.
  3. We are called upon to produce sites or applications in a rapid manor.
  4. We control so many sites that having to install the framework for each would be a inefficient.
  5. We use the same functions and methods throughout many sites.
  6. We have more than one developer building and maintaining sites, consistency determines our efficiency.

The update notification can be found at the unofficial PHPSimpl site and we have included some useful URL’s below. We are always here to help so any question about the framework and its implementation just fire away.

Upcoming Feature: Auto Suggest

Autosuggest screenshotWe are testing a new feature on the WSU homepage. It is designed to allow for quick access to the most common WSU sites right from the search field.

It works by using the characters typed into the search box on the homepage and comparing them with the site index. It finds the top 5 matches and displays them directly on the screen where you can scroll through the list with the keyboard arrows or click on a link with the mouse. It gives direct access to these pages without having to sift through the search results page.

Testing this new feature is pretty easy, it is not enabled by default yet but will be soon. You can enable it for your computer by following the URL below:


It will stay active until you turn it off by following this URL:


We will be testing this feature for two weeks before we roll it out to everyone. We would love to hear your thoughts and any difficulties you may have with the feature. You can send any questions or comments to wcs@wayne.edu. We listen to all concerns and comments and are always looking to improve the user experience here at Wayne State University.

[Friday Links] The Writing Edition

Google Reader has been one of my best friends for a while now and its a great way to keep up with industry leaders and trends. I unfortunately subscribe to >100 feeds and each day scan for interesting articles, I read a lot of them. :-/ Once and a while I come across a few that are worthy enough to share with the marketing staff and keep them in the haps with the industry.

I send them out on Friday’s and they go to developers, designer, writers and an AVP. Each week it seems more people want to be on the distribution so I decided to publish the list for all to read. Being Friday it seemed like the best time to start, so here we go.

Redirection with ModRewrite and GET variables

Yesterday I started working on the redirect in the .htaccess for the homepage. The quick links on the homepage are redirected using ModRewrite then pushed through a php file that logs the action. This lets us monitor their usage so we can make sure the best possible links are available in this drop down. Here’s the code we were using in the .htaccess:

RewriteRule ^r/(.+)$ http://wayne.edu/r.php?url=$1

It was working well with normal URLs but when a URL was passed that had a query string, that query string was being truncated leaving the forwarding php with a URL that didn’t have a query string which then forwarded to the wrong location.After looking deeper into the options available with ModRewrite I found the [QSA] flag. This appends the query string back onto the URL when forwarding, which allowed me to modify the php file that was managing the forwarding so it could reattach the missing query string onto the end of the URL. So now the quick links will forward properly, even with complicated URLs.Here’s that same RewriteRule with the flag:

RewriteRule ^r/(.+)$ http://wayne.edu/r.php?url=$1 [QSA]

The PHP file gained this code which reattaches the query string back onto the end of the URL being passed in.

foreach($_GET as $key => $item){
  if($key != 'url'){
    $pieces[] = $key . ($item != '' ? '=' . $item : '');
$url = $_GET['url'];
if(sizeof($pieces) > 0){
  $url .= '?' . implode('&', $pieces);

Getting down to business

Over the past two months have been slowing down our client work in an effort to focus on the universities main homepage and central university tools. Being the second day of work things seem to be going as planned, we squashed a few bugs in the homepage and are discussing additional features to the child pages. All of the pages are going to essentially be mini portals, giving the user aggregated information relative to the pages content, kinda like Google Adsense but not for profit.

Cleaning up our sloppiness left over from 2007 is also underway. We are starting out with our file server, it store all the active, archive and tools for projects. Starting out by archiving all the old stuff and updating all the current projects has been tedious but necessary.

In addition our server move is almost complete, just two sites left to go and cwis-1 will be re purposed. Unfortunately they are some high traffic sites, the Public Relations site and the Board of Governors site. Getting those off the old server means we will no longer have to support two versions of our CMS and we can stick to updating just one in a single environment.

New year, New endeavors.

With the new year comes new endeavors. 2008 has started and we are back to work, here is a few things we are bringing with us.

  1. Welcoming Nick West to the team! Nick as worked as a student assistant for about a year and has been hired full time.
  2. A blog dedicated to the daily work of the web communications team at Wayne State University.
  3. The chance to be more open about what we do here and how we do it, taking the web development struggle and making it a resource for all to learn from.