Web Support Blog

January 27, 2007

Outsource systems, so you can put more focus on innovation

Filed under: General, Search, Web Analytics — Chris @ 10:51 pm

In an article by Jon Brodkin of NetworkWorld.com, he discusses a recent comment from Google’s Dave Girouard on how “insane” it is that companies spend 75% to 80% of their budget maintaining, instead of innovating. If you consider a company like Google, or even Amazon.com and eBay, they used innovative technologies to drive their businesses. So why don’t others learn from that?

In my current organization, we have solved some of this issue through outsourcing, while other systems have issues because there are too many owners for different parts within our organization. First let me share two short examples of successful outsourcing – a web analytics tool and a customer survey tool.

We use Omniture’s SiteCatalyst as our tool of choice to perform web analytics. Omniture takes care of the data collection and management – an area that has been a problem in organizations that I have been a part of in the past. In my last organization, log files needed to be joined from the various proxy servers and made available to a WebTrends installation. Occasionally the IT organization would change something in the proxy servers, and we might lose a day or two of data (until the problem was found and resolved).

Of course we are not completely free of responsibility with SiteCatalyst. We are responsible to tag our web pages, which then communicate web activity back to the Omniture environment. If we should make a mistake in our tagging, or forget to add tags to a new web page, then we would miss that data. The other thing we do is send additional data, via a batch load, to Omniture that further classifies our data. For example, the page tag will send a product id, then through the batch load, we will send a product name, which makes it much easier when creating and viewing reports.

The other outsourced example, our survey tool, is similar to Omniture SiteCatalyst, as the vendor manages the data for us. In both cases, we free up resources that would otherwise have to manage the data and the servers that host it. Of course in both cases, there are times where we want to join that data with data from other systems. Depending on the complexity, we can either load data into the vendor system, or extract it from the vendor system to ours. We typically would only do that for ad-hoc data analysis, as routine reporting has the required data transfers automated.

So that’s the good news. But when it comes to search, it is quite the opposite. First, you could outsource your search engine, your entire web site, or perhaps just the knowledge management portion. For example, RightNow Technologies provides the knowledge base infrastructure for companies such as IBM, Nvidia, and RealNetworks.

In my organization, and in many others, search itself takes many different specialized skills to make it work.

  1. Managing the network and security access
  2. Maintaining the OS on the servers
  3. Maintaining the search application, including indexing, and its custom code
  4. Develop and maintaining the user interface, including passing the appropriate data to the engine
  5. Develop and manage the ontology and dictionary rules
  6. Content maintenance (e.g. have the appropriate meta tags)

If we were to outsource, we could have someone else worry about the first three (1 – 3), but the last three (4 – 6) will always require people from our organization (4 could be outsourced, if we outsourced the web site development and maintenance). The missing element on the list is someone to have oversight; someone to keep all the various parties in contact with each other. The same resource would also be required if you outsource, though again the first three would be replaced my managing a vendor relationship.

So whether we are talking about outsourcing security, as Mr. Brodkin mentions in his article, or applications within your environment, your company can free up resources for innovation, while paying experts to maintain a system for you.

Advertisements

January 15, 2007

Web Analyst Recommendation

Filed under: Web Analytics — Chris @ 12:56 pm

My organization has been working with a Web Analyst in order to accelerate learning that otherwise would be very time consuming. Like many organizations, we do not have a mature toolset (it is coming), and so the effort to understand what the data is telling us is difficult.

Based on previous work with us some year or so ago, we asked Debora Geary of Fireweed Analytics to come back and work with us. Debora was able to merge and join disparate data sources and uncover some valuable information. For example, we couldn’t identify those who viewed our site that had not logged in — she matched ip addresses with previous logins and helped us realize that most of those visitors have logged in previously.

We also learned that most of the users that did not login had ran a search, but never clicked-through on any of the results. Another value was joining our customer survey data with our web data. Not surprisingly, the reported intentions from the survey did not always align with the behavior on the site.

Don’t just take my word for it, read Debora’s recent article Warming Up to Analytics. This five part series, which she completed on January 4th, 2007, provides a lot of insight to how she approaches her work. You can also catch her posting on the Web Analytics Forum on Yahoo.

So unless you are already an expert in the web analytics field, give Deobora a call — you will be an expert when she’s done.

October 11, 2006

Web Analytics Wednesday | Web Analytics Demystified

Filed under: Web Analytics — Chris @ 8:41 pm

I had a great opportunity tonight to meet Eric T. Peterson, the author of Web Analytics Demystified, and several other web analytic colleagues. Eric has coordinated Web Analytics Wednesday all over the globe — this is a time for people to meet locally and discuss web analytics. I am just fortunate enough to live in Portland where Eric, Web Trends, and many Intel folks do.

I did not meet anyone with customer support experience like myself, but I definitely met many smart and enjoyable people. I was very impressed with how welcome I felt, and how well everyone was engaged in conversation. Kudos to Eric for putting this together — I wish that I would have attended one of these earlier. Be sure to check out Web Analytics Wednesday in your area.

Oh, and I got a copy of Web Analytics Demystified, which I will read and give a review here in the future. I also happened to order another of Eric’s books today, Web Site Measurement Hacks — looks like I have some reading to do.

Content Effectiveness and Functionalism: Update and Misc.

Filed under: Content Effectiveness, General, Web Analytics — Chris @ 4:43 am

Since I posted my whitepaper on Content Effectiveness and Functionalism, I have been working through the next steps: applying. In doing so, I discovered that I may not have always applied the right terms in some cases. The good news is, with Functionalism, the terms are not the important part, it is the methodology. So I have updated the paper with a slight variation to a few of the terms, but the KPIs did not change. If you go to the original post, you will get the update version, or you can get it here.

As I said, they key to Functionalism is the methodology, not the terms. In evaluating your site, whether you are focusing on content effectiveness like me, or another aspect, consider the purpose of each page in your site. A common method to use is a funnel and a fallout report. When you examine your fallout report, what actions do you take? Well if you used Functionalism, you would have KPIs for the page where you lost your users, and therefore you would have some idea as to where the page was failing.

———————————————————————

On a completely different subject, this weekend I posted on my other blog two keyboard shortcut guides. There is one guide for Windows XP Keyboard Shortcuts and the other is for Firefox Keyboard Shortcuts. The guides can be printed double-sided and folded in three (tri-fold), and they make a handy desk reference until you learn the shortcuts. Follow the links above to get your own copy or go to my blog Skimming the Cream Off the Top.

October 5, 2006

Assumption: The Content Exit Page

Filed under: Content Effectiveness, Web Analytics — Chris @ 4:55 pm

I made some assumptions in my whitepaper on Content Effectivenss and Functionalism, so I want to discuss them over the next few weeks. The first assumption is that most users of your site will exit after reading (or viewing) the information they need, or when they give up. This assumption means that the best way to validate whether your content was effective or not, is to look at what the customer did after reading it. This is a model I have followed for years, but was recently supported by Elliott Masie.

As Mr. Masie pointed out in his discussion of Fingertip Knowledge (see my post: Fingertip Knowledge: Learning-on-Demand), people are more and learning as they need it. So, they are not reading ahead and learning what they may need later, rather they are searching for an answer once they run into an issue. Likewise, once they find the answer, they are going back to their original task. Take for example the case of a user who sent a document to their laser printer, and it jams. The user was not trying to learn all about the printer and specifically jams, the user was working on a document and wanted a printed version. Therefore the user will go to a support site, try and learn how to correct the jam problem, and then return to their real work.

You can apply this same concept to other support situations too. Take the example of the user who is writing code and he or she runs into a problem — perhaps a syntax issue. They go to Google, enter related terms, filter through the results. Once the user finds the answer, he or she does not continue this process, they go about their original task — writing code. So, from the perspective of the support site, they see the last transaction with the user is reading a piece of content.

If you are still with me, it is easy to then apply Functionalism to measure how effective an existing piece of content is (our Explainer/Converter page). Divide your Exit Rate by the number of times the content has been viewed and you will get a percentage (result x 100). The higher the percentage, the more effective the content. For example, if you have 1000 views and 10 exits then the rating is 1%; likewise, 1000 visits with 100 exits is 10%. This is an easy way to identify valuable content. Remember to also use the Exit Propensity concept: look across all your content and identify your worst offenders. I would encourage you to consider some weighting too — i.e. apply the formula to your most frequently viewed content instead of all your content.

Back to our formula (Exit Rate / Page Views * 100) If the percentage is low, then you need to look at other measures. First, consider the causes. Likely if users are viewing the content, the title and description was compelling enough for the user to click-through. So I would consider these possible causes:

  1. Mismatch between title and description and the actual content
  2. Content is outdated
  3. Content is incomplete
  4. Content does not written at the users’ level or too complex

All of these symptoms are difficult to diagnose without having an expert evaluate it — which can be expensive. Therefore if you have a lot of this, it may payoff to look at your creation process. (We will save this discussion for another time.) With that being said, there may be some cluses to look for. If users refine searches and look at other content on your site after viewing this content, it may be a sign of mismatch or incomplete content. If users have a tendency to spend a long time with the content, then it is likely at the wrong level or too complex. Through process of elimination, you should conclude that other content likely fits the outdated category.

Enough for today… I think this gives a lot to consider how it could fit into your organization. Again, I will continue to address the assumptions in the whitepaper over the next few weeks. Later, I will dig deeper into the analysis and diagnostic issues, so we can make corrections based on what the data is telling us.

October 1, 2006

Content Effectiveness and Functionalism

Filed under: Content Effectiveness, Web Analytics — Chris @ 9:07 pm

My entire career has been in customer support, with the last 11 specifically in the web support space. Naturally everyone has asked me, “How do you identify a web site fix as an equivalent to a phone fix?” When I tell them, “Unless the user tells you that your site fixed their problem, you really cannot tell,” the next question is, “Then how do you know the site is working?” Well, it takes some assumptions: (1) if your content is right (timely, accurate, and relevant), then your users are solving their problems. And, (2) customers will leave when they have solved their problem or have gotten tired of looking for the solution. If you expand on assumption (2), by watching where customers exit on your site, the number of searches made, and content pages viewed, you can get a really good picture of what is working and what is not.

With these assumptions in mind, I set out to try and identify the content effectiveness on my site – as that is what drives customer success. In this quest, I looked for others who are already doing this, others who have a different view, or others who have a solution for identifying a self-support success. It seems the web analytics vendors have yet to work in this space and the call center organizations are still stuck on trying to find the phone fix equivalent. In other words, I have not been successful in my quest.

Recently though I found a whitepaper, Functionalism: A New Approach to Web Analytics on Gary Angel’s blog, SEMAngel. Gary is the President of SEMphonic, a company that has over 10 years of experience in the analytics space. To make a long story short, I was able to take the methodology from the whitepaper and apply it to content effectiveness. The whitepaper I wrote, Content Effectiveness and Functionalism, discusses how you can use Functionalism to identify the health of your content, and where to focus your efforts to improve your content (its effectiveness).

Below is a short list of basic assumptions I made in writing the paper.

1. Your support set is segmented by product. (In addition, it can also be segmented by language and region/localized.)
2. Your primary support site goal is to provide self-support.
3. Majority of your users use the Fingertip Knowledge approach. (Most users use your site to solve an immediate issue or problem, not for proactive learning.)

In future posts I will discuss the methodology in further detail, answering questions, and drilling-down to the next level of the methodology.

September 22, 2006

Fingertip Knowledge: Learning-on-Demand

Filed under: Learning, Search, Web Analytics — Chris @ 5:48 am

I recently listened to a podcast from Elliot Masie, Fingertip Knowledge: Learning in a “Flatter” World. Mr. Masie introduces a concept called fingertip knowledge. He is using it in context of learning, but it is a concept I recognize — I have called it learning-on-demand. It is the idea that I wont try and learn all that I need to know, I will use Google to find what I need, when I need it.

This also translates to our support web sites. As I have indicated in my prior blogs, users will come to your site to learn a specific piece of information to solve a problem. Your users are employing the concept of fingertip knowledge more and more every day.

It is this idea as to why I suggest that the best way to determine success on your web support site is to look for the users who left your site immediately after reading a content (knowledge) item. Once you have fulfilled their learning-on-demand need, they will return to their work. Remember, as a provider of support information, customers come to your site because your product or service failed — once solved, they will go back to using your product to complete their real work.

September 21, 2006

Measuring Web Support Success, part 2

Filed under: Web Analytics — Chris @ 7:59 pm

As I mentioned prior, you need to think about the delivery of your content as much as the content itself. Moving from that rather basic notion, consider “What is the purpose of your customers’ visit?” and “Why are they coming to your site?” The likely answer is that they had a problem, and they need your help to solve it.

Now some customers will know or believe they know the answer — they just need the latest driver or patch, so their behavior may look a little different, but it is still the same intention — solve their problem. Excluding customers that come to your support site to learn (which will vary by product and industry), the ideal support experience is straight forward and therefore easy to measure success.

If I was selling a product or service, once I collected the customer’s credit card number, I would count that as a success. In the support world, once your customer reads the answer to his or her problem, you have been successful — a conversion. If you buy into that idea, then this should be fairly straight forward: the last content item read on your site by your customer must have been the solution to solve his or her problem. Why else would they keep looking for an answer if I already found it (that could be the learning exception)?

So assuming you have a low rate of customers that visit your site purely to browse and learn, you can conclude that the last item viewed was the conversion content item. Okay, but what if the customer never found a solution, but looked at many content items? Again, what wss the last item viewed on your site? A content item? Or was it a search result page? Or how about a page to route to additional content? If the last item viewed (the last page viewed) on your site was a content item, then I would say very likely your customer was successful in solving their problem. If the customer left your site where the last page was any other page but a content page, then I would say they did not solve their problem.

What do you think? I will continue to expand on this subject in future blogs…

September 18, 2006

Measuring Web Support Success

Filed under: Web Analytics — Chris @ 4:13 am

How does your company know if their web support is successful? Does more visitors mean it is better? Can you correlate increased web success to a reduction of support calls to your call center? Most companies do not know if their web support is successful nor can they correlate it to a reduction in support calls.

I have recently started a new assignment where I will define our organizations content effectiveness strategy. The basic idea is that better content = more successful customers. When I am done, we should be able to identify our content health and which part of the content needs the most attention.

There are two points worth expanding on from the prior paragraph: content health and which part of the content needs the most attention. First of all, content health should put information in the hands of managers that need to decide how to best allocate prescious resources. Today, most organizations only know to answer the phone — it is the only area where they can measure the value. These managers need more information so they understand when it is better to invest in content and when it is okay not to.

Hopefully, the second part is obvious. The fact is, there is more to support content than just writing a few articles or FAQs. Just remember this for now — no matter how good the content is, if users cannot find it, then it is useless.

Create a free website or blog at WordPress.com.