Posts Tagged ‘Good Web Practices’

Fun With Web Metrics

September 13th, 2010 2 comments

My friend Bob has put up a blog focusing on issues of Web Metrics and he’s got some interesting things to say: “Conventional Wisdom Frequently Isn’t” .

Central to that post and touched upon in several other of his posts is the notion that what people so often think are universally important metrics are often antithetical to what the users of an individual site may want.

Therefore, if chasing the wrong metric, how successful is that site is in meeting a need? If the site doesn’t meet users’ needs, they won’t be back; they won’t link to you, recommend you, or buy stuff from your site, or any of your advertisers. Assuming the goals are commercial of course. If they’re non-commercial, how good a service is it? How likely are donations?

Bob’s insights into metrics are a useful read.

Go…shoo…read…bring back some Ben and Jerry’s when you’re done. 😉

Post to Twitter Post to Facebook

‘Tweet This’ Installed After Review

September 11th, 2010 Comments off

Tweet This seems a much more civilized way to allow users to Tweet and Facebook your WordPress postings than Share This which includes a lot of tracking functionality.

Again, while I can’t claim to be the highly trained codemonkey, I have learned from my very quickly fixed (less than an hour!) faux pas with Share This and have now added Tweet This to this site.

After reviewing the code, Tweet This seems to do send nothing to or via any third-party sites and, unless configured to use a third-party URL shortener, includes no traffic logging click-thru tracking or similar functionality. It doesn’t appear to load any external .js either.

So, unless somebody educates me to the contrary, I’m going to call this ‘safe’ and leave it deployed. I hope you find it useful.

– Jon

Post to Twitter Post to Facebook

URL shorteners, the problems, the value, a solution?

September 10th, 2010 No comments

UPDATE: Gruber Explains his method and identifies a pitfall: “What I didn’t foresee was the tremendous amount of software out there that does not properly parse non-ASCII characters in URLs, particularly IDN domain names.” In retrospect, I should have been more curious and tested his cagey domain name branding. It worked for me so I just assumed. I would have tested it had I implemented it but as a user, it worked, so what did I care? Anyway, Gruber explains more here:

****UPDATE AGAIN 10.6.10**** Worth a read but I distill it thus:

Don’t rely on a business relationship where you don’t have the support of a legal system you can participate in. Obviously, we live in a global economy so what matters is the word ‘rely’. Participation != reliance.


URL shorteners are extremely common tools used to make links shorter, more able to fit within artificially constrained text fields, and, in some cases, more human-readable. They can also have some other, arguably indirect, benefits in some cases including click-through tracking.

One major potential problem of course is link rot which can be a pretty staggering problem if your chosen provider completely folds up shop.

One common inducement to shorten URL’s is Twitter and the 140 character count limit on the ‘microblogging‘ site’s posts. While an argument can be made that imposing a 140 character limit enforces a tone, there’s no logical reason that Twitter couldn’t accept URLs shortened by the most basic method, an HTML embedded link:an HREF with your own descriptive text and probably claims disallowing this is a security advantage. Such a claim would ring hollow since URL shorteners themselves obfuscate the real target of a link in a way that’s actually potentially worse.

By now, some of the problems with these tools should be obvious:

  • The provider of the translation from short to long can fold up shop instantly rotting all your links.
  • The user can’t hover over the link and look in the status bar of their browser to see what the actual target is.
  • Users seeing different shortened versions of the same link aren’t given the visual cue of ‘followed’ color change in their browser for links they have already seen.
  • They do nothing to encourage site maintainers to strive for more accessible and better indexable and human-readable URL schemes.
  • They insert yet another intermediary able to track users browsing habits.
  • They are, in many cases, unnecessary layers of complexity.
  • They can pose problems for content publishers wanting to be methodical about managing their own link rot. (Do you try to ‘fix’ expired shortened URLs? Can you when you don’t control them?)
  • They have SEO implications.
  • If you need to shorten URL’s on your intranet for, for example, easier tracking, you wouldn’t want to expose, or possibly couldn’t expose the URLs to a cloud based provider for security reasons.

So, let’s for the moment, set aside all the good reasons we should be hammering on Facebook, Twitter and others to exclude the characters used for posting HREFs and the links themselves from the character counts and accept, albeit grudgingly, that there could be good reasons to implement URL shortening mechanisms. How should we do it?

We should be hosting our own URL shortening mechanisms on our own sites.

John Gruber of Daring Fireball has this down to a science. If you follow Daring Fireball on Twitter you will notice every link he posts looks like this: http://✪ These links are URLs he’s shortened on a server hosted in the .ws domain and they hit short tease pages on in a directory called /linked. His little tease pages don’t always do the best job of telling the user what they’ll get with a click but this is part of Gruber’s editorial voice and is clearly done deliberately. The mechanics of the user experience, however, are extremely solid:

  1. Shortened URL appears in Twitter feed. (or other microblogging site or method)
  2. Shortened URL is uniquely branded as a Daring Fireball Link (issues with scaling his branding method notwithstanding)
  3. User chooses to follow the link based on other content in the ‘tweet’
  4. User is presented with a ‘tease page’ on Daring Fireball
  5. User can, unless Gruber is being cute on his tease page, make an informed choice about whether to fully follow the link.
  6. Gruber can track the response to the tweet, and, if he wanted to be a right bastard about it, outgoing clicks from his tease page could be sent through a click-through sever script.
  7. Gruber can scan the contents of his Linked directory to manage link rot on his own site.
  8. If he’s methodical about keeping track of which URLs he posts (and I bet he is), basic log analysis of referrers on requests to files in his /linked directory will tell him where he is picking up traffic. Some more complicated analytics could tell him some things about where people had propagated his links.

I happen to usually agree with Gruber, always enjoy reading his stuff and, frankly, take a certain vicarious pleasure in his occasional willingness to be a bit brash. I happen, also, to usually like the results, when he gets cute on his tease pages but, no matter what you think of Gruber and Daring Fireball, the method he uses is the basis of a really solid approach to managing a site and integration of that site with Twitter, Facebook and other mechanisms of ‘syndication’. It puts him back into a position where he is editorially engaging with his users on his own site after he used the visibility he got himself elsewhere.

It’s pretty slick and, done methodically with a transparent editorial policy, this approach could do a lot for fixing what’s wrong with the current willy nilly grabs for presence on Twitter, Facebook etc. by content companies who are now, almost always, just diluting their own brands. I happen to bristle a bit that Gruber’s hosting his URL shortener in a .ws domain (The country domain for Western Somoa and I think it’s because it was an easy way for him to register a domain branded with his UTF-8: E2 9C AA “✪” in his very short ‘✪df’ domain name but I don’t actually know why he chose .ws.) but that’s just my old school leanings when it comes to top level domains.

So, the solution, and maybe Gruber’s already built it, or maybe it’s one of these: 10 Free Scripts to Create Your Own Url Shortening Service is to host your own URL shortening mechanism on your own site. You can implement your shortened URL’s using the “The Full Gruber” and, with some good practices of your own, (mostly) mitigate every last one of the problems cited above except maybe some of the SEO issues.  Those  you could work around with other architecture and editorial choices and which would be offset by traffic to and the well organized placement of your tease pages.

No, of course, the Gruberization of link shortening practice isn’t nearly as clean, consistent and user-friendly as actual links direct to content with meaningful descriptions but, since we’ve let Twitter, Facebook etc. co-opt our control of our own presence on the net, for now, this is a more than decent workaround.

Now, as usual, the comments I get will say “But Jon, it’s not easy enough or convenient” and, for most people, yeah that’s probably true. Hell, I am expecting to bleed from the forehead trying to work out how to automate it or will just end up stupidly manually editing my .htaccess file for every link and maintaining some static structured reference file by hand or find somebody to help me set it up here because this is hardly turn-key. The point is, what should a smart company or organization do to manage how they maintain the best most direct connection to their audience rather than, usually self destructively, try to exploit Twitter, Facebook etc. to attract an audience?

Ok, so, who wants to work with me for no money to help me set all this up for my sites and publish a how-to and/or open source package that we can convince DreamHost to make a ‘One Click Install’?

Anybody feeling that charitable? 😉

– Jon

Post to Twitter Post to Facebook