Nik's Technology Blog

Travels through programming, networks, and computers

SEO: Ten Link Building Recommendations

A major and time-consuming part of search engine optimisation (SEO) is link building.  If you own a website of any sort, you’ve undoubtedly received SPAM emails from dubious companies offering to provide you with SEO services.

Assuming you decide to outsource your SEO activities and you’ve chosen a reputable company to provide this service (preferably one that hasn’t spammed you), how do you ensure the link building effort will pay off and give you greater prominence in the top search engines?web

Here’s a list of ten link building recommendations.

  1. Only get links from similar themed organisations and websites
  2. Make the anchor text of the link relevant to the landing page and different on each link obtained. E.g. don’t use your website name as the link text all the time
  3. Try to get deep links into your website, not just to your homepage
  4. Favour text links over image links
  5. Avoid getting links on purpose built 'link' pages
  6. Avoid reciprocal link schemes
  7. Avoid black hat techniques such as obtaining hidden links, links from sites built specifically for gaining PageRank
  8. Favour links with decent PageRank
  9. Links having rel="nofollow" do not count towards inbound links
  10. Make sure pages that contain your link aren’t excluded via robots.txt or robots meta tags
  11. Avoid links which look like they have been bought, as Google can penalise these

 

Thanks to Adam for his helping putting this list together.  Image used under Creative Commons by saintbob.

EAVB_BSNMKXEWNF

Learning jQuery 1.3 - Book Review

My first exposure to jQuery was using other developer's plugins to create animation effects such as sliders, and accordion menus.
The highly refactored and compressed production code isn't the easiest to read and understand, especially if you want to alter the code to any great extent.
After reading a few tutorials, I thought I'd buy a book and get more involved with the jQuery library.

As an ASP.NET developer used to coding with intellisense, I was pleased that jQuery has been incorporated into Visual Studio to allow ease of developing.
I browsed through the jQuery books on Amazon and opted to buy "Learning JQuery 1.3" by Jonathon Chaffer and Karl Swedberg after reading the user reviews.

I've now read most of the book and can highly recommend it.  The book assumes the reader has good HTML, CSS knowledge as well as a familiarity with JavaScript and the DOM, but this enables the book to quickly move onto doing useful, everyday tasks with jQuery.

The first six chapters of the book explore the jQuery library in a series of tutorials and examples focusing on core jQuery components.  Chapters 7 to 9 look at real-world problems and show how jQuery can provide solutions to them, and the final two chapters cover using and developing jQuery plugins.

Web developers should be aware of web accessibility and SEO issues with using client-side scripting and it is good to see the book highlighting the concepts of progressive enhancement and graceful degradation where appropriate.

"the inherent danger in making certain functionality, visual appeal, or textual information available only to those with web browsers capable of (and enabled for) using JavaScript.  Important information should be accessible to all, not just people who happen to be using the right software." - Learning jQuery 1.3,  page 94

After a brief introduction into the world of jQuery, what it does and how it came about the book moves quickly on to selectors, which are a fundamental part of how jQuery selects element(s) from the DOM.  It also covers jQuery's chaining capability, which coming from other programming languages looks odd at the outset, but quickly proves to be a very powerful technique.

The authors then move on to talk about events.  What I particularly like about the way jQuery handles events is that the behavioural code can be cleanly separated away from the HTML mark-up without having to litter tags with onclick and onload attributes.

The examples show how to add functionality on top of your HTML by binding events to elements on the page, which when triggered cause jQuery to modify the HTML to bring the page to life.  Techniques are introduced by example, then slowly refactored and improved while introducing new jQuery methods along the way, which is a breeze to follow and learn.

The fourth chapter covers effects such as fading in and out and custom animations, and jumps straight in to cover a useful example of how text size can be increased on-the-fly for ease of reading.  The intro also mentions an important usability example of effects.

jQuery effects "can also provide important usability enhancements that help orient the user when there is some change on a page (especially common in AJAX applications)."- Learning jQuery 1.3,  page 67

Chapter 5 is all about DOM manipulation and covers jQuery's many insertion methods such as copying and cloning parts of the page, which it demonstrates with another useful example in the form of dynamically creating CSS styled pull quotes from a page of text used to attract a readers attention.

AJAX is the next topic, which interested me enough to create a little tool to load in an XML RSS feed and create a blog category list from the data.
The chapter covers the various options of loading partial data from the server including appending a snippet of HTML into the page, JSON, XML and how to choose which method is the most appropriate.

Table manipulation is next on the agenda and the book discusses how to sort table data preventing page refreshing using AJAX as well as client-side sorting, filtering and pagination.

Chapter 8 delves into forms, using progressive enhancement to improve their appearance and behaviour.  It also covers AJAX auto-completion as well as an in-depth look at shopping carts.

Shufflers and Rotators are next and the book starts out by building a headline news feed rotator which gets it's headlines from an RSS feed, typically used by blogs.  It also covers carousels, image shufflers and image enlargement.

Chapter 10 and 11 examine the plugin architecture of jQuery and demonstrate how to use plugins and build your own.  I successfully produced my first jQuery plugin from reading this book.  You can check out my tag cloud plugin and read about how I originally built it before turning it into a plugin that other developers can use.

UK Reg Encourage Users to Buy Domains for 10 years Citing Google Patent

UK Reg, a domain registrar in the United Kingdom is using text written in one of Google's patent applications to help sell 10 year domain name registrations!
I was quite surprised to see a marketing technique used to sell a search engine marketing benefit.

Here's a screenshot from the site below:

UK Reg - Secure your domain
Clicking on the "Google patent application" link produces a pop-up which quotes the following sentence from the patent application Google made, with a link to the application in full:

"Certain signals may be used to distinguish between illegitimate and legitimate domains. For example, domains can be renewed up to a period of 10 years. Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain and, thus, the documents associated therewith."

Google Paid Link Policing and other more Democratic Ranking Methods

Google's Webmaster Help Center explains Google's policy on paid links and encourages people to report them to Google. Here's a snippet from Google's statement:

"Buying or selling links that pass PageRank is in violation of Google's webmaster guidelines and can negatively impact a site's ranking in search results.

Not all paid links violate our guidelines. Buying and selling links is a normal part of the economy of the web when done for advertising purposes, and not for manipulation of search results. Links purchased for advertising should be designated as such."

Google essentially want websites to designate paid links with rel="nofollow" anchor tags, so link juice or PageRank is not passed on to the website who bought the link. The use of rel="nofollow" anchor tag was originally conceived to stop comment SPAM on blogs and discussion boards, but its use has now spread to the policing of paid links.

I understand the difficulties Google and the other search engines must have in determining when to pass link juice between websites, but leaving the webmaster in control of this is like asking Google to start ranking sites by meta keywords again.

I'm beginning to believe the future of web search lies in the democratic nature of the StumbleUpon, Digg and other social bookmarking methods like (del.icio.us and my favourite ma.gnolia), whereby users vote, tag and bookmark sites. Surely a combination of popularity and search algorithm is the way forward?

Updated: Shortly after I posted this blog entry, Google has been spotted testing Digg-style voting buttons on their results pages!

Updated: Matt Cutts and Maile Ohye posted on The Official Google Webmaster Central blog on 1 Dec 2007 a post that intends to clarify Google's stance on paid links.

Book Review: Prioritising Web Usability

I can't recommend this book enough. A lot of the topics covered in this book are common sense. As a Web developer or designer you may think you create very usable sites already, but even if this is true, and you are a true usability guru, a lot of the facts and statistics in this book are useful for backing up your views, and getting your point across to clients who insist on functionality that you know full well will break usability conventions, and potentially harm their finished Web site.

The book is for people who have business goals for their Web sites or the Web sites they work on. This includes sites that match the following criteria:

  • E-commerce sites
  • Corporate sites
  • News sites
  • Non-profit organisations
  • Government agencies

If you are trying to get users' to accomplish something when they visit your site then you should be concerned about usability.

This book contains the results of many studies into how people behave on the Internet and consequently what makes Web sites succeed or fail.

This book alone is not enough to ensure your site will be the most usable it can be, but it is crammed full of tips and real world examples of what to do, and what not to do when it comes to designing Web user interfaces, writing Web copy and planning your Information Architecture. Ideally you will need to perform usability testing as well, but the information in this book will significantly help in improving your Web site.

The book begins by explaining how people use the Web and how to optimise your site accordingly. It explains how users' use search engines to find answers to problems, and how to improve your site to cash-in on these users.

Nielsen and Loranger then go back to the usability problems they found back in 1994 and discover what significant usability issues are still relevant today, including bugbears such as:

  • Breaking the back button
  • Pop-ups
  • Flash
  • Uncertain clickability
  • Plug-ins
  • Splash pages

The forth chapter helps you prioritise your Web site usability issues and decide what to tackle first. They do this by categorising usability problems by severity, frequency, impact and persistence.

Site search engines and their user interfaces and results pages (SERPS) are covered next, including a brief introduction to Search Engine Optimisation.

Chapters 6 and 7 concern navigation, information architecture, readability and legibility. This is followed by a chapter on how important it is to specifically write for the Web, using summaries for key points, and by using simple language. The importance of knowing your audience and how people skim read articles on the Web is talked about, as is the use of marketing slogans and hype.

The following chapter is geared towards your e-commerce goals. How to provide good product information and win consumer confidence in your site and product to increase and promote sales.

The penultimate chapter looks at presentation and users' expectations, while the final chapter in the book is all about balancing technology with people's needs. This covers the use of multimedia content such as videos and the use of familiar interface conventions in Web design.

Prioritizing Web Usability
By Jakob Nielsen and Hoa Loranger
Published by New Riders
ISBN 0-321-35031-6

 

Migrating from ASP to ASP.NET 2.0

I've pretty much finished migrating my personal website from classic ASP to ASP.NET 2.0. In the end I decided to keep certain pages using ASP (Active Server Pages) technology (more on that in a moment), the majority of the pages however have been migrated.

Minimise Page Rank Loss

While I wanted to bring my site up-to-date I also didn't want to lose too much Google Page Rank in the process, and make people's bookmarks and RSS blog subscriptions stop functioning. The reason the pages have to change URLs is that ASP.NET pages use the extension .aspx, compared to ASP's .asp 3-digit extension. So my portfolio.asp page for example has become portfolio.aspx.

Analysing what can be Migrated

My blog area uses Google's Blogger as a CMS, so this area hasn't had to change, although prior to using Blogger I had previously built my own blog engine and this has remained as is.

The most popular part of my site is my Cisco CCNA section. Apart from the new menu page, the other pages have half-decent Page Rank and a few pages also have DMOZ entries, so those have had to remain ASP too.

Using 301 Permanent Redirects

All the other pages however have been migrated. When you now visit those old pages (from SERPS or old links) you'll get HTTP 301 redirected to the new ASP.NET pages. Because I'm on a shared server with no access to IIS (Internet Information Server), I essentially had to hard-code ASP 301 redirects on all the ASP pages that have moved, redirecting users to the new versions.

Update Robots.txt

The next step in the process was to include those old ASP pages in my robots.txt file and log-in to the Google Webmaster console to expedite the removal of those old pages from the Google index using the URL Removal tool. If you haven't already accessed Webmaster tools I highly recommend you log-in and verify your site.

Spidering as a Final Check

Next, I made sure all navigation menus and links to the old pages under my control were pointing to the new versions. This meant updating my Blogger template and republishing, updating my old ASP navigation include files and crawling my site using XENU link sleuth to check for any I had missed.

Conclusion

Moving my content over to ASP.NET has been fairly straight forward due to the small number of pages, my Tools and Portfolio pages display data stored in XML files, so it was just a case of using XmlDataSource controls to pull the information onto the pages. My homepage picks up the latest entries in my Blogger Atom feed using XSLT, and my contact form uses basic ASP.NET form and validation controls.

Increased Functionality

While migrating my content I thought I'd use the caching feature built-in to ASP.NET to allow me to display my latest ma.gnolia bookmarks on my site, so I ended-up creating a Bookmarks page, which fetches my ma.gnolia lite RSS bookmarks XML file, either from ma.gnolia.com or my cache. The cache doesn't hold my data for as long as I stipulate, but I'm assuming this is because I'm on a shared server and the cache is dropping it to free resources.

A Recipe for a Successful Website

Creating a successful website can be a hit and miss affair even before you've even thought about attracting users to the site.

There are countless websites, books and consultants dedicated to designing websites, web programming and development, search engine optimisation, usability and accessibility. However even if you have an expert specialising in each field on your development team you will have to make difficult decisions along the way which can have a fundamental impact on how your site will perform. This won't be much of a surprise to anyone in the business.

Web Designers Role

Web designers for instance need to be able to create designs that have a high level of usability as well as creating a visually appealing design that sticks to standard methods of navigation with a colour scheme that conforms to accessibility guidelines. Good guidelines and specifications are needed from the client and the other members of the team prior to commencing work so the designer knows the scope of the project.

Web Developers Role

Web developers need to be aware of search engine optimisation techniques and accessibility practices while building pages based on the chosen design. The choice of development environment is also important. Will your website scale when your users increase? Will your choice of platform and IDE have negative impacts on your SEO and usability efforts? Dreamweaver and other WYSIWIG editors are renound for adding reams of unnecessary HTML to your pages. Visual Studio's standard web controls also introduce problems with large VIEWSTATE tags and lots of pointless nested tables.

Search Engine Marketers Role

Search engine marketers need to make sure that they stick to "White Hat" methodoligies. They will need to have an intimate understanding of your business or product(s) in order perform keyword research.
In the push to get goods search rankings usability can be affected by excessive keyword stuffing, which can make paragraphs of text un-readble.

Usability and Website Testing

While usability evangelists will restrain the designers and developers from using browser plug-in technologies and any form of cutting edge design, or navigational elements that deviate from what Web users' expect. Badly positioned advertisements can also affect the usability of the site especially the flash pop-out kind.

The Client

If the client insists on items of functionality and design that will have a detrimental effect on the site such as using a splash page, a non-standard navigation tool etc, he or she will need to be educated on the impact of implementing such items.

Lots of websites fail to look at website development from one or more of these view points and suffer one way or another. Whether that is through confusing users with un-navigable sites, failing to generate traffic because of poor search engine optimisation or making people install software to view your content.

Creating a successful website requires a lot of different skill sets, but this will depend partly on your audience and your individual goals for the project.

Linguistic SEO - Speak Your Customer's Language

When people use a search engine to try and solve a problem they will most probably enter words that describe the symptoms of the problem they are looking to solve. They may even enter a question into the query box. Most of the time, unless the user knows of a particular product which will do the job, they won't search for an xyz machine from Acme Inc.

Traditional Marketing Versus Online Marketing

This poses a problem for manufacturers who describe their product with sexy slogans and marketing speak. Search engines index the textual content on your web pages. If there is not much plain English which describes what the product will do, what problems it will solve, you are potentially not doing your product website justice.

Keyword Research - Speak Your Customer's Language

This is where you need to do a little homework and research into what words users are likely to use in order to find your product and your competitors product. There are many ways you can do this. Google Adwords has a tool to do this, so does WordTracker.
You can use the results of this research to careful write your product text to tailor and optimise it for your audience. This is know as linguistic Search Engine Optimisation (SEO).