Nik's Technology Blog

Travels through programming, networks, and computers

Blogger.com has changed their feed syndication

It seems that Blogger has changed the type of syndication feed they use during the last month (Jan – Feb 2008), I discovered this when my Atom feed XSLT transformation broke when I published my last post.
I originally wrote XSLT to transform the previous feed type for my homepage blog updates, which assumed the following heirachy with the Atom 0.3 namespace:

<feed>...<entry>...</entry></feed>

Whereas the latest feed has changed to use both Atom and openSearch namespaces and the following structure:

<rss>...<channel>...<item>...</item></channel></rss>

The root node seems to suggest it is the RSS 2.0 standard, using the Atom namespace, which is peculiar, notice the openSearch namespace too...

<rss xmlns:atom='http://www.w3.org/2005/Atom' xmlns:openSearch='http://a9.com/-/spec/opensearchrss/1.0/' version='2.0'>

Here's my updated XSLT to convert the new Blogger.com format.



<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet version="1.0"
xmlns:atom="http://www.w3.org/2005/Atom"
xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
xmlns:openSearch="http://a9.com/-/spec/opensearchrss/1.0/">

<xsl:output method="xml" indent="yes" omit-xml-declaration="yes"/>
<xsl:template match="channel">
<div id="FeedSnippets">
<xsl:apply-templates select="item" />
</div>
</xsl:template>


<xsl:template match="item" name="item">
<xsl:if test="position()<6">
<h4>
<xsl:value-of select="title"/>
</h4>
<p>
<xsl:choose>
<xsl:when test="string-length(substring-before(atom:summary,'. ')) > 0">
<xsl:value-of select="substring-before(atom:summary,'. ')" />...<br />
</xsl:when>
<xsl:when test="string-length(substring-before(atom:summary,'.')) > 0">
<xsl:value-of select="substring-before(atom:summary,'.')" />...<br />
</xsl:when>
<xsl:otherwise>
<xsl:value-of select="substring(atom:summary,0,200)" />...<br />
</xsl:otherwise>
</xsl:choose>
<strong>Read full post: </strong>
<a href="{link}">
<xsl:value-of select="title"/>
</a>
</p>
<hr />
</xsl:if>
</xsl:template>
</xsl:stylesheet>

Product Review: uCertify PrepKit Exam Simulator

I was kindly sent a uCertify PrepKit for review back in December last year for the Microsoft C# .NET 2.0 Web-based Client Development exam (70-528). I'm looking to take the Microsoft MCTS .NET Framework 2.0 Web Applications certification this year, and needed an exam simulator and part of my study.

uCertify start-up logo

I've been so busy lately its been difficult to find the time to sit down and put the exam simulator through its paces. Anyhow I've spent a good few hours testing my .NET knowledge with this PrepKit to allow me to confidently evaluate it.

The PrepKit features a bunch of questions that closely follow the style of questions featured in the Microsoft exam, obviously the PrepKit does not contain real exam questions, but uCertify claim they are "realistic", and they are supposed to get you used to the kind of questions you should expect to see when you come to take the real exam.

uCertify PrepKit main menu

The tests in the PrepKit contain between 15 and 40 questions each and you’re given 120 minutes to complete each one, but I found that choosing a shorter time and reducing the amount of questions I needed to answer allowed me to spend more time using the PrepKit, because I don’t often have 2 hours of uninterrupted revision time.

There are two different modes to choose from before starting a test. Learn mode and Test mode, Learn mode allows you to get feedback on the current answer immediately whereas in Test mode you can only review the answers at the end of the practice test.

uCertify PrepKit test question page

When you complete a test you can review the questions and go back and look at any questions you may have answered incorrectly. You can also choose to re-take just the questions you got wrong. When you re-take the test the multiple choice answers change order to keep you on your toes!
During a test you can pause the timer to take a call, make a coffee etc, tag, print, review and bookmark questions.

uCertify PrepKit test history page

Every test you take with the PrepKit gets recorded in the Test History section, from here you can go back and review all the practice tests you've taken, review all the questions you got wrong, re-do the whole tests or re-do only the questions you got wrong.

Custom tests can also be created to turn your weaknesses, based on your test history or certain topics into your strengths.

Besides the practice tests the PrepKit contains study notes, quizzes and tips and flash cards to help assist you in understanding the topic.

MS Visual Web Developer 2008 Express Removes Support for Mobile Forms

I recently downloaded Microsoft's new Visual Web Developer 2008 Express Edition, which is a cut-down free version of Visual Studio aimed specifically at ASP.NET web developers. It supersedes the last version (VWD 2005) and adds new functionality.

I didn't remove the old version before installing, and in doing so noticed, and was able to verify (with screen shots below) that support for building mobile websites has been removed in the latest version, at least that's what appears to have happened!

Visual Web Developer 2005 - Add new item dialog box
Visual Web Developer 2005 - Add new item dialog box

Visual Web Developer 2008 - Add new item dialog box
Visual Web Developer 2008 - Add new item dialog box

Enumerate Available Database Providers in ASP.NET

Using the DbProviderFactories class in ADO.NET you can retrieve a list of available database factories using the GetFactoryClasses method. This can be useful if you host your ASP.NET site on a shared server and don't have access to the web server and machine.config file.

The output of the GetFactoryClasses method is a DataTable of available factory classes, these are the factory classes that the .NET runtime will have access to.

Here's the kind of output you'll get...

NameDescriptionInvariantNameAssemblyQualifiedName
Odbc Data Provider .Net Framework Data Provider for Odbc System.Data.Odbc System.Data.Odbc.OdbcFactory, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
OleDb Data Provider .Net Framework Data Provider for OleDb System.Data.OleDb System.Data.OleDb.OleDbFactory, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
OracleClient Data Provider .Net Framework Data Provider for Oracle System.Data.OracleClient System.Data.OracleClient.OracleClientFactory, System.Data.OracleClient, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
SqlClient Data Provider .Net Framework Data Provider for SqlServer System.Data.SqlClient System.Data.SqlClient.SqlClientFactory, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
Here's the code you'll need in your code behind file.

DataTable providersList = null;
providersList = System.Data.Common.DbProviderFactories.GetFactoryClasses();
GridView providerGridView = new GridView();
providerGridView.DataSource = providersList;
providerGridView.DataBind();
form1.Controls.Add(providerGridView);

Book Review: Prioritising Web Usability

I can't recommend this book enough. A lot of the topics covered in this book are common sense. As a Web developer or designer you may think you create very usable sites already, but even if this is true, and you are a true usability guru, a lot of the facts and statistics in this book are useful for backing up your views, and getting your point across to clients who insist on functionality that you know full well will break usability conventions, and potentially harm their finished Web site.

The book is for people who have business goals for their Web sites or the Web sites they work on. This includes sites that match the following criteria:

  • E-commerce sites
  • Corporate sites
  • News sites
  • Non-profit organisations
  • Government agencies

If you are trying to get users' to accomplish something when they visit your site then you should be concerned about usability.

This book contains the results of many studies into how people behave on the Internet and consequently what makes Web sites succeed or fail.

This book alone is not enough to ensure your site will be the most usable it can be, but it is crammed full of tips and real world examples of what to do, and what not to do when it comes to designing Web user interfaces, writing Web copy and planning your Information Architecture. Ideally you will need to perform usability testing as well, but the information in this book will significantly help in improving your Web site.

The book begins by explaining how people use the Web and how to optimise your site accordingly. It explains how users' use search engines to find answers to problems, and how to improve your site to cash-in on these users.

Nielsen and Loranger then go back to the usability problems they found back in 1994 and discover what significant usability issues are still relevant today, including bugbears such as:

  • Breaking the back button
  • Pop-ups
  • Flash
  • Uncertain clickability
  • Plug-ins
  • Splash pages

The forth chapter helps you prioritise your Web site usability issues and decide what to tackle first. They do this by categorising usability problems by severity, frequency, impact and persistence.

Site search engines and their user interfaces and results pages (SERPS) are covered next, including a brief introduction to Search Engine Optimisation.

Chapters 6 and 7 concern navigation, information architecture, readability and legibility. This is followed by a chapter on how important it is to specifically write for the Web, using summaries for key points, and by using simple language. The importance of knowing your audience and how people skim read articles on the Web is talked about, as is the use of marketing slogans and hype.

The following chapter is geared towards your e-commerce goals. How to provide good product information and win consumer confidence in your site and product to increase and promote sales.

The penultimate chapter looks at presentation and users' expectations, while the final chapter in the book is all about balancing technology with people's needs. This covers the use of multimedia content such as videos and the use of familiar interface conventions in Web design.

Prioritizing Web Usability
By Jakob Nielsen and Hoa Loranger
Published by New Riders
ISBN 0-321-35031-6

 

Migrating from ASP to ASP.NET 2.0

I've pretty much finished migrating my personal website from classic ASP to ASP.NET 2.0. In the end I decided to keep certain pages using ASP (Active Server Pages) technology (more on that in a moment), the majority of the pages however have been migrated.

Minimise Page Rank Loss

While I wanted to bring my site up-to-date I also didn't want to lose too much Google Page Rank in the process, and make people's bookmarks and RSS blog subscriptions stop functioning. The reason the pages have to change URLs is that ASP.NET pages use the extension .aspx, compared to ASP's .asp 3-digit extension. So my portfolio.asp page for example has become portfolio.aspx.

Analysing what can be Migrated

My blog area uses Google's Blogger as a CMS, so this area hasn't had to change, although prior to using Blogger I had previously built my own blog engine and this has remained as is.

The most popular part of my site is my Cisco CCNA section. Apart from the new menu page, the other pages have half-decent Page Rank and a few pages also have DMOZ entries, so those have had to remain ASP too.

Using 301 Permanent Redirects

All the other pages however have been migrated. When you now visit those old pages (from SERPS or old links) you'll get HTTP 301 redirected to the new ASP.NET pages. Because I'm on a shared server with no access to IIS (Internet Information Server), I essentially had to hard-code ASP 301 redirects on all the ASP pages that have moved, redirecting users to the new versions.

Update Robots.txt

The next step in the process was to include those old ASP pages in my robots.txt file and log-in to the Google Webmaster console to expedite the removal of those old pages from the Google index using the URL Removal tool. If you haven't already accessed Webmaster tools I highly recommend you log-in and verify your site.

Spidering as a Final Check

Next, I made sure all navigation menus and links to the old pages under my control were pointing to the new versions. This meant updating my Blogger template and republishing, updating my old ASP navigation include files and crawling my site using XENU link sleuth to check for any I had missed.

Conclusion

Moving my content over to ASP.NET has been fairly straight forward due to the small number of pages, my Tools and Portfolio pages display data stored in XML files, so it was just a case of using XmlDataSource controls to pull the information onto the pages. My homepage picks up the latest entries in my Blogger Atom feed using XSLT, and my contact form uses basic ASP.NET form and validation controls.

Increased Functionality

While migrating my content I thought I'd use the caching feature built-in to ASP.NET to allow me to display my latest ma.gnolia bookmarks on my site, so I ended-up creating a Bookmarks page, which fetches my ma.gnolia lite RSS bookmarks XML file, either from ma.gnolia.com or my cache. The cache doesn't hold my data for as long as I stipulate, but I'm assuming this is because I'm on a shared server and the cache is dropping it to free resources.

Insert a Blogger Atom Feed into an ASP.NET Web Page

I've been busy recently migrated my homepage (and several others) from classic ASP to ASP.NET. My homepage displays the latest 5 posts with a summary and a link to the full blog post.
I eventually found a tutorial using XSLT explaining how to achieve this after discovering that XmlDataSource XPATH doesn't support namespaces!
I've tinkered with the XSLT that Arnaud Weil posted in his blog to achieve the following objectives:

  1. Limit the amount of posts returned by the transformation.
  2. Show a summary of the post.
  3. Show a summary that tries hard not to cut words in half when generating a snippet.
  4. Produce XHTML valid code.

Here's the source of my XSLT...

<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet version="1.0"
xmlns:atom="http://www.w3.org/2005/Atom"
xmlns:xsl="http://www.w3.org/1999/XSL/Transform">

<!--<xsl:output method="html"/>-->
<xsl:output method="xml" indent="yes" omit-xml-declaration="yes"/>
<xsl:template match="/atom:feed">
<div id="FeedSnippets">
<xsl:apply-templates select="atom:entry" />
</div>
</xsl:template>


<xsl:template match="atom:entry" name="feed">
<xsl:if test="position()&lt;6">
<h4><xsl:value-of select="atom:title"/></h4>
<p>
<xsl:choose>
<xsl:when test="string-length(substring-before(atom:summary,'. ')) &gt; 0">
<xsl:value-of select="substring-before(atom:summary,'. ')" />...<br />
</xsl:when>
<xsl:when test="string-length(substring-before(atom:summary,'.')) &gt; 0">
<xsl:value-of select="substring-before(atom:summary,'.')" />...<br />
</xsl:when>
<xsl:otherwise>
<xsl:value-of select="substring(atom:summary,0,200)" />...<br />
</xsl:otherwise>
</xsl:choose>
<strong>Read full post: </strong><a href="{atom:link[@rel='alternate']/@href}"><xsl:value-of select="atom:title"/></a></p>
<hr />
</xsl:if>
</xsl:template>
</xsl:stylesheet>

Font and Text Styles for Websites

People find reading text on screen much more difficult than reading on paper, by following some simple guidelines web designers can help make this as painless as possible.

Due to the nature of how web browsers render textual content on the screen; using the fonts installed on the client machines, it severely limits the font variation web designers have at their disposal.
Most browsers will be able to render the following fonts without reverting to a default alternative:

  • Arial
  • Georgia
  • Verdana
  • Arial Black
  • Times New Roman
  • Trebuchet MS
  • Courier New
  • Impact
  • Comic Sans MS

I personally stick to the top 3 fonts in this list as they are very readable on-screen and are the most professional; avoid Comic Sans MS and Courier New.
It is also possible to specify alternative fonts in your CSS style sheets for the browser to use if your font of choice isn't available.

Once you've chosen a font for a website use that font throughout to maintain consistency. Websites that use too many fonts sprinkled through their pages look unprofessional. Verdana is the most readable font even at low point sizes.

Maintaining Readability

To increase a sites' readability you should focus your attention on three things:

  1. Choose a good font and a decent readable default font size.
  2. Make sure the text and background contrast in colours is high.
  3. Avoid using all capitals in blocks of text and headings.

Text in Images

Some web designers get around the font support problems mentioned above by creating a bitmap image of all headings, titles etc. Although this method does allow you to use any font your graphics program supports, including anti-aliased (smooth-edged) fonts, it causes a big accessibility and SEO problems. You should only use this technique for a company logo, all other textual information should be actual text in your HTML.

Hyperlink Usability

How often do you visit a website and are unsure where to click and what is clickable?
In the past all links were blue and underlined, but web designers have increasingly found that modern stylesheet functionality (CSS) now allows them to change the look and feel of links and experiment with unconventional navigation. Sometimes so much so, that users have to mine the pages for links by hovering over the page to find out what is clickable, all for the sake of design.

User Expectations

Hyperlinks need to stand out to users as links, a clickable entity that will cause a new page to be requested when clicked.
Users should not have to spend their precious time learning what style the site uses to render hyperlinks, by watching when their mouse cursor turns into a pointing-hand while scanning the site.
Web designers need to understand users expectations.

All links should be underlined at the very least. Any other use of underlined text or blue words should be avoided as these can be confused with links.

IIS 7 and Configuration Delegation on Shared Servers

I've just been watching the IIS 7.0 episode of the .NET Show. One of the new exciting features of IIS 7.0 for people who run their sites on a shared hosting environment are the new Delegated, Remote Administration options.

Essentially this will allow developers who do not have access to IIS on the box to use an IIS client tool to configure their site remotely over HTTP. This obviously relies on the hosting company to offer this functionality.

This has been a major bugbear for developers running their sites on IIS in the past in a shared environment. If set-up correctly it should allow hosting companies to save time and money by delegating some IIS functionality out to the site administrators.

I've recently moved hosting companies in order to get ASP.NET and the .Net framework 2.0, and I think that it will be big selling point for hosting firms. As far as I'm aware you will need Vista or Longhorn server to get IIS 7.0 however, so we may not see hosting companies offering this for a while yet.