Nik's Technology Blog

Travels through programming, networks, and computers

Migrating from ASP to ASP.NET 2.0

I've pretty much finished migrating my personal website from classic ASP to ASP.NET 2.0. In the end I decided to keep certain pages using ASP (Active Server Pages) technology (more on that in a moment), the majority of the pages however have been migrated.

Minimise Page Rank Loss

While I wanted to bring my site up-to-date I also didn't want to lose too much Google Page Rank in the process, and make people's bookmarks and RSS blog subscriptions stop functioning. The reason the pages have to change URLs is that ASP.NET pages use the extension .aspx, compared to ASP's .asp 3-digit extension. So my portfolio.asp page for example has become portfolio.aspx.

Analysing what can be Migrated

My blog area uses Google's Blogger as a CMS, so this area hasn't had to change, although prior to using Blogger I had previously built my own blog engine and this has remained as is.

The most popular part of my site is my Cisco CCNA section. Apart from the new menu page, the other pages have half-decent Page Rank and a few pages also have DMOZ entries, so those have had to remain ASP too.

Using 301 Permanent Redirects

All the other pages however have been migrated. When you now visit those old pages (from SERPS or old links) you'll get HTTP 301 redirected to the new ASP.NET pages. Because I'm on a shared server with no access to IIS (Internet Information Server), I essentially had to hard-code ASP 301 redirects on all the ASP pages that have moved, redirecting users to the new versions.

Update Robots.txt

The next step in the process was to include those old ASP pages in my robots.txt file and log-in to the Google Webmaster console to expedite the removal of those old pages from the Google index using the URL Removal tool. If you haven't already accessed Webmaster tools I highly recommend you log-in and verify your site.

Spidering as a Final Check

Next, I made sure all navigation menus and links to the old pages under my control were pointing to the new versions. This meant updating my Blogger template and republishing, updating my old ASP navigation include files and crawling my site using XENU link sleuth to check for any I had missed.

Conclusion

Moving my content over to ASP.NET has been fairly straight forward due to the small number of pages, my Tools and Portfolio pages display data stored in XML files, so it was just a case of using XmlDataSource controls to pull the information onto the pages. My homepage picks up the latest entries in my Blogger Atom feed using XSLT, and my contact form uses basic ASP.NET form and validation controls.

Increased Functionality

While migrating my content I thought I'd use the caching feature built-in to ASP.NET to allow me to display my latest ma.gnolia bookmarks on my site, so I ended-up creating a Bookmarks page, which fetches my ma.gnolia lite RSS bookmarks XML file, either from ma.gnolia.com or my cache. The cache doesn't hold my data for as long as I stipulate, but I'm assuming this is because I'm on a shared server and the cache is dropping it to free resources.

Insert a Blogger Atom Feed into an ASP.NET Web Page

I've been busy recently migrated my homepage (and several others) from classic ASP to ASP.NET. My homepage displays the latest 5 posts with a summary and a link to the full blog post.
I eventually found a tutorial using XSLT explaining how to achieve this after discovering that XmlDataSource XPATH doesn't support namespaces!
I've tinkered with the XSLT that Arnaud Weil posted in his blog to achieve the following objectives:

  1. Limit the amount of posts returned by the transformation.
  2. Show a summary of the post.
  3. Show a summary that tries hard not to cut words in half when generating a snippet.
  4. Produce XHTML valid code.

Here's the source of my XSLT...

<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet version="1.0"
xmlns:atom="http://www.w3.org/2005/Atom"
xmlns:xsl="http://www.w3.org/1999/XSL/Transform">

<!--<xsl:output method="html"/>-->
<xsl:output method="xml" indent="yes" omit-xml-declaration="yes"/>
<xsl:template match="/atom:feed">
<div id="FeedSnippets">
<xsl:apply-templates select="atom:entry" />
</div>
</xsl:template>


<xsl:template match="atom:entry" name="feed">
<xsl:if test="position()&lt;6">
<h4><xsl:value-of select="atom:title"/></h4>
<p>
<xsl:choose>
<xsl:when test="string-length(substring-before(atom:summary,'. ')) &gt; 0">
<xsl:value-of select="substring-before(atom:summary,'. ')" />...<br />
</xsl:when>
<xsl:when test="string-length(substring-before(atom:summary,'.')) &gt; 0">
<xsl:value-of select="substring-before(atom:summary,'.')" />...<br />
</xsl:when>
<xsl:otherwise>
<xsl:value-of select="substring(atom:summary,0,200)" />...<br />
</xsl:otherwise>
</xsl:choose>
<strong>Read full post: </strong><a href="{atom:link[@rel='alternate']/@href}"><xsl:value-of select="atom:title"/></a></p>
<hr />
</xsl:if>
</xsl:template>
</xsl:stylesheet>

Font and Text Styles for Websites

People find reading text on screen much more difficult than reading on paper, by following some simple guidelines web designers can help make this as painless as possible.

Due to the nature of how web browsers render textual content on the screen; using the fonts installed on the client machines, it severely limits the font variation web designers have at their disposal.
Most browsers will be able to render the following fonts without reverting to a default alternative:

  • Arial
  • Georgia
  • Verdana
  • Arial Black
  • Times New Roman
  • Trebuchet MS
  • Courier New
  • Impact
  • Comic Sans MS

I personally stick to the top 3 fonts in this list as they are very readable on-screen and are the most professional; avoid Comic Sans MS and Courier New.
It is also possible to specify alternative fonts in your CSS style sheets for the browser to use if your font of choice isn't available.

Once you've chosen a font for a website use that font throughout to maintain consistency. Websites that use too many fonts sprinkled through their pages look unprofessional. Verdana is the most readable font even at low point sizes.

Maintaining Readability

To increase a sites' readability you should focus your attention on three things:

  1. Choose a good font and a decent readable default font size.
  2. Make sure the text and background contrast in colours is high.
  3. Avoid using all capitals in blocks of text and headings.

Text in Images

Some web designers get around the font support problems mentioned above by creating a bitmap image of all headings, titles etc. Although this method does allow you to use any font your graphics program supports, including anti-aliased (smooth-edged) fonts, it causes a big accessibility and SEO problems. You should only use this technique for a company logo, all other textual information should be actual text in your HTML.

Hyperlink Usability

How often do you visit a website and are unsure where to click and what is clickable?
In the past all links were blue and underlined, but web designers have increasingly found that modern stylesheet functionality (CSS) now allows them to change the look and feel of links and experiment with unconventional navigation. Sometimes so much so, that users have to mine the pages for links by hovering over the page to find out what is clickable, all for the sake of design.

User Expectations

Hyperlinks need to stand out to users as links, a clickable entity that will cause a new page to be requested when clicked.
Users should not have to spend their precious time learning what style the site uses to render hyperlinks, by watching when their mouse cursor turns into a pointing-hand while scanning the site.
Web designers need to understand users expectations.

All links should be underlined at the very least. Any other use of underlined text or blue words should be avoided as these can be confused with links.

IIS 7 and Configuration Delegation on Shared Servers

I've just been watching the IIS 7.0 episode of the .NET Show. One of the new exciting features of IIS 7.0 for people who run their sites on a shared hosting environment are the new Delegated, Remote Administration options.

Essentially this will allow developers who do not have access to IIS on the box to use an IIS client tool to configure their site remotely over HTTP. This obviously relies on the hosting company to offer this functionality.

This has been a major bugbear for developers running their sites on IIS in the past in a shared environment. If set-up correctly it should allow hosting companies to save time and money by delegating some IIS functionality out to the site administrators.

I've recently moved hosting companies in order to get ASP.NET and the .Net framework 2.0, and I think that it will be big selling point for hosting firms. As far as I'm aware you will need Vista or Longhorn server to get IIS 7.0 however, so we may not see hosting companies offering this for a while yet.

View Output of an XSLT Transformation using Firefox

XSLT is a powerful method of converting XML into another well-formed XML based document. You can for example transform RSS syndication format into ATOM and even XML into XHTML.

Probably the easiest way to debug your XSLT is to use the Firefox web browser. Make sure you have the Web Developer toolbar (by Chris Pederick) installed.

You'll need to add a stylesheet to the XML document you're transforming using a declaration like this:


<?xml version='1.0' encoding='UTF-8'?>
<?xml-stylesheet type="text/xsl" href="transformAtomFormat.xsl"?>

Then in Windows Explorer right-click the XML document and open in Firefox. You'll see the rendered output of your transformation in the browser. If you view the page source you'll see your XML source, not much use when you want to see the output source of your XSLT transformation. For this you need to right click somewhere on the page, find the Web Deveoper toolbar menu and choose "View Generated Source" from the sub-menu. Now you'll see what Firefox is rendering to the screen.

Find out more about XSLT at W3 Schools

Google Reader 'Shared Items' Widget Broke My Site in IE7

As an avid user of Firefox the dangers of using one browser (apart from initial testing) became apparent today, especially considering most web users still use Internet Explorer.

I recently converted my site to use Blogger, and I also thought I'd share interesting news articles I read with my visitors by using Google Reader's Javascript widget.

Google Reader is still in Beta

I placed the Javascript widget on my home page and my blog pages a while back. I just checked my site in Internet Explorer 7 and was amazed to see only the template was rendered. There was a distinct lack of content!

I looked at the source HTML, and the content was there. I was puzzled for a few minutes until I realised what the blog page and the home page had in common; The Javascript Google widget.

I quickly removed them and FTPed the pages, the problem was resolved. I grabbed the URL the script block was trying to call and tried to download the external Javascript straight to my browser... it timed-out.

It appears that Internet Explorer will get stuck when external Javascript doesn't load, so be aware of this the next time you choose to use an externally hosted Javascript file. Opera and Firefox didn't seem to have a problem.

A Recipe for a Successful Website

Creating a successful website can be a hit and miss affair even before you've even thought about attracting users to the site.

There are countless websites, books and consultants dedicated to designing websites, web programming and development, search engine optimisation, usability and accessibility. However even if you have an expert specialising in each field on your development team you will have to make difficult decisions along the way which can have a fundamental impact on how your site will perform. This won't be much of a surprise to anyone in the business.

Web Designers Role

Web designers for instance need to be able to create designs that have a high level of usability as well as creating a visually appealing design that sticks to standard methods of navigation with a colour scheme that conforms to accessibility guidelines. Good guidelines and specifications are needed from the client and the other members of the team prior to commencing work so the designer knows the scope of the project.

Web Developers Role

Web developers need to be aware of search engine optimisation techniques and accessibility practices while building pages based on the chosen design. The choice of development environment is also important. Will your website scale when your users increase? Will your choice of platform and IDE have negative impacts on your SEO and usability efforts? Dreamweaver and other WYSIWIG editors are renound for adding reams of unnecessary HTML to your pages. Visual Studio's standard web controls also introduce problems with large VIEWSTATE tags and lots of pointless nested tables.

Search Engine Marketers Role

Search engine marketers need to make sure that they stick to "White Hat" methodoligies. They will need to have an intimate understanding of your business or product(s) in order perform keyword research.
In the push to get goods search rankings usability can be affected by excessive keyword stuffing, which can make paragraphs of text un-readble.

Usability and Website Testing

While usability evangelists will restrain the designers and developers from using browser plug-in technologies and any form of cutting edge design, or navigational elements that deviate from what Web users' expect. Badly positioned advertisements can also affect the usability of the site especially the flash pop-out kind.

The Client

If the client insists on items of functionality and design that will have a detrimental effect on the site such as using a splash page, a non-standard navigation tool etc, he or she will need to be educated on the impact of implementing such items.

Lots of websites fail to look at website development from one or more of these view points and suffer one way or another. Whether that is through confusing users with un-navigable sites, failing to generate traffic because of poor search engine optimisation or making people install software to view your content.

Creating a successful website requires a lot of different skill sets, but this will depend partly on your audience and your individual goals for the project.

An Overview of the ASP.NET Cache Object

I've been looking at the ASP.NET cache object, which makes the older Application object effectively redundant. The analogy is that it is like a leaky bucket. A bucket in which you can store data which is "expensive" to retrieve from its source every time. Since databases and files are considered slow in comparison with in-memory data then it makes sense to keep a copy of certain data you use frequently in the cache, which is stored in memory.
The bucket is leaky because you can only put so much data in the cache before it fills up and has to drop some. It uses a simple method called LRU - Least Recently Used to decide what data it disposes of by default. I say by default, because you can specify, if you so wish and make data persist for a period of time or make it dependent on other data on file, database, or cached data.

The cache is shared however by the whole application and is therefore accessible by the whole application, so you have to be careful what kind on data you store in here. You probably don't want to store user specific data here!
You also need to check that the data you want to use is in the cache before you go and use it. Sounds obvious, but if you don't code this right, by assigning the cached object to a variable then checking the variable you could discover bugs which you can't replicate in a test environment.

Data Layer with Business Logic

Deciding when to move business logic to the data layer in a multi-tiered application is a tricky one and depends a lot on the RDMS you are using.
I've been looking at an ASP.NET application that creates 7 rows in a database for each "musical performance" when a button is clicked on a web form to rebuild a database.
To begin with a SQL 2005 stored procedure written in VB.NET (But could easily have been C# with the same results since all .NET languages are compiled to IL). was called from a loop in the business layer 7 times to create the 7 rows. The application was run and it took about 90 seconds to rebuild a test database in this manner.

This logic was then moved into 1 stored procedure still written in VB.NET; The result was that instead of making 7 calls to the database each time, we make 1 call to the database and move the business logic, executing the loop there, and hence at the data layer. The database rebuild took about 30 seconds this time.

Stored Procedures in Native T-SQL

To try and make the execution faster we decided to write the stored procedure in T-SQL, which is still the native language of SQL Server. This shaved another 15 seconds of the execution time! Just proving what sort of overhead writing your stored procedures in anything other than T-SQL introduces.

This exercise provides a glimpse into how the decisions you make when designing an application radically affect the responsiveness of the system.

Business Logic at the Data Layer

If you were pulling lots of data to the business layer performing some processing then discarding most of the data at that point, then it may improve performance if this operation was moved down to the data layer. It really depends on the data and the application.
One of the problems with storing too much application logic in the data layer is version control. It can become a mine field managing lots of complex stored procedures between test and live environments and keeping them in-sync with the application.