Nik's Technology Blog

Travels through programming, networks, and computers

Data Layer with Business Logic

Deciding when to move business logic to the data layer in a multi-tiered application is a tricky one and depends a lot on the RDMS you are using.
I've been looking at an ASP.NET application that creates 7 rows in a database for each "musical performance" when a button is clicked on a web form to rebuild a database.
To begin with a SQL 2005 stored procedure written in VB.NET (But could easily have been C# with the same results since all .NET languages are compiled to IL). was called from a loop in the business layer 7 times to create the 7 rows. The application was run and it took about 90 seconds to rebuild a test database in this manner.

This logic was then moved into 1 stored procedure still written in VB.NET; The result was that instead of making 7 calls to the database each time, we make 1 call to the database and move the business logic, executing the loop there, and hence at the data layer. The database rebuild took about 30 seconds this time.

Stored Procedures in Native T-SQL

To try and make the execution faster we decided to write the stored procedure in T-SQL, which is still the native language of SQL Server. This shaved another 15 seconds of the execution time! Just proving what sort of overhead writing your stored procedures in anything other than T-SQL introduces.

This exercise provides a glimpse into how the decisions you make when designing an application radically affect the responsiveness of the system.

Business Logic at the Data Layer

If you were pulling lots of data to the business layer performing some processing then discarding most of the data at that point, then it may improve performance if this operation was moved down to the data layer. It really depends on the data and the application.
One of the problems with storing too much application logic in the data layer is version control. It can become a mine field managing lots of complex stored procedures between test and live environments and keeping them in-sync with the application.

When it comes to programming two heads are better than one!

On the last Learning Tree ASP.NET course I took I never thought to inquire why we were made to work in pairs sitting in front of one PC workstation. Well it seems this is not a cost cutting exercise, which were my first thoughts. Apparently Learning Tree bases this on research and programming methodologies (namely Extreme Programming, which I may have to look into further at some point). According to the lecturer the perfect number of programmers crafting a piece of code is 1.8 (Some pieces of code are too trivial for two developers to work on simultaneously), but in a learning environment it is 2.2, with the extra help coming from the teacher and other peers in the class room.

Essentially what he was trying to say was; developers can't think and type at the same time, the product of pair programming is less errors and better quality, structured code.

Linguistic SEO - Speak Your Customer's Language

When people use a search engine to try and solve a problem they will most probably enter words that describe the symptoms of the problem they are looking to solve. They may even enter a question into the query box. Most of the time, unless the user knows of a particular product which will do the job, they won't search for an xyz machine from Acme Inc.

Traditional Marketing Versus Online Marketing

This poses a problem for manufacturers who describe their product with sexy slogans and marketing speak. Search engines index the textual content on your web pages. If there is not much plain English which describes what the product will do, what problems it will solve, you are potentially not doing your product website justice.

Keyword Research - Speak Your Customer's Language

This is where you need to do a little homework and research into what words users are likely to use in order to find your product and your competitors product. There are many ways you can do this. Google Adwords has a tool to do this, so does WordTracker.
You can use the results of this research to careful write your product text to tailor and optimise it for your audience. This is know as linguistic Search Engine Optimisation (SEO).

The Rise and Fall of User Generated Content?

Another day another ludicrous allegation about cyberspace. Apparently..."The vast majority of blogs on top social websites contain potentially offensive material."

This was the conclusion of a ScanSafe commissioned report, which claims sites such as MySpace, YouTube and Blogger which are a "hit" among children can hold porn or adult language. According to the report 1 in 20 blogs contains a virus or some sort of malicious spyware.

The Problem?

User generated content is to blame of course; because of the nature of how the content is built and edited it makes it very difficult to control and regulate.


Even if you were to monitor every post on a website as part of your process, how would you clarify whether a particular portion of text, or Photoshopped image has violated anyone's copyright or intellectual property?


This is a problem the big search engines have as well. With so many SPAM sites scrapping content from other sites, then republishing the resulting mashed content as their own work in order to cash-in on affiliate income generated from SERPS. Is Google working on a solution to stem this SPAM?

EU Intellectual Property Ruling

Another potential blow to websites which rely on user generated content is the European Union ruling on intellectual property which is making its way through the ratification process. This could see ISP's and website owners being charged for copyright infringements even if the data was posted by users of the site.

The Rel Attribute in HTML

The rel attribute is available for use in a few HTML tags namely the <link> and <a> anchor tags, but until recently it has been fairly pointless to use because web browsers did not support the intended functionality of most of the values you could assign to the rel attribute.

The rel attribute has been around since HTML 3 specifications and defines the relationship between the current document and the document specified in the href attribute of the same tag. If the href attribute is missing from the tag, the rel attribute is ignored.

For example:
<link rel="stylesheet" href="styles.css">

In this example the rel attribute specifies that the href attribute contains the stylesheet for the current document.
This is probably the only recognised and supported use of the rel attribute by modern web browsers and by far the most common use for it to date.
There are other semantic uses for the rel tag, beyond those which a browser might find useful; such examples include social networking, and understanding relationships between people; see http://gmpg.org/xfn/intro, the other use which has been talked a lot about recently concerns search engine spiders.

Search Engines and the rel Attribute

Recently Google has played a big part in finding another use for the rel attribute. This time the HTML tag in question was the humble anchor tag.
Google and the other major search engines (MSN and Yahoo!) have a constant battle with SERP SPAM which clutter their results and make them less useful. These pages make their way into the top results pages by using black hat SEO methods such as automated comment SPAM, link farms etc.
Rather than adopt a complex algorithm to determine these SPAM links which increase target pages search engine vote sometimes called "Page Rank" or "Web Rank", the search engines (Google, MSN and Yahoo!) have collectively decided that if blogging software, big directories and general links pages etc use anchor tags with a rel="nofollow" attribute those links will simply be ignored by search engine spiders, yet still be fully functional for end users.
Of course using rel="nofollow" does not mean the links are deemed as bad in any way, every link on a blog comment will be treated in the same fashion. The webmaster is essentially saying

"this link was not put here by me, so ignore it and do not pass any "link juice" on to it".

More on nofollow by Search Engine Watch.

Putting Webmasters in Control

Putting this kind of control in the webmasters hands hasn't been without controversy. People will always try to experiment with ways of manipulating the intended outcome to favour their own goals, such as using nofollow internally in their site etc. Others have welcomed the move as a way of reducing the problem of spamming.

Dollar - Pound Exchange Rate Hits British Web Publishers

The British Pound broke through the physiological barrier of $2 yesterday due to the relative strength of the British economy. For us Brits this has some advantages like cheap shopping trips to New York, and some negatives such as companies who export goods to the US will suffer due to their goods becoming more expensive to American importers.

It also affects British web publishers who earn money from American companies. Affiliate programs like Google's Adsense, Amazon Associates etc are all paid in US dollars. Some schemes have the option of holding payments, but with the weakening economy in the US this exchange rate might be with us for some time.

Wi-Fi hacking - 2 Cautioned by UK Police

For the first time in the UK two people have been cautioned by police for accessing wireless broadband connections without permission. Both cases were detected by suspicious behaviour in cars parked in the vicinity and not through electronic means.
Both people were warned for dishonestly obtaining electronic telecoms with intent to avoid payment.

Most wireless routers come without Wi-Fi encryption turned on by default, leaving unsavvy users open to this kind of abuse.
Most broadband ISP terms and conditions state that you cannot share your broadband connection with your neighbours etc, therefore all related activity on your connection is connected with you.
Due to recent laws, ISPs have to keep records of your Internet activity for a number of years. If authorised people are accessing your connection and using it for illegal practices then how would you prove your innocence?

Recently news has come out that anti-piracy companies are monitoring P2P traffic, using a modified version of Shareaza they are automatically sending your IP to your ISP demanding your details if it detects that pirated material is being downloaded. Some people have questioned whether an IP is enough evidence to connect a person with a crime, especially considering these cases of drive-by Wi-Fi hacking.

Is there still a place for site newsletters in the web 2.0 world?

More and more sites are adopting XML syndication technologies such as RSS and ATOM which users can subscribe to.

Pull Technology

Rather than being a push technology like email newsletters, RSS is a pull technology. The subscriber is in full control of the subscription, the publisher does not have a relationship with the subscriber, or need to know their email address. This makes unsubscribing very easy and because you don't need to supply an email you do not need to worry whether your details will be sold on by unscrupulous companies.

RSS Adoption

XML syndication has been around for over 5 years or so, but in the early days the RSS readers available weren't up to scratch, so it took a while for the technology to gather momentum. Nowadays there are plenty of good readers, such as Bloglines, Google Reader etc, which are very polished products that support all the major formats.

RSS Advertising

The last nail in the email newsletters coffin will be the adoption of RSS advertising into the mainstream. Currently Google and Yahoo! are performing tests with advertising on these syndication formats. As soon as these are released the already strong relationship Google has with publishers will allow it to rapidly make RSS very lucrative for website publishers.

Syndication Analytics

Until recently publishers syndicating their content via RSS had a hard time analysing their circulation, that's where companies such as Feedburner have found a niche and continue to provide publishers with additional services on top of basic subscription tracking.

Syndication SPAM

Of course syndicating your content is just another method of publishing. First you had paper, then HTML now XML. You can't irradiate SPAM with RSS, people can set up SPAM blogs etc, but it's the subscribers who are in control of their subscriptions. So as a publisher you know that your 500 subscribers reported by your RSS analytics product of choice are actively reading your content or else they'd simply click to unsubscribe from within their RSS reader application. Compare that to a database of registered subscribers dating back several years; are those users viewing your newsletter in their preview pane and pressing delete rather than unsubscribing via an unsubscribe link?

Content is King

The old adage that 'content is King' is truer than ever with RSS syndication. The problem with giving such power to the subscriber is that your content needs to be top-notch in order to keep your subscribers subscribing. Even though there are guidelines specifying opt-out and unsubscribe methods and practices, which newsletter senders must adhere to, the fact is unsubscribing from RSS is far easier and is not reliant on differing geographic data protection laws.

Cybersquatters lose out after Royal break-up

Such was the certainty that Prince William and Kate Middleton were to get married that even high street stores has begun stocking their shelves with commemorative gifts. Now it's all over for the couple, I feel for all the cybersquatters that bought-up all available Kate and Wills domain names in the hope that they could cash in on what was to be the next Royal wedding. Perhaps it wasn't such easy money after all...

It's not only people in the public eye that fall foul of cybersquatting. Big organisations like Microsoft have similar problems and these squatters are cashing in on the brands and trademarks or other companies by duping users into clicking adverts on these sites after visiting them by typing in slight variations of the real URL.

Does Twittering have a place in Business?

Is there supposed to be a point behind Twittering I asked myself? The Twitter.com site is pretty scarce on describing a particular use for it's service apart from "What are you doing now?". Maybe not limiting it's boundaries is part of it's success?

I'm always willing to try out new technology, I'd describe myself as an early adopter. Now I'm not saying I won't ever sign up and be a fellow Twitter myself in the near future, but from the outside, and without experiencing it firsthand I can't see why anybody would be interested in a rolling commentary of what some other individual is doing right now. I guess if you're into instant messaging (IM) or texting and want to let all your friends or family know what you're up to broadcast fashion, that might be a powerful tool. I'm guessing adding a Twitter to a myspace.com page would be the best place to put this kind of information.

Celebrity Tweets

A celebrity Twitter on the other hand might be extremely popular in this celebrity obsessed world we live in. Just imagine the sort of Tweets Paris Hilton would send from her Sidekick cell phone! And the hoards of followers that would subscribe.

What about Twitter use in a business environment?

Blogs, instant messaging and texting have all been adopted by the businesses and they all started out in the consumer space, but what business problems could they solve?

I can see them being used internally inside companies for staff to keep line managers updated on what tasks they are working on. After all Microsoft amongst others have found business uses for IM.
Public Relations could be another use, as could musicians and bands keeping their loyal fans up-to-date on tour etc.