Nik's Technology Blog

Travels through programming, networks, and computers

Nike+ SportBand Review

Last year I decided to get healthy and take up running.  Being a gadget lover, I had been researching the iPod nano and Nike+ sport kit

I liked the idea of being able to record my progress, set myself challenges and map my routes etc, see http://nikeplus.nike.com/nikeplus/ for more details.

The thing was, I already had an iPod classic which was too bulky to run with, and not compatible with the Nike+ kit.  I didn't really want to fork out for another iPod just to take out for a run, so I was pleased when Nike released the SportBand, I ordered one and started my training. 

That was about a year ago, and I've been using the Nike+ SportBand for all my runs since, clocking up over 300km in that time.

Review

I thought I'd write a quick review of the product to share my experience with the SportBand and the Nike+ website, which forms an integral part of the product, since all your stats are uploaded to the site.

Hardware

The hardware included consists of a watch and shoe adapter.  The shoe adapter is designed to fit into special Nike+ running shoes, which I had already had.  If you don't want to purchase Nike+ running shoes search on eBay for "nike+ sensor" you can buy sensor pouches that fit on your shoe laces instead.

Before each run you have to hold down the big button on the face of the watch to sync the shoe adapter to the watch, then after a few seconds when you are ready to run, you just press the same button again briefly to start and stop the clock.

After using this for a while you wonder what Nike was thinking when they designed the watch.  First of all its not very sturdy, my LED broke after a few months use (see photo, left-hand side), the angle of the screen and reversed LED display are not at all easy to read when you are running.  Personally I would have been willing to pay a little more for a better watch.

Software and Website

When you get back from your run, you simply detach the watch from its strap and plug it into your computer's USB port to upload your run data.  The software driver that you install on your PC allows you to calibrate your device as well; however I found that its not very accurate and if you increase your pace you need to recalibrate the device.

The Nike+ website has been produced in Flash and looks visually impressive, but I find it to be a bit cumbersome to use and personally I would prefer an HTML website with embedded Flash graphs etc.

All your runs appear in a bar chart, with a calendar running across the bottom.  When you hover a run you get more details for that particular run.  If you click on a run you get a timeline for that run with km/mile marker points and your pace at those positions.

You can also map your runs before or after a training session to either gauge how far a route is or to assign certain runs to a particular route.  This is useful so you can see your progress over the same route.

The nike+ website also has a social element to it, allowing you to challenge other nike+ users and run routes others have mapped.  However the interface isn't as intuitive as it could be.  You can also create widgets to allow you to show your training overview on your blog or social profile, take a look at mine on the "About Me" page.  They also provide a FaceBook app, but I have never managed to get this to work.

Summary

Nike+ isn't perfect and I think that professional runners should probably look elsewhere, but for people like me who just run to keep fit and don't take it too seriously I find it helps me keep track of my progress and keeps me motivated.

Being a developer it would also be nice to get access to my run data through an API.  There are ways to do it, but it would be nice if Nike were to publish an SDK or API documentation to make this a little easier.  Services such as Twitter have thrived on 3rd party applications which leverage the Twitter API, what are you waiting for Nike?

TargetInvocationException - Exception has been thrown by the target of an invocation

This exception isn't very useful because it's a general exception thrown when a method invoked through reflection throws an exception, took me a while to figure out what the issue was.  Even though I knew the page causing the error.

This exception was thrown by a ASP.NET web form which contained a GridView connected to an ObjectDataSource.

The ObjectDataSource references methods in a data access layer class.  These methods then call stored procedures in the MS SQL database. 

The code worked perfectly in my development environment.

I have the SQL database set up so that the database user ASP.NET uses only has rights to execute the stored procedures it needs to.  The database user cannot run commands against the tables directly, this way I limit the surface area of attack should my application have a weakness that could be exploited.

I double checked the stored procedures were all up-to-date, then double checked all the permissions on the stored procedures, and everything seemed in order, but I still kept getting the stack trace below in Event Viewer:


    Stack trace:    at System.RuntimeMethodHandle._InvokeMethodFast(Object target, Object[] arguments, SignatureStruct& sig, MethodAttributes methodAttributes, RuntimeTypeHandle typeOwner)
   at System.RuntimeMethodHandle.InvokeMethodFast(Object target, Object[] arguments, Signature sig, MethodAttributes methodAttributes, RuntimeTypeHandle typeOwner)
   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture, Boolean skipVisibilityChecks)
   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
   at System.Web.UI.WebControls.ObjectDataSourceView.InvokeMethod(ObjectDataSourceMethod method, Boolean disposeInstance, Object& instance)
   at System.Web.UI.WebControls.ObjectDataSourceView.ExecuteSelect(DataSourceSelectArguments arguments)
   at System.Web.UI.DataSourceView.Select(DataSourceSelectArguments arguments, DataSourceViewSelectCallback callback)
   at System.Web.UI.WebControls.DataBoundControl.PerformSelect()
   at System.Web.UI.WebControls.BaseDataBoundControl.DataBind()
   at System.Web.UI.WebControls.GridView.DataBind()
   at System.Web.UI.WebControls.BaseDataBoundControl.EnsureDataBound()
   at System.Web.UI.WebControls.CompositeDataBoundControl.CreateChildControls()
   at System.Web.UI.Control.EnsureChildControls()
   at System.Web.UI.Control.PreRenderRecursiveInternal()
   at System.Web.UI.Control.PreRenderRecursiveInternal()
   at System.Web.UI.Control.PreRenderRecursiveInternal()
   at System.Web.UI.Control.PreRenderRecursiveInternal()
   at System.Web.UI.Control.PreRenderRecursiveInternal()
   at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)
Custom event details:

I then decided to check the stored procedures.  Then it dawned on me that one of the stored procedures used EXEC sp_executesql, which requires the database user to have, in my case, SELECT permissions on the actual table itself.

Once I granted these permissions to the database user, the web form loaded correctly.

ASP.NET Content Disposition Problem in IE7

I've just spent quite a while debugging a problem with content disposition I was having with Internet Explorer 7, the code works fine in Firefox but causes this error message to occur in IE7.

"Internet Explorer cannot download xxx from xxx."

"Internet Explorer was not able to open this Internet site.  The requested site is either unavailable or cannot be found.  Please try again later."

content-disposition-error

This was my original snippet of C# code:

Response.Buffer = true;
Response.ClearContent();
Response.ClearHeaders();
Response.ContentType = docToDisplay.Type.ContentType.ToString();
Response.AddHeader("Content-Disposition", "attachment;filename=" + Server.UrlEncode(docToDisplay.FileName));
Response.Cache.SetCacheability(HttpCacheability.NoCache);

Response.BinaryWrite(docToDisplay.FileContent);
Response.End();
Response.Flush();
Response.Close();

Response.Cache.SetCacheability

I eventually figured out that the following line on code was causing the issue.

Response.Cache.SetCacheability(HttpCacheability.NoCache);

I then did a quick search for "Response.Cache.SetCacheability(HttpCacheability.NoCache);" and discovered another developer who have had the same Content-Disposition issue.  Unfortunately for me that page didn't get returned when I was searching for the Internet Explorer error message.

This was the response to the post by Microsoft Online Support:

"Yes, the exporting code you provided is standard one and after some further
testing, I think the problem is just caused by the httpheader set by
Response.Cache.SetCacheability(HttpCacheability.No Cache)
I just captured the http messages when setting and not setting the above
"NOCache" option and found that when the http response returned the
Cache-Control: no-cache
header. So we can also reproduce the problem when using the following code:
page_load...
{
Response.CacheControl = "no-cache";
ExportDataGrid(dgSheet,"test.xls");
}
IMO, this should be the clientside browser's behavior against "no-cache"
response with stream content other than the original text/html content. So
would you try avoid setting the CacheAbility or the "Cache-Control" header
to "no-cache" when you'd like to output custom binary file stream?
Thanks,
Steven Cheng
Microsoft Online Support"

After removing the Response.Cache.SetCacheability line the file downloads correctly in Internet Explorer.

Create a Simple Windows Service to Request a URL at Set Intervals

I needed a simple Windows Service to request a web page at set intervals indefinitely. Windows Services are the best way of doing this as they have the ability to start automatically when the computer boots up and can be paused, stopped and restarted. You can also get them to write events to the Windows Event log.

I found this Windows Service sample tutorial on The Code Project and downloaded the code to familiarise myself with the basics. The tutorial lacked a timer and the code to request a URL though so I had to add this functionality.

Visual Studio Standard edition doesn't have a Windows Service template, but you can still create a Windows Service, you just need to do a bit of extra work.

After some research and a bit of coding I added two new methods:

private void ServiceTimer_Tick(object sender, ElapsedEventArgs e)
{
this.timer.Stop();
DoWork();
this.timer.Start();
}

void DoWork()
{
WebClient client = new WebClient();
client.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)");

Stream data = client.OpenRead(URL to request here);
client.Dispose();
data.Dispose();
}

I overrode the OnStart() method of ServiceBase to enable the timer and start it. I also overrode the OnStop() method to disable the timer.

The DoWork() method simply creates an instance of WebClient and reads in the URL you want to request.

Then in the constructor I set the timer interval and added an event handler to raise the ServiceTimer event when the interval elapses. The event handler simply stops the timer, calls the DoWork() method and then restarts the timer.

public static void Main()
{
ServiceBase.Run(new Service3());
}

public Service3()
{

InitializeComponent();
// Duration 1 hour
double interval = 3600000;

timer = new Timer(interval);
timer.Elapsed += new ElapsedEventHandler(this.ServiceTimer_Tick);

}

To install the Service you need to publish the project in Visual Studio. Then use InstallUtil.exe following the process below:

  1. Open a Visual Studio .NET Command Prompt
  2. Change to the bin\Debug directory of your project location (bin\Release if you compiled in release mode)
  3. Issue the command InstallUtil.exe MyWindowsService.exe to register the service and have it create the appropriate registry entries
  4. Open the Computer Management console by right clicking on My Computer on the desktop and selecting Manage
  5. In the Services section underneath Services and Applications you should now see your Windows Service included in the list of services
  6. Start your service by right clicking on it and selecting Start

Each time you need to change your Windows Service it will require you to uninstall and reinstall the service. Prior to uninstalling the service make sure you close the Services management console. To uninstall the service simply reissue the same InstallUtil command used to register the service and add the /u command switch.

e.g. InstallUtil.exe /u MyWindowsService.exe

When you install the service on a server you can find the InstallUtil.exe in the .NET framework folder e.g. C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727

 

Can we rely on Cloud services to look after our data?

With the current trend moving from desktop applications to web applications, more and more of our data is being stored in the data centres of computer service companies around the world, rather than on our computer hard disks like in the old days.

The benefits of storing data in the "the cloud" are that you can access your documents anywhere you have an Internet connection. However, you have to ask yourself what happens if my data go missing?

This month the social bookmarking site Ma.gnolia which I had been using for 2 years or so to store all my web bookmarks suffered a data loss which brought the service down, and it doesn't look like any of their data recovery attempts will be able to resurrect the data and the service.

I've also noticed that occasionally Google Docs has problems retrieving saved documents.

Luckily I was able to use Ma.gnolia's recovery tool to recover most of my bookmarks from the web cache, and have since imported them into delicious.com, but incidents like this should be a reminder to us all that it pays to take care of your important data and not to rely on such services to store data that is important to you.

Of course the hard disk on your computer could fail too, but you all perform regular back ups, so that's not a problem is it... :-)

Since I wrote this post Gmail had an outage and Techradar have posted a similar article as mine here...

Stricter Guidelines for the use of Voucher Codes on Affilliate Websites

From January 1st 2009, there will be stricter guidelines for voucher code use on affiliate websites in the UK.
The Affiliate Marketing Council (AMC) , part of the Internet Advertising Bureau has issued a code of best practise to avoid putting it's members brands at risk.
The following affiliate networks have so far signed up to the new code of conduct.

  • Affiliate Window
  • Affilinet
  • Commission Junction UK
  • DGM-UK
  • TradeDoubler
  • Platform-A’s buy.at
  • Premier Affiliate
  • Network
  • Webgains
  • Zanox

The Voucher Code best practise press release states that affiliates will need to make sure they comply with the following come 1st January 2009:

  • Using 'Click to Reveal' when there is no valid or current code present is not permitted of affiliate publishers including using 'Click to Reveal' to show any deals/offers/sales instead of vouchers.
  • Voucher code affiliate publishers must clearly detail the voucher offer that will be revealed by the click.
  • A valid code is defined as a code that has been legitimately issued by a merchant for use online. This code will have an activation date and where necessary a deactivation date.
  • Voucher code directories must contain clear categorization and separation between deals/offers/sales and discount codes

I'm not quite sure how this will affect sites such as hotukdeals.com which links to vouchers posted by members of it's forum, since they rely on user generated content, which is most probably unmoderated.

Authenticate with MailEnable SMTP using ASP.NET 2.0

I've recently been trying to send emails using ASP.NET 2.0 on a web server which uses MailEnable. MailEnable had relaying locked down to avoid opening the server up to spammers, but I wasn't able to authenticate with MailEnable to relay emails to external domains.

I tested my code was working by temporarily checking the "Allow relay for local sender addresses" checkbox in the SMTP properties on MailEnable and my email was relayed successfully. However as soon as I uncheck this option I get this error message in the SMTP W3C logs:

503+This+mail+server+requires+authentication+when+attempting+to+send+to+a+non-local+e-mail+address.+Please+check+your+mail+client+settings+or+contact+your+administrator+to+verify+that+the+domain+or+address+is+defined+for+this+server.

I eventually got it working. I changed the authentication type from "MailEnable integrated authentication" to "Authenticate against the following username/password." then I supplied a username and password to MailEnable.
After restarting the SMTP service in MailEnable. I was able to relay emails using the following C# ASP.NET 2.0 code snippet...


using System.Net.Mail;

MailAddress source = new MailAddress("admin@localdomain.com", "Server");
MailAddress recipient = new MailAddress("external@mail.com", "External");

MailMessage enquiryMail = new MailMessage(source, recipient);
enquiryMail.Subject = "Test";
enquiryMail.Body = "Test content";
enquiryMail.IsBodyHtml = true;

SmtpClient smtpServer = new SmtpClient("mail.localdomain.com");
smtpServer.Credentials = new NetworkCredential("username", "password", "localdomain.com");
smtpServer.Send(enquiryMail);

Apache Web Development Testing Server Set-up

I've been setting up an Ubuntu linux machine for web development testing on my local network, but because the machine has only one IP address assigned to it I wanted a solution to be able to serve more than one website without changing the config in Apache each time.

One method would be to use a separate folder for each site but .htaccess rewrites don't work properly using this method and if you write hyperlinks that refer to the root of the site (i.e /contact.php), this can cause issues too.

Anyway, I thought of using different port numbers to distinguish the different testing websites. So one site maybe on 192.168.0.1:2000, another on 192.168.0.1:2001 etc...

You set this up in Ubuntu as follows:

Open a Terminal prompt and type:

sudo gedit /etc/apache2/ports.conf

Enter your password and then for each port number you require add a Listen statement on a new line. So your file should look like:

Listen 80
Listen 2000
Listen 2001

<ifmodule>
Listen 443
</ifmodule>

Save the file and then go back to the Terminal and type:

sudo gedit /etc/apache2/sites-available/default

In this file you should find a <virtualhost *> tag followed by various commands. You need to copy and paste everything from <virtualhost *> to </virtualhost> onto a new line below.
Then rename the first <virtualhost *> to <virtualhost 192.168.0.1:80> where 192.168.0.1 is the IP address of you machine. Rename the second virtual host to <virtualhost 192.168.0.1:2000> and repeat for as many websites you want to set-up.

Then for each Virtual Host you'll need to change the DocumentRoot to the file path to each website on the local machine.

Once this is done you'll need to restart Apache to see if your changes have been successful. To do this type the following into the Terminal:

sudo /etc/init.d/apache2 restart

Freesat for free HD TV in the UK

If you haven't already heard about Freesat then you should look into it, I'm not referring to the free satellite service from Sky, but the new not-for-profit organisation set up by the BBC and ITV to help distribute digital TV to areas of the UK where Freeview signals are not strong enough.
If you also own a new high definition capable flat screen TV or are looking to buy one you should also investigate Freesat. Not only will it allow a greater percentage of British population to receive digital TV for free, it also carries free high definition content. Currently BBC HD and ITV HD channels, which are far superior to their standard definition channels.

Essentially you need a satellite dish and a new set-top box or Freesat capable TV to receive the broadcasts. The Freesat service uses the same satellite as Sky, so it is apparently possible to use a Sky dish and plug in a Freesat tuner instead of a Sky box.

According to a sales assistant in my local Richer Sounds every retailer stocking Freesat equipment has to be a registered Freesat installer and they charge a set fee of £80 to install the service for you. This install price is set by Freesat and should be the same for each registered installer.

Panasonic have announced the first TV with a Freesat tuner built-in which should be launched in time for the Olympic games, which should be broadcast in HD on the BBC HD channel.

Can Freesat and Sky Co-exist?

What I'm wondering is, can Sky and Freesat be picked up with the same dish simultaneously? If you have a quad LNB on the dish and a additional run of coaxial cable to the Freesat box?

If so, you could have high definition TV in more than one room and you wouldn't have to pay Sky's multi-room fee, you also get to keep Sky in one room so you can pick up those channels not available on Freesat, like Living TV etc.

I asked this question to the sales guys at the Panasonic stand in the Bluewater shopping centre who were demonstrating the new Freesat capable panels and although they'll admit that you can use a Sky dish to receive Freesat, they're not sure about the 2 services co-existing.

If they won't co-exist maybe they can be switched? After all how many people want a second dish stuck to their house?
Personally, I could make do without the garbage US TV shows Sky broadcasts, but it might upset my girlfriend if she's unable to watch her shows :-)

Internet World - eBusiness Legal Tips

It's been a couple of weeks since I attended Internet World at London's Earl Court and the follow-up emails have started to arrive in my inbox where I exchanged business cards with some of the exhibitors.
The show ran for three days, but I only managed to attend on the last day (1st May 2008).

During the exhibition, as well as general networking, I attended several presentations about marketing, social media, search and e-commerce.
One particular presentation by Nigel Miller of Fox Williams LLP was about legal tips for safe selling online.
This topic will bore many developers, entrepreneurs and start-ups because they don't understand or see the importance in legal issues and just want to get their idea/business or product live on the web.
Having seen the potential problems of ignoring legislation first hand, I was particularly interested in what Nigel had to say.

I'm one of those people who tends to read the odd terms and conditions page or End User License Agreement (EULA) and find that the language these documents are written in doesn't make for easy reading or understanding, so I was pleased that this presentation used simple plain English.

The presentation was not an exhaustive list of the all legal rules and regulations a website needs to comply with, but it highlighted the areas that are frequently misunderstood or ignored completely, it focused mainly on UK rules and regulations, such as:

  • Sector specific compliance
  • Web Accessibility compliance
  • Company information which must be on the website
  • Intellectual property and ownership
  • The Data Protection Act (complying with)
  • Terms and conditions and disclaimers
  • Pricing errors
  • Distance selling regulations and consumer rights

Nigel's full presentation entitled "Risky business; legal tips for safe selling online" can be downloaded as a PDF from Fox Williams' ebizlawTM website.

Nigel Miller is a partner at Fox Williams LLP.