Use this T_SQL script to anonymize data from ASP.NET's Membership database.
declare @email varchar(255)
set @email = 'firstname.lastname@example.org'
set PropertyNames = '', PropertyValuesString = ''
where UserId = (
where Email = @email
set Email = 'email@example.com', LoweredEmail = 'firstname.lastname@example.org', IsApproved = 0, PasswordQuestion = 'deleted', PasswordAnswer = 'deleted', [Password] = 'deleted'
where UserId = (
where Email = @email
After our Royal Mail API was migrated to the new Pro Shipper API platform in June 2018 our integration stopped working, we were getting the following response back from Royal Mail: "The remote server returned an unexpected response: (405) Method Not Allowed.".
We are using the SOAP version of the API with a .NET client originally based on this example integration by povilaspanavas.
Trying to find any details on what we needed to do to make this upgrade continue to work was difficult as we were under the impression no changes would need to be made from our end. This is the email response we got back from Royal Mail.
We are sorry to hear that your API connection has connection issues after migration to Royal Mail Pro Shipper.
If you are receiving blank API responses or “Authorization Failure” responses from the API, your connection issues can be resolved quickly by following the instructions in the attached document and setting up your API Password in Royal Mail Pro Shipper. Until you do this, your API calls will not work.
- The confirmation email for migration would have been sent to your DMO email address, your API Developer Portal Address and if you completed the migration web form, the email address listed as the contact email address. If you did not receive the email, please contact email@example.com
- Your Royal Mail Pro Shipper Login will be the same login username as your DMO email address.
- If you changed your API Password in Pro Shipper, you need to advise your IT Developer to update the password in your API calls.
Some important notes about the API migration are:-
- If you have coded the API connection in .Net, you will need to update to SOAP 1.2 from SOAP 1.1
- The API responses now have an Upper Case element name for SOAP-ENV instead of lower case
- The API responses no longer return warnings in the integration footer
- The 2D Item ID now returns a hex value instead of a decimal value.
- The Channel ID in the 2D barcode has changed. If you are generating the 21 Character UID from the 2D barcode in your system you will need to update your system not to convert the 2D Item ID to Hex and you will need to update the Channel ID.
We made sure the API password was set-up correctly in Pro Shipper but that didn't fix the problem.
After some trial and error I managed to fix our integration by adding this HTTP header to the CreateShipment method.
And removing all code trying to read warnings in the integration footer, as these are no longer sent back.
We have an existing ASP.NET MVC project which has grown to the point where making the switch to use the Bootstrap framework required a substantial amount of work to the Views. Unfortunately this isn't high priority so we couldn't afford to do this in one go, instead we had to find a solution which would enable us to deploy the solution into production using the old Views and Layouts to introduce new functionality to the solution mid-way through the Bootstrap conversion.
To overcome this problem and allow us to upgrade to Bootstrap as a side project we needed a clever solution to effectively allow us to switch from Bootstrap to the old Views and Layout quickly and easily, and then back again.
This was our solution to the problem:
- We created a Bootstrap MVC _Layout and added it to the solution.
- We created an ActionFilterAttribute class called UseBootstrapLayout which we used to decorate the Actions to denote which Layout the Action method should use to render the View.
- We copied each View we were going to upgrade to Bootstrap and named it the same but appended a 2 at the end of the filename.
- In _ViewStart.chtml we added to code (below) to check if the View ended with a 2 and which allowed us to switch to the Bootstrap Layout if it did.
If we needed to deploy the application before the Bootstrap update was complete we simply changed to _ViewStart.chtml code to use the old _Layout. All the rest of the code changes and new Views could stay the same.
/// Method to change a view name used by an action method.
/// We are using this to enable us to re-skin the system
public class UseBootstrapLayout : ActionFilterAttribute
public override void OnActionExecuted(ActionExecutedContext filterContext)
filterContext.RouteData.Values["action"] = filterContext.RouteData.Values["action"] + "2";
var view = HttpContext.Current.Request.RequestContext.RouteData.Values["action"].ToString();
string layout = "";
layout = "~/Views/Shared/NewBootstrapLayout.cshtml";
layout = "~/Views/Shared/_Layout.cshtml";
Layout = layout;
public ActionResult Index()
It's been a year since we sold our house and moved, and that year has gone so quickly!
I was confident that getting the house marketed on Rightmove and Zoopla would be enough to sell the property, but unfortunately unless you are a registered estate agent you can't post directly on to these portals. All estate agents are so dependent on these property portals now, so much so that they tried to create their own onthemarket.com. However with strict conditions I believe it is doomed to fail.
So after getting a few agents around for a valuation and lots of research into the best online estate agents we finally bit the bullet and chose an online agent and paid the flat fee up-front. Paying upfront would put most people off, but at £800 it was less than a quarter of the fee of the other agents.
Even though the agent we chose (PurpleBricks) was a hybrid agent who assign you a property expert to provide a valuation and take photos and create your brochure, I decided I wanted to take more control and take the photos myself and also write the description to make it more personal.
I let the agent create the floorplan and do the measurements.
The PurpleBricks portal gives you the control over scheduling viewings, amending the brochure and photos, plus it also allows you to see the offers come in, which I found interesting, but definitely not for everyone.
All in all it was a good experience, we sold on our first "open day" and we saved a lot of money over the traditional high street estate agents that pitched to us.
I would also like to thank Perry Power for his free advice on selecting an agent and pricing a property to sell. These free guides are available on his website.
Even though some NAS providers advertise their products as "having" Plex Media Server apps, all but the most expensive and powerful cannot actually perform video transcoding, all they can do is direct play and direct stream content. This is fine if all your devices support the media you have, but it is very restrictive and misleading.
Video transcoding is very CPU intensive task and consumer NAS boxes have low powered CPUs, they aren't designed for CPU intensive tasks.
You can store your media on a NAS and use a PC to run Plex Media Server to do all the heavy lifting, however when you factor in the cost of a NAS and the hard disks, to me it made more sense to either use an existing PC or a new custom built PC and add the hard disks directly to the PC. This also reduces the latency of accessing the content from the NAS over the network.
After deciding against using a NAS for Plex, I went about checking what the requirements are to run Plex Media Server on a PC.
My current PC is old and in need of replacement, but perhaps I could upgrade the storage and when I buy a new PC at a later date, the old one can become a dedicated media streaming box?
The Plex website has some guidance on how much CPU power you need for running Plex Server. Based on the benchmark score of 3057 PassMark for my old CPU I was able to confirm it is capable of streaming at least one 1080p stream.
The alternative would have been to build a new custom PC especially for Plex, but I don't have that sort of spare cash at the moment.
I had already installed Plex Media Server on my PC and had been pleasantly surprised by how good the software was, so much so that I now wanted to rip all my DVDs to hard disk. After a few calculations I figured out I would need quite a bit more storage to hold all my ripped DVDs for use with Plex Media Server.
Originally I thought that a NAS device might be the way forward. I could get a four bay device for future expandability, and upon first glance it appeared that PMS would run on selected NAS boxes too. I almost bought a four bay Netgear ReadyNAS, until I read that the CPU in most NAS boxes just isn't up to the task of transcoding video.
That would mean the NAS would just store the content and the PC would need to be switched on to act as the Plex Server. This to me, just didn't seem worth it.
Why not just add some extra hard disks to my PC I thought? But what about the added benefit of RAID you get from a NAS?...
I ended up buying some 4TB WD Red NAS hard disk drives and using Storage Spaces in Windows 10 to set up resiliency, mimicking the RAID you get from a NAS box.
It was probably around 2002 or 2003 that I ripped all my CD collection to MP3 so I could listen on the move, which I recall was a painful process. So the thought so doing the same with my DVD collection wasn't too appealing.
As this was going to be a painful process I only wanted to do this once, so I spent some time deciding on the video codec and container to use.
Plex works with most codecs and containers (except ISO disc images). This is one of the best things about Plex Media Server. It transcodes the video on-the-fly depending what hardware the Plex client is able to play. This enables you to watch video on phones, tablets, smart TVs etc and not have to care whether they play AVI, MP4, MKV etc...
I also don't want to recode all the DVDs during the process as this would take too long and potentially degrade the video quality.
The final solution was a two stage process.
- Rip each DVD using DVD Fab into VOB files.
- Use MakeMKV Batch Converter to merge the VOB files into the MKV container.
All that was needed then was a load more hard disk space!
After reading how to sideload a Plex client on a NOW TV box I decided to pick one up during the Black Friday sale for £14.99.
They are basically Roku 3 boxes rebranded as NOW TV (Owned by Sky TV) and feature a limited app store with competing apps such as NetFlix and LoveFilm removed.
I am thinking of using Plex Media Server as a way of ditching my Blu Ray player and ripping all my DVDs to stop my kids scratching them to pieces. Plex indexes the movies and downloads the covers and descriptions from internet movie databases.
Because the NOW TV box has a limited app store, Plex is not available to download directly, so it needs to be sideloaded onto the box.
Sideloading the Roku Plex client onto the NOW TV box basically involves accessing the developer part of the box and installing a package file, which takes 5 minutes.
The box is very stable via WiFi and wired Ethernet and is a very good Plex client especially considering the price.
I'll write another post when I get around to ripping my DVD collection to hard disk.
I bought some 7" dual portable NEXTBASE CAR DVD players to entertain the kids in the car, which are also capable of playing video from a USB stick. However they are very fussy with the video formats and containers they will play.
There is no mention in the manual about what types are supported from recollection, so I spent a few minutes of searching and experimentation trying find a compatible format. I wasn't having much luck until I stumbled on the details tucked away on the NextBase website!
Basically you need to use DivX format.
They suggest using the official DivX converter with the following settings:
- Use the “DivX Home Theatre” option.
- Try to keep to the 720 x 408 dimensions, as this is a 16:9 ratio (or 1.77 ratio) so will fill the screen.
- With the default settings applied, the Video bitrate should be left alone, but if you want to adjust around 1500Kbps.
Normally a frame rate of 25frames per second will result, no need to adjust this.
You can also record multiple DivX movies on to disc. Both CD-R (720Mb) and DVD (4.7Gb) disc are compatible. Although some DVD players may require the file extension to be changed to “.avi” in order to play correctly.
I'm a fan of Powerline Ethernet adapters, I've had some Devolo AV 200 adapters running happily for a number of years connecting my desktop PC to the broadband router.
Since moving house and upgrading to Fibre broadband I've had no end of issues with my broadband. The broadband signal kept dropping off and requiring a router reboot.
I originally thought the issue was down to my broadband provider, but I eventually pin-pointed the issue down to the powerline adapters. I could make the broadband connection fail instantly just by trying to download something via the Devolo adapters.
They seem to interfere with my broadband signal. This could possibly be due to the internal wiring in my house. The master socket has been moved and it probably runs near a mains cable. It turns out that VSDL2 operates at 17Mhz and AV2 Homeplus operate at 2-80Mhz.
To prevent my broadband connection from dropping off when using the powerline adapters I had to place them several metres away from the fibre broadband router, which partly defeats the object of not having to place Ethernet wires everywhere.
I even tried upgrading my powerline adapters, but I found that they made my broadband connection drop off even when I place them several metres from my fibre router. The new powerline adapters use the Homeplug AV2 specification, which uses both the neutral and earth cables of the household wiring.
After many hours troubleshooting, reading this post and sending back the new powerline adapters I have now wired my desktop PC to my router with cat5e. The result is a much better internet experience and higher throughput.