Wednesday 13 November 2013

Static Content in IIS 7.5

It's amazing how almost every piece of advice you read about something can be completely, utterly, totally useless. I think it's an endemic problem of developers who only spend time clicking the shiny buttons in the GUI of products without ever actually getting into the guts of a thing to see what is happening.

Anyway, a designer was trying to add .SVG, .ICO and .WOFF files to our website which worked fine locally but when viewed on our UAT server it was coming up with 404 errors for those files.

After a little head scratching, I found out that IIS disables transport for those extensions by default. When looking around for advice as to how to switch transport back on for these, I read so many articles where clearly the author never bothered to go near the web.config file.

And of the advice that did cover the web.config file, most of the received wisdom goes along the lines of:

Add these lines to the web.config:

    <system.webServer>
    ...
    <staticContent>
<mimeMap fileExtension=".woff" mimeType="application/x-font-woff" />
<mimeMap fileExtension=".svg" mimeType="image/svg+xml" />
<mimeMap fileExtension=".ico" mimeType="image/x-icon" />
        </staticContent>
    </system.webServer>


Which is great in theory, but this actually results massive amounts of errors on the site as it suddenly fails to load ALL STATIC CONTENT. So no CSS, JS, images, anything, plus a load of odd redirections for other types.

I figured that having these rules in a separate web.config file in our static resources folder might help, so I tried that and although the site was now generally working again, there were mysterious 500 errors for the .SVG etc requests.

After a little digging, I found this post on Stack Overflow:

http://stackoverflow.com/questions/13677458/asp-net-mvc-iis-7-5-500-internal-server-error-for-static-content-only/13677506#13677506

This saved me a complete metric fuck-tonne of time.

What all the advice I had read was missing is that you should remove the mappings in the static content before you add them, or you will get a 500 error that doesn't appear in any log unless you enable the really paranoid stuff in IIS.

So, the proper version of what the file looks like is this, remembering that this is a partial web.config sitting in our static resources folder:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
    <system.webServer>
    <staticContent>
<remove fileExtension=".woff"/>
<mimeMap fileExtension=".woff" mimeType="application/x-font-woff" />
<remove fileExtension=".svg"/>
<mimeMap fileExtension=".svg" mimeType="image/svg+xml" />
<remove fileExtension=".ico"/>
<mimeMap fileExtension=".ico" mimeType="image/x-icon" />
        </staticContent>
    </system.webServer>
</configuration>


Antaris RazorEngine and Mono-3.2.3

After some machinations, I managed to get Antaris RazorEngine working with Mono-3.2.3.

After making and installing mono from a tarball (http://download.mono-project.com/sources/mono/), I took the latest version of RazorEngine down from github (https://github.com/Antaris/RazorEngine), compiled using mono's xbuild tool, and then took the version of RazorEngine.dll and System.Web.Razor.dll from the RazorEngine.Core folder and used that within my applications. I removed all instances of System.Web.Helpers, System.Web.Mvc, System.Web.WebPages, WebMatrix, etc as those would have conflicted references.

And RazorEngine is now really horribly strict :( It throws compile exceptions on warnings which is wholly irritating. I'll have to revisit and police every single template and make sure they are compatible.

I also noticed that the new version of xsp4 required me to be in the root folder of the website even though I was setting the root on the command line.

Anyway, glad to see it working again, even if it will be a chore to get it fully compatible. I guess it is balanced out by now being on .Net 4.5 compatible mono.

Update: I've made an adjustment to the RazorEngine code which stops it treating warnings as errors during template compilation.

Modify RazorEngine.Core/Compilation/DirectCompilerServiceBase.cs, Compile method thusly:

            var @params = new CompilerParameters
            {
                GenerateInMemory = true,
                GenerateExecutable = false,
                IncludeDebugInformation = false,
                CompilerOptions = "/target:library /optimize",
                TreatWarningsAsErrors = false,
                WarningLevel = 0
            };

Although the default for TreatWarningsAsErrors is false in Mono, I think the default for WarningLevel is -1, which doesn't really mean anything. So this might actually be a Mono porting issue.

Sunday 22 September 2013

Bookmarks for 20130922

Ello. Not done one of these in a while. Here's the shit.

Incredible illustrations, in a very old style. http://danielmartindiaz.com

The one that caught my eye was this one, "Binary Predator" -


which can be found in an animated form in Warren Ellis's introduction to his short story "Lich House" - http://www.youtube.com/watch?v=RYScwq3EOD0



You should check out TRSST - a secure micro blogging system that recently hit its funding target on Kickstarter.

http://www.trsst.com

This is both terrifying and gorgeous. Poet CJ Allen - "Explaining the Plot of Blade Runner to my Mother who has Alzheimer's".

http://www.forwardartsfoundation.org/poetry/explaining-the-plot-of-blade-runner-to-my-mother-who-has-alzheimers/

Wednesday 14 August 2013

Antaris RazorEngine, Site Layouts

I have been using Antaris RazorEngine (v3) both at work and in the Asura framework used by Xizi. It is a light implementation of Razor that isn't strongly tied to MVC.

That said, documentation has been pretty thin on the ground. Especially around what I've been trying to do today which is to use the Layout functionality from within a file that is effectively in a .cshtml format.

So, the problem I am trying to address is doing ASP.Net MasterPage-like things but with isolated Razor templates using RazorEngine.

Certainly there are bits of documentation about how to use Layouts when creating and treating templates from code, but I couldn't find any complete examples on how to do it with template files.


The Layout file - mylayout.cshtml


<!DOCTYPE html>
<html>
@RenderSection("Head")
@RenderSection("Body")
</html>
This is a very basic layout comprised of the beginning and end of the markup and two section placeholders for "Head" and "Body".

That's all quite simple, but the problem is that if you are using RazorEngine in the raw way that I am, then when a template tries to resolve that layout it will fail as it is not in the template cache and I do not specify a resolver delegate to be able to do it.

Instead, the layout file must be compiled ahead of time. This can be performed by the following which, given an existing pathname for a .cshtml, will compile it into the template cache:


string viewPath = @"C:\code\razor\mylayout.cshtml";
string layoutName = @"mylayout";
if (File.Exists(viewPath))
{
ITemplate template = Razor.Resolve(viewPath);
if (template == null)
{
string templateContents = File.ReadAllText(viewPath);
Razor.Compile(
templateContents,
typeof(IDictionary<string, object>),
layoutName);
}
else
{
// already in cache
}
}
Assuming viewPath is the physical path of the mylayout.cshtml file and layoutName is the simple name that this layout will be known as in the template cache. Also, there should be a critical section between the Razor.Resolve and the Razor.Compile operations.

The Razor.Compile operation takes the contents of the mylayout.cshtml file and considers it with having a Model object of type IDictionary passed to it. This type should be repeated in the template file, as shown below.

Once the layout has been cached, any template file can use the layout.


The Template file - layout_test.cshtml

Here we go, then. This is the contents of my template, which is the one I want to use to fill the layout's section placeholders:
@inherits RazorEngine.Templating.TemplateBase<IDictionary<string, object>>
@{   
    this.Layout = @"mylayout";
}

@section Head
{
<head>
    <title>@Model["facebookAppID"].ToString()</title>
</head>
}

@section Body
{
<body>
my body
</body>
}
Firstly, the @inherits statement uses the generic version of RazorEngine's TemplateBase class to define the type that the @Model object will be interpreted as. This type should match whatever you are passing into the RazorEngine.Razor.Compile method when you compile the template. This should also match the type passed into the layout's compile operation.

Next, the setting of this.Layout. This is the name by which the layout's template is known within the RazorEngine cache, so it must match what was used to compile that template. Setting this string activates RazorEngine's ability to process this template while considering the contents of the layout.

After that are two @section definitions. One of which uses the @Model object which is treated as an IDictionary just as an example proving that it has access to @Model.

Note: You can also access @Model from the Layout template, without having to use the @inherit statement.

The template is compiled in much the same way, although in the Asura framework, templates are compiled on application startup and also when the last file write time is detected to be greater than that which we have recorded most recently.


Conclusion

So, long story, short, in order to get the Layout functionality working from within a .cshtml style use of RazorEngine, you have to pre-compile the templates that you will be using as a Layout before they are accessed, because RazorEngine will not do that work for you.

Hope that helps some people.

Monday 29 July 2013

Bookmarks for 20130729

Well this just pulled me out of a terrible hole ... the dreaded "case folding error" in Mercurial. The following advice worked:
http://mercurial.selenic.com/wiki/FixingCaseCollisions

And this seemed useful...

Failing over from CDN to local storage:
http://www.hanselman.com/blog/CDNsFailButYourScriptsDontHaveToFallbackFromCDNToLocalJQuery.aspx

Twitter Bootstrap has a release candidate for version 3. Not sure how I feel about this yet. On the one hand the new stuff looks amazing, on the other, I'm not sure I like the new grid classes. But, worth a look and it is admirable that they are going for "mobile-first", so rolling the responsive stuff into the main build as default:
http://twitter.github.io/bootstrap


Friday 14 June 2013

Bookmarks for 20130614

"No more miracles, loaves and fishes.
Been so busy with the washing of the dishes."

I'm back. Here's some stuff.

Some useful stuff on the blueimp jQuery.UI FileUpload control:

https://github.com/blueimp/jQuery-File-Upload/wiki/How-to-submit-additional-Form-Data

Twitter Bootstrap is just the best bloody thing ever. It feels like cheating when using it to produce what turn out to be really gorgeous, minimalistic, clean web designs.

https://github.com/twitter/bootstrap

Whole bunch of stuff I should have read BEFORE I started writing Xizi:

http://docs.mongodb.org/manual/core/data-modeling/

IPython Notebook looks like a lot of fun, bit like a live, interactive version of One Note maybe?

http://ipython.org/notebook.html


Friday 24 May 2013

Bookmarks for 20130524

Not dead, merely dreaming.

Some useful stuff from today, which allowed me to make a monitor page showing me all the job processors across 12 different platforms. CSS to scale an iframe:

http://stackoverflow.com/questions/4660659/javascript-css-set-firefox-zoom-level-of-iframe

(bearing in mind that the width and height on the iframe will be scaled as well so it must be multiplied accordingly)


A really bloody simple and non-fussy lightbox for Javascript/JQuery - EasyBox:

https://code.google.com/p/easybox/


A great article from Fleet Street Fox on the Woolwich murder and the EDL. Yes, I know it's on The Mirror's site, but it's truly a good article:

http://www.mirror.co.uk/news/uk-news/edl-dont-defend-england-says-1908599


Friday 12 April 2013

Bookmarks for 20130412

Interesting metrics UI stuff
http://ducksboard.com/tour/

At least partly atmosphere-based sci-fi space-fighter game. SHUT UP AND TAKE MY MONEY.
http://www.strikevector.net/


Tuesday 2 April 2013

Bookmarks for 20130401 ish

This gives the phrase "thousand yard stare" a whole new spin. Ridiculous, Linux-assisted-aim rifles.

http://arstechnica.com/gadgets/2013/03/bullseye-from-1000-yards-shooting-the-17000-linux-powered-rifle

If This Then That, a web "channel" automation service. This looks awesome and I'll be totally annoying you all with it soon.

http://ifttt.com

Thursday 28 March 2013

Bookmarks for 20130328

Hogeweyk, a "hyperrealistic" care-village in the Netherlands, offering an environment for dementia patients that closely resembles normal life, but is a carefully maintained illusion - front-stage / back-stage. This is a page about the architecture of the place.

http://www.detail-online.com/architecture/news/dementia-village-de-hogeweyk-in-weesp-019624.html

Short film version of Tim Maughan's excellent "Paintwork" story (augmented reality, graffiti)

http://boingboing.net/2013/03/27/short-sf-film-about-future-aug.html


Thursday 21 March 2013

Bookmarks for 20130321

Instant Healing Gel.  This is so awesome I can't even.

http://www.humansinvent.com/#!/11409/the-gel-that-stops-bleeding-instantly/

Behold, the travesty that is the F-35; feast your eyes on this AlterNet article:

http://www.alternet.org/fail-400-billion-military-jet-cant-fly-cloudy-weather

Also, I had no idea AlterNet were still going. Yay!

Epic snow crystals under electron microscope:

http://twistedsifter.com/2013/03/microscopic-images-of-snow-crystals/


Bookmarks for 20130320 (ish)

After a lot of messing about tonight, finally got some sense out of XmlReader after reading the following codeproject article:

http://www.codeproject.com/Articles/318876/Using-the-XmlReader-class-with-Csharp

Tuesday 19 March 2013

Bookmarks for 20130319


Fantastic article about Sean Smith, aka Vile Rat, Eve Online's greatest diplomat, killed in action in the attack on the US Embassy in Benghazi on September 11th 2012. It explains a little bit about the game and, in some part, about the community that I love. Reading it brought back the initial shock at hearing about the attack and then the ripples it sent through the game's players. Memento mori; Vita brevis.

http://www.playboy.com/playground/view/vile-rat-virtual-world-of-eve-online



And a first look at new Mozilla Firefox dev tools:

http://boingboing.net/2013/03/19/mozilla-foundation-unveils-dev.html


Monday 18 March 2013

Friday 15 March 2013

Bookmarks for 20130315

"Huginn is a system for building agents that perform automated tasks for you online. They can read the web, watch for events, and take actions on your behalf. Huginn's Agents create and consume events, propagating events along a directed event flow graph. Think of it as Yahoo! Pipes plus IFTTT on your own server. You always know who has your data. You do."
 https://github.com/cantino/huginn/

The Lunar Orbiter Image Recovery Project is seeking funds to continue saving the original imagery. Last few days for this!

http://www.rockethub.com/projects/14882-lunar-orbiter-image-recovery-project


Thursday 14 March 2013

Bookmarks for 20130314

"@sup3rmark: Two weeks of no pope: baby cured of HIV, breath test for cancer, salt water found on moon of Jupiter. Day one with pope: Google Reader dies."

That.

De facto list of Google Reader alternatives:

http://gizmodo.com/5990540/8-google-reader-alternatives-that-will-ease-your-rss-pain

Google, you f***ing suck sometimes; and just so you know, this is one of those times.


Wednesday 13 March 2013

Bookmarks for 20130313

Retinal implant allowing blind people to navigate doors, read words. Amazing stuff. (via @realityisbuggy)

http://news.sky.com/story/1063789/bionic-eye-enables-blind-people-to-see

HTML5 seems to be replete with people making the same old mistakes about website UI.

http://econsultancy.com/uk/blog/62335-14-lousy-web-design-trends-that-are-making-a-comeback-due-to-html5

Tuesday 12 March 2013

Mainstream

Metro story today:

"How David Bowie infiltrated our minds with his viral ad campaign"

Did he now?

There seem to be a lot of people like myself who, for whatever reason, have simply opted out of being exposed to mainstream media. This could in some cases be described as a coping mechanism - the individual knows that being immersed in mainstream media is akin to drowning oneself in the constant push of stories, celebrities and happenings that vie for attention from resources which that person may think better directed elsewhere. So the act of avoidance becomes a kind of self-defence against distraction.

I don't have a TV. In truth, I haven't had a TV for many years. Oh, I have certainly watched TV; I think I spent most of my teenage years doing that (when I wasn't playing bass, smoking weed or failing at maths) (those items are vaguely related, by the way). Teenage Fractos can be found on the sofa, lying on one side, the remote held in an outstretched, balanced grip; watching every detail of a programme before discarding it and flicking channel - a remarkably cyclic act in pre-Cable-TV days. At some point, after years of this behaviour, I was distracted by other things and that was the end of that.

The principal reason I avoid TV now is simply the wall-to-wall advertising. That said, I don't think TV has been the same since Horizon went shit, although that was somewhere in the mid-1990s. But there is something else, too, and I think it has to do with not connecting with the attitudes and personalities that present news and media. Programmes have personalities; channels have personalities, and it is my belief that I have found them to be more compatible with my own personality in the past, but no longer.

Now, we can choose our news, we can choose our streams of information. For example, I may not have a TV but I will catch up on Twitter every chance that I get. Within that medium are things of my own choice: channels and conversations that I have either expressed an interest in, in the people behind those words, or in particular flavours of news reporting.

Often I will learn from Twitter of news items before they ever hit the TV or news websites, and indeed some will never actually appear elsewhere. That is important because there is a sense of beating the media at its own game. Media dictates its own presentation pace and content; All will be revealed... right after this commercial, provided the board and the editorial team lets it through! I don't like that and I definitely do not want that. News should not be squandered and we are not all huddled around the radio any more.

When important things happen, news travels faster through some mediums than others. Even if I have turned away from the mainstream, I still feel superluminal.

Bookmarks for 20130312


If you haven't read/heard this, then you really should. Bradley Manning's court statement leaked. Sounds to me like they didn't actually break him. \o/

http://www.guardian.co.uk/commentisfree/2013/mar/12/bradley-manning-tapes-own-words

Another Guardian link, this is Cory Doctorow's piece on why Tim Berners-Lee is wrong about practically endorsing DRM in HTML5 and that the W3C should send packing all those who require bringing DRM to the browser.

http://www.guardian.co.uk/technology/blog/2013/mar/12/tim-berners-lee-drm-cory-doctorow?CMP=twt_fd


Monday 11 March 2013

Saturday 9 March 2013

Spate

http://www.rockpapershotgun.com/2013/03/09/mario-and-dear-esther-walk-into-an-absinthe-bar-spate/

Now I have played Dear Esther, which is less of a game and more of one of those dreams where you're a ghost.

But this looks... mental.

Bookmarks for 20130309

This was pretty cool. Coder breaks down the technique behind Amazon's "instant" sub-menu presentation.

http://bjk5.com/post/44698559168/breaking-down-amazons-mega-dropdown


Friday 8 March 2013

Bookmarks for 20130308

.Net object-oriented question that should be on interview tests

http://stackoverflow.com/questions/1508350/why-does-this-polymorphic-c-sharp-code-print-what-it-does

Amazing video showing retired lab chimps stepping outside for the first time into a more natural habitat. I fully cried.

"@slugnads: ...and see a sky without bars. RT @wiredscience: Video: Retired lab chimps step outside for the first time. ... " - Nadia Drake, reporter for @wiredscience

http://www.wired.com/wiredscience/2013/03/lab-chimps-step-outside/?cid=co6271394

Wednesday 6 March 2013

Bookmarks for 20130306

I've been playing this to death. To Death.

How To Destroy Angels - "Welcome oblivion" - live on Soundcloud now.

https://soundcloud.com/howtodestroyangels/sets/welcome-oblivion-2013


Turns out it's (probably) not a Hadley cell. Lost that bet.

http://news.sciencemag.org/sciencenow/2010/04/saturns-strange-hexagon-recreate.html


Some extra bits for that SqlBulkCopy blog. Defaults of the behaviour which mean I have to correct my article a little bit.

http://msdn.microsoft.com/en-gb/library/system.data.sqlclient.sqlbulkcopyoptions.aspx


Migration to Blogger

The process of moving my Dirty Fire Project blog to Blogger has been, frankly, hair-pullingly annoying.

However, I would like to save for posterity how I managed to get it working.

Importing the articles proved impossible as I couldn't give it an RSS feed and let it do the rest, which was a shame. The utility of that would be immense. So, I ended up copy pasting work in and then stripping the formatting (after realising that it had kept the background colour of the text the first time I did it). It did give me a chance to go through every article and add edits, fix links and so on (desperately trying to find silver cloud here).

The main problem with migration was the use of a custom domain, or rather, persuading Blogger to accept that I had authority for my existing domain.

What it boils down to is that to make it work I have the following zone entries:
(my verify key) CNAME (my verify token).googlehosted.com.
www CNAME ghs.google.com.
Where (my verify key) is the CNAME key and (my verify token) is the CNAME domain they want you to add in the domain authentication settings.

The most critical part of this that mattered was the inclusion of the full stops after the two CNAME values.

Basically, the first page of instructions, the one on the Settings->Basic->Publishing screen, and critically the one you see when you enter information which hasn't worked (giving the infamous Error 12 / Error 13 messages) - has the wrong information. It advises putting the address in without the full stop.

If I was more clear about how to put DNS records together then I probably would have spotted this sooner. As it is, it took until the next morning when I stumbled across a different page - the one that is linked as "settings instructions" and tailors its information based on what registrar your domain uses - before I noticed that it included a trailing full stop in the domain verification value.

Once that was set up, it all went quite well. The "missing files host" facility works really well, so I can point that back at my old server and it picks up any files that are not present on the new www host.

The site looks a lot better. And I don't have to run a really contrived set of operations to add an article any more. Though, in all fairness, I did *write* that contrived set of operations. My bad :)

EDIT:
I've isolated what the confusion is. The set of instructions that appear when you first type in your custom domain, where the operation fails with an Error 12/13 - THOSE INSTRUCTIONS ARE WRONG. Then, the linked page called "settings instructions" claims that you need a full stop after the name of the verify token (which is correct), but doesn't include one after the "ghs.google.com" address - THOSE INSTRUCTIONS ARE THEREFORE WRONG.
The only page with the correct information on is the "webmaster's verification tool" page which you have to click around to find. The CNAME method verification page in that area has the full stops added to the back of both the www / ghs.google.com and verify token addresses.

Tuesday 5 March 2013

Don't fear the Bulk Copy

While searching for a way to optimise parts of a particular system today, I had managed to get a write to SQL Server (2005, non-local) down to about 0.5 seconds for 121 rows. Not great, but I was prepared to believe that it was working hard as it is a fat table with a dodgy looking schema, with the less said the better about the network in-between.

My boss pointed me at this page: http://msdn.microsoft.com/en-us/library/1y8tb169.aspx which has details on the SqlBulkCopy object in the System.Data library.

To be honest, I was a bit cagey about this. My belief has always been that there is a lower limit of rows before the pros of using a bulk copy operation outweigh the cons. Perhaps that is true, but certainly for a mere 121 rows, it is not an issue at all.

Previously, I had implemented this insert operation using a variety of methods:

  • Injection via an XML parameter, processed by use of @xml.nodes style queries. 
  • Dynamically generated SQL, creating a large INSERT statement with multiple rows using SELECT and UNION ALL. 
  • Individual parameterised INSERT statements. 

Of these, the XML method had the worst performance; SQL Server may have tools to navigate and process XML, but they are definitely not quick and this was to be expected. This process is *marginally* quicker than doing individual insert statements, but only after the number of rows has increased past, say, 50.

The dynamically generated query looked a bit like this:

INSERT INTO [dbo].[tblData] ( [UserId], [CreateDate], [Value] )
SELECT 1001, '2013-03-04 00:14:30.00', 25
UNION ALL SELECT 1023, '2013-03-04 00:14:30.15', 67
UNION ALL SELECT 1038, '2013-03-04 00:14:30.32', 21
Examining the execution plan for this yielded that it spent most of its time doing a clustered index insert; about 98% of its time to be exact. Although performance for this increased between runs (eventually down to 21 milliseconds for the writing of 121 rows), there was still room for improvement. Time for 1000 users in a Parallel.ForEach loop of the entire operation (which included this write) was 00:02:30

It was hoped that the use of parameterised INSERT statements would allow SQL Server to cache an execution plan and use it. In tests, it does perform faster. Time for 1000 users, as above, was 00:01:45

So... on to an implementation using SqlBulkCopy.

If you are throwing arbitrary data at it then this entails a little bit of set up. The WriteToServer method accepts a DataTable object and this must be tailored exactly to the schema for the bulk copy to work.

(Please note that this is an extremely cut down / Noddy version of the table for brevity.)
public override void Put(int userID, List; myData)
{
    DataTable dt = new DataTable();
    dt.Columns.Add(new DataColumn("DataId", typeof (System.Int32)));
    dt.Columns.Add(new DataColumn("UserId", typeof(System.Int32)));
    dt.Columns.Add(new DataColumn("CreateDate",typeof(System.DateTime)));
    dt.Columns.Add(new DataColumn("Value",typeof(System.Int32)));
After this, you add the data, row by row into the table. If you have a nullable field, then test for HasValue and use either the Value or enter DBNull.Value for the assignment.
foreach (MyData row in myData) {
    DataRow dr = dt.NewRow();
    dr["DataId"] = DBNull.Value;
    dr["UserId"] = row.UserId;
    dr["CreateDate"] = row.CreateDate;
    dr["Value"] = row.Value;
    dt.Rows.Add(dr);
}
Now we set up the SqlBulkCopy object. I've added two option flags for it - KeepNulls and KeepIdentity. KeepNulls so it will honour the DBNull.Value encountered on some fields, and KeepIdentity so that it leaves the destination table in control of the assignment of row identity. I have included the row ID in the DataTable's columns, set it to DBNull.Value in the rows themselves, but I shall now make sure that it is removed from the column mappings by clearing them and re-adding the columns I require.


CORRECTION:
The KeepIdentity flag does NOT do that. This code only works because I do not include the identity column in the ColumnMappings collection.

using (
    SqlBulkCopy bulkCopy = new SqlBulkCopy(
        _connectionString,
        SqlBulkCopyOptions.KeepNulls | SqlBulkCopyOptions.KeepIdentity))
    {

        bulkCopy.DestinationTableName = "dbo.tblData";
        bulkCopy.ColumnMappings.Clear();
        bulkCopy.ColumnMappings.Add("UserId", "UserId");
        bulkCopy.ColumnMappings.Add("CreateDate", "CreateDate");
        bulkCopy.ColumnMappings.Add("Value", "Value");
Then I can perform the write.
        try
        {
            bulkCopy.WriteToServer(dt);
        }
        catch (Exception)
        {
            throw;
        }
    }
}
The performance of the write, at 121 rows, was 0.025 seconds, instead of 0.5 seconds. Time for 1000 users, in the parallel test mentioned above, was 00:00:45.

This technique is *lightning* fast and totally worth doing for much smaller numbers of rows than I originally thought.

Don't fear the Bulk Copy.


EDIT:
It should be noted that the default behaviour of the WriteToServer command is that the following apply:
  • Table constraints will not be enforced
  • Insert triggers will not fire
  • The operation will use Row locks
This behaviour can be tailored using the SqlBulkCopyOptions enumeration, as detailed here: http://msdn.microsoft.com/en-gb/library/system.data.sqlclient.sqlbulkcopyoptions.aspx


Bookmarks for 20130305

DesignModo's Flat-UI, a free UI kit based on Bootstrap, JQuery:

EDIT: This is now defunct. GitHub got hit with a DMCA takedown on the software as the guy's previous employer deigned it to be their property.

http://designmodo.github.com/Flat-UI/

TIL that SqlBulkCopy is really awesome:

http://msdn.microsoft.com/en-us/library/1y8tb169.aspx

Saturday 2 March 2013

Wine and DNS

Wake up. Blink. Rehydrate. Discover emailed receipt for two domains purchased sometime on Friday night.

Gulp.

I'd better finish this project then!