random .NET and web development musings

I have lots of calls to jQuery’s focus() method in my sites:

$('input:first').focus();

However on mobile devices you might not want to be calling this, because focus() doesnt open the keyboard, but does focus the element. Under certain cirumstances you might want to prevent the focus from happening to give a better user experience.

You can do this with the following code:

$.event.special.focus = {
	trigger: function (e) {
		e.preventDefault();
		return true;
	}
};

You will only want to apply this when your responsive site is being displayed on a mobile device.

thanks to pushOK on StackOverflow for this jQuery insight.

Check out this blog post:

http://www.pretentiousname.com/timesync/

Essentially you need to create a scheduled task that runs

W32tm.exe /resync

As often as you deem necessary (I have chosen hourly).

Here is an exported scheduled task that you can use:

<?xml version="1.0" encoding="UTF-16"?>
<Task version="1.3" xmlns="http://schemas.microsoft.com/windows/2004/02/mit/task">
  <RegistrationInfo>
    <Date>2013-11-28T09:19:59.2378</Date>
    <Author>OPS-02\Administrator</Author>
  </RegistrationInfo>
  <Triggers>
    <CalendarTrigger>
      <Repetition>
        <Interval>PT1H</Interval>
        <StopAtDurationEnd>false</StopAtDurationEnd>
      </Repetition>
      <StartBoundary>2013-11-28T09:17:40.4914</StartBoundary>
      <Enabled>true</Enabled>
      <ScheduleByDay>
        <DaysInterval>1</DaysInterval>
      </ScheduleByDay>
    </CalendarTrigger>
  </Triggers>
  <Principals>
    <Principal id="Author">
      <UserId>S-1-5-19</UserId>
      <RunLevel>HighestAvailable</RunLevel>
    </Principal>
  </Principals>
  <Settings>
    <MultipleInstancesPolicy>IgnoreNew</MultipleInstancesPolicy>
    <DisallowStartIfOnBatteries>false</DisallowStartIfOnBatteries>
    <StopIfGoingOnBatteries>true</StopIfGoingOnBatteries>
    <AllowHardTerminate>true</AllowHardTerminate>
    <StartWhenAvailable>true</StartWhenAvailable>
    <RunOnlyIfNetworkAvailable>true</RunOnlyIfNetworkAvailable>
    <IdleSettings>
      <StopOnIdleEnd>true</StopOnIdleEnd>
      <RestartOnIdle>false</RestartOnIdle>
    </IdleSettings>
    <AllowStartOnDemand>true</AllowStartOnDemand>
    <Enabled>true</Enabled>
    <Hidden>false</Hidden>
    <RunOnlyIfIdle>false</RunOnlyIfIdle>
    <DisallowStartOnRemoteAppSession>false</DisallowStartOnRemoteAppSession>
    <UseUnifiedSchedulingEngine>false</UseUnifiedSchedulingEngine>
    <WakeToRun>false</WakeToRun>
    <ExecutionTimeLimit>P3D</ExecutionTimeLimit>
    <Priority>7</Priority>
  </Settings>
  <Actions Context="Author">
    <Exec>
      <Command>%windir%\system32\sc.exe</Command>
      <Arguments>start w32time task_started</Arguments>
    </Exec>
    <Exec>
      <Command>%windir%\system32\w32tm.exe</Command>
      <Arguments>/resync</Arguments>
    </Exec>
  </Actions>
</Task>

This shall be a dumping ground that I keep updated with useful resources for optimising web sites.

Tools

Blogs

Videos

Tweeps to Follow

Three simple steps:

openssl pkcs12 -in mycert.pfx -out mycert.txt -nodes

Then, to generate your encrypted private key

openssl rsa -in mycert.txt -text -out mycert.key

And your certificate:

openssl x509 -inform PEM -in mycert.txt -out mycert.cer

Bosh.

The other day something happened that caused the insert part of me running the SQL Azure Migration Wizard to abort, leaving me with a bunch of local .dat files and no data on my destination server.

To upload the data, you need to run the following command:

bcp database.dbo.tablename in dbo.tablename.dat -n -U username -P password -S remote.server.address.com -b 200 -h"TABLOCK"

You can play with various values for -b, the batch size. I found 200 worked reasonably although I didn’t investigate too much.

Here is a great post on troubleshooting your AWS ELB.

The point that caught me out for about 10 hours today was that if you have your ELB configured for multiple Availability Zones, it doesnt matter if your assigned instance list doesn’t contain any instances from some of the AZs, it will still route traffic to those zones, which will get lost and result in a 503 (or 504/324).

So, DONT assign AZs that dont have any in-service instances running.

You want your site to issue far-future cache expiry values for resources like CSS and JS to reduce bandwidth usage and decrease page load speed.

However, when you release new code, you want everyone to receive this a.s.a.p.
But how do you achieve this when they all have cached versions that are cache-valid for a week or more?

Here’s what I do.

Create yourself a class such as this:

public static class Cacher
{
	public static readonly string Value;

	static Cacher()
	{
		Value = DateTime.UtcNow.ToString("yyMMddHHmmssfff");
	}
}

Then, change your script and css tags from:

<link rel="Stylesheet" type="text/css" href="/assets/css/all.css" />

to:

<link rel="Stylesheet" type="text/css" href="/assets/css/<%: Cacher.Value %>/all.css" />

You can then use a mod_rewrite/asapi_rewrite rule to remove the value:

RewriteRule ^assets/css/[^/]+/all.css /assets/css/all.css [L,NC]

The reason you want the value in the path and not in the query string is that some caches refuse to cache content on URIs which include querys, regardless of the cache-control headers.

Alternatively, you could make the value be the current assembly version. It depends on your use-case.

Do you want to pull GA data into your site’s admin area? Here’s how to do it as simply as possible.

  1. Read this: https://developers.google.com/analytics/devguides/reporting/core/v3/reference
  2. Then this: https://developers.google.com/accounts/docs/OAuth2WebServer#refresh
  3. Then this: https://developers.google.com/analytics/devguides/reporting/core/dimsmets

Right, now go here: https://code.google.com/apis/console/ and create yourself an App, and a Client ID for a Web Application, so you end up with something less blurry than this:

Next, replace the XXXXXX in the below with the Client ID from above:


https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=XXXXXX&redirect_uri=https%3A%2F%2Flocalhost&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fanalytics.readonly&access_type=offline

and visit this in your browser, click “grant access/allow”.

You’ll end up on:

http://localhost/?code=yyyyyy

“yyyyyy” is your “Code”, keep this.

Next, open up Fiddler and issue this request:

POST https://accounts.google.com/o/oauth2/token HTTP/1.1
Host: accounts.google.com
Content-Type: application/x-www-form-urlencoded
Content-Length: ???

code=YYYYY&client_id=XXXXXX&client_secret=ZZZZZ&redirect_uri=https%3A%2F%2Flocalhost&grant_type=authorization_code

Replace YYYYY with your Code, XXXXX with your Client Id and ZZZZZ with your client secret.

Bang, get a response like this:

{
  "access_token" : "PPPPPPPPP",
  "token_type" : "Bearer",
  "expires_in" : 3600,
  "refresh_token" : "QQQQQQQQ"
}

Store the access_token and refresh_token, you’ll need these.

Now, make yourself a request!

replace DDDDD with the IDs of the profile you want to report on, find that here

GET https://www.googleapis.com/analytics/v3/data/ga?ids=ga:DDDDDDD&metrics=ga:visits&start-date=2012-06-01&end-date=2012-06-25 HTTP/1.1
Host: www.googleapis.com
Content-Type: application/x-www-form-urlencoded
Content-Length: 0
Authorization: Bearer PPPPPPPP

Then, when your token expires, request a new on like this:

POST https://accounts.google.com/o/oauth2/token HTTP/1.1
Host: accounts.google.com
Content-Type: application/x-www-form-urlencoded
Content-Length: ???

refresh_token=QQQQQQQQ&client_id=XXXXXX&client_secret=ZZZZZ&grant_type=refresh_token

Next, send me beer for writing the only tutorial on the ENTIRE INTERNET that explains this process concisely.

Here are some of the techniques I use to optimise my builds. Not all of them will be appropriate to you, and not all of them conceptually work together.
Regardless, these techniques can considerably reduce your build time.

Parallelism

Get MsBuild using all your cores:

/m /p:BuildInParallel=true

you can set

/m:x

to tell it to use x cores if you like.

Reduce Projects

Say you have a project structure like this:

src/MyProj.Core
src/MyProj.Persistence
src/MyProj.Web
src/MyProj.Api

where Core and Persistence are class libraries and Web and Api are web applications.

Do you really need Core and Persistence to be separate assemblies? Do you ever use one independant of the other? Are you really building a highly modular, reusable solution?
There is a huge overhead in firing up the compilation process for each assembly. Keep this to a minimum with as few assemblies as possible.

You might also have the following test projects:

src/MyProj.Core.Tests
src/MyProj.Persistence.Tests
src/MyProj.Web.Tests
src/MyProj.Api.Tests

Why? You can most likely reduce these to a single assembly and use namespaces to divide the various assembly tests.

Inter-Project Dependencies

The biggest waste of time in my previous build mechansim has been redundant building.

Say you have the following dependency tree:

Web
 L- Core
 L- Persistence
     L- Core
	 
Api
 L- Core
 L- Persistence
     L- Core

If you call MsBuild twice, once for Web and once for Api, you will needlessly build Core and Persistence twice.

There are two ways to avoid this, one simple and one complicated.

Complicated – Manage the dependencies yourself

For me, this is a no-go really. Its more effort than its worth for simple projects, and its too unmaintainable for large projects. Essentially it involves building each project directly with msbuild without the ResolveReferences target, then xcopying the artefacts around to each project and fiddling with reference paths. It gets very messy, very fast.

Simple – Build a single project

Option one: Just build your test assembly.

Continuing the same example from above, your dependency graph would look like this:

Tests
 L- Web
     L- Core
     L- Persistence
         L- Core
 L- Api
     L- Core
     L- Persistence
         L- Core

You can then use something like the following msbuild command:

msbuild src/MyProj.Tests/MyProj.Tests.csproj /t:ReBuild;ResolveReferences;PrepareResources;_CopyWebApplication /v:m /p:OutputPath=../../build /m /p:BuildInParallel=true

Note the _CopyWebApplication target, this “publishes” the web apps.

This will result in the following file system structure being created:

build/
build/_PublishedWebsites/
build/_PublishedWebsites/MyProj.Web
build/_PublishedWebsites/MyProj.Api

All your assemblies will be in build/, as well as a normal “published” version of each site under _PublishedWebsites.

You can then call your test runner on these :)

Option two: Build a custom single project

Perhaps you don’t have a single test project to build, or you only want to build a subset of all your projects. In this case, you can make a custom project file, and just build that!



	<ItemGroup>
		<ProjectReference Include="src\MyProj.Web\MyProj.Web.csproj" />
		<ProjectReference Include="src\MyProj.Api\MyProj.Api.csproj" />
	</ItemGroup>
	<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
	<Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" />

This way, each project is only built once, with the artefacts reused for each referencing project :)

One thing I’ve found is sometimes, a compiler error is thrown: CSC : fatal error CS2008: No inputs specified. I’ve got some projects that do this, and some that don’t and I’ve not been able to identify the difference that causes it.

Regardless, the solution is to include a .cs file (such as AssemblyInfo.cs) in the above project. This does result in an otherwise unwanted assembly being produced, but you can just ignore it. I’ll update this post if/when I find out more.

ILMerge or worse, aspnet_merge

Update: The below doesn’t work, but it will. Working on a patch for ILRepack that will fix this. Stay tuned.

Do you precompile your views with aspnet_compiler? If you do, you probably want to combine the multitude of App_Web_xxxxxx.dll assemblies that get created to reduce your app’s startup time and reduce its memory footprint. If you use aspnet_compiler that comes with the Windows SDK, you’re gonna have a bad time. Use ILRepack instead. Its like ILMerge, but written against Mono.Cecil – so it’s uber fast.

Say you have your artefacts in build/MyProj.Web, run this:

ilrepack /targetplatform:v4 /parallel /wildcards /log /verbose /lib:build/MyProj.Web/bin /out:build/MyProj.Web/bin/MyProj.Web.Views.dll build/MYProj.Web/bin/App_Web_*.dll

You can even go one step further and merge the assemblies into your web assembly for a single DLL:

ilrepack /targetplatform:v4 /parallel /wildcards /log /verbose /lib:build/MyProj.Web/bin /out:build/MyProj.Web/bin/MyProj.Web.Views.dll build/MyProj.Web/bin/MyProj.Web.dll build/MYProj.Web/bin/App_Web_*.dll

YUI Compressor

Using the Java YUI Compressor? STOP! Use the YUI Compressor MSBuild task instead, you will reduce the time this takes by several orders of magnitude. The Java compressor only accepts one file at a time, which causes Java to be fired up for every file you want to compress, this is slow.

Conclusion

There you have it, lots of ways you can make your slow build process run like lightning!

I simultaneously work on multiple web projects, and have a back catalogue of scores of sites that could need my attention at any time.

I always run my sites locally in a fully fledged IIS site (rather than using Cassini) which means each site needs its own hostname.

Until recently I had been managing this with my hosts file, simply adding a new line:

127.0.0.1  mysite.dev

for each site. However last week I reached breaking point as my hosts file was about 3 pages long.

Enter Velvet. Velvet adds wildcard support to your hosts file by acting as a simple DNS server that you can run stand alone or (preferably) as a windows service.

I have now reduced 3 pages of hosts entries into a two lines:

127.0.0.1  *.dev
127.0.0.1  *.*.dev

actually theres a few others, such as wildcard mappings to my colleagues machines:

192.168.0.2  *.jim
192.168.0.2  *.*.jim
192.168.0.3  *.bob
192.168.0.3  *.*.bob

This means I now rarely ever need to touch my hosts file, at least not for standard day-to-day project work :)

Ultra time saving win.

Check out the project on github.

Feature suggestions welcome!