Richard Thomason: TechEd Day 3

8 Creating custom LINQ providers - Bart de Smet

This session was a code heavy explanation of how to go about writing a LINQ provider. The key elements were how to parse the expression tree (invoke the compiler on it) and the relatively straightforward details of how to recognise constants and operators and translate them into the target language. The next stage is to get hold of some simple samples.

It looks fairly straightforward to write a LINQ provider for Equinox. It is also tempting to think about implementing a LINQ subset in Equinox as an alternative to subtable blocks and select statements as all the building blocks are readily available. Allowing the implementation of custom providers is probably a step too far however...

9 REST The lightweight alternative to SOAP - Andy Wigley

I've come across REST before and wanted a refresher. This particular talk was using it with Windows Mobile, so I thought I'd see what the professionals get up to, as I have had bad experiences with it as an end user.

Not one single demo worked.

Poor guy. Fortunately the REST part was all slides and so the presentation of that was fine, and he showed LINQ to XML which makes the parsing of the result trivial. Briefly, REST is becoming increasingly popular as a protocol as it is more lightweight than SOAP, using URIs rather than RPCs to access named data items, and standard HTTP verbs for the main read/create/update/delete operations. Many sites offer REST, including Facebook, Flickr, Yahoo and Amazon.

10 Building cloud based enterprise grade applications - Gianpaolo Carraro

A standing room only session.

This talk assumed familiarity with Azure, which is basically Microsoft Cloud Services; instead of implenting your applications internally you host them "in the cloud", on virtual machines somewhere on the internet, and take advantage of near infinite (they say) scalability and pay as you go pricing models.

Cloud based apps such as Facebook and Flickr are popular with end users but haven't yet made it in the corporate environment. GP introduced VeryBigCorp as his target organisation, and imagined that the marketing department rolls out a 20 user cloud app, then wants to deploy company wide. They require single sign on, integration with mainframe apps and other internal systems, for example, redirection of help desk tickets from the internal system to the cloud service provider.

Firstly, Kerberos etc does't work in the cloud, so what can replace it? Secondly there needs to be different levels of signon to allow administrators and users (for example) to operate differently. You can use your usual development tools to implement this, but there is also available the Azure Services Platform which contains Service Bus, Access Control, Windows Azure and SDS (SQL Data Services).

Security Integration is normally done with a simple username and password, however a better solution is claim based access control. The first stage is authentication which can use a variety of different means of identification, then there is a mapping from identity to roles, called a claim transformer, and finally there are resources protected by claims. He demoed adding a new employee to applicationss and groups in AD, then used the Azure Access Control Service to achieve the same things.

Next, how to monitor your cloud application. Amazon Web Services have a Health Dashboard which is a good example. It needs to provide all the management tools which you would expect in an internal application: backup, start, stop troubleshoot, monitor etc. Cloud apps should be the same as internal ones. GP demoed a cloud app being monitored in the Management Console, just like a normal internal application. The emphasis is on local management of remote services.

Remote applications are much more "black box" than internal ones. You can't back up SQL Server databases, fiddle with IIS implementations as you would internally. Furthermore, large organisations have a need for process integration. The Azure Service Bus provides a way to allow external services to access internal endpoints in order to allow information flow back into the organisation. It uses WCF to reflect existing internal endpoints to the Internet. He impressively demoed sending issues to and from a green screen AS400 application.

Presumably it's all a mega set of web services that you have to implement to provide this functionality, and hopefully this will be covered in the cloud service implementation talk I'm going to later today. SDS didn't come up, but is basically an ADO client with Net facing SOAP and REST interfaces.

11 Developing high performance Javascript and AJAX for Explorer 8 - John Hrvatin

It was the "high performance" aspect of this that I'm interested in, rather than IE8, so I was hoping some of the material would apply generally. This proved to be the case, and John went through a number of performance issues with JS, some new, some old, as well as pointing out improvements made in IE8.

Optimise symbol table resolution. Accessing the DOM is expensive, so avoid it where possible and cache otherwise. JS and the DOM have separate memory management in versions before IE8 and this can lead to circular references and leaks. Do not access DOM arrays directly. Instead copy them locally first. Use DOM methods where possible (eg GetElementById) rather than use generic JS traversing the DOM.

Use var to declare local variables, which avoids the need for the compiler to look through the whole symbol table before creating it. Use local variables to cache writes to DOM items and do one write at the end. Use var to cache DOM function pointers. Closely examine loops, as that is where 90% of the time inefficiencies will be.

String concatenation is very expensive in nearly all browsers (not IE8). Instead use the built in array functions push() and join().

The eval() function is very powerful but very expensive as it creates a new execution context and parser. Especially for JSON, use a 3rd party JSON library which typically use regular expressions and so on to do the parsing. IE8 has a new JSON object which exposes useful methods parse() and stringify(). See also json_parser.js at

The switch statement is costly for large sets as it resolves to a number of if statements. Consider using a hash table and try/catch instead where appropriate.

HTTP improvements. Put JS in one file at the end of the page. The browser stops rendering when it sees JS and immediately downloads it in case there are embedded document.writes etc. With the one file at the end you maximise browser caching and get the page rendered before downloading the script. Similarly, put css in one file at the beginning, as then the browser has everything it needs to render the page as it is loading.

Use HTTP compression especially for large amounts of text (gzip Accept-Encoding Content-Encoding). Use conditional loading wher epossible via If-Modified-Since/Expires/Max-Age.

MS have a number of tools for script debugging, including some sophisticated debugging tools available at all times in IE8. Press F12 or use the Tools menu. Also check out Fiddler, a network/bandwidth tool at

12 Developing and deploying cloud services - David Aiken

This was a code heavy session, as David demonstrated developing and debugging a cloud application locally then set up a cloud site and deployed it to a test environment online, then finally moved it to its live environment. While there was a lot of legwork to achieve the desired result and the deployment was achingly slow, no doubt this will improve as the technology matures.

There was a lively discussion in the bus afterwards as to the merits of the cloud concept, and who exactly would be tempted to use it in anger, and depend on its security and reliability. The demo used the blob store for persistence, and there was no mention of SQL Server, however a little further investigation indicates that services may well be the way it goes.