Go back

Richard Thomason: TechEd Day 4

Today is a better day all round. My nose is no longer streaming from the cold I contracted on Monday morning, and because we had to get back on the Metro after the Uk developer party, I got a few minutes more sleep!

13 Writing a 3D game in 60 minutes - Dirk Primbs

I figured this session would either be mind melting or illuminating, but the 60 minutes angle was encouraging. The XNA game development environment has been well conceived, and gives a straightforward object oriented managed code interface to hardware accelerated DirectX game development for Windows and XBox 360. Furthermore you don't need expensive development hardware, just download the free Game Studio XNA development kit. There are starter kits to get you going; Dirk used elements of an Asteroids style interface as the basis for his game, and in about an hour implemented a ship, enemies, bullets, movement, collision detection and a simple background. Not bad.

The technical details are pretty much as you would expect for VS OO development in a 3D graphics environment.

Update method: repeatedly called to handle movement and other actions
Draw method: also repeatedly called.
Content class handles resources.
Meshes are collections of points.
Effects are rendering descriptions - lighting, position, shadow.
Shader is a compilable language which is sent to the graphics card.
Effects are matrices:
Projection describes the field of view.
View describes the direction of view.
World describes the position on the screen.
Moves are handled by Matrix.createTranslation.
Intersections are handled by BoundingSpheres (simple and effective).
Backgrounds are textures.

Etonyek: Green Computing through sharing - Pat Helland

I remembered this chap from last year as the one who annoyingly read out verbatim every single word on every single slide, and prepared to grit my teeth. This year he improved - only 90% read out.

His talk was aimed at providing a rationale for cloud computing that went beyond the computer industry, and focussed on power consumption, competition for resources, SLA's and possible solutions. Fortunately MS are opening a big datacenter late this year, so he was able to talk about that and avoid the strong impression that MS are 3-4 years behind Google and other players such as Amazon in this whole area; Google has been doing datacenters in anger since at least 2006, and the impressive Google File System was white papered in 2003: labs.google.com/papers/gfs-sosp2003.pdf.

But enough sour grapes already. PH said that in the US about 2% of total power goes on datacenters, which is more than on TV. Somewhat less is consumed in Europe (a Google effect?) but the same trend to increasing consumption is there; how can this be managed and possibly reduced? He described two papers on usage of common resources - the famous Tragedy of the Commons by Garret Hardin in 1968, and a development by Michael Heller in 1998 called the Tragedy of the Anti-Commons. Hardin noted that when resources are shared they are always over-used, with detrimental effects on both the resource and on usage; his example was shepherds grazing their sheep on common land - there is no incentive for shepherds to not graze or to have fewer sheep.

Heller observed that when ownership is fine-grained it is more expensive and difficult to share, for example land ownership and patents. These problems both occur when companies use datacenter and other computing resources. Mainframes were expensive, PC's were unscalable and had no redundancy, virtual machines provide some solutions, and Lo! The cloud provides further sharing opportunities. A variety of factors make shared virtual machines efficient; mixing application types on servers makes more efficient use of power and throughput.

PH went on to discuss SLA's and noted that large numbers of low utilisation VM's were more efficient than small numbers of high utilisation ones. Amazon, whose page requests typically contact 150 different services (holy shmoly) have extreme requirements for response times, and use percentage rather than average response times, as this mechanism delivers a far better user experience. Typically they require 99.9% of pages to be rendered in less than 400 ms.

14 Quick integration from Sharepoint to your application - Robert Bogue

As I arrived a little late after lunch, this presenter was demonstrating how to make SP go to an embedded link. Never mind, I thought, I'm sure we'll move rapidly on. In due course, after dealing with parameterised URI's, we moved on to how out of the box features such as single sign on were unusable, how Web Parts were a lot of work, how the demo in fact was not refreshing the application because the cache settings were incorrectly configured, how the Page Viewer Web Part creates many usability issues, how the Business Data Catalog creates an application definition that is difficult to edit manually unless you use the built in definition editor, how good the search SP facitilities facilities were and how all the details were on his blog and not in his demo. And at this point I had had enough.

I returned to the workdesk, plugged in and took a look at his blog - Robert Bogue [MVP]: http://www.thorprojects.com/blog/default.aspx. The lead article was currently Sharepoint Governance and Pandora's Box.

I guess I shouldn't say that this particular messenger should definitely be shot, as I might be accused of being uncharitable. If this wasn't a public facing blog I could be more specific, but instead I propose that some of the energy saved by Pat Helland's efforts earlier should be encapsulated and applied for his benefit.

15 Building advanced Virtual Earth applications - Johannes Kebeck

I was excited about this one - the Google API is a stonker and as VE is actually jolly good maybe MS have caught up or even overtaken it. Unfortunately Johannes had connectivity issues at the start of his talk which made the whole a little rushed, but he overcame the problem with a lot of energy, excellent knowledge and demos.

Virtual Earth (VE) has been going since 2001, and started as an investigation project on dealing with larger data sets, which later became the TerraServer project. In a 50th birthday speech, Bill outlined a need to be able to see the earth from his pc, including local weather, 3D images and even live content. 7 years and half a dozen acquisitions later, VE has most of these features in some form or other, including several sources of geodata, their own camera system and 9000 cores for obtaining and automatically processing 3D data, real-time weather, and rich and powerful manipulation APIs, as well as a lot of support functionality in SQL Server 2008.

In the near future when privacy issues are overcome, which basically amounts to processing out people and number plates, there will be street views as well as 3D cityscapes. Trees are added in by detecting them with IR, matching the type via a geographic library of trees and "planting" them in the places identified by the camera.

Johannes started with a bang and showed in Popfly an RSS feed from Reuters and VE on the same mash-up. However then he piped the output from the feed through a geoconverter service which gave him location information and displayed the result as pushpins in VE with popups containing the stories, no code required. This got a round of applause.

He them showed the Photosynth viewer at National Geographic, using the Sphinx, followed by a demo showing a floor plan of the MS building in Reading side by side with the building in Virtual Earth. He showed in a technology called MapCruncher how to link the two images together so that VE understands how to move when you move in the other window, a matter of a few trangulation points and scaling. This is easy to achieve because VE supports multiple layers (VETileLayer), which can include layers from VE and layers from your own data server.

VE can be integrated via JS with pretty much everything. BP has a fabulously named Hurricane Management System which uses MOSS to figure out where the weather is, which staff will be affected and so on. CRM has obvious requirements for sales staff, customer locations, service engineers etc.

He gave some details of the extensive geodata support in SQL Server 2008, including geometry and geography data types and advanced tesselating indexing functionality. He then did a demo showing data points sourced from his local machine in VE, and discussed how to display the right number of points on the map, optimising the work required to show the correct number of points quickly by choosing to evaluate on the client or on the server. For apps which have less than 1000 points, evaluate them all on the client as there is not much processsing overhead. However use an array in JS to pass the points to VE if there are more than a few. For < 1500 points use a ShapeArray. For more than that, consider using client or server side clustering.

There are a number of data import tools. A good free one is Martin Nielsen's SQL spatial tools. The best commercial one is Spatial ETL.

16 The Windows Vista Bridge: access Vista from managed code - Kate Gregory

This was a very straightforward discussion of how to call unmanaged code from managed. In addition, it's a way to call Vista features not avilable in the CLR.

Originally distributed in the SDK as a sample, Vista Bridge 1.3 is now published on Code Gallery, which is much easier to find than in the SDK samples.

Got a project? Let's work together

Compsoft is a remote working company except for Thursdays when we work together from the Alton Maltings in Alton, Hampshire (UK). Our registered address is detailed below. We'd be delighted to have either a virtual chat with you or to make arrangements to meet in person.