MVC, Building and persisting view models.

On my recent projects there has been a higher demand for richer UI with pages containing a number of smaller steps (page updates) before the final submission.

In order to do this you will probably need to persist the collection of data between posts, here is one method of tackling this whilst keeping to the Post-Redirect-Get pattern.

To persist the data add these

using System.Collections.Generic;
using System.Web;

namespace MappingUIModels.Extensions
{
    /// <summary>Creation, access and deletions of objects passed in by session key and id.</summary>
    public static class SessionExtensions
    {
        /// <summary>Saves the object into a dictionary in session by key and a default dictionary id of object type name.</summary>
        public static T Save<T>(this T model, string key) where T : class
        {
            return model.Save(key,typeof(T).Name);
        }
        /// <summary>Saves the object into a dictionary in session using the key and id.</summary>
        public static T Save<T>(this T model, string key, string id) where T : class
        {
            var dictionary = FindOrCreateDictionary(key);
            if (dictionary.ContainsKey(id))
                dictionary[id] = model;
            else
                dictionary.Add(id, model);
            return model;
        }

        /// <summary>Retreives the object from the dictionary in session by key and a default dictionary id of object type name.</summary>
        public static T Find<T>(this T model, string key) where T : class
        {
            return model.Find<T>(key, typeof(T).Name);
        }
        /// <summary>Retreives the object from the dictionary in session by key and id.</summary>
        public static T Find<T>(this T model, string key, string id) where T : class
        {
            var dictionary = FindOrCreateDictionary(key);

            if (dictionary.ContainsKey(id))
                return dictionary[id] as T;

            return null;
        }
        /// <summary>Flush session by key.</summary>
        public static T Flush<T>(this T model, string key)
        {
            HttpContext.Current.Session[key] = null;
            return model;
        }
        /// <summary>Retreive or create dictionary in sesssion.</summary>
        private static Dictionary<string,object> FindOrCreateDictionary(string key)
        {
            Dictionary<string,object> dictionary = HttpContext.Current.Session[key] as Dictionary<string,object>;
            if (dictionary == null)
            {
             dictionary = new Dictionary<string, object>();
             HttpContext.Current.ReadValue(x => x.Session)[key] = dictionary;
            }
            return dictionary;
        }
    }
}

You may want to abstract this further so you can inject the persistence implementation and remove the coupling.

And here is a

using System.Web.Mvc;
using MappingUIModels.Models;
using MappingUIModels.Extensions;

namespace MappingUIModels.Controllers
{
    [HandleError]
    public class MyController : Controller
    {
        /// <summary>Post-Redirect-Get pattern example.</summary>
        private const string Key = "MyKey123";

        [AcceptVerbs(HttpVerbs.Get)]
        public ActionResult Index()
        {
            return View(new myViewModel().FindAndBuild(Key));          //Build/Show
        }

        [AcceptVerbs(HttpVerbs.Post)]
        public ActionResult Index(myViewModel model)
        {
            if (ModelState.IsValid) model.Description = "Well done.";  //Update
            model.Save(Key);                                           //Save
            return RedirectToAction("Index");                          //Redirect
        }
    }
}

Take note of the method on the view model FindAndBuild(Key). The view models are now responsible for knowing how to create themselves simular to MVVM. Again I’ve used extension methods in this simple example and you may want to abstract this further.

And here is a

using MappingUIModels.Extensions;

namespace MappingUIModels.Models
{
    public class myViewModel
    {
        public string Name { get; set; }
        public string Description { get; set; }
    }

    public static class MyViewModelExtensions
    {
        // Returns the model in session if it exists or builds a new one.
        public static myViewModel FindAndBuild(this myViewModel model, string key)
        {
            return model.Find(key).Flush(key) ?? model.Build();
        }

        // Builds the model, can use Services, repositories and mapping extensions here.
        public static myViewModel Build(this myViewModel model)
        {
            model.Name = "Tom & Jerry";
            model.Description = "Cartoon";
            return model;
        }
    }
}

Squashing your old sql scripts into migrations using T4 templates.

I recently looked at pulling some old sql scripts into a series of versioned migrations, here’s how the first cut panned out…

To get started download MigratorDotNet and the T4 Toolbox.

Install the toolbox and then create a new c# class project referencing Migrator and Migrator.Framework.

Create an Extensions folder and add

using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Reflection;
using Migrator.Framework;
using System.Data.SqlClient;

namespace Datacom.BPS.Migrations.Extensions
{
    public static class MigratorExtensions
    {
        public const string StoredProceduresDirectory = "\\StoredProcedures\\";
        public const string SchemaDirectory = "\\Schema\\";
        public static readonly string ScriptPath = Path.Combine(Path.GetDirectoryName((Assembly.GetExecutingAssembly().Location)), "Scripts");

        public static string StoredProceduresPath(string version, string action)
        {
            return ScriptPath + "\\" + version + "\\" + action + StoredProceduresDirectory;
        }

        public static string SchemaPath(string version, string action)
        {
            return ScriptPath + "\\" + version + "\\" + action + SchemaDirectory;
        }

        // http://code.google.com/p/migratordotnet/issues/detail?id=77#c3
        //Database.ExecuteOutsideTransaction("sp_addrole N'bps_Administrator'");
        //Database.ExecuteOutsideTransaction("sp_addrolemember N'bps_Administrator', N'bps_Monitor'");
        public static int ExecuteOutsideTransaction(this ITransformationProvider database, string sql)
        {
            //I guess this is how we get the connection from MigratorDotNet... -Lance
            var connectionString = database.GetCommand().Connection.ConnectionString;
            int result;

            using (var conn = new SqlConnection(connectionString))
            using (var cmd = new SqlCommand(sql, conn))
            {
                conn.Open();
                result = cmd.ExecuteNonQuery();
            }

            return result;
        }
        public static int ExecuteScriptOutsideTransaction(this ITransformationProvider database, string sqlScript)
        {
            var sql = File.ReadAllText(ScriptPath + sqlScript);
            var connectionString = database.GetCommand().Connection.ConnectionString;
            var result = 0;

            using (var conn = new SqlConnection(connectionString))
            using (var cmd = new SqlCommand(sql, conn))
            {
                conn.Open();
                result = cmd.ExecuteNonQuery();
            }

            return result;
        }
        // http://code.google.com/p/migratordotnet/issues/detail?id=77#c3
        //Database.ExecuteScriptAsNonQuery(Version.StoredProcedure("AgentEnable.sql"));
        public static void ExecuteScriptAsNonQuery(this ITransformationProvider database, string scriptPath)
        {
            var commands = File.ReadAllText(ScriptPath + scriptPath).Commands();

            foreach (string cmd in commands)
            {
                database.ExecuteNonQuery(cmd.FixIncorrectSyntaxNearGoError());
            }
        }
        public static List<string> Commands(this string script)
        {
            //http://blogs.msdn.com/b/onoj/archive/2008/02/26/incorrect-syntax-near-go-sqlcommand-executenonquery.aspx
            var commands = script.Split(new string[] { "GO\r\n", "GO ", "GO\t" }, StringSplitOptions.RemoveEmptyEntries).ToList();
            return commands;
        }
        private static string FixIncorrectSyntaxNearGoError(this string cmd)
        {
            //Incorrect syntax near 'GO'
            //SQL requires a blank line, haventt quite go this fix going yet so do nothing 4 now.
            //(cmd + (char)(10) + (char)(13))
            return cmd;
        }
        public static string Schema(this string path, string schema)
        {
            return path + SchemaDirectory + schema;
        }
        public static string StoredProcedure(this string path, string procedure)
        {
            return path + StoredProceduresDirectory + procedure;
        }

        public static List<string> FindScripts(this string path)
        {
            const string sqlFileSearchPattern = "*.sql";
            return new DirectoryInfo(path).GetFiles(sqlFileSearchPattern).Select(fi => fi.Name).ToList();
        }
        public static void MigrateUpSchemaChangesForVersion(this ITransformationProvider database, string version)
        {
            SchemaPath(version,"Up").FindScripts().ForEach(x => database.ExecuteScriptAsNonQuery(version.Schema(x)));
        }
        public static void MigrateUpStoreProcedureChangesForVersion(this ITransformationProvider database, string version)
        {
            StoredProceduresPath(version,"Up").FindScripts().ForEach(x => database.ExecuteScriptAsNonQuery(version.StoredProcedure(x)));
        }
    }
}

Create a Scripts folder and add your *.sql scripts.

It works by convention, under your Scripts folder add a directory for each migration version you need. The code uses the numeric part of the directory name to specify the version e.g. PowerCars 1.0.0.0 would become V1000. Under each version directory create an Up directory and then optional Schema and StoredProcedures directories. Under these place all your sql scripts.

The final part is to add the T4 template, note that once you do it will run each time you save it and will create and include the Migrations under the migrations folder.

Create a Migrations folder and add the T4 template

<#@ template language="C#" hostspecific="True" debug="True" #>
<#@ output extension="cs" #>
<#@ import namespace="System.IO" #>
<#@ import namespace="System.Text.RegularExpressions" #>
<#@ include file="T4Toolbox.tt" #>
<#
	//Big Thx, lots of great code here http://code.google.com/p/simpledotnet/source/browse/trunk/src-tools/base/src/Templates/T4Toolbox.ttinclude?r=716

	string version, fileName, seqNo;
	int sequence = 1;
	DirectoryInfo migrationDirectory;

	foreach (DirectoryInfo di in new DirectoryInfo(GetProjectPath() + "\\Scripts").GetDirectories("*"))
	{
		version = VersionName(di.Name);
		if (string.IsNullOrEmpty(version.Trim()))
			continue; //Skip directories without numerics as these are probably not versions e.g. .svn
		seqNo = sequence.ToString("000000");
		fileName = seqNo + ".cs";
		GenerateMigration(version,seqNo,di);
		SaveOutput(fileName);
		AddFileToProject(fileName);
		sequence++;
	}
#>
<#+
  static string VersionName(string name)
  {
	//Numerics only
	return Regex.Replace(name, @"[^\d]", "");
 }

  void SaveOutput(string outputFileName)
  {
	  string templateDirectory = Path.GetDirectoryName(Host.TemplateFile);
	  string outputFilePath = Path.Combine(templateDirectory, outputFileName);
	  File.WriteAllText(outputFilePath, this.GenerationEnvironment.ToString());
	  this.GenerationEnvironment.Remove(0, this.GenerationEnvironment.Length);
  }
  static string GetProjectPath()
  {
		EnvDTE.Project project = TransformationContext.Project;
		System.IO.FileInfo info = new System.IO.FileInfo(project.FullName);
		return info.Directory.FullName;
  }
  static EnvDTE.ProjectItem FindProjectItem(EnvDTE.ProjectItems projectItems, string fileName)
  {
		foreach (EnvDTE.ProjectItem projectItem in projectItems)
		{
			if (projectItem.get_FileNames(0) == fileName)
			{
				return projectItem;
			}
		}

		return null;
  }
  void AddFileToProject(string filePath)
  {
		EnvDTE.ProjectItem outputProjectItem = FindProjectItem(TransformationContext.ProjectItem.ProjectItems, filePath);

		if (outputProjectItem == null)
		{
			// Add file to project.
			EnvDTE.ProjectItem projectItem = TransformationContext.Project.ProjectItems.AddFromFile(GetProjectPath() + "\\Migrations\\" + filePath);
		}
		else
		{
			// Check it out
			if (!System.IO.File.Exists(filePath))
			{
				// Does the file need to be checked out?
				EnvDTE.SourceControl sourceControl = outputProjectItem.DTE.SourceControl;
				if (sourceControl.IsItemUnderSCC(filePath) && !sourceControl.IsItemCheckedOut(filePath))
				{
					sourceControl.CheckOutItem(filePath);
				}
			}
		}
	}
#>
<#+
	void GenerateMigration(string version, string seqNo, DirectoryInfo migrationDirectory)
	{
		//Migration code is: YYYYDDMM[sequenceNumber]
		string migrationVersionNumber = DateTime.Today.ToString("yyyyMMdd") + seqNo;
		var nameSpace = "Datacom.BPS.Migrations";
		DirectoryInfo upDir = new DirectoryInfo(migrationDirectory.FullName + "\\Up");
		DirectoryInfo downDir = new DirectoryInfo(migrationDirectory.FullName + "\\Down");
		var schemaPath = migrationDirectory.Name;

#>using System;
using Migrator.Framework;
using Datacom.BPS.Migrations.Extensions;

namespace <#= nameSpace #>
{
	[Migration(<#= migrationVersionNumber #>)]
	public class V<#= seqNo #> : Migration
	{
		public override void Up()
		{
<#+
	if (upDir.Exists)
	{
	   if (new DirectoryInfo(migrationDirectory.FullName + "\\Up\\Schema").Exists)
	   {
#>	        Database.MigrateUpSchemaChangesForVersion("<#= schemaPath #>");
<#+
	   }
		   if (new DirectoryInfo(migrationDirectory.FullName + "\\Up\\StoredProcedures").Exists)
	   {
#>	        Database.MigrateUpStoreProcedureChangesForVersion("<#= schemaPath #>");
<#+
	   }
	}
	else
	{
#>			throw new NotImplementedException();<#+
	}
#>
		}

		public override void Down()
		{
			throw new NotImplementedException();
		}
	}
}
<#+

	}
#>

Once you have the project compiling and creating you migrations you will need to run it against your database, So here is an example using migrations with Nant, you can also integrate with MSBuild.

Create a

<?xml version="1.0"?>
<project name="migrator" default="help-migrations" basedir=".." xmlns="http://nant.sourceforge.net/release/0.91-alpha2/nant.xsd">
  <description>BPS database migrations script</description>

  <property name="nant.settings.currentframework" value="net-4.0" />

  <!-- DB properties -->
  <property name="db.source" value="DBServerName" overwrite="false"/>
  <property name="db.name" value="DatabaseName" overwrite="false"/>
  <property name="db.trusted" value="true" overwrite="false"/>
  <property name="db.access" value="Integrated Security=true;" />
  <property name="db.connectionstring" value="Data Source=${db.source};Database=${db.name};${db.access}" overwrite="false"/>

   <!-- Migration properties -->
  <property name="migrations.dll" value="bin\Debug\YourMigrationsDLL.dll"/>
  <property name="migrations.project" value="YourMigrationsProject.csproj"/>

  <!-- MSBuild properties -->
  <property name="project.name" value="YourMigrationsProjectName"/>
  <property name="msbuild.path" value="C:\Windows\Microsoft.NET\Framework\v4.0.30319"/>
  <property name="env.config" value="Debug"/>

  <echo message="${db.connectionstring}" />

  <loadtasks assembly="../Lib/migratordotnet/Migrator.NAnt.dll" />

  <target name="migrate" description="Migrate the database" depends="replace_dbName, compile">
    <echo message="${db.connectionstring}" />
		<property name="version" value="-1" overwrite="false" />
	  	<migrate
	    	provider="SqlServer"
	    	connectionstring="${db.connectionstring}"
	    	migrations="${migrations.dll}"
	    	to="${version}" />
  </target> 

</project>

cd ..\..
path
%cd%\Lib\nant-0.91a2\bin\nant.exe -buildfile:YourMigrationsProject\Scripts\migrations.build -D:ReleaseEnvironment=Dev -D:db.trusted=true migrate 1>migrate.log | type migrate.log
start migrate.log

I accept no warrenties for this work, not production ready but hopefully of use.

Links

NuGet – Sharing code in Visual Studio 2010

It’s an open source visual studio 2010 extension that lets you include other peoples code in your projects, or your own if you package it.

The extension handles the grunt work of downloading, adding references, and configuring packages you choose to add to your project. You can also easily remove packages you’ve installed.

The extension is installed automatically with MVC 3 or you can get it here.

How do I include a package in my project ?

Right click references and choose a package from the list.

Add Package Reference

Or, using the package manager console (ctrl w, ctrl z). You need Powershell 2 installed which is part of the Windows Management Framework Core package.

Package Manager Console

Try some of these commands:

  • List-Package
  • Add-Package Automapper (or nap Automapper)
  • Update-Package Automapper
  • Remove-Package Automapper
  • New-Package myPackage -Spec myPackage.nuspec

See help page for a full list of commands and syntax.

Changes and references are kept local to the project your working with and all the downloaded files are put in a folder called \packages under your project.

Packages are not automatically updated, you get to control that. In fact once added it’s transparent to visual studio, as if you added the files and configuration by hand.

Where does my list of packages come from ?

The list of packages available is taken from feeds you configure as Visual Studio options.

Settings

Feeds are easily to setup, no server required, just create a directory that can be accessed and place all packages(*.nupkg files) in it, then add it to the available package sources, its that simple.

How do I create a package ?

You need to create a spec file then build a package using the new-package command in the package manager console.

The spec file includes the metadata and dependencies for the package and you don’t need to include a file list if you use the conventions specified in the documentation on building a package.

  • tools – The tools folder of a package will be added to $env:Path (PATH) when the package is installed
  • content – These files will be copied to the site root on install
  • lib – dlls under this folder will become assembly references on install,  root lib folder is short-hand for specifying any framework
    • lib/40 –> .NETFramework 4.0
    • lib/SL4 –> Silverlight 4.0
    • lib/net20 –> .NETFramework 2.0

Spec files (*.nuspec) are xml files, so use your editor of choice.

myPackage
1.0.0
Me



To update a config file e.g. web.config include a web.config.transform file in the package with the sections you want to add.

See the help page for more.

Links

Bil Simser
Scott Gu
Haacked
Scott Hanselman