Archive for March, 2010

For quick and easy prototypes, you’ve got to admire ASP.NET MVC and WCF RIA Services. These approaches may not be perfect out-of-the-box but they’re structured much better than the old “bind a dataset to a grid and let it fly” approach of 2003. As easy as these approaches are, I’m always looking for ways to make things easier. I get a lot of bang for my buck by using SQLite as an in-memory database whenever I create a new MVC or RIA Services solution. In fact, I create 4 SQLite databases with each new solution: one each for application data, test data, membership/role data, and logging/tracing data. Below I’ve described the techniques I make use of to utilize each of these databases.

System.Data.Sqlite + ORM of Choice
If you’ve never used SQLite with .NET before, you’ll be happy to know that it’s as easy as can be. The System.Data.SQLite open source ADO.NET provider gives you everything you need. The provider is a complete ADO.NET implementation, including full support for the ADO.NET Entity Framework and Visual Studio design-time support – all in a 900 KB assembly. Need support for VisualStudio 2010? Ion123 includes a library compatible with 2010 in this post. So whether you use Entity Framework or NHibernate, just drop in the System.Data.Sqlite DLL, create a database, wire up your objects to the ORM and go to town. Data access simply could not be easier.

SQLite-Backed Testing
There are lots of good reasons to implement proper interfaces and mock objects or stubs for the purposes of testing. Sometimes it’s just easier not to have to deal with it. SQLite-backed testing provides the perfect alternative. You can still create your unit tests, even exercising framework elements and third party libraries that aren’t always the easiest to cover with traditional mocking frameworks. Just plug in a temporary SQLite test database, write your test code just as you’d write your application code and use one of several mechanisms to purge the data between tests. As usual, Ayende provides the definitive reference on how to do this for NHibernate. I’ve provided code below from my experiences doing this with file backed databases for Castle ActiveRecord. Find your way to Google for references on how to accomplish this with the Entity Framework.

using Castle.ActiveRecord;
using Castle.ActiveRecord.Framework;
using Castle.ActiveRecord.Framework.Config;
using Gallio.Framework;
using Gallio.Model;
using MbUnit.Framework;
using MyNameSpace.Models;
using System;

namespace MyNameSpace.Tests
{
    public abstract class AbstractBaseTest
    {
        protected SessionScope scope;

        [FixtureSetUp]
        public void InitializeAR()
        {
            ActiveRecordStarter.ResetInitializationFlag();
            IConfigurationSource source = new XmlConfigurationSource("TestConfig.xml");
            ActiveRecordStarter.Initialize(source, typeof(Object1), typeof(Object2));
        }

        [SetUp]
        public virtual void Setup()
        {
            Object2.DeleteAll();
            Object1.DeleteAll();
            scope = new SessionScope();
        }

        [TearDown]
        public virtual void TearDown()
        {
            scope.Dispose();
        }

        public void Flush()
        {
            scope.Flush();
            scope.Dispose();
            scope = new SessionScope();
        }
    }
}

SQLite as a Membership and Role Provider
Both ASP.NET MVC and WCF RIA Services use SQL Server ASP.NET Membership and Role Providers by default. Take SQL Server out of the equation and swap in the custom SQLite Membership and Role Providers and you can use SQLite for your security data as well. Configuration of the custom provider can all be done right in the web.config file, as illustrated below.

<configuration>
  <connectionStrings>
    <add name="MembershipConnection" connectionString="Data Source=C:ProjectsDatabasesMyApp_Membership.s3db;Version=3;"/>
  </connectionStrings>
	<system.web>
		<authentication mode="Forms">
			<forms loginUrl="~/Account/LogOn"/>
		</authentication>
    <membership defaultProvider="SQLiteMembershipProvider" userIsOnlineTimeWindow="15">
      <providers>
        <clear/>
        <add name="SQLiteMembershipProvider" type="MyNameSpace.Web.Helpers.SqliteMembershipProvider" connectionStringName="MembershipConnection" applicationName="MyApplication" enablePasswordRetrieval="false" enablePasswordReset="true" requiresQuestionAndAnswer="false" requiresUniqueEmail="true" passwordFormat="Hashed" writeExceptionsToEventLog="true"/>
      </providers>
    </membership>
    <roleManager defaultProvider="SQLiteRoleProvider" enabled="true" cacheRolesInCookie="true" cookieName=".ASPROLES" cookieTimeout="30" cookiePath="/" cookieRequireSSL="false" cookieSlidingExpiration="true" cookieProtection="All">
      <providers>
        <clear/>
        <add name="SQLiteRoleProvider" type="MyNameSpace.Web.Helpers.SQLiteRoleProvider" connectionStringName="MembershipConnection" applicationName="MyApplication" writeExceptionsToEventLog="true"/>
      </providers>
    </roleManager>
	</system.web>
</configuration>

SQLite Logging and Tracing with NLog
I recently covered the integration of NLog with SQLite. A simple configuration file entry and all of your log and trace output can go into a single SQLite database.

Comments No Comments »

One of the things I’m often asked to do for clients is to create an applicability matrix. That is, which technology applies best to which particular challenges in an enterprise? There would seem to be an acute need for this type of clarification in the realm of Microsoft’s service technologies. With the recent releases of Web Process Activations Services (WAS) on Windows Server 2008, WCF 3.5 and 4.0, Windows Server AppFabric, BizTalk 2009 and 2010, and Windows Azure AppFabric, the waters of Microsoft’s service and integration technologies is muddy indeed. In this post, I’m going to provide some clarification; explaining what new service and integration offerings are on the way from Microsoft, offering a frame of reference on how I see them applying to enterprise customers, and furnishing references to materials you can use to educate yourself in these technologies.

Let’s start off with a quick tour of Microsoft’s new service and integration offerings. Specifically, I’m going to cover WCF 4.0, Server AppFabric, and Azure AppFabric. In this overview, I’m going to restrict the discussion to technologies that specifically relate to the challenges of traditional large enterprise application integrations. Interesting aspects of Microsoft’s new offerings such as WCF 4.0 RESTful service support (incorporated from the WCF Rest Starter Kit) and AppFabric Caching (formerly known as ‘Velocity’) will not be covered in detail.

Windows Communications Foundation (WCF) 4
Release focuses on ease of use along with new features, such as routing, support for WS-Discover, and enhancements from the WCF Rest Starter Kit.
Key Enterprise Application Features

  • A complete message routing solution that is useful for the following scenarios: redundancy, load balancing, protocol bridging, and versioning
  • Support for the WS-Discovery protocol that allows the discovery of services on a network. Support is provided via managed mode discovery which uses a centralized discovery proxy and via adhoc mode, in which the location of the service is broadcast.
References

Windows Server AppFabric
The best way to think of Windows Server AppFabric is as a replacement for the COM+ hosting environment. In the same way that WCF unified/replaced web services, remoting, and DCOM, AppFabric is replacing the COM+ hosting environment. Hosting administration, monitoring services, and management tools allow AppFabric to play this role. Also includes workflow persistence and a distributed caching platform.
Key Enterprise Application Features

  • A WAS-based hosting environment, which includes durable workflow hosting. Includes tools for managing, monitoring, and querying in-flight services and workflows.
  • Workflow persistence that allows AppFabric workflows to scale across machines. This includes the ability to monitor long-running workflows.
  • Health monitoring and troubleshooting of running WCF and WF services. High performance instrumentation based on Event Tracing for Windows (ETW) with analytics from a SQL monitoring store leveraging SQL Server Reporting Services (SSRS).
  • Management of services and workflows through the AppFabric dashboard, an extension to the IIS manager. PowerShell cmdlts enable management of services and workflows from the PowerShell console and enable further automation of AppFabric.
References

Windows Azure AppFabric
Branded as the Azure cloud-based version of its Windows Server-based counterpart, Azure AppFabric is perhaps better understood as a parallel service in the cloud. It provides features that Server AppFabric doesn’t, such as cloud-based relay, a service registry, and a service bus buffer. At the same time, several of Server AppFabric’s core features such as workflow persistence and health monitoring either aren’t built in or don’t make sense for the cloud-based version. Remains to be seen if these two products ever achieve true parity.
Key Enterprise Application Features

  • Relay service that removes the need for point-to-point bindings, instead routing non-transactional calls through the cloud.
  • Service bus registry that provides an ATOM feed of services listening on a particular namespace.
  • Variety of service bindings that represents a rough subset of the WCF bindings. Includes a WS-compliant binding as well as a TCP binging that operates in several modes, including a hybrid mode that can promote a connection from a cloud-based relay to a more direct connection.
  • Cloud-based service bus buffer queuing service. MSMQ-like, utilizable by both the client and the server with the condition that the queues are cloud-based. Allows the messages to be stored on the bus for a configurable amount of time, even if the service endpoint is not available.
  • Robust service authentication service, based upon claims-based application-to-application authentication.
References

What I’ve found is that knowledge of these new service and integration offerings alone does not get you to the point where you intuit when to apply them to enterprise application integration challenges. Therefore, I have begun to cluster these technologies together and think about what the best use cases are for each of the respective technologies. The image below represents these clusters, along with the archetype use case and particular features of the clusters’ technologies. This clustering represents a fundamental simplification of reality and doesn’t account for many of the shades of gray. Decisions such as whether workflows are best hosted in WF under AppFabric or under BizTalk are best made by application architects, based upon their knowledge of the organizational, business and technical constraints that impact their applications. That said, these clusters represent what I feel to be sound heuristics for Microsoft service and integration decisions over the next several years.

Microsoft Service Integration Technologies

Comments No Comments »

I’ve been ranting to some colleagues about a particularly useful table that showed the interactions between WCF’s InstanceContextMode and ConcurrencyMode behaviors.  I referenced it in a conversation again today and decided that I needed to go hunt down the phantom table so that it haunted me no longer.

I thought the table was in Lowy’s Programming WCF Services. Full attribution to Essential Windows Communication Foundation: For .NET Framework 3.5, one of my other favorite WCF books. I’ve copied the table below so that I can refer to it by hyperlink forever more.

The table is awesome because it shows you the results of the different ways of combining these two important WCF concurrency settings, including the default combination.  Without some trial and error, it’s not always easy to intuit what the combination of these settings means. This table makes it easy.

WCF Context and Concurrency

Comments 1 Comment »

I’ve had the opportunity to spend the last week or so investigating a system integration challenge involving PowerBuilder and .NET communicating with web services hosted on a mainframe. It’s been an interesting experience that’s enabled me to dive deep into .NET and to learn a bit about where PowerBuilder is at and where it’s heading. My outtakes follow:

  • What’s .NET got to do with it? I was totally perplexed at first when I got asked to assist with a challenge integrating a PowerBuilder client with mainframe web services. I saw .NET in the middle of the diagram but was perplexed “what’s .NET have to do with a PowerBuilder client-to- mainframe integration?”
  • .NET is PowerBuilder’s future. Seems that the future of PowerBuilder is bound to .NET. Check out the diagram below or the PowerBuilder 11.5 (current) and 12.0 (beta) features. Support for Code Access Security (CAS), IIS7, WPF and WCF.

  • Why Use PowerBuilder at all? Been asking myself this here. PowerBuilder 12 is being touted as a “Complete and highly productive .NET development solution”. Uh… isn’t that VisualStudio? Even with legacy compatibility considerations, isn’t it eventually just time to cut the cord? Didn’t we learn anything from VB.NET?
  • Diving Deep. To my final point, as a non PowerBuilder guy, I’ve been relegated to analyzing the PowerBuilder-to-.NET interaction from the outside in. The process of generating proxies (shown below), first in .NET for a SOAP-based service and then in PowerBuilder to interoperate with .NET, provides lots of opportunities for suboptimal behavior. ProcessMonitor from SysInternals really is man’s best friend in these situations. When I compare the traces from the integration attempts with my pure .NET test calls, I’m seeing a bunch of opportunities for optimization. Now I need to find someone who understands how PowerBuilder works under the hood so that we can determine how to optimize the .NET invocation.

Comments 1 Comment »

This is one of those seemingly trite blog entries – unless you’re actually trying to integrate System.Data.SQLite with NLog, in which case it’s invaluable. SQLite and NLog really are the perfect combination for lightweight logging. You avoid the sprawl of file-based logs over time, can execute SQL queries against your logs and have an absolutely minimal database footprint to deal with. If only you can get the configuration correct…

The NLog documentation provides some hints to get you going in the right direction for database-based logging. However, no matter how much spelunking I did around the Net, I couldn’t find a definitive answer on how to configure NLog to use SQLite. The configuration file that worked for me can be found below. Exact mileage may vary based upon your project setup. This should get you 95% of the way there though. Happy logging!

<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  <targets>
    <target name="File" xsi:type="File" fileName="C:Logfiles${shortdate}.nlog.txt"/>
    <target name="Database" xsi:type="Database" keepConnection="false"
            useTransactions="false"
            dbProvider="System.Data.SQLite.SQLiteConnection, System.Data.SQLite, Version=1.0.65.0, Culture=neutral, PublicKeyToken=db937bc2d44ff139"
            connectionString="Data Source=C:ProjectsDatabasesLogging.s3db;Version=3;"
            commandText="INSERT into LOGTABLE(Timestamp, Loglevel, Logger, Callsite, Message) values(@Timestamp, @Loglevel, @Logger, @Callsite, @Message)">
      <parameter name="@Timestamp" layout="${longdate}"/>
      <parameter name="@Loglevel" layout="${level:uppercase=true}"/>
      <parameter name="@Logger" layout="${logger}"/>
      <parameter name="@Callsite" layout="${callsite:filename=true}"/>
      <parameter name="@Message" layout="${message}"/>
    </target>
  </targets>
  <rules>
    <logger name="*" minlevel="Debug" writeTo="Database" />
  </rules>
</nlog>

Comments 1 Comment »