Top 10 Things to Know When Integrating with Line of Business Systems
09 May 11 11:39 PM | wmmihaa

We are once again fortunate to have prominent speakers visiting our user group in Sweden. This time it's Kent Weare and Richard Seroter. Both of them have been here before, and have both been much appreciated speakers. (BTW if you are aware of any challenges happening in Sweden at the time of the event such as Iron man, marathon or cage fighting let me know so we can sign up Kent).

Kent Weare recently led a team of authors in their production of the book Microsoft BizTalk 2010: Integrating with Line of Business Systems (Packt Publishing, 2011).  This book walks through multiple technologies and how to integrate with them via BizTalk Server 2010.  Join Kent and Richard as they deliver the Top 10 Things to Know When Integrating with Line of Business Systems.
In these sessions, they will walk us through numerous key principles to follow when doing system integration and they will draw inspiration from their new book.  These principles will be demonstrated by integrating BizTalk Server 2010 with SharePoint, Windows Azure, SAP and software-as-a-service providers.

Monday the 13th of June 18:00 at Microsofts office in Akalla

If you have an opportunity to join us, sign up. If you need help booking travel arrangements such as hotel, let us know by dropping us an email.


PowerShell cmdlet for BizTalk db restore
22 February 11 08:59 AM | wmmihaa | 4 comment(s)

Configuring the backup job for BizTalk is a fairly simple task, while restoring it is a bit more complicated. By default the BizTalk backup job makes a full backup once a day, and a log backup every 15 minutes. When backups are done, a mark is set on each file. This mark is the same across all databases, and should be used to restore all databases to the same point in time and keeping all databases in a consistent state. –Also, by default, all backups are made to the same directory folder.

The only supported disaster recovery procedure from Microsoft is log-shipping. Nick Heppleston has gone through the trouble of describing this in great detail, and I strongly recommend to read these post before you choose to use any other approach.

A few weeks ago, I sent out a question on twitter, asking whether people used log-shipping or not. I got 24 responses where only 4 used log-shipping.

Although log-shipping comes with many advantages, it is still expensive since it requires a secondary SQL cluster. Most of the people I asked confirmed this was the main reason why they had chosen other alternatives. Since BizTalk doesn’t come with any restore scripts/features other than log-shipping, everyone is left to fix this on their own.

If you’re in the same situation, feel free to download this sample. If it doesn’t fit your solution, it might at least be a good starting point.

The sample comes with two cmdlet’s: Get-Marks and New-RestoreDatabaseFromMark. The first one gives you a list of all marks from all log files. The second one, as the name implies, restores a database to a specific mark. When doing so, the database will be restored from the last full backup before the mark. After that, all log files will be restored in order from the full backup. The last log file will only be restored to the specified mark.


The Get-Mark cmdlet queries the backup output folder to retrieve all marks. The mark is part of the name of each backup file:


Each file is made up of the following parts:


Eg: SERVER001_DTA_BizTalkDTADb_Log_BTS_2011_01_18_12_06_50_22.bak
* The instance is only present for none default instances.

You can use the New-RestoreDatabaseFromMark cmdlet with or without specifying the mark. Leaving the mark empty is eqvivilent to last mark.

The New-RestoreDatabaseFromMark cmdlet is called per database, why it's easier to create a script for restoring all databases together. The sample comes with a RestoreScript.ps1 script file, which could work as a good start:

$backupPath = "X:\BizTalkBackUp";
$dataPath = "E:\SQL Server 2008\MSSQL10.MSSQLSERVER\MSSQL\DATA";
$logPath = "E:\SQL Server 2008\MSSQL10.MSSQLSERVER\MSSQL\DATA";

$mark = Read-Host "Specify mark (use the Get-Marks cmdlet to get all marks or blank to use last mark)";

if ($mark.Length -eq 0)
Write-Output "Restoring to last mark...";

New-RestoreDatabaseFromMark SSODB $backupPath $dataPath $logPath $mark;
New-RestoreDatabaseFromMark BAMPrimaryImport $backupPath $dataPath $logPath $mark;
New-RestoreDatabaseFromMark BizTalkDTADb $backupPath $dataPath $logPath $mark;
New-RestoreDatabaseFromMark BizTalkMgmtDb $backupPath $dataPath $logPath $mark;
New-RestoreDatabaseFromMark BizTalkMsgBoxDb $backupPath $dataPath $logPath $mark;
New-RestoreDatabaseFromMark BizTalkRuleEngineDb $backupPath $dataPath $logPath $mark;

trap [Exception]

Write-Output "Done restoring all BizTalk databases" -foregroundcolor "yellow";

The first section of the script defines a set of path variables. In my simple sample, all database files are located in the same folder. This is never a good practice, why you probably have different paths for data- and log files for each database.

To use the sample, open PowerShell and navigate to the sample output folder. Eg:

PS C:\> CD “c:\Program Files\bLogical.BizTalkManagement”

Before you can use the cmdlets, you need to install them. You can do this using the install script:

PS C:\Program Files\bLogical.BizTalkManagement> Install.ps1

After you’ve installed the snapins, you can start using the commands:

PS C:\Program Files\bLogical.BizTalkManagement> Get-Mark "X:\BizTalkBackUp"


PS C:\Program Files\bLogical.BizTalkManagement> MyRestoreScript.ps1




Using AppFabric Cache as WF Persistence store
01 February 11 09:00 PM | wmmihaa | 2 comment(s)

Paolo Salvatori and I, had this idea about using AppFabric Cache as a workflow persistence provider. As we’ve been working on this for quite some time now, we’re happy to finally share the work with everyone else thinking about how workflow persistence work, or perhaps thinking of building your own provider.

Developing your own persistence provider is not the simplest task you can take on, but it’s perfectly doable. I was surprised by the absence of persistence provider samples on the web. The only helpful one I could find was the MemoryStore, which helped me get started.

The funny thing with this sample was that I spent most of the time, trying to figure out why it worked, not why it didn’t.

I want to thank Paolo Salvatori, Manu Srivastava and Ruppert Koch from the AppFabric CAT and Product group for helping out and clarifying how the underlying plumbing works. More on that at the end of this post.

What is Workflow Persistence?

Unless we enable persistence, a workflow instance is only stored in memory. If the host goes down, it will take any evidence of your workflow instance with it. The process of persisting a workflow to a durable repository or storage is known as dehydration while restoring it, is generally referred to as rehydration. If you are a BizTalk person, you are probably aware that the persistence store for BizTalk is the BizTalkMsgBoxDb database.

Much like BizTalk, Workflow Foundation comes with a SQL persistence store, - the SqlWorkflowInstanceStoreProvider. However, with WF it’s optional. -Not only to use persistence, but you are also free to choose other providers, or even build one yourself.

You can enable the persistence either through configuration or code. If you’re hosting your workflow in IIS/AppFabric you can even manage your persistence through the IIS Manager.

Why and when would we use persistence?

You might end up using persistence for different reasons, which might also have an impact on which provider you’d choose. As I’m a “BizTalk guy”, I generally think of persistence in terms of 1) A safety-net in case something goes wrong and 2) Resource management, off-loading running instances to prevent memory consumption. In most cases you’re looking for a safe and robust provider to make sure you are not going to lose sensitive data along with the internal state of the workflow instance.

AppFabric Cache might not be your first choice if you are looking for a secure and reliable store provider. As Stephen W. Thomas put it: “Never put anything in the Cache that MUST not be lost – it is a non-transactional cache, not a database”.

However, persistence is also about scalability. If you, for example, are using WF to control the page flow of an ASP.Net application running on multiple web servers, you’d need persistence to make sure one server can continue a workflow which initiated on another server. Furthermore, in any scenarios where workflow services are accessed in a session-less manner, it is also important to persist immediately when the workflows gets idle. Otherwise the second request to the workflow might be re-directed to the other server, while it’s still active in memory on the first server. We can control this behavior using the WorkflowIdleBehavior.TimeToUnload setting in the web-/app.config:

<sqlWorkflowInstanceStore connectionStringName="persistenceStore" />
<serviceMetadata httpGetEnabled="true"/>
<workflowIdle timeToUnload="00:00:00"/>
<serviceDebug includeExceptionDetailInFaults="true"/>

The default value for the property is 1 minute. Unloading a workflow implies that it is also persisted. If TimeToUnload is set to zero the workflow instance is persisted and unloaded immediately after the workflow becomes idle. Setting TimeToUnload to MaxValue() effectively disables the unload operation. Idle workflow instances are never unloaded.

In some cases, for example when using WF to control the UI of your application, , you may consider using a less reliable provider. In such cases storing your workflow state in AppFabric Cache could be a good fit. In fact I would not be surprised if Microsoft shipped an AppFabric Cache provider in a near future.

But I won’t lie to you, – I think AppFabric Cache is super cool, and would make up any excuse to play with it!

What is correlation?

First lets have a look at the sample workflow. It’s a very simple sample to simulate a process of working with some sort of document form. The user may save the document, to come back later to continue working on its content.


  1. The client calls the workflow service through the CreateDocument method. This will cause the runtime to create a new instance of the workflow.
  2. After receiving the new Document, the workflow assigns it to a local variable.
  3. As the response is sent back to the client, the workflow initializes the correlation on the Document.Id (string)
  4. While the workflow waits for the client to send an updated document it will become idle, and therefore persisted to the AppFabric Cache store.
  5. The client invokes the UpdateDocument method. As the workflow instance is not in memory, the runtime will ask the persistence provider for the serialized workflow and will then rehydrate it.
  6. The workflow instance will perform the following actions:
      • update the local document variable;
      • return the response;
      • persist its internal state,
      • continue to listen for updates until the user submits “I’m done”. 

The only “complex” part of this workflow is making the second call (and all subsequent calls) to be “routed” to the right workflow instance. Keep in mind that there might be any number of concurrent documents being processed in the system. This is solved using the WF Correlation and in particular Content-Based Correlation.

If you’re a BizTalk guy, this is nothing new; however when I teach BizTalk classes people seems to have a bit of trouble to understand the concept. A correlation set is basically a “primary key” for the instance of the workflow. The id is created on the first call to the workflow, and needs to be passed on to the runtime for all subsequent calls. Using WF, this is very simple, all you need to do is to set the CorrelationInitializers for instance a SendActivity:


This will cause the Workflow runtime to create the instanceId (Guid), and add it to the binding context of the client. This way the instanceId is passed along with the soap header.  Although this might work fine in most situations, there are some limitations. -If the client does not have the instanceId, as when different clients are interacting with the same workflow, this won’t work. You are therefore better off using the Content based correlation type; Query correlation initalizer. For more information about the different kinds of correlation that WF provides, have a look at Paolos post.


This correlation type lets you define the id of your workflow, based on the content of the message. But keep in mind it needs to be unique.

So how does a persistence provider work?

The simple answer is: It serializes the workflow along with its metadata to a store. The persistence provider could use any durable media such as a file, database or queue. The AppFabric Cache does not fall in the category of being a durable media, but we will ignore Stephens lame warnings for now, and just concentrate on the sweet coolness of AppFabric Cache.

To begin with, a persistence provider needs to inherit from InstanceStore (System.Runtime.DurableInstancing), which is an abstract class with some operations you need to overload. The most important one is the BeginTryCommand. This is a universal operation which will be called from the runtime. With it comes an InstancePersistenceCommand parameter which tells you what the runtime expects you to do. This can be any of the following:

(there are more commands, but this will do for now…)


If the last SaveWorkflowCommand has the CompleteInstance set to true, this indicates the workflow is done. In my case, this is where I clean up the cache, even though the cache will eventually clean itself up, as I add to the cache using the timeout parameter. You can set this yourself in the web.config.

<bLogicalPersistenceStore cacheName="bLogical" timeout="00:05:00"/>
<serviceMetadata httpGetEnabled="true"/>
<workflowIdle timeToUnload="00:00:01"/>
<serviceDebug includeExceptionDetailInFaults="true"/>

The LoadWorkflowByInstanceKeyCommand is where I got stuck. As I said in the beginning, –it worked, I just couldn’t figure out why. Below is the code in my LoadWorkflowByInstanceKey method:

private IAsyncResult LoadWorkflowByInstanceKey(InstancePersistenceContext context, 
LoadWorkflowByInstanceKeyCommand command,
TimeSpan timeout,
AsyncCallback callback,
object state)

Key key = _cacheHelper.GetKey(command.LookupInstanceKey);
Instance instance = _cacheHelper.GetInstance(key.Instance);

I’m loading the workflow using the LookupInstanceKey value, which is a Guid. Where did that come from? I was expecting my correlation key, – the Document.Id. I was starting to believe the correlation key was stored (together with the LookupInstanceKey) somewhere else. I started to trace the message to see if there was anything hidden in the header, but there wasn’t. I tested different client, collaborating on the same workflow instance, and that worked too. I even set up an environment with multiple servers –AND IT STILL WORKED! How was this possible?

As I was losing sleep over this, I turned my faith to Paolo Salvatori, hoping he could share some light on this. Eventually, Manu Srivastava and Ruppert Koch explained how the magic works. It turns out the LookupInstanceKey is a 128-bit hash of the actual content based key.

Manu Srivastava:

“The Hash == InstanceKey.Value == CorrelationKey == LookupKey. These terms all mean the same thing. InstanceKey.Value contains the value of the hash, which is of type GUID and represents the correlation / lookup key []”

“It is the responsibility of the Provider to store the Workflow Instance, the Correlation Keys *and* the mapping between the two. The LoadWorkflowByInstanceKeyCommand has a LookupKey as an argument. This is the correlation key. This is the key the custom provider implementation must use to identify and then return the Workflow Instance. The hashing algorithm used is irrelevant to the implementation of the Provider. The Provider only has to worry about storing the InstanceKey.Value and retrieving an instance via the InstanceKey.Value. The transformation between Document.Id and the InstanceKey is a WF runtime detail that you do not need to worry about for your implementation; thus, you do not need to worry about the hashing algorithm itself. When you persist an instance, save the InstanceKey.Value and its mapping to InstanceId. When you load an instance, use the LookupKey to find the correct InstanceId and return the WorkflowInstance. That's it. You don't need to go from Document.ID to hash or vice-versa...thats handled at the WF Runtime Layer. Just focus on saving InstanceKey.Values and loading by LookupKey. =) InstanceKey.Value == LookupKey in this particular scenario”

In other words: – Concentrate on the problem, and leave the plumbing to us! –Point taken.

Running the sample

Download and install:

Before running the sample, you need to properly configure the environment and in particular you have to create the cache used by the custom Persistence Provider.

To accomplish this task, you can perform the following steps:

1. Download the Windows Server AppFabric.

2. Read Scott Hanselman’s post as to how to set it up.

3. Open Caching Administration Windows PowerShell through the Start menu, and run the following commands:

  • Start-CacheCluster
  • New-Cache bLogical


4. Download and run the sample the sample.

This is probably a good time to point out, that the sample provided with this post is not production ready. However, If you want to take it further, I’d be happy to help out!(</disclaimer>)

Again, many thanks to Paolo Salvatori, Manu Srivastava and Ruppert Koch from the AppFabric CAT and Product group.


My 15 favorite blog posts of 2010
22 December 10 08:41 PM | wmmihaa | 7 comment(s)

I’ve taken the time to reflect over the blogs I follow, and though about the impact it made to my work over the last year. I’ve also credited post for being just interesting or cool, and perhaps useful in the future.

I’ve done this post as a tribute to those who took there valuable time to share their thoughts and experience with the rest of us.


How to exploit the Text In Row table option to boost BizTalk Server Performance

by Paolo Salvatori

As I’ve spent much of the last year focusing BizTalk performance, I found these settings to make a huge difference. Applying the “text in row” table option on all tables storing the actual message, improved not only the message throughput but also greatly reduced the CPU utilization.


Creating a bootable VHD the easy way

by Hugo Häggmark

Creating a bootable virtual hard drive is not as easy as it’s made to believe! I struggled quite a bit, and as I was complaining about it in the office one day, Hugo said: “I’ve written a blog post about it…”. These posts are great, and it really is the EASY way!


How To Boost Message Transformations Using the XslCompiledTransform - Series

by Paolo Salvatori

Paolo Salvatori’s blog should be on every BizTalkers feed subscription. I’m amazed by the details and the effort of his research. In this post he shows how to use the XslCompiledTransform class instead of using the traditional XlsTransform which is used by the Transformation shape in orchestrations. I found there to be an issue of memory consumption that needs to be sorted out. Never the less, it’s a very interesting concept, and a great series of posts.

WCF-SQL Adapter Table Operations

by Steef-Jan Wiggers

As I was working with the WCF-SQL adapter a couple of months ago, I was really happy to find Steef-Jans post on the subject. The post shows how to use all the table operations using the “new” sql adapter. Thanks Steef-Jan, and again –Congrats on the MVP award.

Four Questions - Series

by Richard Seroter 

Every time my feed reader notifies me of a new “Four Question” post, I instantly have to read it. It’s always interesting to hear what other people within the community and Microsoft are up to, and Richards questions are always relevant to what is currently happening in the field. And of course it always ends up with the last question, where Richards humor and sarcasm comes to good use.

I was especially intrigued by the last post where he interviews one of the true heroes of the BizTalk community – Ben Cline.

Mapping in BizTalk 2010: My favorite new features - Series

by Randal van Splunteren

Randal, also a newly decorated MVP was kind enough to share the features of the new mapper that shipped with BizTalk 2010. If you’re a BizTalk dev, I really recommend you to read these posts as you’re likely to find out some features you didn’t already know about.

BizTalk 2010: Musing of the ‘new’ SharePoint 2010 WS Adapter

by Mick Badran

As I was putting together a SharePoint / BizTalk lab using the new SharePoint adapter, I was happy to have found this great great post. Saved my lots of time, –Thanks Mick!

ShareTalk Integration (SharePoint/BizTalk) series

by Kent Ware

Yet another series of really good posts if you plan to integrate with SharePoint. Despite being a compulsive gambler from Canada, Kent is a great guy who shares a lot of BizTalk experience through his blog. Rumors has it he is also writing on a book…And besides his interest in BizTalk he also shares his thoughts on Windows Phone 7 through his new WP7 blog.

BizTalk Adapter Pack 2.0/SAP Adapter series

by Kent Ware

I’m not currently working with SAP even though know I’ll probably be in the future. By then I’m sure I’ll thank Kent again for taking the time to write these posts.

Benchmark your BizTalk Server (Part 3)

by Ewan Fairweather

If you’re a true hard-core BizTalker and think performance is important, then this is a “must read” article. Take your time and read it a couple of times as it’s very detailed. Ewan have also been kind enough to share his script files for helping you identifying bottlenecks.

Large Message Transfer with WCF-Adapters – Series

by Paolo Salvatori

I find it funny that Paolo sometimes makes an effort to split a topic into two parts. Every one of those posts could easily be split into five parts, and together make up for small book. Paolo does not write posts – he writes essays! 

Transferring large messages is a common challenge using BizTalk. This post cover the subject in detail, and shows how to effectively minimize the use of recourses, while still transferring large files through BizTalk. The content of these posts was also demonstrated by Ewan Fairweather during his talk at the Swedish BizTalk User Group.

XmlDisassemble in a passthrough pipeline?

by Johan Hedberg

The discovery of BizTalk adding an Xml disassemble stage to a passthrough pipeline, was an interesting fact to say the least. Frightening might be a better choice of words.  Johan explain under which circumstances this happens. “Funny” enough, I’ve come across this issue twice this year.

Modernizing BizTalk Server BAM with PowerPivot

by Jesus Rodriguez

I couldn’t agree more, – BAM is one of my favorite features of BizTalk. With the release of SQL Server 2008 R2 came PowerPivot for SharePoint and Excel, and even though I haven’t got around to test PowerPivot  yet,I really find this interesting, and I’m sure I’ll get back to the post later on.

Less Virtual, More Machine - Windows 7 and the magic of Boot to VHD

by Scott Hanselman

With Windows 7 came the “boot to VHD” feature. I generally don’t want to install anything but the Office suite on my laptop. This is because I tend to try out a lot of CTP releases, along with the fact that I work with different customers where it’s a good practice to separate the environments. I solve that by always working in virtual environments. Using Windows Virtual PC is only supported in 32 bits, and even if I’d use VMWare or Virtual Box, I would not fully utilize the capacity of my laptop.

I’ve come across many high level demos showing the “boot to VHD” feature in action. But it’s not as easy as it seams. Every time I need to add a VM to my boot menu, I return to this post.

Nesting Scope shapes more than 19 levels

By Jan Eliasen

The knowledge of this limitation is not likely to come to any good use (for anyone (I hope)). But the effort of finding this flaw can not go unnoticed! – Thank you Jan.

10.000 SEK for a developer!
09 December 10 10:05 PM | wmmihaa

We are in desperate need of developers! If you point me to anyone (yourself included), I’ll split the finder’s fee of 20.000 SEK with you, if we hire that person.

Who are “we”?

Enfo Zystems is a company with a long standing commitment to integration and service orientation. In fact – It’s all we do! We are currently expanding on the Microsoft platform with focus on BizTalk, AppFabric and the cloud offerings from Microsoft.

The commitment and focus of this company, lead Johan Hedberg and myself to join Zystems. We were, and still are, amazed by the dedication by everyone we’ve met. Everyone from dev’s to sales, knows and understand integration and service orientation. We’ve even had discussions about BAM with our CEO!   

What do we offer?

Right now we are looking to set up a delivery center, from which we’ll work as a unit, delivering solutions to projects, opposed to selling consultants per hour. This means you’ll be working on-site from our office in Kista, together with your colleagues, delivering solutions to many customers.

Don’t know BizTalk? –Not a problem, we’ll provide you with necessary education and training. We require you to either have a couple of years experience from .Net, or from working with other integration- or ESB platforms.

Let me know if you find anyone...

Filed under: , ,
Webcast: Management Tasks Made Simpler in BizTalk Server 2010 – on Channel9
15 October 10 01:39 PM | wmmihaa

On the 29th of September we had yet another fantastic event at the Swedish BizTalk User Group. This time the agenda was the Future directions of Microsoft Application Platform, which we did together with the Swedish SQL user Group.

Unfortunately only one session was recorded and that was the “Management Tasks Made Simpler in Microsoft BizTalk Server 2010” with me and Paolo Salvatori. The good news is that is been published on Channel9:



Videos from the BizTalk Conference in Stockholm available on Channel9
15 October 10 10:52 AM | wmmihaa | 3 comment(s)

All the videos from European BizTalk Conference in Stockholm has been published on Channel9. Thanks to everyone attending, and of course also Richard, Ewan and Stephen for coming over to Sweden.

All sessions relates to the Applied Architecture Patterns on the Microsoft Platform book that came out just after the conference. It’s a great book, and I hope you feel compelled to buy it after you’ve seen these presentations :)



Day 1 (Sessions from 8th of September)

Title Description Speaker
Welcome and Introduction   Richard Seroter
Choosing The Right Tool in the Application Platform Discuss the challenge of choosing the right technology for a given situation and present a decision framework for guiding evaluation. Richard Seroter, Ewan Fairweather & Stephen W. Thomas
Tech Overview: SQL Server Look at the core components of SQL Server that are used to build applications (e.g. SSIS) and when to use them. Ewan Fairweather
Tech Overview: BizTalk Server Discuss what BizTalk is and when to use it. Richard Seroter
Tech Overview: WCF/WF, Server AppFabric Highlight key capabilities in WCF and WF and benefits offered by Windows Server AppFabric. Stephen W. Thomas
Tech Overview: Windows Azure Platform Discuss Microsoft’s cloud offering and best usage scenarios. Richard Seroter
Pattern #1 – Simple Workflow Evaluate scenario that involves aggregating data from multiple sources and presenting a unified response Ewan Fairweather

Day 2 (Sessions from 9th of September)

Title Description Speaker
Pattern #2 – Content Based Routing Consider options for effectively transmitting data to multiple systems that perform similar functions Richard Seroter
Pattern #3 – Human Workflow with Rapair and Resubmit This video shows using SharePoint 2010 to store customer details.  Then using workflow 3.5 send these details to an AppFabric hosted workflow 4.0 Workflow Service.  This workflow service controls the payment collection process and allows for updating information on a use into the same workflow from SharePoint. Stephen W. Thomas
Pattern #4 – Cross Organization Supply Chain Evaluate how to build a supply chain to integrate systems in a PO scenario Ewan Fairweather
Pattern #5 – Remote Message Broadcast Demonstrates a scenario where traditional polling solution is augmented to support real-time updates Stephen W. Thomas
Pattern #6 – Complex Event Processing Addresses click stream analysis and creating actionable events from user and system behavior. Richard Seroter
Applied Architecture Patterns on the Microsoft Platform -- The Story Behind the Book
10 October 10 10:11 PM | wmmihaa | 1 comment(s)

All sessions from the European BizTalk Conference will be available on MSDN shortly.

For more information:

How to use the new “FTPS adapter” with BizTalk 2010
26 September 10 11:30 PM | wmmihaa | 1 comment(s)

First of all – there is no new FTPS adapter. The already existing FTP adapter has been extended to support FTPS, which is really all you need anyway. However, several publications from Microsoft states there is a new FTPS adapter, which might be somewhat confusing.

So what is FTPS?

The support for FTPS was added with IIS 7.5, and is FTP over SSL, similar to HTTPS. Although it lacks some of the underlying infrastructure. When you visit an HTTPS site, you might be given various warnings if the certificate is not among your trusted ones. If you chose to ignore these warnings (and recommendations), you’re provided with the option of installing the public key in your certificate store. This logic is provided by your browser, and is a helpful way to manage your trusted sites.

The FTPS client (BizTalk in this case), however,  require the public key to be installed manually before it can use it. When you’ve installed it, it works like a charm.        


Create a Certificate

First you’ll need a certificate, a private key. There are several ways to create a certificate, but for the purpose of this sample, I’ll create one using IIS 7.5 manager.

Select the site root in the Connections pane on your left side. In the Action pane on your right, click the “Create Self-Signed Certificate”.


This will open a new dialog, where you can set a friendly name of your certificate:


When you’ve clicked Ok, you should be able to see the certificate in the list of Server Certificates. Select your newly created certificate and click “Export” in the Action pane. Set the path and password, and click Ok. this step will export the public key, which we’ll use later on.


Enable FTPS on your FTP site

As we now have a certificate created and registered on our server, we can proceed to enable our FTP site to use it. Select your FTP site, and double-click the “FTP SSL Settings” icon on your FTP Home page.


In the SSL Certificate dropdown list, select your certificate. You can choose either to Allow or to Require SSL connections. In this case I’ve chosen to “Require SSL Connections”. Click “Apply” in the Action pane, before you continue.


While in the FTP Home page, you might want to overlook the “FTP Authentication”, and “FTP Authorization Rules”.

Install the Public Key

Your FTP server is now ready to accept incoming SSL connections, and we can pursue with setting up the client (BizTalk). But first we need to install the public key, which you exported earlier. In my case, I’m logged in as Administrator, which happens to be the same account I use for my host instance. Hopefully, this is not the case for you, why you’ll to import the public key as the same account as you’re running the host instance with. However if you are logged in with the service account, double-click the certificate, and proceed with default values.

If you’re not logged on as the service account, open a command prompt and write the following:

runas /user:[Domain]\[User] "cmd.exe"

This will open a new command prompt, in which you type the path to where you saved the public key. Hit enter and proceed the wizard.

Extract the Thumbprint


Open the Certificate Manager (Start->Run->certmgr.exe), browse to the Trusted Root Certificate Authorities\Certificates folder, and double-click the certificate. Select the “Details” tab on top. Scroll down to the Thumbprint field, select it, and copy the value.


Configure Your BizTalk Port


I assume you are familiar to the normal FTP transport settings, so when you’re done configuring the server, port, username and password, paste the thumbprint in the “Client Certificate Hash” property. Set the “Use SSL” property to “true”. Start the port…You are done!

I’m a superstar and that’s how I do it!


BizTalk Sftp Adapter - New release 1.4 available on CodePlex
21 September 10 02:10 PM | wmmihaa | 1 comment(s)

I’ve got quite a few request for Proxy support through the years, and eventually someone (no names) got tired of waiting, hired me as a consultant, and forced me to implement it. So I’m happy to say the BizTalk adapter now comes with support for HTTP Proxy.

Support has also been added to:

  1. Enable "%UniversalDateTime%" macro on Receive Location (rename)
  2. "Leave File" on the Receive Location.
  3. Not throwing 10.000 exception to the eventlog (yes this is a feature)

Thanks to everyone giving feed-back through the codeplex site.

If you like it, please rate the project (If you don’t like it, you should not feel obligated to do so)

Get it at CodePlex...


You will still need to set the appropriate SSH parameters such as SSH Host, User name and Remote path. These parameters will be sent to the proxy server, which in turn will act as “man-in-the-middle” and connect to the SSH server using the parameters provided by the adapter.

Use AppFabric Cache to cache your WCF Service response
12 September 10 08:33 PM | wmmihaa | 19 comment(s)

[This post has been updated to support InvokeBegin and InvokeEnd to support BizTalk Receive Locations]

We’ve just finished the “BizTalk Release Party” in Stockholm, where Stephen W. Thomas, Richard Seroter and Ewan Fairweather held some fantastic sessions. One of the talks was about the AppFabric Cache formally known as “Velocity”.

Whenever I come across a new technology I try to put it in some useful context, where I can try it out. This is usually a quite painful process, as I tend to do this long before the technology has reached any sort of mature state. AppFabric Cache is a v1 product and therefore considerable more stable then other products or technologies I’ve been experimenting with.

While Ewan was presenting the Windows Server AppFabric Cache I though it could be interesting to use it for caching outgoing WCF service responses, so that the back-end logic would not be executed if it was cached. I’m not sure how useful this scenario really is, but I guess there could be many scenarios where it would be ok not to get the latest version of the information. Some of those situations might be:

  • Where the information is accessed during office hours, and only updated through nightly batches.
  • Where a composite service get bursts of calls that would often be the same.
  • When services expose static data which will seldom change over time.

In my sample I’ll use a fictitious Weather Forecast Service, which I think would qualify as good candidate because:

  • Nobody trust the weather forecasts in the first place, so it doesn’t matter if the result not 100% up to date.
  • The service is frequently called with the same input parameters (Zip code).
  • The service will support a very popular IPhone Windows 7 Mobile application, which in turn will cause an immense load on our back-end systems.



How it works:

WCF: The incoming call from the client is received through the Transport channel, after which it will proceed through an encoder and possible some other channels before it reaches the Dispatcher.

The Dispatcher is the last step before the request is handed over to the actual service. The Dispatcher is responsible for associating the incoming call with the appropriate operation and then invoking it. 

By creating a custom OperationInvoker you may customize the behavior of how the back-end logic is invoked. 


Windows Server AppFabric Cache provides distributed caching over many servers. It can utilize a cluster of servers that communicate with each other to form a single, unified application cache system. It comes with a decent API, and can be managed using PowerShell.


The Visual Studio solution has  three projects:

Project Description
bLogical.CachingExtension This is the main project, responsible for the custom behavior.
bLogical.WcfWeatherService A WCF service application, applying the CacheOperationBehavior attribute to indicate the response of the method should be cached if possible. The service makes a call to a database to pickup the forecast.
bLogical.Client Client tool calling the two services.


The cache implementation:

Using the Windows Server AppFabric Cache is pretty straight forward. I recommend you read Scott Hanselman’s post on the subject. The post also covers the installation process.

Basically you use the DataCashe.Put method to add the data to the cache, and the DataCache.Get method to retrieve the cached value. Working with the Windows Server AppFabric Cache is similar to using a HashTable, where you add values along with a key (string) which you can later use to get the value back. 


public object Invoke(object instance, object[] inputs, out object[] outputs)
// Serialize all input parameters. The string will be used as the Key to the Cache.
string input = GetSerializedKey(inputs);

// Return value from the Method
object value= this._cacheHelper.Cache.Get(input);

if (value != null)
outputs = new object[0];
return value;
//Invoke the method
value = this._innerOperationInvoker.Invoke(instance, inputs, out outputs);

// Add the return value to the Cache.
this._cacheHelper.Cache.Put(input, value, new TimeSpan(0,0,0,0,this._timeOut));

return value;


I use the input parameters as Key. But as the parameters can be any number of objects, I need to serialize them into an XML string first. I can then use the string as key together with the result from the invoked method. As I don’t want the value to be cached forever, I also pass in a TimeSpan to indicate the lifetime of the cached object.

Using the caching behavior:

You can use the cache behavior either declaratively in your code:

public interface IWeatherForecastService
Forecast GetForecast(string zipCode);

…or through configuration:

<serviceMetadata httpGetEnabled="true"/>
<cachingBehavior timeout="00:10:00" cacheName="bLogical"/>

<add name="cachingBehavior"
type="bLogical.CachingExtension.CacheElement, bLogical.CachingExtension,
Version=, Culture=neutral, PublicKeyToken=49c05550fea0c875"



I’ve run two tests; One with and one without caching enabled. I haven’t got time to run this in any real test environment. But even though I’ve run the test on my local laptop, the tests came out pretty clear.

Without Caching (~3 calls/sec):


(Using Visual Studio profiler, I found the database call to take ~200ms)

With Caching (~23 calls/sec):



Download and install:

1. Download the Windows Server AppFabric.

2. Read Scott Hanselman’s post as to how to set it up.

3. Open PowerShell and run the following commands:

get-command -module DistributedCacheConfiguration

First you need to grant access to your service account:
Grant-CacheAllowedClientAccount [Your Account]
Continue to create the Cache:

New-Cache bLogical

Start the cluster:


4. Download the sample


Using it with BizTalk:

Follow these steps to use the bLogical.CachingExtensions with BizTalk:

  1. Add a key to the bLogical.CachingExtensions project.
  2. Build the project and add it to the GAC.
  3. Open machine.config and add the extension to the behaviorExtensions section.
  4. Open BizTalk Administration Console
  5. Add a Request/Response ReceivePort > Add a Location
  6. Set the transport to WCF-Custom, and click configure
  7. Set the URI and Binding.
  8. On the Behavior tab, add a ServiceBehavior and select the CacheElement:




BizTalk Performance session at Tech·Ed Europe 2010
28 August 10 11:24 PM | wmmihaa | 4 comment(s)

I just got the news my BizTalk Performance Optimization session was approved for Tech·Ed:

Topic Information

Primary Track
Application Server & Infrastructure

Session Type
Lunchtime Session

Session Title
BizTalk Server Performance: Configuring BizTalk Server for performance

Optimizing and verifying your BizTalk Server installation is not an easy thing to do. The documentation is good but very extensive. This presentation aims to guide you through the most important operations you need to do in order to boost the performance of BizTalk. The session includes a live demo where these settings are applied and how it significantly improves the performance.

Major Products or Technologies Covered
Microsoft BizTalk Server 2010, Microsoft SQL Server 2008 R2



Hope to see you there. Let me know you’re coming.

Filed under: , , ,
BizTalk Sftp Adapter - New release 1.3.6 available on CodePlex
11 August 10 08:12 PM | wmmihaa

I got great feedback from many users, and have made some updates to the bLogical BizTalk adapter:.

Disabling the connection pool (only used for send ports)

Some SSH servers can not handle the connection pool very well. This can also be a problem if there is a limit set to number of connections a certain user can have. So I’ve been asked for the possibility to disable the connection pool for send ports, and this can now be done by setting the SSH Connection Limit to "0" in the admin console or BTSNTSvc.exe.config.

Logging and tracing

A much more verbose tracing as been added. If you care to save the trace info to file, you can use the TraceListener (System.Diagnostics):

  1. Enable Tracing in the port configuration.
  2. Open the BTSNTSvc.exe.config (or BTSNTSvc64.exe.config) file and add the following in the Config section:
    <trace autoflush="false" indentsize="4">
    <add name="BTSListener"
    initializeData="bLogical.Shared.Adapters.Sftp.log" />
    <remove name="Default" />

Public key authentication

There was a problem using identity file authentication as the password was sent as an empty string. This works for some but not all servers. The new version always set the password to null, which should work for all servers.


%UniversalDateTime% added to the list of supported macros


  • temp folder and remote permissions no longer required properties.
  • Empty files are no longer picked up or processed.


Special thanks to John Vestal at CSC and Antti.

Download the adapter from CodePlex

BizTalk User Group Sweden – Bonus Session (for your manager)
23 July 10 06:34 PM | wmmihaa

As you should by now, we are hosting a BizTalk Release Party in Stockholm on the 8th-9th of September, featuring Richard Seroter, Stephen W. Thomas and Ewan Fairweather.  All the sessions will be technical (level ~300) and targets Developer/Architects. The event is almost full (150 attendees), however Richard will do a one hour “bonus event” about the hype around Cloud Computing which targets CTO, CIO, Senior Manager or similar.

This will be a great event for your Manager, and we urge you to recommend it. The session starts at 13:15 on the 9th of September.

Sign up here:


Welcome to our two days release party of BizTalk 2010
21 June 10 11:48 PM | wmmihaa | 8 comment(s)

As you my already know, we are hosting a fantastic event in Stockholm, as we celebrate 10 year anniversary of BizTalk as a product along with the release of the new 2010 version.

At the same time, Richard Seroter, Stephen W. Thomas, Ewan Fairweather, Michael Sexton and Rama Ramani are releasing a book titled Applied Architecture Patterns on the Microsoft Platform. The book tackles 13 real-world scenarios and applies a decision framework for deciding the best Microsoft application platform technology for the problem at hand. In each chapter, a use case is outlined, a pattern is identified, multiple candidate architectures are evaluated, and a solution is built based on the best platform technology.

Even though this is not a BizTalk book, it addresses many patterns, techniques and products related to BizTalk. And we thought it’d be a great idea to invite the authors over to the The Swedish BizTalk User Group, and do a TWO day event with 10 sessions! Each session will relate to a chapter in the book, addressing a specific real-world scenario.

Needless to say, the scope for this event targets much brother audience then we normally do, covering the full suite of applicable Microsoft technologies, such as Windows Workflow, Windows Communication Foundation, Windows Server AppFabric, SQL Server Service Broker, SQL Server Integration Services, BizTalk Server, StreamInsight, Windows Azure and more. The goal is to sufficiently explain these applications and their optimal use cases in a way that helps you make better choices in your solution design.

If you have the opportunity to visit Sweden at the 8th and 9th of September, you are more than welcome to sign up for the event. The event is free and if you book your flight now, it's still pretty cheap (~€100).

For more information and sign-up:

Let us know you're coming.


More Posts « Previous page - Next page »

This Blog


    MVP - Microsoft Most Valuable Professional BizTalk User Group Sweden BizTalk blogdoc

    Follow me on Twitter Meet me at TechEd


    Locations of visitors to this page


    The opinions expressed herein are my own personal opinions and do not represent my employer's view in anyway.