New version of the SFTP Adapter
09 February 14 10:42 PM | wmmihaa

The bLogical SFTP adapter has been around for a while now, since 2008 to be exact. It has been one of the most popular BizTalk related downloads on Codeplex, and even though BizTalk 2013 is now shipped with an SFTP adapter people still use it since it quite rich on features such as proxy support and more granular scheduling capabilities.

There has been lots of feedback, most of which I believe I’ve implemented or fixed over the years. However there was one issue I never got around to fix; – the receive location 'freeze' issue.

Luckily Greg Shap from New Zealand came along, fixed the issue and uploaded a patch on Codeplex. I’ve since added Greg to the project and he’s fixed the issues along with an impressive list of other fixes and changes:

  • Ensure thread-safe write access to SftpHostFiles.config
  • Resolve a receive location 'freeze' issue where files would stop being picked up until restarting the host instance
  • Resolve a zero length file creation issue
  • Correct a partial file read issue when consuming large files
  • Add X.509 identity certificate support
  • Add TransmitLocation context property schema items to fully support all static send port behaviours on dynamic send port

Download it from Codeplex

I want to take this opportunity to thank Greg for his work, and also apologise for not getting this post up earlier…

Greg started working in IT about 20 years ago in Auckland, New Zealand as a junior developer.  Eventually, he gained customer exposure doing on site system installations and upgrades. Having newly acquired soft skills, platform and development experience the next natural career move seemed to be system integration. This is the role he has filled for the past 15 or so years in varying capacities. 5 years ago he crossed the ditch with his wife to Sydney, Australia and now specialise in BizTalk development and implementation.

logo-codeplex2

Filed under: , ,
BizTalk Database Restore cmdlets – Now on Codeplex
09 February 14 10:21 PM | wmmihaa

Since I published the powershell cmdlets to restore BizTalk databases I got plenty of feed-back and suggestions. Lately my new friend and colleague Farhan, found use for it and got started fixing a few bugs i had laying around.

Now that it’s all up and running I thought it be a good idea to move in to CodePlex.

logo-codeplex2

Thank you Farhan.

Azure Mobile Services API support for Xamarin
01 November 13 05:01 PM | wmmihaa

This might be one of my shorter post, but I thought it was worth sharing. I’ve been working with a game lately and depend heavily on Azure Mobile Services. Microsoft has been kind enough to provide us with the Azure Mobile Services for Xamarin, which can be found here. Sadly thought, it does not include support for API’s.

You can do this:

var profiles = App.MobileService.GetTable<UserProfile>();

But you can’t do this:

var myProfile = App.MobileService.InvokeApiAsync<UserProfile>("me");

I find API’s much more useful the working with the tables, so I was bit bummed out when I found out it wasn’t supported. I solved this by simply using the HttpClient to make the calls to the API’s. However that didn’t work to good when enabling the authorization.

It took a bit of work with fiddler to find our what headers needed to be used. So I updated the calls to include these headers:

HttpContent content = new StringContent(payload);
content.Headers.ContentType = new MediaTypeHeaderValue("application/json");
content.Headers.Add("x-zumo-application", Client.ApplicationKey);
content.Headers.Add("x-zumo-auth", Client.CurrentUser.MobileServiceAuthenticationToken);

var response = await client.PostAsync(url, content);

That did the trick, and I could now call the API with the permissions != Everyone.

- But I was still bothered by the lack of support for API’s, since that meant I couldn’t share my code across the platforms. Of course I could have used the HttpClient approach above with my Windows Phone clients, but that didn’t seem right.

So I spent some time adding the missing InvokeApi* methods as Extension Methods. And you can download the code here:

To use it, simply download the MobileServiceClientExtensionMethods.cs file and add it to your project. You should update the namespace, and the just add the namespace in the using statement in the class where you want to make the call.

HTH

Mikael

Securing Azure BizTalk Services using ADFS
01 September 13 02:27 PM | wmmihaa

In all fairness, this post relates to Azure Service Bus in general and not specifically to Windows Azure BizTalk Services (WABS).

When is this relevant?

Azure Service Bus and WABS both rely on Azure Access Control Service (ACS) to authenticate users before calling the service. This type of authentication is often referred to as Federated Security. In most samples of how to interact with Service Bus or WABS, the client first calls ACS (using user name and password) to receive a token, which is later used upon calling the actual service.

image 

Using a ACS Service Identity (service user) makes sense when the client is a system, where you’d probably let each system have its own Service Identity.

-But what if the client is a user, such an employee or customer, perhaps calling in from a mobile unit?

In such cases, you’d have to keep the Service Identity stored on the device/application. Part from the security problems related to some evil person hacking the (Android) phone and then uses the credentials for evil purposes, you can’t do Authorization as every user is using the same credentials.

In such cases, you’d probably want to use some other directory than the one provided in ACS. In most cases Active Directory is likely to be the preferred choice. 

Authentication

Token Based Authentication can be a bit hard to understand, so I’ll try to explain it using an analogy.

image

Imagine you’re going on a vacation to a different country, and upon facing the scary customs officer, you smile and give him your VISA card. Take my word for it, it won’t work, but the interesting part is why. Despite the anger he or she will direct towards you, i’s not about you at all. In fact, they will never trust YOU!

-The problem is that they don’t trust the VISA organization to vouch for your identity.

So we need to go to a trusted authority and ask them to vouch for our identity, in Sweden that would be the local police department. They won’t trust you either, and will ask for some other proof, but more on that later. Let’s just assume all goes well and they’ll issue you a passport. 

image

The passport holds information about you, some information is public others may be hidden or encrypted, but more importantly it is signed by a trusted authority.

With a passport in our hand, the officer now trust your identity (Authentication), and he will now proceed to check your criminal records and ask a bunch of questions to determine if he will allow you to enter the country (Authorization).

image

Leaving the analogy and getting back to Federated Security. It’s pretty much the same thing. The Swedish Police department plays the role of the Issuer (ACS) the customs officer plays the role of the Relying Party Application (WABS or Service Bus), and the passport is of course the Token.

The difference between the analogy and Service Bus authentication, is that while the customs trust many different issuers, Windows Azure Service Bus ONLY TRUST ACS!

However, ACS can be configured to trust other issuers also referred to as Identity Providers. If we go back to the analogy, and the local Police department that issued the passport, –why did he trust you? There are actually a number of ways to prove your identity, but lets assume you’d use an ID. This is sometime referred to as a chain of trust.

This is how Active Directory come in play. Active Directory can be configured with Active Directory Federated Services (ADFS). ADFS is an issuer, and although Service Bus does not trust your ADFS, ACS can be trusted to do.

By adding your ADFS as an Identity Provider in ACS, you can authenticate yourself with ACS using a token provided by ADFS! ACS will then provide you with a new token which can be used to authenticate yourself with Service Bus.

image

Step 1: Ask ADFS to issue a token for ACS using Active Directory user name and password.

Step 2: Ask ACS to issue a token for SB, using a token issued by ADFS

Step 3: Call the service using the token provided by ACS.

Authorization

Although it won’t be sufficient for us, it’s worth pointing out that ACS provides some level of authorization which you can read more about here. ACS can however not provide authorization based on AD group membership, as the group membership claim comes as a comma separated string, with all groups:

Eg: CN=Sales,CN=Employees,…,…,…,…,…,…,…,CN=Users,DC=MyDomain,DC=net

Before we dive into BizTalk Services I’d like to point out a couple “issues” you might encounter. In the case where your service is a REST service (such as with WABS), you’ll face two problems, one small and one big. Both problems relates to the fact that Service Bus will remove your token after the token has been verified by Service Bus. In other words, the token is not for you!

This little problem can be worked around, by adding the token twice. One using the “Authorization” header, and the other using a custom header.

The potentially bigger issue relates to the HTTP header it self, or rather its size. Using IIS 7 that limit is 16KB, but might differ depending on the version of IIS. By propagating the group membership claims from ADFS to the token issued by ACS, the token can get pretty big (depending of the number of groups the user is a member of). Adding the token twice, and you might pass the limit. The good news is that this limit is configurable, but the problem can be difficult to identify, as it might not occur for all users.

Anyway, by adding the token twice we can now evaluate group membership and authorize the user within our service (SB) or itinerary (WABS).

The Setup

The setup involves many steps, some of witch I’ll assume you have already done. For instance, it’s outside the scope of this post to set up ADFS and install the WABS SDK.

1. The Itinerary

As the overall sample is complicated enough, we’ll stick with a very simple itinerary .

image

The itinerary received the message using a One-Way bridge, and routs the message to either one of the queues, depending on if the user is part of any of the groups.

2. Extracting the HTTP Custom Header

As described earlier, we need to add the token twice. Once for the Service Bus to do Authentication, and the other for us to do Authorization. To do this, double-click the One-Way bridge to open the bridge configuration.

image

Select any of the two Enrish stages, and click the Property Definition in the Property Window. Add a new property definition, and set the values as following:

image

I will call my custom header “AUTH”, if you care to use a different name, just change the Identifier to what ever work for you.

3. Create a Message Inspector

Now that we have the token, we need to extract the group claim. To do this we’ll need to create a Message Inspector. 

Follow these steps to create your project:http://msdn.microsoft.com/en-us/library/windowsazure/dn232389.aspx

Here is the code I used:

public class AuthorizationInspector : IMessageInspector 
{
    const string GROUPCLAIM = "http://schemas.xmlsoap.org/claims/Group";

    /// <summary>
    /// The name of the custom HTTP header, Eg. "AUTH"
    /// </summary>
    [PipelinePropertyAttribute(Name = "AuthToken")]
    public string AuthToken { get; set; }
    /// <summary>
    /// A comma separated list of authorized groups, Eg. Sales, Marketing 
    /// </summary>
    [PipelinePropertyAttribute(Name = "InGroups")]
    public string InGroups { get; set; }
        
    /// <summary>
    /// Processesing the Message
    /// </summary>
    /// <param name="message"></param>
    /// <param name="context"></param>
    /// <returns></returns>
    public Task Execute(IMessage message, IMessageInspectorContext context)
    {
        bool isInGroup;
        try
        {
            // Get the incomming SWT token
            var token = message.GetPromotedProperty(AuthToken);

            if (token == null)
                throw new Exception(AuthToken + " token is null");

            var requestedGroups = InGroups.Split(',');
            var claims = GetClaims(token.ToString());
                
            var groups = (from c in claims[GROUPCLAIM].Split(',')
                            select new GroupEntry(c)).Where(g => g.Type == "CN");

            var isingroup = groups.Where(g => requestedGroups.Count(r => r == g.Name) > 0);

            return Task.Factory.StartNew(() =>
            {
                message.Promote("IsInGroup", isingroup.Count() > 0);
            });

        }
        catch (Exception ex)
        {
            throw ex;
        }
    }
}

* The GetClaims method and the GroupEntry class is part of the downloadable sample…

Compile your project. and go back to your Bridge configuration. Select the Enrich stage again (outer rim) and click the On Exit Inspector in the property window.

Type the FQN of your Inspector class in the Type field, Eg:

InspectorsLibrary.AuthorizationInspector, InspectorsLibrary, Version=1.0.0.0, Culture=neutral, PublicKeyToken=a1cd373975c6a197

 

 image

Our newly created Inspector had two properties:

  • AuthToken – The name of the custom token we’ll use
  • InGroups – The authorized groups Eg: Sales,Marketing,Finance

Build and deploy your bridge.

4. Configure ADFS to issue tokens for ACS

Open the Azure Management Portal, and browse to the Active directory. Select the namespace you’re using for WABS, and select Manage to open the ACS portal. At the bottom of the navigation field, click Application Integration, and copy the WS-Federation Metadata URI. This file includes the Realm(s) of the ACS along with public keys etc, needed to configure the Relying Party Trust in ADFS.

Next open the ADFS Manager, and browse to the Relying Party Trust node. Right click the node and select Add Relying Party Trust… and click the Start button.

image

Keep the “Import data about…” option, and paste in the ACS WS-Federation Metadata URI. Click Next *4 and Finish (leave the checkbox to open the claims configuration).

In the Edit Rule dialog, click Add Rule, and Next. Give it a name and set the Attribute Store to Active Directory.

Select the “Is-Member-Of-DL” attribute and map it to the Groups claim.

image

Click Ok.

In the ADFS Manager, browse to the “Endpoints” node. Make sure the usernamemixed endpoint is enabled. Also make a note of the URI together with the FederationMetadata URI.

5. Adding ADFS Identity Provider in ACS

Go back to the ACS portal, and click Identity Providers. Click Add and select WS-Federation identity provider and Next.

Give it a name and set the login text (although will never use it). Paste or type the URI to the ADFS FederationMetadata. I ran into some problems here, and needed to copy it locally, and add it using the Browse option. This file includes all configured claims from ADFS (among many other things). 

In the Used By section, deselect all applications but your WABS application.

To propagate claims from the ADFS token to the ACS token, we need to do one more step. Click the Relying Party Application node, and select your WABS application. At the bottom of the page, click the selected Rule Group. Click the Generate button, and select your new Identity Provider. Click Save.

6. Try it out!

Kudos to you if you made it this far!

If you download the sample, you’ll find the MessageSender project which I originally got from here. I’ve made some modifications to make it work with ADFS. It generally comes down to two methods at the end of the MessageSender class.

  • RequestSamlToken – Uses a WSTrust channel to get the SAML token from ADFS.
  • RequestSwtWithSamlToken – Similar to the GetAcsToken method, but uses assertion to pass along the SAML token rather then username/password as with the GetAcsToken.

download sample

 

Related links:

Federated Authentication for Azure Service Bus

Azure Service Bus

BizTalk Services

REST Start Kit for BizTalk Server
28 May 12 12:45 AM | wmmihaa

The REST Start Kit for BizTalk Server is about providing support for GET, POST, PUT and DELETE operation, for both Receive and Send ports together with support for both XML and JSON.

In a former post, I showed how to expose REST services from BizTalk. However I never got it to work for GET operations and did not cover consuming other REST services through Send Ports. Consuming REST services was however greatly covered by Nitin Mehrotra in his post “Invoke ReSTful Web Services with BizTalk Server 2010”. In fact I’ve included Nitins code in this solution to compete the REST Start Kit for BizTalk Server.

Is this post interesting to me?

We are seeing a broader diversity among clients, hosted on various smart devices, running on different runtimes. Commonly among all these, is that applications are becoming commodity with a relatively short life-cycle. These applications needs to be rapidly developed using frameworks that can be re-used on other platforms. These and other factors are reasons we see a broad adoption of HTML5 and JavaScript today.

When developing these kind of applications, REST-based services are the preferred choice by far. Not only because it’s easy to consume using JQuery (resource orientated JavaScript library), but equally important is the fact that REST is lightweight. -Bandwidth is not infinite on mobile devices!

Is this important to BizTalk? Yes it is! BizTalk Server is Microsoft’s middleware platform. And as such it should manage REST. Arguably, many web-based application are built together with the service layer in the same application. But what happens when the client needs to access a back-end system like SAP or Oracle EBS? BizTalk can handle these with ease, however it can’t bridge them to REST. 

I’ve tried my best to make the start kit easy to use. However I do realize that many of the topics mentioned in the post are outside the comfort zone for many. But if you only care about HOW to use it, just skip the “How it works” section of the post.

What is REST?

I assume you already know what REST is, but never the less… Representational state transfer (REST) is a light weight protocol introduced by Roy Fielding around the year 2000. REST exposes services over HTTP, and can receive the request either through the URI, as in the case of a GET or DELETE operation, or through the payload (POST and PUT).

HTTP verbs are essential to REST, as REST is focused on entities or resources. For instance, you wouldn’t expose a DeleteCustomerOrder method using REST. Instead you’d let the client access your recourses using the URI, Eg http://somedomain.com/Customers(1234)/Orders. If the user was to delete an Order (with an id of 5678), he or she would call http://somedomain.com/Customers(1234)/Orders(5678) using the HTTP DELETE verb. Using the GET verb, would return the actual order (given that it was not already deleted). If the resource was deleted (or never created), the user should receive a 404 status code.

POST (and PUT) works a bit different, as it receives it’s request through the payload. This behavior is similar to SOAP (which only uses POST), however REST does not require a specific format (as opposed to SOAP which uses the SOAP envelope). In fact, you don’t even have to use XML. Arguably, most clients call REST services using the JSON format.

If you don’t know about JSON, it’s a lightweight data format, commonly used by JavaScript and JQuery. Part from being less verbose then XML, it can be parsed to an object on the client which makes it easier to navigate (as oppose to using XPath).

As such, REST is all about HTTP, and works very much like the architectural principles of the World Wide Web. This is not an accident, as Roy Fielding was one of the principal authors of the HTTP specification.

Many more things could (and should) be said about REST (and RESTFul services), but this is not the place. However, it’s important to point out the challenges we face using BizTalk:

1. BizTalk requires a Message!

This is somewhat problematic as all parameters for a GET or DELETE request is passed along with the URI, not the payload. Because of this, we need to intercept the request and create a “real” message using the parameters passed on by the URI. Using a dynamic URI’s reveals yet an other challenge, as URI’s are generally of a static nature for BizTalk. This is where the WebHttpBinding comes in to place, as it let’s us define a base URI which our service (Receive Location) will listen to, regardless of what proceeds the base URI, Eg http://somedomamin.com/Customers(1234)

2. BizTalk only accepts POST!

This was the most difficult problem to solve, and I’ll get more into detail on this later. But the basic principle is that upon creating the message described before, it will add necessary meta data to the message to let BizTalk think this is a POST message.

3. BizTalk only understand XML!

If the consumer sends a GET request to the service, it also passes along information in the HTTP header, letting us know what format is expected. This is done through the “Accept” header, and could be either “application/xml” or “application/json” (there are others, but xml and json are the only supported ones using the REST Start Kit).

In case of a GET request, the outgoing message will get converted from XML to JSON using JSON.Net. If the consumer sends a POST request, the header could state: Content-Type: application/json. If this is the case, the incoming request will also be converted.

How it works

To make use of the REST Start Kit for BizTalk you need to expose your service or consume another REST service using the WCF-Custom adapter with the binding set to WebHttpBinding. In addition to selecting the binding, you also need to add a behavior; BizTalkRESTRequestHandler for the Receive Location and BizTalkRESTTransmitHandler for the Send port. These behaviors takes care of underlying pluming needed to make BizTalk accept REST based messages. This is all you have to do.

Exposing REST services (Receive Locations)

GET and DELETE

Regardless of what HTTP verb is used, BizTalk only exposes one usable method, – the “TwoWayMethod”. This is because BizTalk does not really care about the actual message until it hits the pipeline. The TwoWayMethod does however care about the HttpRequestMessageProperty.Method to be “POST”.

In order to send anything but a POST request to BizTalk, we need to intercept the message and convert it to a POST, or create a new message (GET & DELETE) and set the HttpRequestMessageProperty.Method to “POST”. This action needs to be done in the SelectOperation method of an IDispatchOperationSelector behavior. As we need to use the WebHttpBinding, the behavior I’ve created inherits from WebHttpDispatchOperationSelector.

As said before, a GET or DELETE request passes it’s parameters through the URI. You may have a base URI (declared on the Receive Location) set to http://localhost:8080/MyService while the consumer may call your service using http://localhost:8080/MyService/Bookings/2012/11, expecting all bookings from November 2012. I this case the BizTalkRESTRequestHandler behavior parses the URI and passes a BizTalkWebHttpRequest message to BizTalk that would look like this:

<ns0:bizTalkWebHttpRequest 
  method="GET" 
  xmlns:ns0="http://bLogical.RESTSchemas.BizTalkWebHttpRequest/1.0">
  <ns0:params>
    <ns0:param>2012</ns0:param>
    <ns0:param>11</ns0:param>
  </ns0:params>
</ns0:bizTalkWebHttpRequest>

On the other hand, the user might use a bit more complex URI such as: http://localhost:8080/MyService/Employees?firstname=Caren&lastname=Smith  

In this case you need to instruct the behavior how to parse the URI. If you’ve ever created a REST service using only WCF, you’re probably familiar to the UriTemplate class, which is also used with the BizTalkRESTRequestHandler behavior. In the sample above, the UriTemplate could be: 
Employees?firstname={fname}&lastname={lname} 

Which in turn would give you BizTalkWebHttpRequest looking like this:

<ns0:bizTalkWebHttpRequest 
  method="GET" 
  xmlns:ns0="http://bLogical.RESTSchemas.BizTalkWebHttpRequest/1.0">
  <ns0:params>
    <ns0:param name="fname">Caren</ns0:param>
    <ns0:param name="lname">Smith</ns0:param>
  </ns0:params>
</ns0:bizTalkWebHttpRequest>

Lastly, if the consumer requires an XML formatted response, the response from BizTalk will be sent straight back to the consumer. However, if the user expects JSON, the response need to be parsed. The DispatchOperationSelector behavior, described above, is only executed on the incoming request, so we need to add an other behavior, –a IDispatchMessageInspector to handle the outgoing response. This behavior is called BizTalkRESTResponseHandler and is added by the BizTalkRESTRequestHandler. In that sense it’s internal and never exposed to you. Never the less, an IDispatchMessageInspector exposes two methods; AfterReceiveRequest and BeforeSendReply. The incoming header is added to the operation context in the  AfterReceiveRequest method, and if the “Accept” parameter is set to “application/json”, the response is converted to JSON and sent back, along with the WebBodyFormatMessageProperty set to WebContentFormat.Json.

PUT and POST

If the client sends a POST or PUT request with a content type set to “application/xml”, nothing is done but passing the message to BizTalk. Only it the content type is set to “application/json”, it is then parsed to XML using JSON.Net. If you expect a message to match a BizTalk generated schema like this:

<ns0:Tweet xmlns:ns0="http://yourns.Tweet">
  <Author>wmmihaa</Author>
  <Text>XML Rocks!</Text>
</ns0:Tweet>

…the JSON message needs to included the namespace as:

{"ns0:Tweet":{"@xmlns:ns0":"http://yourns.Tweet","Author":"wmmihaa","Text":"XML Rocks!"}}

Also, all JSON properties will be parsed as Xml Elements, not Attributes!

Consuming REST services (Send Ports)

As I said before, I’ve “borrowed” the consuming part from Nitin Mehrotra's sample, and added support for POST and PUT. To consume a REST service from BizTalk, the experience is pretty much the same as when exposing a service using a Receive Location. Set the Send Port Transport to WCF-Custom and the binding to WebHttpBinding. As with the Receive Location, we also need to add a behavior, - this time the BizTalkRESTTransmitHandler. This is a IClientMessageInspector. The actions of this behavior differs depending on the SOAP Action Header which you state in the WCF-Custom Transport Properties:

image   

GET and DELETE

To consume a REST service you have to pass a BizTalkWebHttpRequest message to the adapter.

<ns0:bizTalkWebHttpRequest 
  method="GET" 
  uriTemplate="/Events/{id}" 
  xmlns:ns0="http://bLogical.RESTSchemas.BizTalkWebHttpRequest/1.0">
  <ns0:params>
    <ns0:param name="id">123</ns0:param>
  </ns0:params>
  <ns0:headers>
    <ns0:header name="Content-Type">application/xml</ns0:header>
  </ns0:headers>
</ns0:bizTalkWebHttpRequest>

The UriTemplate together with your parameters and the Send Port URI, will make up the actual To URI of the message. For instance, if the Send Port URI is set to http://somedomain.com the message above would be sent to http://somedomain.com/Events/123.

The response is at this moment never parsed, since most REST services does respond XML if asked to…

POST and PUT

Except for adding necessary headers, the POST and PUT request does very little. Keep in mind though, you will always post a message to the URI set on the Send Port. You may find it tempting to use the same Send Port for all calls to a particular REST service, and although this will work for GET and DELETE, it wont work for POST and PUT. This is because the behavior has no knowledge of your intentions other than the payload. I could have solved this using SOAP Action Headers or some promoted property, but I didn’t… (sorry). 

How to use it

1. Download the bits from CodePlex.

The solution has four projects:

bLogical.BizTalk.RESTBehavior Includes the three behaviors:
- BizTalkRESTRequestHandler
- BizTalkRESTResponseHandler
- BizTalkRESTTransmitHandler
bLogical.BizTalk.RESTBehavior.Setup Installs the bLogical.BizTalk.RESTBehavior assembly to the Global Assembly Cache
bLogical.RESTSchemas Includes the BizTalkWebHttpRequest schema
BizTalkSampleApplication Includes six demo orchestrations and bindings.
ExternalSampleApplication This is a sample REST service used used in the Send Port of the BizTalkSampleApplication

2. Register the behavior

In order to see the behaviors in the Select Behavior Extension dialog you need to register the behaviors. This can be done in different places such as in the machine.config or the BTSNTSvc(64).exe.config file. However I find it more convenient to set it on the Receive- and Send handler in the BizTalk Administration Console, as you’d otherwise need to update the config files on all BizTalk machines (given you have more then one)

To do this, select the WCF-Custom adapter under Platform Settings –> Adapters in the BizTalk Administration Console. Double-click the Receive Handler and click the Properties button. Then click the Import button and browse to the WCF-Custom.BindingsExtension-ReceiveHandler.config found in both the solution folder and the installation folder. Repeat the steps for the Send handler using the WCF-Custom.BindingsExtension-SendHandler.config file.

SNAGHTML16ea3c8b

If you want to register the behaviors in the machine.config file(s), just copy the behavior extensions to the <behaviorExtensions> element in the appropriate files.

3. Install the BizTalkWebHttpRequest schema

The BizTalkWebHttpRequest is part of the bLogical.RESTSchemas project. Update the Application Name in the project properties, and deploy it to BizTalk.

4. Use the behaviors

Receive

  1. Create a two-way Receive Port and Receive Location
  2. Set the Transport to WCF-Custom
  3. Set the URI to something like http://localhost:9090/MyService, and select the webHttpBinding in the Binding tab
  4. Select the Behavior tab, and right-click the EndpointBehavior node on the left, and select Add extension
  5. Select the BizTalkRESTRequestHandler

SNAGHTML16fae3c2

6.  Set the UriTemplate to match your expected incoming request. You can have multiple UriTemplates delimited using a pipe, eg

/Person/id={pId}| /Person/firstname={fname} | /Person/lastname={lname}.

This would make your service accept incoming GET or DELETE request like these:

http://localhost:8080/myservice/Person/id=123
http://localhost:8080/myservice/Person/fistname=Caren
http://localhost:8080/myservice/Person/lastname=Smith

SNAGHTML16ff5401

Send

  1. Create a new Static Solicit-Response Send Port
  2. Set the Transport to WCF-Custom
  3. Click the Configure button and set the URI to the base URI of the service. Eg http://externalservice/service.svc. Select the webHttpBinding in the Binding tab
  4. Select the Behavior tab, and right-click the EndpointBehavior node on the left, and select Add extension
  5. Select the BizTalkRESTTransmitHandler from the list of extensions

5. Install and use the Sample Application

The BizTalkSampleApplication project has six samples:

ConsumeDELETE Makes a DELETE request to an external REST service (ExternalSampleApplication)
ConsumeGET Makes a GET request to an external REST service
ConsumePOST Makes a POST request to an external REST service
ExposeDELETE Receives an incoming DELETE request
ExposeGET Receives an incoming GET request
ExposePOST Receives an incoming POST request

The BizTalkSampleApplication is deployed to an appplication with the same name. The bindings file are included in the project. Before you try out the Consume* scenarios, make sure you’ve started the ExternalSampleApplication (just right-click the RESTService.svc file and select View in browser).

After you’ve deployed the solution to BizTalk, you need to redirect all FILE ports to the FILEDROP folder (part of the zip file)

All Consume* samples uses the same Receive Location; ReceiveConsumeREST and submits the message to the RESTService.svc in the ExternalSampleApplication.

To test the ExposeGET sample, you can use the browser and hit http:localhost:9999/Events/id=123 (any number will do)

To test the ExposeDELETE and ExposePOST you’ll need to use some other tool like Fiddler.

SNAGHTML171d0630 

To use the sample with Fiddler:

  1. Select the Composer tab
  2. Select the Verb and set the URI
  3. Set the Request Headers (sample file included in the solution)
  4. When testing POST, generate an instance of the Event schema in the BizTalkSampleApplication or copy the content from the Event_output.xml file in the FILEDROP folder.

That’s it. Let me know if you run into any issues. If you find it useful TWEET it!

HTH

//Mikael

Filed under: , , , ,
Exposing JSON/REST endpoints from BizTalk
07 March 12 02:14 PM | wmmihaa

This ports is no longer valid. An update can be found here:

 

Solutions for consuming REST services from BizTalk has been around for a while, and Jon Flanders has an excellent post about it.  However, very little has been told about exposing REST endpoints, and even less using JSON. If you don’t know about JSON, it’s a lightweight data format, commonly used by JavaScript and JQuery. Part from being less verbose then XML, it can be parsed to a object on the client which makes it easier to navigate (as oppose to using XPath). This can come to good rescue for UI devs who apparently don’t understand XPath ;)

SNAGHTML3e7893f

SNAGHTML3e6150a

SNAGHTML3da61a3

I haven’t yet been in a situation where I’ve had to expose REST/JSON endpoints from BizTalk, but as Kent Weare was being hackled by Bil Simser (MS Word MVP), I was eager to help out.

 

channelstack

I begun by creating a custom WCF MessageInspector. My plan was to parse the incoming JSON message to an XML message, and also to change the HTTP verb from GET to POST if the client sent a GET request (BizTalk requires POST). As it turns out, the HTTP verb/Method, can not be changed in the IDispatchMessageInspector. If it was to be changed it would have to be earlier in the channel stack.

Prior to the MessageInspector is the OperationSelector, so I went on creating one implementing the IDispatchOperationSelector interface. After moving the logic from the inspector to the SelectOperation method in the OperationSelector, I ran into a new problem. The method was never called. It seems BizTalk is adding it’s own OperationSelector through its HostFactory. As I wanted to host the Receive Location in a In-Process host (no IIS), making my own HosFactorythis wouldn’t work either…

I was forced to dig even deeper in the WCF channel stack. Next step was a custom Encoder. Luckily I found a sample in the SDK which was pretty easy to use. The only problem was I couldn’t access the HTTP verb. However after all this, I was willing to accept this trade-off.

Next up was the serialization and deserialization of JSON. Bil Simser pointed me to the JSON.Net project on codeplex, which made it very easy:

XmlDocument doc = new XmlDocument();
doc.LoadXml(xmlString);

string jsonString = JsonConvert.SerializeXmlNode(doc, Newtonsoft.Json.Formatting.None, true);

How to use the sample:

  1. Download the sample here
  2. Either run the bLogical.JsonXmlMessageEncoder.Setup.msi or build and add the bLogical.JsonXmlMessageEncoder to the global assembly cache (GAC).
  3. Open BizTalk Administration Console. Browse to Adapters and right-click the WCF-Custom Receive Handler. Select Properties.
  4. Click the Import button, and select the WcfExtensions.config file found in the project.
  5. Deploy the FortisAlberta project to BizTalk.
  6. Import the FortisAlberta.BindingInfo.xml to the FortisAlberta Application
  7. Start the FortisAlberta Application.
  8. Run the WebApplication1 project, and submit an OutageReport.

How to configure a Receive Location manually

  1. Add a Request/Response Receive Port and Location.
  2. Set the transport to WCF-Custom (no need to host it in IIS).
  3. Set the binding to customBinding, and remove the existing binding elements.
  4. Add the "jsonXmlMessageEncoder" and the "http transport" extensions.
  5. Enable the port.
  6. You can use the XmlToJSONConverter that comes with the project to generate the expected JSON format from an XML instance, or use any of the online conversion sites like this one.

SNAGHTML3d49ea3

How to call the service

<script type="text/javascript">

    jQuery.support.cors = true;
    var jsonRequest = '{"Tweet":{"Author":"wmmihaa","Text":"BizTalk Rock!"}}';

    function ReportOutage() {
        $.ajax({
            type: 'POST',
            url: http://yourdomain.com/submitTweet,
            data: jsonRequest,
            contentType: "application/json; charset=utf-8",
            dataType: "json",
            success: function (msg) {
                alert(msg);
            },
            error: function (xhr, ajaxOptions, thrownError) {
                alert('error: ' + thrownError);
            }
        });
    }
</script>

Please not the JSON above:

{"Tweet":{"Author":"wmmihaa","Text":"XML Rocks!"}}”. This is going to be translated to:

<Tweet><Author>wmmihaa</Author><Text>XML Rocks!</Text></Tweet>

As there are no namespace, you’d need to add one in the receive pipeline. Alternatively, you could add the namespace in JSON:

{"ns0:Tweet":{"@xmlns:ns0":"http://yourns.Tweet","Author":"wmmihaa","Text":"XML Rocks!"}}

Which would come out as:

<ns0:Tweet xmlns:ns0="http://yourns.Tweet">
  <Author>wmmihaa</Author>
  <Text>XML Rocks!</Text>
</ns0:Tweet>

How to call the service without using parameters

function ReportOutage() {
    $.ajax({
        type: 'POST',
        url: "http://yourdomain.com/submitTweet",
        data: '{}', // empty parameter
        contentType: "application/json; charset=utf-8",
        dataType: "json",
        success: function (msg) {
            alert(msg);
        },
        error: function (xhr, ajaxOptions, thrownError) {
            alert('error: ' + thrownError);
        }
    });

Empty parameters are casted to a message that looks like this: <EmptyJsonMessage/>. As you won’t have an equivalent schema in BizTalk, you can’t parse it using an XmlReceive pipeline. If you want to process the message in an orchestration, you’d need to set the message type of the incoming message to System.Xml.XmlDocument.

Restrictions

  1. Does not support HTTP GET.
  2. Does not support Uri parameters, Eg. http://server/Customers?id=16.
  3. The encoder supports both XML and JSON, but not both. It will be restricted to the media type set on the encoder.
  4. In this sample the response can not be handled in a streaming manner. If the size of the response message is bigger then what is read from the client, this is likely to cause a problem. I haven’t experienced this myself, but if you get into this problem, contact me and I’ll look into it.

HTH    

Filed under: , , ,
Azure Service Bus EAI/EDI December 2011 CTP – Content Based Routing
18 December 11 03:11 PM | wmmihaa

In this blog post we are going look at how to manage routing in the new Azure ServiceBus EAI CTP.

As a scenario, I’m going to send a request for information (RFI) to some of my fellow MVP’s. To do that, I’m going to create a One-Way Xml Bridge, to receive the messages. After receiving the RFI message I tend to route it to one of three queues.

1. Create a ServiceBus project

If you haven’t already downloaded the SDK, you can do this here. After you’ve installed the SDK, you can sign in to the labs environment using a Windows Live ID.

Open Visual Studio 2010, and select Create Project. In the list of project templates, select ServiceBus, and Enterprise Application Integration. Give it a name and click Ok.

1

2. Create a Message Type

Right-click the project and select Add->New Item. At this time there are two types of artifacts you can add; Schemas and Maps. Select Schema and sett an appropriate name. In my case I set the name to RFI.xsd. Continue building up your schema. Notice, you don’t have to promote any nodes as you’d have to do in BizTalk.

2a

3. Designing the Bridge

Double-click the BridgeConfiguration.bcs and drag a Xml One-Way Bridge from the toolbox to the canvas. This is going to be your entry point to your process, similar to a Receive Location in BizTalk. Set the name appropriately, and notice the Router Address which is going to be your endpoint in Azure ServiceBus.

4

4. Add Queues

As stated before, the incoming RFI message is going to be routed to any of the three queues. You might not your message relayed to a queue, and could there for use any of the other Destinations such as Relay- or External Service EndPoints. Either way, the principle of routing is the same.

Connect the Bridge with all Destinations.

5

5. Configure the Bridge

Next we’ll define the incoming message type(s). Double-click your Bridge (ReceiveRFI in my case). Click the plus button in the Message Types stage. Select the Message Type you created earlier, and click the arrow button on the right.

6

6. Enrich the Message

This is the interesting step, where we are going to promote some fields in the payload so that we can route on these in the next step.

First open your schema and select the node you care to use for routing (in my case Receive). Copy the Instance XPath from the Property window.

8

Then double-click the Bridge, select either of the Enrich stages, and then click the Property Definition button in the Property window. There are two Enrich stages, as you might be using a Transformation, in which case you might want to promote fields from either the original message or the transformed message.

For more information about transformations, have a look at Kent’s post.

7

Set the Type to XPath, and paste the XPath expression in the Identifier text box. Select the Message Type and set the name of the property. Finish be setting the data type and click the Add button (+). Close the dialog by clicking the Ok button.

9

7. Set the routing conditions

As you have promoted your property (or properties), you’re now ready to set the Filter Conditions on each of the Connectors. Select one of the selectors and type the Filter in the Property window. Eg. receiver=’SteefJan’ or customerId=1234.

10

8. Create the Queues

Before we deploy the solution, you need to create the queues. There are several tools for this, but with the samples comes a MessageReceiver project you can use.

Type MessageReceiver.exe <Your namespace> owner <Your issuer key> <Queue name> Create 

After creating the queues verify they are created in the portal.

11

9. Deploy your solution

Right-click the project and select Deploy. Supply the secret.

12

10. Test the solution

Along with the MessageReceiver tool, you’ll find a MessageSender project as well. Just type:

MessageSender.exe <Your namespace> <Your issuer key> <Your endpoint> <Path to sample file> application/xml

Use the MessgeReceiver to get the messages from the queues:

14

HTH

Filed under: , ,
Using User-defined tables as stored procedure parameter
15 December 11 01:47 PM | wmmihaa

The WCF-SQL adapter provides support for multiple inserts through the Consume Adapter Service feature:

image

However, sometimes you might want to validate the data on the SQL side before before making the insert. For instance, if you have a collection of Customers, where some of them might already exist in the database, and should only be updated. In such a case, you’d have to first make a database lookup, to determine the state of the Customer and then make either an insert or update.

In such a case, using user-defined table types might be your solution. User-defined tables are similar to ordinary tables, but can be passed in as a parameter.

In my sample, I have a Contacts table, and I’m receiving a collection of Persons where some entities are new and some are to be updated.

image

Create the User-Defined Table Type

The user-defied table type will serve as our contract.

CREATE TYPE [dbo].[InsertContactRequest] AS TABLE
(
    [PersonNo] [varchar](50) NOT NULL,
    [FirstName] [varchar](50) NOT NULL,
    [LastName] [varchar](50) NOT NULL,
    [Phone] [varchar](50) NOT NULL,
    PRIMARY KEY CLUSTERED ([PersonNo] ASC)WITH (IGNORE_DUP_KEY = OFF)
)

Create the Stored Procedure

The stored procedure takes the user-defined table type as a parameter (@insertContactRequest), then updates all existing rows and inserting all new once.

CREATE PROCEDURE [dbo].[sp_InsertContacts] @insertContactRequest InsertContactRequest READONLY
AS
BEGIN
    
    UPDATE dbo.Contacts 
    SET Phone = r.Phone
    FROM dbo.Contacts c
    JOIN @insertContactRequest r on r.PersonNo = c.PersonNo

    INSERT INTO dbo.Contacts (PersonNo, FirstName, LastName, Phone)
    SELECT r.PersonNo, r.FirstName, r.LastName, r.Phone
    FROM    @insertContactRequest r
    WHERE    r.PersonNo not in(SELECT PersonNo FROM dbo.Contacts)
       
END

Generate BizTalk artefacs

1. In you Visual Studio, right-click the BizTalk project and select Add->Add Generated Items. Select Consume Adapter Service.

image

2. In the Consume Adapter Service dialog, click the configure button to set the credentials. Click Ok, and then Connect.

image

3. In the tree-view, select Strongly Typed Procedures, and select your stored procedure in the right pane. Click Add and Ok to generate the schemas.

image

4. Make your transformation, and complete your solution.

image

 

Here is the sample source.

HTH

(Kudos Daniel Östberg)

Filed under: , ,
I did it so you don't have to: Connecting to Dynamics CRM Online from BizTalk Server
11 December 11 02:31 PM | wmmihaa

If you’re a consultant like me, you’ve probably got similar calls from some key account manager, as I did yesterday:

KAM: Hi Mikael. I’m just about to close this super big deal, with this super important Customer.
Me: Really! Good for you.
KAM: Yeah, we’re really close. But to rap it up, I was wondering if you could help me out a bit…
Me: Sure. What do you have in mind?
KAM: Could you come with me to meeting with the customer on Monday? (this happens on Thursday 6PM)
Me (getting suspicious): mmm…What do you want me to do?
KAM: A demo!
Me: Demo of what?
KAM: The customer want us to show how to integrate Dynamics CRM Online with SAP using BizTalk.
Me: What?
KAM: Yes, yes. The customer wants us to show it live! You know, they want to see you do it…
Me: ARE YOU HIGH? (I didn’t actually say that, but I was thinking it)
Me: It will not happen! I haven't worked with SAP since 3-4 year ago. I’ve never worked with CRM Online (or off-line for that matter). I’m fully booked tomorrow, and I want to spend the weekend with my family as X-mas is coming up.
KAM: But we need to close this deal…
Me: NO!
KAM: Please…
Me: No way!
(Yada, yada, yada)
Me: Ok, I’ll give it a try (I’M SUCH AN IDIOT!!!!!)

So here it is: How to connect to Dynamics CRM Online from BizTalk Server

To begin with, if you want to integrate with CRM Online, you have two options. Either use an un-typed web-service API or use a tool called CrmSvcUtil.exe to create a proxy class for you. Each of these comes with some challenges and limitations:

Using an un-typed web service, can of course be somewhat messy, but the SDK provides you with the schemas you need (more on that later). The biggest challenge, however, is to authenticate to the service as it assumes you’re using Windows Live Id. Authenticating against the service would require an additional four calls to the service to finally get the authentication tokens needed to create the security header. An then figure out a way to to add the headers in a pipeline. The steps needed are described by Girish Raja here.

The proxy created using the CrmSvcUtil is quite nice, since it’s typed, but of course I can’t use it in a send port. I would  have to make the call using the inline-send approach from within an expression shape in an orchestration. And thereby loose the built in re-send functionality and more, that ships with BizTalk.

As none of these approaches was acceptable, I begun looking for other alternatives. What I really wanted was an authentication behavior, that I could add to my WCF-Custom send port adapter. 

Building the Custom WCF Behavior

What I needed was a Message Inspector that would build up the security header as Girish Raja did in his sample, and then add that header to the SOAP envelope. This class is called LiveIdAuthenticationMessageInspector and inherits from IClientMessageInspector. This gives two methods to my class: BeforeSendRequest and AfterReceiveReply.

public object BeforeSendRequest(ref System.ServiceModel.Channels.Message request, 
    System.ServiceModel.IClientChannel channel)
{
    string securityHeader = HeaderHelper.GetSecurityHeader(this._username, this._password, this._crmUri);
            
    request.Headers.Add(MessageHeader.CreateHeader("Security",
        WSSecurityUsernameTokenProfileNamespace,
        string.Empty,
        new SecurityHeaderSerializer(securityHeader),true));

    return null;
}

In the BeforeSendRequest method is where I can add the security header to the message before the message is sent out. The the BeforeSendRequest method I call a helper class returning the actual header. The GetSecurityHeader method is going through four steps to build up the header:

  1. Get Windows Live Device Credentials
  2. Register Device Credentials and get binaryDAToken
  3. Get Security Token by sending WLID username, password and device binaryDAToken
  4. Build up the security header with the token from previous step.

(This sample does not cache the tokens! I strongly suggest you add some caching logic before you run this in production)

After the header is created it is added to the request, using a custom serializer, as it would otherwise be HTML encoded.

public void AfterReceiveReply(ref System.ServiceModel.Channels.Message reply, object correlationState)
{
    Trace.WriteLine("[bLogical] LiveIdAuthenticationMessageInspector:AfterReceiveReply called");
    int index = reply.Headers.FindHeader("Security", WSSecurityUsernameTokenProfileNamespace);
    reply.Headers.RemoveAt(index);
}

When BizTalk (or any other WCF client) receives the response it will throw an exception, as it doesn’t understand the Security header. I might have gone away with adding the http:http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd schema to BizTalk, but as I don’t need it, I just removed it from the header on the way back.

Part from the Message Inspector, I also added an EndpointBehavior and a BehaviorExtensionElement. The LiveIdAuthenticationBehaviorExtensionElement needs to be registered in the configuration using the behavior in BizTalk

After you have registered the behavior, you can focus on the normal BizTalk tasks like building orchestrations and mappings. To get started with consuming the CRM Services, have a look at Richards post. The only thing I’d like to emphasis is that the correct schemas are part of the SDK (sdk\schemas). After you run the Consume WCF Service Wizard, remove all schemas and replace them with the the once in the SDK. Better yet, put all those schemas in a separate schema project, and reference that project from other projects where you’re using them.

1. Add the behavior to the Global Assembly Cache

Open up the LiveIdAuthentication project, build and add it to the global assembly cache.

2. Register the behavior in the configuration

(sorry about the formatting).

<extensions>
  <behaviorExtensions>
    <add name="liveIdAuthentication" type="LiveIdAuthentication.LiveIdAuthenticationBehaviorExtensionElement, LiveIdAuthentication, Version=1.0.0.0, Culture=neutral, PublicKeyToken=698ceec8cebc73ae"/>
  </behaviorExtensions>
</extensions>

You can do this either in a config file (machine.config or BTSNTSvc[64].exe.config) or in BizTalk (WCF-Custom Send handler):

image

I prefer the later as I would otherwise need to make the changes to all the servers in the group. Just copy the extension element above, into a config file, and import the file from the Transport Properties dialog above. Or you can point to the app.config file in the sample.

3. Add the Endpoint Behavior

Open the send port and click the Configure button to open the WCF Transport Porperties. Select the Behavior tab and right-click the Endpoint behavior node, and select Add extension. Select the liveAuthentication extensions. Select the extension and set the properties.

image

 

You’re done.

Use this code as you like, and on your own risk. If you make improvements, I’d appreciate if you notify me. One thing I know could be done better, would be to cache the tokens and re-use them for the next call.

 

Download the Dynamics CRM LiveId Authentication Behavior sample here.

HTH

Recordings from Enfo Integration Days are available
28 November 11 07:00 PM | wmmihaa

Every year my employer hosts a two day event all related to integration or service orientation. Around 450 people attended this year, which I assumes makes it the biggest event of the year in the integration space, with more then 25 sessions on four different tracks. All Microsoft related session were recorded and are now available.

Enjoy, an we hope you join us next year.

 

Enfo Zystems is sponsoring a two day event focusing all on integration.
14 September 11 09:59 PM | wmmihaa

Welcome to Integration days 2011!

If you are in the integration space, you’ll find all kinds of interesting and valuable sessions in any of the four tracks; Strategy, Public, Microsoft and IBM. Each track has six sessions with speakers from both Enfo Zystems and other partner organizations such as Microsoft.

At Thursday evening, you’re invited for dinner with entertainment, which of course will be a great time to meet up with other integration geeks (such as myself…)

And the best of all… –It’s all free, so sign up now!

image


The event starts on the 13thof October and covers four tracks, each with six sessions. The Microsoft platform track will cover the following six sessions:

Microsoft BizTalk Server and Microsoft’s Middleware vision

BizTalk Server has been at the center of Microsoft’s Middleware platform for a number of years, to provide a rich set of capabilities for services and integration. AppFabric, both on-premise and on Windows Azure provides additional capabilities as well as some overlapping ones. So what is the strategy here, what is Microsoft up to long term and short term? How will this affect solutions you create and what opportunities will it create for your company? In this session, you will get the answers to these questions.
Presenter: Marcus Gullberg, PM Microsoft Sweden

Microsoft BizTalk Server & Windows Azure AppFabric

Microsoft’s Middleware platform is currently undergoing a change, which in turn offers different solutions with unique capabilities. What is available today, and how can we today make these solutions work together? This session will cover Microsoft BizTalk Server, Windows Server AppFabric and Azure AppFabric, to show how you can extend the reach of your integration platform outside your own domain.
Presenter: Mikael Håkansson, Solution Architect, Enfo Zystems

Windows Azure AppFabric Platform futures

Where is the future of Microsoft’s Middleware platform going? How will we design, build and monitor our solutions in the future? What capabilities will we have in our tool box? These and many other questions will be addressed in this session, which will focus on Microsoft Azure AppFabric Platform and emerging capabilities such as Composite Application, Access Control Center, Caching, ServiceBus Topics & Queues and other enhancements, and Integration.
Presenter: Johan Hedberg, Solution Architect, Enfo Zystems

Using AppFabric Cache to Maximize the Performance of Your Windows Azure and On Premises WCF Applications

Caching is an integral part of an overall scaling strategy. By properly utilizing caching you can radically increase the number of concurrent users your application can service. Much of the caching information available to users today only focuses on server side caching. Server side caching is important, we will cover it in this session, and we will show concrete techniques to maximize its effectiveness . However, this session will also cover client side caching techniques. Client side techniques are often overlooked in spite of the fact that in order to truly hit extreme scale those techniques are nearly always necessary and often end up being bolted on after the fact. After attending this session, the attendees will walk away with the concrete knowledge and code necessary to immediately improve their WCF application performance.
Presenter: Paolo Salvatori and Mikael Håkansson

Deep dive: How to integrate BizTalk Server with Windows Azure Service Bus Messaging

The Windows Azure AppFabric Service Bus and Windows Azure Connect are the foundation for building a new class of distributed and hybrid applications that span the cloud and on premises environments. The Service Bus is an Internet-scale Service Bus that offers secure, scalable and highly available connectivity and messaging capabilities. Windows Azure Connect provides a network-level bridge between applications and services running in the cloud and on-premises data centers. Windows Azure Connect makes it easier for an organization to migrate their existing applications to the cloud by enabling direct IP-based network connectivity with their existing on-premises infrastructure. In this session you will see how to integrate these technologies with BizTalk Server to create solid and cloud-ready solutions.
Presenter: Paolo Salvatori, Senior Program Manager Microsoft

Baseline for BizTalk Hands-on

Baseline provides a comprehensive framework that supports the design, development and maintenance of systems integration solutions. In this session we will provide a practical example of how to use the Baseline methodology and tools to refine project requirements into a working BizTalk solution – tested, documented and packaged, ready for deployment in BizTalk Server 2010. In the process we will use Baseline documents and the Baseline Portal to highlight the main strengths of Baseline.
Presenter: Martin Rydman and Mikael Håkansson

Scripts used at my TechEd session (MID309 | Configuring Microsoft BizTalk Server for Performance)
22 May 11 08:52 PM | wmmihaa | 1 comment(s)

Thanks to everyone attending my TechEd session on Thursday. You can find the scripts I was running here (sorry I didn’t put them up earlier). Let me know if you have any trouble.

If you didn’t attend, you can view the session online here.

image

Good luck

//Mikael

Demos from the How to do integration with Office365 and On-Premise Applications at TechEd (MID372-INT)
18 May 11 06:51 PM | wmmihaa | 2 comment(s)

Thanks to everyone attending my session on integration with Office365 and on-prem applications.

All demos can be downloaded from here: http://blogical.se/files/folders/downloads/entry25152.aspx

I recommended you to start downloading the AppFabric SDK CTP, in which you’ll find the ClientAccessPolicyPublisher sample I was running in the last sample:   http://www.microsoft.com/downloads/en/details.aspx?FamilyID=d89640fc-c552-446e-aead-b1e0d940f31b

Good luck and let me know if you need any additional help.
Top 10 Things to Know When Integrating with Line of Business Systems
09 May 11 11:39 PM | wmmihaa

We are once again fortunate to have prominent speakers visiting our user group in Sweden. This time it's Kent Weare and Richard Seroter. Both of them have been here before, and have both been much appreciated speakers. (BTW if you are aware of any challenges happening in Sweden at the time of the event such as Iron man, marathon or cage fighting let me know so we can sign up Kent).

Kent Weare recently led a team of authors in their production of the book Microsoft BizTalk 2010: Integrating with Line of Business Systems (Packt Publishing, 2011).  This book walks through multiple technologies and how to integrate with them via BizTalk Server 2010.  Join Kent and Richard as they deliver the Top 10 Things to Know When Integrating with Line of Business Systems.
In these sessions, they will walk us through numerous key principles to follow when doing system integration and they will draw inspiration from their new book.  These principles will be demonstrated by integrating BizTalk Server 2010 with SharePoint, Windows Azure, SAP and software-as-a-service providers.

Monday the 13th of June 18:00 at Microsofts office in Akalla


If you have an opportunity to join us, sign up. If you need help booking travel arrangements such as hotel, let us know by dropping us an email.


Welcome

PowerShell cmdlet for BizTalk db restore
22 February 11 08:59 AM | wmmihaa | 4 comment(s)

Configuring the backup job for BizTalk is a fairly simple task, while restoring it is a bit more complicated. By default the BizTalk backup job makes a full backup once a day, and a log backup every 15 minutes. When backups are done, a mark is set on each file. This mark is the same across all databases, and should be used to restore all databases to the same point in time and keeping all databases in a consistent state. –Also, by default, all backups are made to the same directory folder.

The only supported disaster recovery procedure from Microsoft is log-shipping. Nick Heppleston has gone through the trouble of describing this in great detail, and I strongly recommend to read these post before you choose to use any other approach.

A few weeks ago, I sent out a question on twitter, asking whether people used log-shipping or not. I got 24 responses where only 4 used log-shipping.

Although log-shipping comes with many advantages, it is still expensive since it requires a secondary SQL cluster. Most of the people I asked confirmed this was the main reason why they had chosen other alternatives. Since BizTalk doesn’t come with any restore scripts/features other than log-shipping, everyone is left to fix this on their own.

If you’re in the same situation, feel free to download this sample. If it doesn’t fit your solution, it might at least be a good starting point.

The sample comes with two cmdlet’s: Get-Marks and New-RestoreDatabaseFromMark. The first one gives you a list of all marks from all log files. The second one, as the name implies, restores a database to a specific mark. When doing so, the database will be restored from the last full backup before the mark. After that, all log files will be restored in order from the full backup. The last log file will only be restored to the specified mark.

image

The Get-Mark cmdlet queries the backup output folder to retrieve all marks. The mark is part of the name of each backup file:

image

Each file is made up of the following parts:

[Server]_[Instance*]_[Database]_[Full|Log]_[Mark]

Eg: SERVER001_DTA_BizTalkDTADb_Log_BTS_2011_01_18_12_06_50_22.bak
* The instance is only present for none default instances.

You can use the New-RestoreDatabaseFromMark cmdlet with or without specifying the mark. Leaving the mark empty is eqvivilent to last mark.

The New-RestoreDatabaseFromMark cmdlet is called per database, why it's easier to create a script for restoring all databases together. The sample comes with a RestoreScript.ps1 script file, which could work as a good start:

$backupPath = "X:\BizTalkBackUp";
$dataPath = "E:\SQL Server 2008\MSSQL10.MSSQLSERVER\MSSQL\DATA";
$logPath = "E:\SQL Server 2008\MSSQL10.MSSQLSERVER\MSSQL\DATA";



$mark = Read-Host "Specify mark (use the Get-Marks cmdlet to get all marks or blank to use last mark)";

if ($mark.Length -eq 0)
{
Write-Output "Restoring to last mark...";
$mark="";
}

New-RestoreDatabaseFromMark SSODB $backupPath $dataPath $logPath $mark;
New-RestoreDatabaseFromMark BAMPrimaryImport $backupPath $dataPath $logPath $mark;
New-RestoreDatabaseFromMark BizTalkDTADb $backupPath $dataPath $logPath $mark;
New-RestoreDatabaseFromMark BizTalkMgmtDb $backupPath $dataPath $logPath $mark;
New-RestoreDatabaseFromMark BizTalkMsgBoxDb $backupPath $dataPath $logPath $mark;
New-RestoreDatabaseFromMark BizTalkRuleEngineDb $backupPath $dataPath $logPath $mark;

trap [Exception]
{
Exit;
}

Write-Output "Done restoring all BizTalk databases" -foregroundcolor "yellow";

The first section of the script defines a set of path variables. In my simple sample, all database files are located in the same folder. This is never a good practice, why you probably have different paths for data- and log files for each database.

To use the sample, open PowerShell and navigate to the sample output folder. Eg:

PS C:\> CD “c:\Program Files\bLogical.BizTalkManagement”

Before you can use the cmdlets, you need to install them. You can do this using the install script:

PS C:\Program Files\bLogical.BizTalkManagement> Install.ps1

After you’ve installed the snapins, you can start using the commands:

PS C:\Program Files\bLogical.BizTalkManagement> Get-Mark "X:\BizTalkBackUp"

or

PS C:\Program Files\bLogical.BizTalkManagement> MyRestoreScript.ps1

image

Downloads 

HTH

More Posts Next page »

This Blog

News

    MVP - Microsoft Most Valuable Professional BizTalk User Group Sweden BizTalk blogdoc

    Follow me on Twitter Meet me at TechEd

    Visitors

    Locations of visitors to this page

    Disclaimer

    The opinions expressed herein are my own personal opinions and do not represent my employer's view in anyway.

Syndication