September 2010 - Posts

Me and BizTalk Server 2010 “on the air”
27 September 10 10:41 PM | Johan Hedberg

With BizTalk Server 2010 going RTM I was interviewed briefly about the release in the last issue of the Swedish podcast show MSDN Radio. Link to show is here. It’s in Swedish. I’m on at about 5:20 in, but it has lots of other interesting content as well.

Filed under:
BizTalk Server 2010 RTM notes
27 September 10 12:41 AM | Johan Hedberg

I generally try not to blog about the same thing as countless others do, but being a BizTalk MVP I feel obliged to make some notes about the release of BizTalk Server 2010 as it really is big news.

BizTalk Server 2010 has actually been RTM for a while and available to Volume License customers for a short period of time already. It will become available for general purchase on October 1st – or so the official announcement on the BizTalk Team Blog says. That post does a good job of covering the news in the release, as does the microsoft.com New Features in BizTalk Server 2010 page. For an even more comprehensive write-up download the Microsoft BizTalk Server 2010 Technical Overview white paper.

There are also numerous other MS web pages that have been updated. I’ll try to highlight a few of the things I haven’t seen mentioned so far and provide links to the more important ones:

  • Editions
    • The Development Edition is now a FREE download and the page clearly states that it’s the one to use for development and testing environments, which is a relief since it’s a discussion I’ve been in more than once regarding the use of MSDN licenses.
    • The BizTalk Adapter Pack is no longer licensed separately. You need at least STD for this.
  • Pricing and Licensing Overview
    • BizTalk Server has gotten slightly more expensive for the production environments ($44k ENT, $10k STD).
    • Current limitations with STD are still there: Single Server, Two procs AND as it seems limited 64-bit support. The technet 64-bit Support pages seems to confirm that STD still has a 64-bit exception when compared to other versions.
      • Update 20100930: Product Group representatives says the documentation is incorrect and Standard DOES SUPPORT 64-bit. Good thing that was included.
      • Update 20101001: It’s proven! 64-bit supports standard. Se blog post here.
    • The AppFabric Connect feature is an installation option. Since I see now real licensing info about it I’m assuming it’s not stand alone. And since the pre-requisite of the Adapter Pack that makes some of that magic happen is packaged with STD (and up) I can only assume at this point that using the mapper in production also requires at least STD. I would still argue that it could be well worth it to enable scenarios like this.
  • Pricing and Licensing FAQ
    • There is a running discussion on how long Microsoft can continue to license per socket instead of heading in a slightly different direction as the number of cores become more and more. Whenever (if ever) that happens. It’s not now. BizTalk continues to be licensed per proc.
    • The ISV Licensing (formerly Runtime Editions) is still there. This is not a really well known possibility but essentially it allows you to package BizTalk as part of you product. The customer can’t use it with any products except your own. It’s hard to get a quote for this as it isn’t publicly listed anywhere, but it’s way cheaper that ordinary licenses. I have a quote, but I don’t want to put it out there. You could probably just call your local MS rep or Licensing partner to get your own.
  • System Requirements
    • You should not be fooled by the fact that it says minimum 2GB of RAM. If you install BizTalk on a 64-bit system you will want at least 4GB or RAM. And I’m saying that even though Windows Server 2008 (R2) says that its minimum system requirements are 512MB. I suppose no one reading would want to try that either.
  • BizTalk Developer Center
    • Has been updated with a lot of new content for BizTalk Server 2010 and October’s theme is the BizTalk Server 2010 Launch.
    • There’s also a BizTalk Server 2010 Training Kit available. It contains 6 labs for the Developer and 3 for the Admin that highlight new features. It’s great to see the Admin getting some much needed love on the training side. There is also 6 videos available weighing in at a 823MB download.
    • Alot of other online content like the tutorial scenarios seems to have been getting an overhaul as well (as you would expect) showing of screenshots of the new mapper and other new features.
  • Microsoft BizTalk Server 2010 Help
    • Except for the updated online version there are numerous downloadable documents. The Installation and Upgrade guides as well as the CHM – which personally, I couldn’t do without.
  • MSDN Subscriptions
    • So far there is no availability through this channel. I would love to see an .iso for the Developer Edition as I’m not to fond of keeping extracted catalogs around, or self-executable zip files. Keep this feed in your feed reader to be notified when it arrives.
      • Update 20101001: Enterprise, Standard, Branch (.iso’s) and Developer (still as .exe) are available at MSDN.
  • Upgrading from Beta
    • Although some people (like Brian Loesgen) seems to have been able to upgrade seamlessly from the beta, I couldn’t. I got an exception saying that the Enterprise version was installed and that I had to uninstall before installing. Fair enough. My Beta version does register as Enterprise in the Add/Remove programs dialog.
  • Errors and issues
  • Misc new features and updated tools

That ended up being quite a lengthy post. But then again… I’m a superstar and that’s how I do it! I hope it helps.

Filed under:
My BizTalk infrastructure design baseline
20 September 10 09:39 PM | Johan Hedberg | 4 comment(s)

How should I design my environment? What OS version? SQL Edition? BizTalk license? Etc. Etc. I get these questions frequently.

The only one true answer to this question is the architects favorite - “it depends”. And once you know the requirements - some of the things on which it depends - it will still be closely followed by its companion: “We need to test to know”.

Still, I think everyone has their favorite configuration - that they then add or deduct from based on the requirements. I do. This is how it goes.

Servers

Four. Two SQL, Two BizTalk.
Why? High availability for SQL and BizTalk. Load balancing on BizTalk machines.

OS

Windows Server 2008 (R2) Enterprise, on all machines.
Why? Clustering. Plain and simple.

SQL Edition

SQL Server 2008 (R2) Standard, on the SQL boxes.
Why? Off the BizTalk environments I have worked with they have only very seldom gone beyond two machines which is the only real limitation with Standard that I care about. I know of limitations to RTA in BAM and of performance gains with Enterprise, and sometimes that may be required – but not as a baseline.

SQL Instances

Four SQL Server Instances: BizTalkMgmt, DTA, BAM, MsgBox.
Why? Prepare for and maximize scale-out possibility. Simplify IO division. Help with memory reservation.

SQL IO

Three disks per instance: Data, Log and TempDb. Baseline is 40, 20, 15 GB. Varies with requirements.
SAN if available.
Why? Disks are un-expensive. IO is core to SQL. SQL is core to BizTalk.

BizTalk Edition

BizTalk Server 2009/2010 Enterprise
Why? Number one reason – you want to be able to go beyond a single server: Because you want load balancing and high availability.

Virtualization

Not in my baseline.
Why? Could be a requirement with certain customers and it certainly works, but I would recommend physical machines because I think it gives more bang for the buck. BizTalk can many times be a processor, memory and IO intensive application.

Processor

Quad cores for sure, Hexa or Octo if availability permits. One proc per server is enough with this amount of cores as a baseline. Requirements like high throughput messaging and processing may cause it to rise.
Why? Same as above. Higher ROI with multi-core.

Memory

4 GB minimum for a 64-bit OS, preferably 8GB or more total memory on BizTalk Servers.
At least 16GB on the SQL machines.
Why? Memory is a cheap commodity right now. Not the right place to be cheap.

Summary

The above is in no way thorough. It wasn’t meant to be. There is no “One Truth”. I stress that I call it a Baseline. It was meant to be a brief overview. It’s based on the most common questions and the most common requirements for my customers. Consider your own requirements. Mine might not match yours. “It depends”…

Input

What’s your baseline? Where does it differ? Where have you drawn the same conclusions?

Filed under: ,
To learn or not to learn - it’s about delivering business value
19 September 10 10:43 PM | Johan Hedberg | 1 comment(s)

For a developer Windows Azure is an opportunity. But it is also an obstacle. It represents a new learning curve much like the ones presented to us by the .NET framework over the last couple of years (most notably with WF, WCF and WPF/Silverlight). The nice things about it though is that it’s still .NET (if you prefer). There are new concepts – like “tables”, queues and blobs, web and worker roles, cloud databases and service buses – but, it’s also re-using those things we have been working with for numerous years like .NET, SQL, WCF and REST (if you want to).

You might hear that Azure is something that you must learn. You might hear that you are a dinosaur if you don’t embrace the Windows Azure cloud computing paradigm and learn its APIs and Architectural patterns.

Don’t take it literally. Read between the lines and be your own judge based on who you are and what role you hold or want to achieve. In the end it comes down to delivering business value – which often comes down to revenue or cost efficiency.

For the CxO cloud should be something considered. For the architect, Azure should be something grasped and explored. For the Lead Dev, Azure should be something spent time on. For the Joe Devs of the world, Azure is something that you should be prepared for, because it might very well be there in your next project and if it is – you are the one that knows it and excels.

As far as developers embracing Windows Azure I see a lot of parallels with WCF when that launched. Investments were done in marketing it as the new way of developing (in that case primarily services or interprocess communication). At one point developers were told that if they were not among the ones who understood it and did it, they were among the few left behind. Today I see some of the same movement around Azure, and in some cases the same kind of sentiment is brought forward.

I disagree. Instead my sentiment around this is: it depends. Not everyone needs to learn it today. But you will need to learn it eventually. After all… today, a few years later - Who among us would choose asmx web services over WCF today? Things change. Regardless of how you feel about it. Evolution is funny that way.

Because of the development and breadth of the .NET Framework together with diverse offerings surrounding it a wide range of roles are needed. In my opinion the “One Architect” no longer exists. Much the same with the “One Developer”. Instead the roles exists for different areas, products and technologies – in and around .NET. Specialization has become the norm. I believe Azure ads to this.

I give myself the role of architect (within my field). Though I would no sooner take on the task of architecting a Silverlight application than my first pick of on boarding a new member in our integration team would be someone that has been (solely) a Silverlight developer for the last couple of years.

How is Azure still different though? Azure (cloud) will (given time) affect almost all of Microsoft’s (and others) products and technologies (personal opinion, not quoting a official statement). It’s not just a new specialization - it will affect you regardless of your specialization.

You have to learn. You have to evolve. Why not start today?

Filed under: ,
Defining “cloud computing” – in my opinion
19 September 10 10:21 AM | Johan Hedberg | 2 comment(s)

As I will begin doing more posts on and around cloud computing in general, and Windows Azure in particular, I’d first like to give my view on when something is cloud and not.

Why? Well, it’s not the first time a word gets status. It gets hot. It gets overused, overloaded and obfuscated. Vendors, consumers, service providers and others might not always agree what cloud is. It will get slapped on, belted down or fuzzily added to an existing product or service to make it more “today”. I might not agree. Others in turn may think I’m wrong. It will add to the overall confusion. So, to hopefully help provide clarity (but potentially adding to the confusion), when do I consider something to be “cloud”?

There are a couple of characteristics that I would look for when it comes to cloud.

Elastic

Or On-Demand. I would assume that I could scale up and down. At any time. I would assume that the procurement process for another server, piece of service (accounts, users, databases etc) is immediate (or next to). Same when scaling down. I would expect to be able to manage this elasticity myself.

Elasticity is not “I have a cold stand by server I can bring online”. Elasticity is “I need 10, 20, or 100 new instances and I need them for two days”. The dynamic capacity does not have a defined limit.

Pay-per-use

I would expect that the service uses some kind of pay-per-use charge model. How I use the service would be measured. How many hours have I been using it? How many GB of storage? How many MB transferred? How many connections opened? How many customers on boarded? How many users? – That sort of thing.

I would hope not (and this one is on the fence) to have to pay for underlying software in the form of procurement or running licensing. It would be included in the service. However this very much plays to the Software-as-a-Service or Platform-as-a-Service (PaaS) rather that say Infrastructure-as-a-Service (IaaS). In the latter I would of course have to handle licenses myself.

Hardware agnostic

I would expect that I do not need to care about underlying hardware. I wouldn’t need to know the cost of purchasing it nor how it is set-up or configured. I wouldn’t need to specify how my machine is built. Even if I do choose the size of the machine, that’s not really my machine, and I can change that at any time.

I would expect the environment, as far as the service or servers go, to be fault tolerant. If hardware fails, or some service needs to be performed, I would assume it to be transparent to me and not effect me or my service.

Summary

If someone calls their offering a cloud service, and it does not fulfill these things I would think twice before considering it a cloud offering. The service offered might be just what I want and need, but I wouldn’t consider it “cloud”. Windows Azure fulfill all these.

This is not an exhaustive list of what I would expect, nor of the capabilities or limitations of Windows Azure or any other platform, though it is what I would say raises cloud above hosting.

Addendum

The term private cloud is often mentioned by hosting providers that would like to compete (at least marketing wise) with the bigger cloud offerings like Windows Azure. In my experience, localized as it might be, they fulfill some of the tenants I hold to, but they often fail on things like rapid (and self-serviced) procurement of a new resource (like a server) or on the metered pay per use model – where often you are expected to pay for something based on your time of pre-determined need of availability to that resource - like a month, if not a year.

Also, for the something in the cloud to be usable for a business that will likely have parts of their business, but not all of it, in the cloud I would look for it to be

Secure connectivity

Since the cloud is not on-premise I would expect there to be a solution available for how do I, in a secure manner, connect what I almost certainly still have on-premise with what I have off-premise – in the cloud. I would hope this to be based on some kind of federated security model, and not by leasing a land line, using VPN, connecting to the existing Active Directory domain or setting up a trust. Though I’ll settle for tried and proven solution.

Further posts

Continuing on with additional Azure posts I’ll try and link back to these pillars, if possible. With pure how-to technology posts that might not always be applicable, but keep these base concepts in mind anyway so that you architect and build your solutions to support them.

Filed under: ,
Applied Architecture patterns on the Microsoft platform geographic spread
11 September 10 07:43 PM | Johan Hedberg | 1 comment(s)

 IMG_2292

BizTalk User Group Sweden organized an event the 8-9 of September that was marketing branded as the European BizTalk Conference and featured no more than 2 US BizTalk MVPs and a member of the SQL Server CAT Team. Richard Seroter, Stephen W. Thomas and Ewan Fairweather. They delivered an impressing 13 session over the course of two days featuring content from the new book they co-authored – Applied Architecture Patterns on the Microsoft Platform (Sample chapter is here). A crowd of roughly 150 people from an amazing 13 countries (among them around 10 MVPs) listened attentively as they delivered patterns, technical insight and lessons learned on topics such as BizTalk Server, Windows Server AppFabric, Windows Azure and SSIS. To top it off the whole thing was recorded and will be made available through MSDN. People walked away happy, wearing nice giveaway polo shirts. A gang of three was even more lucky as they were picked as the winners for a MSDN Ultimate subscription giveaway. Did I mention it was free?! Url of event is here: http://bugs20100908.eventbrite.com/

UPDATE: Presentation material from the event is here (this leads to BizTalk User Group Swedens main site, look in the bottom right corner). I will update again with URL of recordings once they are available.

Map is courtesy of Eventbrite, which really is a wonderful service, especially if you are a non-profit organization like a user group.

image

BizTalk, OpsMgr and Triathlon – What have they got in common?
11 September 10 07:24 PM | Johan Hedberg | 2 comment(s)

Canadian BizTalk MVP Kent Weare.

At the 26th of august the BizTalk User Group Sweden had Kent over to deliver two sessions around System Center Operations Manager and its Management Pack for BizTalk Server. The sessions were separated by a networking and food pause and delivered to a group of around 75 people. Together with Johan Hedberg (me!) and 35 others he also completed the Stockholm Triathlon participating in the Microsoft SQL Server Fast Track Team (newspaper slideshow (in swedish) here: http://it24.idg.se/2.2275/1.337297/blott-och-svettigt-for-microsoft) (Named persons not featured). Url of user group event is here. Blog post by Kent is here. Presentation slides can also be found at the BizTalk User Group Sweden site (look in the lower right corner area).

IMG_2264

How-to: Easily examine the incoming message using tracking
11 September 10 04:48 PM | Johan Hedberg

I got a question the other day, and thought I’d post a very short how-to. The problem the questioner had was this: They encountered a situation where they wanted to view the incoming document before BizTalk has begun processing it – the sending party “hadn’t changed anything” yet all of a sudden the message was failing in BizTalk. How can this easily be available?

By default, when you look at a ports Tracking tab, no box is checked.

clip_image001

If you try and look at the message details of such a message

clip_image002

you will get an error dialogue.

clip_image004

However if I return to my port and check some of those check boxes for tracking, in this case “Request message before port processing”

clip_image005

make sure that the SQL job TrackedMessages_Copy_BizTalkMessageBoxDb is running as it should (which by default is enabled and running)

clip_image006

Now if I send in another message and try to look at its Message Details I get a dialogue containing the details I am after, including the content of the message

clip_image008

As expected this dialogue may not show some characters correctly, but you can easily go to File… Save Message.

This will create two files, one which is the context properties, and the other the message.

clip_image010

The message will stay in the database as long as configured in the database job DTA Purge and Archive (BizTalkDTADb). Live means the the ServiceInstance exists (ie suspended) and Hard that it is no longer in BizTalk.

XmlDisassemble in a passthrough pipeline?
11 September 10 04:32 PM | Johan Hedberg | 3 comment(s)

So, I a couple of weeks ago now I opened a case on Connect, and this is just to wrap it up in a blog-post for discovery and archive purposes.

Background

We use BAM for infrastructure tracking. We apply a simple tracking profile to all our ports for all our customers in all our environments. BAM does this like a charm, and handles the clean up, indexing, and archiving of this data automatically by schedulable jobs – tunable to customer requirements. Many times BizTalk will be used to do file transports without really caring about the content, alongside the more traditional transformation or orchestration work. Other times the requirements might cause me to want to bring in a file in BizTalk to handle things like processing, disassemble, splitting or something else in one (or more) orchestrations. To me, nothing strange with this, and I’m hardly alone in this.

The problem

Even though the passthrough pipeline is used, with BAM tracking, disassemble of the incomming file will be attempted or performed. This might result in disassemble errors and/or debatching of message with envelopes were that was not intended.

If you do BAM tracking and apply that to a port that has the passthrough pipeline on it the code will create a xmldisassembler as part of the BAM tracking process in the passthrough pipeline. If this does not find what it considers XML through it's Probe method, things seem fine. Otherwise this can create issues. Two that we have identified is:

  1. If the message passed through the passthrough pipeline matches that of an envelope schema deployed to BizTalk it will be debatched into the first document by the pipeline.
  2. if the document contains faulty xml - pipeline processing will get an exception.
    image

Why does this happen?

Looking through the code with Reflector gives us some insights. The ReceivePipeline class, base class of the PassThruReceive class implements the GetNext member. Towards the end of that code the below code is present:

IBaseMessage ReceivePipeline.GetNext()

if (base.PipelineContext.InvokedFromMessaging)
{
  msg = base.DoBamTracking(msg, Pipeline.BamTrackingMode.BamTrackingOnly);
}

The DoBamTracking method (which resides in the even more generic Pipeline class) decides whether or not BAM tracking is necessary be looking at some context properties (supposedly populated by the fact that I’ve applied a Tracking Profile and mapped it to the port in which context the pipeline is running). If it is it then creates a XmlDasmComp component and Probes the message. If Probe returns false, then we are, as stated above, fine. If not… later in the code it will use the (XmlDasmComp) component to run Disassemble followed by its GetNext implementation and return the result. The result: A disassemble message – if it succeeds that is.

IBaseMessage Pipeline.DoBamTracking(IBaseMessage, BamTrackingMode)

IBaseComponent component = PipelineManager.CreateComponent(
"Microsoft.BizTalk.Component.XmlDasmComp, Microsoft.BizTalk.Pipeline.Components,
Version=3.0.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL"
); IProbeMessage message2 = (IProbeMessage) component; … if (!message2.Probe(this.pipelineContext, msg)) { return next; } … IDisassemblerComponent component2 = (IDisassemblerComponent) component; component2.Disassemble(this.pipelineContext, msg); next = component2.GetNext(this.pipelineContext); ... return next;

Steps to reproduce

  1. Deploy a envelope schema and a document schema.
  2. Set up a BAM activity with at least one field.
  3. Connect the BAM element in the tracking profile to something.
  4. Create a receive port with a passthrough pipeline.
  5. Connect the tracking profile to the receive port.
  6. OPT 1) Send a message through with badly formed xml, like a one line message with only the text "<TestMessage>" (xmldisassembler exception). OR
    OPT 2) Send a valid envelope with two document in it through (debatch will occur and leave only the first document).

Code to reproduce

(PassThruProblem.zip contains artifacts built for 2010)
(PassThruPipeline2006.zip contains 2006 (R2)/VS.NET 2005 artifacts.)

Versions affected

I have tested this in 2010, 2009, 2006 R2 SP1 CU3 - all have the issue.

Are you experiencing this issue?

If you are having this issue and want it fixed, vote it up on Connect and mark is as reproducible.

Further, we have found that to stop this from happening on a port the only way is to to re-create it. Removing the port mapping in the tracking profile for some reason is not sufficient.

This Blog

News

    Messenger

    Twitter Updates

      Follow me on twitter

      Visitors

      Feedburner Subscribers

      Locations of visitors to this page

      Disclaimer

      All material is provided AS IS voiding any thinkable or unthinkable effect it might have for any use whatsoever. There... is that clear enough ;)

      Pages

    Syndication