Monday, December 12, 2011

Dynamic SMTP Failure: Unknown Error Description

I was sending an email dynamically, from an orchestration.

I set up a nice payload message with all of these nicely distinguished fields, that contained the to/ from, subject and body of the email, so that other parts of my system could send an email, just by generating this message.

My message assignment shape had:

EmailSendMessage = EmailReceiveMessage;
EmailSendMessage(SMTP.EmailBodyText) = EmailReceiveMessage.Body;
EmailSendMessage(SMTP.CC) = EmailReceiveMessage.CopyTo;
EmailSendMessage(SMTP.From) = EmailReceiveMessage.From;
EmailSendMessage(SMTP.Subject) = EmailReceiveMessage.Subject;
EmailSendMessage(SMTP.MessagePartsAttachments) = 0;

EmailSendPort(Microsoft.XLANGs.BaseTypes.Address) = "mailto:" + EmailReceiveMessage.SendTo;


The message itself was not sent, just the properties of this message, so the message I was constructing as receive message didn’t matter, so I made it the same type of the EmailReceiveMessage. It compiled, and I deployed it.

First Try:


Event Type: Error
Event Source: BizTalk Server 2009
Event Category: (1)
Event ID: 5754
Date: 13/12/2011
Time: 1:16:31 PM
User: N/A
Computer: [computer]
Description:
A message sent to adapter "SMTP" on send port "XXX.Email.Orchestrations_1.0.0.0_XXX.Email.Orchestrations.SendEmail_EmailSendPort_43e93d0db20c465a" with URI "mailto:email@address.com" is suspended.
Error details: Unknown Error Description
MessageId: {B66F52BA-DFF0-4274-B4B2-3B1F51E862E0}
InstanceID: {C7A4CA9E-E606-4D04-9001-D34974B4D971}

For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.


I then checked and you needed to have the body text charset:

EmailSendMessage(SMTP.EmailBodyTextCharset) = "UTF-8";


Second try, same error…

Several modifications later I get the message: Unknown Error Description

Which I must say is not a great deal of use when you are trying to figure out what’s wrong.

I then decided I’d make a different message type for sending, and used a transform. I used my send message and copied and pasted the schema, I changed the target name space of course.

SAME ERROR…..

I was annoyed, and tried a bunch of things, then EUREAKA! …

The send schema which I copied and pasted has the same properties promoted as distinguished fields.

This was BAD it seems, the instant I removed these properties from being promoted, and changing nothing else… everything worked….

BAD BAD BAD... schema properties, who would have thought…

DO NOT HAVE PROMOTED PROPERTIES ON YOUR SCHEMA WHEN SENDING TO THE SMTP ADAPTER.

Friday, November 11, 2011

ABA bank payment file format (Australian Bankers Association)

I'm currently working on an application for a un-named organisation. As part of this, I need to export files for processing in the Australian defacto standard for Electronic Funds Transfer (EFT) files - the ABA format.

I'm using BizTalk of course, and it can handle this weird format, however all I had was the sample file, which for a fixed width flat file is not great.

I found the format documented here: http://ddkonline.blogspot.com/2009/01/aba-bank-payment-file-format-australian.html

Apparently the banks have all agreed on this format, which is firstly Ancient in origin and format, and design Highlighting that it’s a flat file, of fixed field lengths..

If this was re-factored into a XML format, it would be much easier to generate, highly flexible, and they could expose an interface, via a simple web service to accept this format. All authentication could be done via a secure https web service, with encryption on the web service.

Let’s understand that this is used to effect payments from a company’s bank account to individuals, this is highly sensitive and needs to be secured.

This format of this file is NOT encrypted in anyway; it is open, readable, and modifiable. There are no check digits, no certificate of authentication or any of the modern features you would expect in such a file.

By Exposing a WCF endpoint, which had authentication via certificate, using an https/transport encryption/security would handle some of this requirement; the rest is in the detail of the message itself.

Currently what happens is we output this file to the file system, and then someone picks it up... and processes it...

If a WCF or web service endpoint was open by the bank, we could securely communicate with this, and send the payment file, someone could log onto their secure interface and approve the transfers still, however there would be zero chance of someone modifying this file before it got there....

Before we got involved, this was just popped onto the file system somewhere, generated via a different method…

I am not one to mess around when it comes to security, and this smells to me. The banks need to provide an interface. I’ll happily build it, securely and flexible enough for all platforms to communicate with it.

Sunday, October 2, 2011

Handy Hint: BtsCompile

Are you looking for the build action on an orchestration, you know BtsCompile
For the uninitiated, it is great for having a BizTalk Artefact that’s broken, or not
complete and having it not form part of the compile. It works for maps, schemas,
you name it.

I generally use it to set it for some unfinished orchestrations not to include in the
build/compile, so it does not break things....

Well, as it happens, in BizTalk 2009, it can disappear, for some reason, however I have
discovered the way of making it re-appear.

Simple copy and paste ANY orchestration in your solution, and paste it back, now the
BuildAction property appears, not just for that orchestration, of ALL of them.

It’s as if it knows, if you are copying and pasting an existing orchestration you
are probably not going to want to have it build right away.

Tuesday, September 6, 2011

Fixed Field Settings on Map

On a map, you can set the value of a destination node to a fixed value, from within the mapper, simply by setting the value property when clicking the destination node on the right hand side schema. I’ve see people use the string functiods to pre-set a value, however you do not need to do this.



There is a catch I found in BizTalk 2009. If you set the fixed value on a map page other than the first page, it does not set, worse yet, if that field on the destination schema happens to be a promoted property, it does not promote the value, and will fail when you try and use the value inside your orchestration as desired.


You will not know why, the shape will just fail, with some weird exception.


Always use the first page.

Wednesday, August 31, 2011

DefaultPipelines.XMLReceive, Attempted to read or write protected memory

I got the error of death.... I mean we were this close to formatting the machine.

Event Type: Error
Event Source: BizTalk Server 2009
Event Category: BizTalk Server 2009
Event ID: 5719
User: N/A
Description:
There was a failure executing the receive pipeline: "Microsoft.BizTalk.DefaultPipelines.XMLReceive, Microsoft.BizTalk.DefaultPipelines, Version=3.0.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" Source: "XML disassembler" Receive Port: "GetReferenceDelta" URI: "mssql://zz11aazz//My?" Reason: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.

I thought that the xmlpipeline was broken, so I tried the passthrough...

There was a failure executing the receive pipeline: "Microsoft.BizTalk.DefaultPipelines.PassThruReceive, Microsoft.BizTalk.DefaultPipelines, Version=3.0.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" Source: "Unknown "
Receive Port: "GetMyDelta" URI: "mssql://server/uri" Reason: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.

I can no longer receive ANY messages into BizTalk from SQL...

I tried the file receive, I tried xml and pass-through pipelines, nothing works, and it all fails....

I can no longer receive ANYTHING at ALL into BizTalk.... Error of death...

I looked, I tried updates to the Adapter packs... nothing worked....

I was about to trash my environment and format and start again....

THEN!!! I thought, let me remove all updates I had done to BizTalk, I removed one, and tested, removed one and tested.

Finally finding that..... The offending update was cumulative update 1:

http://support.microsoft.com/kb/2429050

I removed it, and my BizTalk came back to life..... PHEW!

Saturday, July 30, 2011

Encoded Multi Part Messages to SOAP adapter

I had one of those old/non conforming to the standards web service of kinds I needed to call.
The ones you really hate, and wish they’d just upgrade them to WCF... but hey thats 90% of integration, old rotten systems... so I persisted.

It did accept xml messages, however was very fussy as it seemed to employ the good old, no xml translator, the one that is more of a text base, string manipulation xml decoder, where if you send quite valid xml, it will fail because they didn’t support it.
This kind of set up is... sadly... quite common....
It would not handle xml tags that had no content, but had a closing tag.

EG: <salary currency="en-AU"></salary>
It needed to have them encoded differently:
<salary currency="en-AU"/>
Now both of these examples are valid xml, however good old string parser went the easy route and didn’t cater for an end tag.
The problem is that you can’t exactly tell biztalk to not put the end tag, it’s going to put xml, valid xml, and thats what it does....
Further more... it also complained about white space in my xml message.... ??? you say???
<salary currency=""><salary> <standout logoid="”1111"><standout>
The white space it didn’t like was between the tags, <salary>[--- white space ---]<standout>

Now how on earth do you cater for this?

A custom pipeline component to the rescue, just before it sends the message I would “FixUP” the xml by removing the end tags, and white space between tags, using my old favourite, c#.net .
So I go to the trouble of making a custom pipeline component, hook it up to a custom send pipeline, and then attach the pipeline to the send port.

BANG, nothing happens... it stil fails for the same reason.

At this point I’m a little annoyed, and more p’d off than i’d care to mention.
I spend the next hour trying to figure out why......

Firstly the web service is a SOAP web service, I have to use the SOAP adapter because it’s a multipart message, I check with Yossi. http://blog.sabratech.co.uk/2009/08/biztalk-wcf-adapter-and-multipart.html to discover that even if I went to WCF and fudged a wcf-http port to work it would not make much difference.

I then try and pump the message out to the file system, using a send port in the orchestration. I get a message, with only the first part. It throws me for a second, then I realise that the FILE adapter does not support multi part messages. Weird, considering all messages are multi part messages, most with one part, but I prevail.

How do I see what is going to the adapter? I really need to see the message, I mean I have tested by component, and it is good, it works and does the trick, its .net, so I of course have a .TEST project to test it...

How... Ah..ha... You can stop the send port, not unenlist it, but stop it, the message will still go to the port however sit there waiting for you.

I can then see the message from the admin console, ALL parts of the message, a much better way to debug messages BTW, I notice the message part has been encoded, the multi part message takes two strings, the second of which is my xml message. So the message is wrapped in a <string> tag, with the contents.... no longer in XML format, they are html encoded. My lovely <salary> tag is now a horrid, & lt;Salary& gt; the xml is hardly readable.... I cringe, it’s destroyed my lovely xml...

Now I can see what’s going on, before it even gets to my custom pipeline, my xml is encoded, therefor my pipeline which is looking for xml, does not find any, and does not work.... UGH!

Interestingly when I assign the value of the xml in the outbound message, I do it from a message assignment shape, I have the xml in my grasp...
I do a cheeky thing in BizTalk, turn the message into an xml document, which I can turn into text. Luckily I wrote the pipeline by using a separate class for the tagging and white space cleaning. So I can call the methods from the message assignment shape, and I process the xml, clean and massage it.

I then set the xml document back up by loading the NEW xml into to, and then set the value of the parameter of the outbound message to the xmldocument.

The message is now encoded before it goes to the port. I send the message to the port and wolla, response message, in xml... that tells me it worked.

This is a bit of a hack and a workaround, it does work, however I would have preferred the pipeline to work.

I do note with interest that only the SOAP adapter is able to process this multipart message, because it’s an OLD soap web service, I did hear of talk of scrapping the SOAP adapter, however multipart messages are VERY common on soap web services, and this would be foolish to scrap it. I’d like to see support for multi part messages in the WCF set of adapters, with more backwards compatible support, so I don’t have to use an out dated SOAP adapter.

Thursday, June 16, 2011

Exception: System.EnterpriseServices.TransactionProxyException

I get an exception when configuring BizTalk 2010.

BizTalk application server.
Separate SQL server for the databases.

Exception: System.EnterpriseServices.TransactionProxyException

Looking at http://support.microsoft.com/kb/293799

When a System.EnterpriseServices.TransactionProxyException exception is triggered during a transaction completion, it cannot be caught from other application domains. Instead, you receive a System.Runtime.Serialization.SerializationException exception that resembles the following:

Unhandled Exception: System.Runtime.Serialization.SerializationException: Type 'System.EnterpriseSer vices.TransactionProxyException' in Assembly 'System.EnterpriseServices, Version=2.0.0.0, Culture=ne utral, PublicKeyToken=b03f5f7f11d50a3a' is not marked as serializable.

Gave me a hint to the problem, however was not the solution…… or any where near to it be careful following this I did not.

I saw that the databases were created during the configuration, and then deleted; only leaving the management database in a weird state. So the communication to the database server was fine.

I did some more investigation and found the dtctester tool… great for testing if my dtc was set up correctly, as this is one thing that BizTalk leverages extensively.

I ran the 'dtctester' tool, and it came up with the following:

Executed: dtctester.exe
DSN: PMS
User Name: *******
Password: ********
tablename= #dtc16032
Creating Temp Table for Testing: #dtc16032
Warning: No Columns in Result Set From Executing: 'create table #dtc16032 (ival
int)'
Initializing DTC
Beginning DTC Transaction
Enlisting Connection in Transaction
Error:
SQLSTATE=25S12,Native error=-2147168242,msg='[Microsoft][ODBC SQL Server Driver]
Distributed transaction error'
Error:
SQLSTATE=24000,Native error=0,msg=[Microsoft][ODBC SQL Server Driver]Invalid cur
sor state
Typical Errors in DTC Output When
a. Firewall Has Ports Closed
-OR-
b. Bad WINS/DNS entries
-OR-
c. Misconfigured network
-OR-
d. Misconfigured SQL Server machine that has multiple netcards.
Aborting DTC Transaction
Releasing DTC Interface Pointers
Successfully Released pTransaction Pointer.

Ok, so my problem was DTC….

I found a great response to a similar problem that told me to look at a couple of DTC issues. (http://dbaspot.com/sqlserver-server/215054-msdtc-doesnt-seem-work.html)

http://support.microsoft.com/kb/817064
http://support.microsoft.com/kb/301600

If you are running Win2K3 SP1, you will need to reset the DTC security parameters:

http://support.microsoft.com/kb/899191

Since you are beginning the transaction on a remote host, that host will also need to properly configure its DTC services.

Finally, if the communication must transit any firewalls, then you will need to restrict RPC ports on both Client and Server for Internet Ports, and then authorize these Ports in the firewall ACLs.

http://support.microsoft.com/kb/300083
http://support.microsoft.com/kb/250367/EN-US

I checked the firewall, no problem, I checked the other things, and made a change to the DTC security not via the method mentioned but via the component services.

Everyone should know by now to change the security settings under component services for the dtc on the sql server and the BizTalk server.

Go to the Component services/My Computer/Distributed Transaction Controller/LocalDTC

Properties/security

Ensure that everything in here is checked, (Remote admin is optional)

I usually select no Authentication required, and ensure that the network service account is running the service.

I did this, thinking I’ve found the problem…..

Nope same problem re occurs…

I find out that the SQL server is clustered, so I go to the cluster manager, and look at the dtc settings, and open the localdtc from there, the same thing, all the settings are correct because I had already changed them.

Then I notice, CLUSTER DTC, and expand that and mange that…. The cluster has its OWN instance of DTC, of course, it does not use the local DTC at all.

I look at the security settings of the cluster DTC, and change those to match, and wolla… it works…. Nice to know if the SQL server is clustered…

Always look at DTC, if your BizTalk does not wish to install to a remote SQL server.

Thursday, April 21, 2011

Your WCF-SQL adapter is not comming up after you have installed it in BizTalk?

You installed the WCF LOB Adapter SDK? From the ASDK folder on the installation cd.

You Installed the adapter Pack? From the AdapterPack folder on the installation cd.

You did this: http://soa-thoughts.blogspot.com/2010/08/wcf-sql-adapter-table-operations.html

Which version did you install?x86, x64, were they both x86, or x64 ?

Still not comming up... What version of BizTalk did you install? x86 or x64?

If you installed x86, then you need to install x86 version of both of these, and then.... Go to the BizTalk Administration Console, Platform Settings, Under Adapters, right click and say new adapter...

See this again: http://soa-thoughts.blogspot.com/2010/08/wcf-sql-adapter-table-operations.html

Do you NOW see WCF-SQL adapter there ??

If in doubt install x86.... Handy Hint....

Wednesday, March 30, 2011

Exam 70-595 BizTalk 2010 Exam Released

After much pushing, and some prodding of the people who look after exams at Microsoft. Several members of the community got involved with Microsoft to create a BizTalk 2010 Exam.

It has now been released... have a look here

The more interesting thing to note, this exam will be quite different from exams on this topic from previous exams as the community created it, the community they knows in depth about the product.

Just look at the main areas:

Configuring a Messaging Architecture (20 percent)
Developing BizTalk Artifacts (20 percent)
Integrating Web Services and Windows Communication Foundation (WCF) Services (14 percent)
Implementing Extended Capabilities (13 percent)
Deploying, Tracking, and Supporting a BizTalk Solution (16 percent)

With a large focus on Architecture and actual development, without forgetting about deployment and support for BizTalk Solutions, an area that was sadly lacking any real content however significantly needed.

I have passed previous exams on BizTalk, and was really pushing for an exam that was not like other ones and set the mould for new exams.

Go on, get out there and have a go... here

Sunday, March 20, 2011

To the Cloud......

I have just returned from the MVP summit held in Redmond each year, there were some highlights and some low lights, as with each summit. The highlights are always all the cool new stuff, most of which I’m not allowed to talk about publicly, which is great (NOT), it does mean I can’t say much on the blog. I can only say what has already been announced publicly at PDC, Microsoft is doing the cloud thing, the next off the conveyer belt is composite apps, what’s in the box, wait and see, I will say it’s interesting, and then more interesting when you add in the comments of the MVP’s present when they told us how interesting it was.

We are talking about the cloud here, and of course I want to run apps on it, I want to run workflow in the cloud, I’ve wanted this since they had it a few years back, and then took it away because it was so limited. It makes sense, in the right scenario.

I also want to access my applications inside the organisation (on premise), provide a rich integration layer to them, to enable my cloud apps to communicate with my on premise systems.

This kind of application is called a hybrid model, and it is/will become very common. I would like to use the same technology I use in the cloud to do the whole integration and workflow as I do to access my on premise applications. Currently I use BizTalk for my on Premise applications and then have to write something different to enable my cloud applications to do this, hence the use of the term “Hybrid” it’s using a bit of both. This is currently possible by various means of the service bus in windows Azure and the new bits to enable BizTalk to expose a port or Orchestration on the service bus to accept connections. I can then establish a communications pattern into my organisation’s “legacy” on premise applications.

My problem with this approach is detailed in my recent webcast at http://www.cloudcasts.net/Default.aspx?category=BizTalk+Light+and+Easy my on premise middleware still needs to exist, and it needs to scale in line with my cloud system, it’s not 1:1 more like 2-3(cloud instances):1(on Premise) but it needs to scale, hence I need to still invest in on premise hardware, however I don’t want to have to scale it I want to leverage the cloud to scale on demand, and scale back when I don’t need it. It’s one of the key selling factors for using the cloud.

Whilst I cannot put everything in the cloud, it’s never going to happen, I want to have the option of scaling to the cloud and then scaling back to on premise when I have low load levels, hence justifying my on premise costs.

I do have this for websites in the cloud; this is a little more difficult for an integration platform that needs to access legacy systems and is written in a non-cloud friendly way.

I would love to provide this on the cloud, but this is one ask that is some time away which ever provider you look at. My view is whoever cracks this will dominate the cloud market.

The rest of the detail will come…. I don’t know when and I can’t say how but it’ll come wait and see, with more announcements coming…. It’s how you leverage the cloud to work for you that will make the real difference in adopting a cloud/hybrid model or not.

Integration is a hard sell enough, to add cloud to the mix makes it even harder, I’m not the only one out there trying this on, customers are not buying yet, and the amount of convincing, assurances and explaining needed is staggering.

Sunday, February 6, 2011

BizSpark Event Results

Following the BizSpark event, I needed to recall the results and more the criteria for the event.

One of them was number of Windows Azure investments used. This was perhaps one of the major ones, there were perhaps 3 of the top major criteria.

Our solution was 100% hosted and built for Windows Azure, it used:

Web Roles x 4, (Two Fully working Websites)
Web Service(s) x 2, including hosting of Windows Workflow in the cloud. A https wcf endpoint for doing credit checks.

We used and had working by the end of the 2 days:

Azure Blob storage
Azure Table Storage
Azure Queue Storage

We even had a windows phone 7 application that worked with our application.

Our solution worked end to end, had quite a complex architecture and did what we said it would do.

So we were on the ball with using the technology required, and it worked.

I'm not being bias here, but from the look at the other presentations our solution used the most Azure features, was actually built in the two days and was actually running on Windows Azure.

The winning entries had all been developed prior, one of which was not even running on windows Azure. It did look nice, was much more complete than any of the other solutions, and had several months of development from two people who formally worked for Microsoft developing the platform of which the application lived on.

Given several of the judges were from Microsoft this did not sit well.

Don't get me wrong, I liked the two entries that came first and second, their solutions and concepts were great.

I think for future competitions it needs to be fair, given the majority of people had not developed a windows azure application before and their presentations were not fully working, as expected, they should be judged on this factor alone. If you want to bring along something you have spent several months developing and put it forward then you should sit in a different category.

Fair is Fair, this event was not fair and those people who worked on the solutions may not have such a great first experience with Windows Azure.

Friday, February 4, 2011

WIndows Azure BizSpark Event

We are here at the BizSpark event, working on a solution that we are only given two days to develop. We have to get it done and presented in less than this time.

We are working hard at it, and have two services running, with https endpoints, as the data is personal.

Twitter is working hard with #WABizCamp. To make some noise....

Let’s see how we go, we have a great plan and a lot of work to go.....

Sunday, January 30, 2011

The Future of Middleware?

Today I read a rather interesting and profound statement in the book recently released "BizTalk 2010 Recipes" by Mark Beckner

It was so interesting that I'd like to share it with you, it speaks of the future of which I’d tend to agree with, "in the decade ahead, middleware will be more important and relevant than ever before."

"Why does middleware like this have such staying power? You’d think that newer advances in technology like web services, SOA, and software as a service (SaaS) would render applications much more inherently interoperable and that the pain and complexity of systems integration would be a thing of the past.

The truth is that enterprises of all sizes still experience tremendous cost and complexity when extending and customizing their applications. Given the recent constraints of the economy, IT departments must increasingly find new ways to do more with less, which means finding less expensive ways to develop new capabilities that meet the needs of the business. At the same time, the demands of business users are ever increasing; environments of great predictability and stability have given way to business conditions that are continually changing, with shorter windows of opportunity and greater impacts of globalization and regulation. These factors all put tremendous stress on IT departments to find new ways to bridge the demanding needs of the users and businesses with the reality of their packaged applications.

This leads back to the reason why middleware—certainly not sexy as technologies go—continues to deliver tremendous value to both businesses and IT departments. As the technology’s name suggests, it sits in the middle between the applications you use and the underlying infrastructure; this enables IT departments to continue to innovate at the infrastructure level with shifts like many-core processing, virtualization, and cloud computing. Instead of having to continue to continually rewrite your LOB applications to tap into infrastructure advances, you can depend on middleware to provide a higher level of abstraction, so you can focus your efforts on writing the business logic, not plumbing code. Using middleware also helps future-proof your applications, so that even as you move ahead to the nextgeneration development tools and platforms (including the current trends toward composite applications and platforms as a service), you can still leverage the existing investments you’ve made over the years.

So, in the decade ahead, middleware will be more important and relevant than ever before. "

Burley Kawasaki

Director of Product Management, Microsoft Corporation