Wednesday, July 23, 2014

MuleSoft dot NET

For those of you who have been keeping your AnyPoint Studio up to date you may have been pleasantly surprised this week.  The reason?  MuleSoft released two important connectors for customers who leverage the Microsoft platform in their architectures.  (You can read more about the official announcement here.)

More specifically, the two capabilities that were released this week include:

  • .NET Connector
  • MSMQ Connector

image

The MSMQ Connector is self explanatory but what is a .NET Connector?  The .NET Connector allows .NET code to be called from a Mule Flow.

Why are these connectors important? For some, hating Microsoft is a sport, but the reality is that Microsoft continues to be very relevant in the Enterprise.  In case you missed their recent earnings, they made 4.6 billion in net income for their past quarter…yes that is a ‘b’ and yes that was only for a quarter of the year. 

Many customers continue to use MSMQ.  Sometimes these solutions are custom .Net solutions where they are using MSMQ to add some durability for their messaging needs.  Sometimes, these are legacy applications in maintenance mode but not always.  Other use cases include purchasing a COTS (Commercial Off The Shelf) product that has a dependency on MSMQ.

While the MSMQ Connector is a nice addition to the MuleSoft portfolio of Connectors, the .NET Connector is what really gets me excited.  I have been using .Net since the 1.1 release and am very comfortable in Visual Studio.

For many organizations, they have standardized on building their custom applications in .NET.  I have worked for these companies in the past and for many of these organizations, programming in another language is a showstopper.  There may be concerns about re-training, interoperability and productivity as a result of introducing new programming languages. Some people may consider this fear mongering, but the reality is if you have a strong Enterprise Architecture practice, you need to adhere to standards. While some people are willing to introduce many different languages into an environment,  others are not.

The combination of the AnyPoint Platform and the ability to write any integration logic that is required in .NET is a very powerful combination for organizations that want to leverage their .NET skill sets.

How to invoke .NET Code from a Mule Flow? There are many resources being made available as part of this release so I don’t want to spoil that party (See conclusion for more resources).  But let me provide a sneak peak. For those of you who may not be familiar with MuleSoft, we have the ability to write Mule Flows.  You can think of these much like a Workflow or an Orchestration for my BizTalk friends.  On the right hand side we have our pallete where we can drag Message processors or Connectors from the pallete to our Mule Flow.

image

Once our Connector is on our Mule Flow, we can configure it.  We need to provide an Assembly Type, Assembly Path (can be relative or absolute), a Scope and a Trust level.  This configuration is considered to be a Global Element and we only have to configure this once per .NET assembly.

image

Next we provide the name of the .NET Method that we want to call.

image

From there it is business as usual from a .NET perspective.  I can send and receive complex types, JSON, XML Documents etc.

image

Conclusion

Hopefully this gives you a little taste of what is to come.  I have had the opportunity to work with many Beta customers on this functionality and am very excited with where we are and where we are headed.  What we are releasing now is just the beginning.

Stay tuned for more details on both the MSMQ and .NET Connectors.  Now that these bits are public I am really looking forward to sharing this information with both the Microsoft and MuleSoft communities.

Other resources:

  • Press Release
  • MuleSoft Blog Post including two short video demos and registration link for an upcoming Webinar.

BTW: If this sounds interesting to you, we are hiring!!!

Sunday, June 1, 2014

My MuleSoft Connect 14 Recap

This past week over 1200 attendees ascended upon San Francisco to attend Connect and APICon (a sister conference).  If you are interested in checking out the company recap of Connect you can check that out here.  The purpose of this post is to highlight some of my thoughts from the event.

This was my first time attending Connect and I had a great time chatting with attendees, customers and other Muleys about everything currently going on in the industry.  I also had the opportunity to co-present in two sessions. Whenever I speak at a conference like this I generally provide a post conference write up and figured I would take this opportunity to share some of the details from my sessions.

The Connected Insurer

In this session I shared the stage with Aaron Landgraf (Product Marketing Manager) where we discussed some of the themes that are taking place in today’s Insurance Company.

The first point is that the Insurance landscape is changing:

  • Organizations are trying to differentiate themselves via Customer Service.  This may include the ability for customer self service via the Web or Mobile.
  • Omni (or multi) Channel engagements.  No longer are customer interactions going to occur within one particular channel but may start over social media, continue over email and conclude through the web.
  • Lastly there should be one view (or 360 degree) view of the customer.  Regardless of the system that is being used to service the customer, all contextual information needs to be made available within that view.  This information may be pulled from a variety of systems in real time and aggregated while setting the appropriate context.

image

Another aspect of the presentation was to discuss some of the current challenges within this industry vertical.

  • Customers are looking for improved customer service while organizations are trying to reduce costs.
  • Any new solution will likely need to interface with existing (or legacy) assets.
  • Delivering solutions needs to be done in a very timely manner.

image

Next I discussed a high level architecture where we can address some of these challenges by providing RESTful APIs modeled in RAML, A Service Orchestration layer hosted by Mule ESB that is capable of orchestrating back end services.

image

An actual implementation of this architecture can be found in the next image.  More on this solution towards the bottom of the post.

image

Integrating the Heterogeneous Enterprise

The next session I co-presented with Ken Yagen (VP Products) where Ken walked us through some trends occurring within the industry including:

  • 36% of IT’s time is spent maintaining legacy systems and 67% of global execs said IT will spend the same or more time on these systems.
  • Cloud Services are used to augment current services.  65% of CIOs have deployed cloud solutions as a way to bolster existing services.
  • Enterprise Mobility: 45% of organizations will spend at least 500K on mobility projects over the next 12 to 18 months.

The reality of the current state that we are in is that organizations are being forced to integrate with heterogeneous environments.  No longer can organizations say we are an “ABC” shop or an “XYZ” shop.  While Enterprise Architectures may push for reducing technical diversity, the increasing adoption of Mobility, SaaS, Cloud and Social platforms are disrupting traditional architectures.  As a result a flexible, yet comprehensive platform is required to address these needs.  At MuleSoft, we feel that our AnyPoint Platform is the answer when integrating heterogeneous environments.

image

Demo

I put together a demo that addressees a lot of the challenges and trends that were discussed across these two presentations including:

  • Mobility
  • RESTful APIs (modeled in RAML)
  • Service Orchestration via MuleESB
  • Integrating Heterogeneous systems (SQL Server, WCF Services, .NET and SalesForce)
  • Also took the opportunity to introduce some of the new features of the AnyPoint Platform May 2014 release including Scatter-Gather, Web Services Consumer and enhanced DB connector which supports Data Sense)

Here are a few of the screen shots from the Windows 8 Mobile App that was built using C#/XAML.

image

 

image

image

image

image

image

RAML Definition

image

AnyPoint Studio – MuleESB solution (well part of it – didn’t fit in one screenshot)

image

Conclusion

Overall it was a great week with a lot of interesting sessions and conversations. Whether you attended it or were unable to attend,  the good news is that the London edition of Connect is right around the corner from September 24 – 26th. You can sign up for more details here.

Saturday, May 17, 2014

Speaking at Connect 2014

On May 27th-29th, in San Francisco, MuleSoft is hosting one of the largest integration events of the year called Connect.  There is an impressive list of speakers including:

  • Greg Schott, CEO MuleSoft
  • Ross Meyercord, CIO Salesforce
  • John Collison, Co-Founder Stripe
  • Ross Mason, Founder MuleSoft
  • Ben Haines, CIO BOX
  • Uri Sarid, CTO MuleSoft

I am fortunate to have the opportunity to participate in the event.  I am speaking in the Insurance track and also co-presenting on Integrating heterogeneous environments.

In the Integrating heterogeneous environments session will be demonstrating some of the investments that MuleSoft has been making in the area of Microsoft integration and more specifically some of the work we have been doing with WCF and .Net.

Without giving up too many details, there will be a healthy dose of RAML, APIs, SalesForce, .Net, REST/JSON,  WCF and Mobile in the demos. 

image

So if you haven’t signed up, there are still some spots available.  You can do so here.

Sunday, April 13, 2014

Learning Mule ESB

 

I recently joined MuleSoft and have had quite a few people ask me how they can get started with the platform.  These people typically have some integration experience on other technology stacks and are curious about the buzz that MuleSoft is creating in the industry.  Instead of copying and pasting from email to email I figured that I would put together a blog post that identifies some beneficial resources for learning Mule ESB.  I will try to keep this post up to date as new material emerges.

Mule Before getting started with any of the learning resources, you will need to download the Mule ESB platform.  MuleSoft provides a free community edition that allows you to build and run Mule Applications. 

In addition to a Community Edition (CE), a commercial product called Enterprise Edition (EE) also exists that provides some additional Enterprise features.

The Bits

Mule ESB Community Edition
– Free download of the community edition of the software including the Mule AnyPoint Studio IDE for developing interfaces for the Mule Platform.  These tools can be run on Windows, Mac and Linux.
 
Tutorials

First 30 Minutes with Mule
– An introduction to the platform and simple walkthrough of your first Mule Application.

First Day with Mule
– Some more concepts are introduced including Message States, Global Elements, Content Based Routing and Connector Tutorials.

First Week with Mule
– Some more advanced concepts are introduced including Application Architecture, Security and Extending Mule.
 
Video Clips/Webcasts

Mule 101: Rapidly Connect Anything, AnywhereDiscover the MuleSoft’s Data Mapper, 120+ out of box connectors, development tools and deployment.

Mule 201: Develop and manage a hybrid integration applicationLearn about Legacy Modernization, Service Orchestration and Connectors. Also learn to deploy your Mule Applications and Manage/Monitor them through the Mule Management Console.

MuleSoft’s YouTube Channel – Find a lot of short demonstrations and promotional material.  Demonstrations include SAP, SalesForce, Marketo, LinkedIn, Amazon S3, Hadoop, Netsuite, Twitter and many more.
 
Books 

These are the books that I have read.  I can confidently say that I learned something from each of them.

Mule ESB Cookbook – I started with this one and found some easy to follow, walk-throughs of common integration scenarios.

Getting Started with Mule Cloud Connect – This book focuses more on the Cloud and SaaS connectors.  It is a good read, but I would suggest getting some more fundamental learning taken care of first then dig into these topics.

Mule in Action Second Edition- This is the most comprehensive book of the 3.  It gets into a greater level of detail than the cookbook and walks you through some rich examples.
 
Blog Posts

Here are few walkthroughs that I put together as I began my Mule journey.

Exposing Simple REST Service – As the title suggests, a simple REST Service.

Exposing SQL Server Data as HTTP Endpoint – This post will demonstrate how to expose SQL Operations through HTTP and return responses in JSON format.

Exploring Mule ESB SFTP Adapter – Since I have used the SFTP adapter on other platforms I was curious to take a peek at MuleSoft’s solution.

Twitter Integration – A quick look at the MuleSoft Twitter connector that allows you to interact with the Twitter API in a very eloquent way.  In this example I update my Twitter status via Mule ESB.
 
.Net Resources

On this blog you will without doubt find a lot of Microsoft related content.  MuleSoft is a company that is driven to connect any system to any device on any platform.  With this in mind there are some activities in the pipeline to better support .Net and other Microsoft products/services like SharePoint, Dynamics, Azure etc.  With this in mind, I figured I would include a few links that may be of interest to people who are interesting in integrating Microsoft technologies.

Connect .NET to anything, anywhere  - Whitepaper

.NET Connectivity – Article 

In addition to what you will find in those articles here are some of the ways that Mule ESB integrates with Microsoft technologies.

Mule ESB Anypoint Connectors for Microsoft platforms

  • MSMQ
  • AMQP
  • Active Directory
  • SOAP/WS-* (WCF interoperability)
  • REST (ASP.NET WebAPI interoperability)
  • SharePoint
  • SQL Server
  • Microsoft Dynamics GP
  • Dynamics CRM
  • Dynamics Online
  • Excel/CSV
  • Yammer

 

 

Lastly, I wanted to mention an upcoming event in San Francisco where you will be able to learn more about the .Net investments and other areas of focus for MuleSoft.  Click on the image below for more details.

image

Saturday, February 15, 2014

European Tour 2014

As I look at the calendar and see some important dates are quickly approaching, I thought I better put together a quick blog post to highlight some of the events that I will be speaking at in early March.

I will be using the same content at all events but am happy to talk offline about anything that you have seen in this blog or my presentation from Norway this past September.

The title of my session this time around is: Exposing Operational data to Mobile devices using Windows Azure and here is the session’s abstract:

In this session Kent will take a real world business scenario from the Power Generation industry. The scenario involves real time data collection, power generation commitments made to market stakeholders and current energy prices. A Power Generation company needs to monitor all of these data points to ensure it is maintaining its commitments to the marketplace. When things do not go as planned, there are often significant penalties at stake. Having real time visibility into these business measures and being notified when the business becomes non-compliant becomes extremely important.
Learn how Windows Azure and many of its building blocks (Azure Service Bus, Azure Mobile Services) and BizTalk Server 2013 can address these requirements and provide Operations people with real time visibility into the state of their business processes.

London – March 3rd and March 4th

The first stop on the tour is London where I will be speaking at BizTalk360’s BizTalk Summit 2014.  This is a 2 day paid conference event which has allowed BizTalk360 to bring in experts from all over the world to speak at this event.  This includes speakers from Canada (me), my neighbor, the United States, Italy, Norway, Portugal, Belgium, the Netherlands and India.  These experts include many Integration MVPs and the product group from Microsoft.

There are still a few tickets available for this event so I would encourage you to act quickly to avoid being disappointed.  This will easily be the biggest Microsoft Integration event in Europe this year with a lot of new content.

londonbanner

Stockholm – March 5th

After the London event, Steef-Jan Wiggers and I will be jumping on a plane and will head to Stockhom to visit our good friend Johan Hedberg and the Swedish BizTalk Usergroup.  This will be my third time speaking in Stockholm and 4th time speaking in Scandinavia.  I really enjoy speaking in Stockholm and am very much looking forward to returning to Sweden.  I just really hope that they don’t win the Gold Medal in Men’s Hockey at the Olympics otherwise I won’t hear the end of it.

I am also not aware of any Triathlons going on in Sweden at this time so I should be safe from participating in any adventure sports.

At this point an EventBrite is not available but watch the BizTalk Usergroup Sweden site or my twitter handle (@wearsy) for more details. 

icy-harbour-stockholm

Netherlands – March 6th

The 3rd and last stop on the tour is the Netherlands where I will be speaking at the Dutch BizTalk User Group.  Steef-Jan Wiggers will also be speaking as will René Brauwers.  This will be my second trip to the Netherlands but my first time speaking here. I am very much looking forward to coming back to the region to talk about integration with the community and sample Dutch Pancakes, Stroopwafels and perhaps a Heineken (or two).

The eventbrite is available here and there is no cost for this event.

amsterdam

See you in Europe!

Wednesday, January 1, 2014

2013–Year in Review and looking ahead to 2014

With 2014 now upon us I wanted to take some time to reflect on the past year.  It was an incredible and chaotic year but it was also a lot of fun!  Here are some of the things that I was involved in this past year.

MVP Summits

This year there were two MVP summits.  One in February and another at the end of November.  MVP Summits are such great opportunities on a few different levels.  First off you get to hear about what is in the pipeline from product groups but you also get to network with your industry peers. I find that these conversations are so incredibly valuable and the friendships that are developed are pretty incredible.  Over time I have developed an incredible world wide network with so many quality individuals it is actually mind blowing.

(Pictures from February MVP Summit)

MVPSummit1b

 

At the attendee party at Century Link stadium

MVPSummit1

Dinner with Product Group and other MVPs

PGand MVPs

(Pictures from November Summit)

At Lowell’s in the Pike Place Market in Seattle  for our annual Integration breakfast prior to the SeaHawk’s game.

Breakfast

A portion of the Berlin Wall with Steef-Jan at Microsoft Campus

Kent_Steef_BerlinWall

Dinner at our favourite Indian restaurant in Bellevue called Moksha.

Dinner

At Steef Jan’s favorite Donut shop in Seattle prior to the BizTalk Summit.

Donuts

Speaking

This year I had a lot of good opportunities to speak and share some of the things that I have learned.  My first stop was in Phoenix at the Phoenix Connected Systems Group in early May

The next stop was in Charlotte, North Carolina where I presented two sessions at the BizTalk Bootcamp event.  This conference was held at the Microsoft Campus in Charlotte.  Special thanks to Mandi Ohlinger for putting it together and getting me involved.

KentCharlotte

Soon after the Charlotte event I was headed to New York City where I had the opportunity to present at Microsoft Technology Center (MTC) along side the Product group and some MVPs to some of Microsoft’s most influential customers in New York City.

New York City

The next stop on the “circuit” was heading over to Norway to participate in the Bouvet BizTalk Innovation Days conference.  This was my favourite event for a few reasons;

  • I do have some Norwegian heritage so it was a tremendous opportunity to learn about my ancestors.
  • Another opportunity to hang with my MVP buddies from Europe
  • I don’t think there is a more passionate place on the planet about integration than in Scandinavia (Sweden included).  Every time I have spoke there I am completely overwhelmed by the interest in Integration in that part of the world.

Special thanks to Tord Glad Nordahl for including me in this event.

NorwaySpeakers2

After the Norway event I had the opportunity to participate in the 40th Annual Berlin Marathon with my good friend Steef Jan Wiggers. This was my second Marathon that I have run and it was a tremendous cultural experience to run in that city.  I also shaved 4 minutes off of my previous time from the Chicago marathon so it was a win-win type of experience.

Celebrating

The last speaking engagement was in Calgary in November.  I had the opportunity to speak about Windows Azure Mobile Services, Windows Azure BizTalk Services and SAP integration at the Microsoft Alberta Architect forum.  It was a great opportunity to demonstrate some of these capabilities in Windows Azure to the Calgary community.

Grad School

2013 also saw me returning to School! I completed my undergrad degree around 12 years ago and felt I was ready for some higher education.  I have had many good opportunities for career growth in my career but always felt that it was my technical capabilities that created those leadership and management opportunities.  At times I felt like I didn’t have a solid foundation when it came to running parts of an IT organization.  I felt that I could benefit from additional education.  I don’t ever foresee a time when I am not involved in Technology.  It is my job but it is also my hobby. With this in mind I set out to find a program that focused on the “Management of Technology”.  I didn’t want a really technical Master’s program and I also didn’t want a full blown Business Master’s program.  I really wanted a blend of these types of programs.  After some investigation I found a program that really suited my needs.  The program that I landed on was Arizona State University’s MSIM (Masters of Science in Information Management) through the W.P. Carey School of Business.

In August, 2013, I headed down to Tempe, Arizona for Student Orientation.  During this orientation myself and 57 other students in the program received detailed information about the program.  We also got assigned into groups of 4 or 5 people who you will be working closely with over the course of the 16 month program.  There are two flavors of the program.  You can either attend in-person at the ASU campus or you can participate in the on-line version of the program.  With me living in Calgary, I obviously chose the remote program. 

One thing that surprised me was the amount of people from all over the United States that are in this program.  There are people from Washington St, Washington DC, Oregon, California, Colorado, New Mexico, Texas, Indiana, New York, Georgia, Vermont, Alabama, Utah and of course Arizona in the program. When establishing groups, the school will try to place you in groups within the same time zone.  My group consists of people from Arizona which has worked out great so far.  This is really a benefit of the program as everyone brings a unique experience to the program which has been really insightful.

I just finished up my 3rd course (of 10) and am very pleased with choosing this program.  Don’t get me wrong, it is a lot of work but I am learning alot and really enjoying the content of the courses.  The 3 courses that I have taken so far are The Strategic value of IT, Business Process Design and Data and Information Management.  My upcoming course is on Managing Enterprise Systems which I am sure will be very interesting.

If you have any questions about the program feel free to leave your email address in the comments as I am happy to answer any questions that you have.

388644_10151308828386207_698535131_n

 

 Books

Unfortunately this list is going to be quite sparse compared to the list that Richard has compiled here, but I did want to point out a few books that I had the opportunity to read this past year.

Microsoft BizTalk ESB Toolkit 2.1

In 2013, it was a slow year for new BizTalk books.  In part due to the spike in books found in 2012 and also the nature of the BizTalk release cycle. However we did see the Microsoft BizTalk ESB Toolkit 2.1  book being released by Andres Del Rio Benito and Howard Edidin. 

This book comes in Packt Publishing’s new shorter format.  Part of the challenge with writing books is that it takes a really long time to get the product out.  In recent years Packt has tried to shorten this release cycle and this book falls into this new category.   The book is approximately 130 pages long and is the most comprehensive guide of the ESB toolkit available.  I have not seen another resource where you will find as much detailed information about the toolkit.

Within this book you can expect to find 6 chapters that discuss:

  • ESB Toolkit Architecture
  • Itinerary Services
  • ESB Exception Handling
  • ESB Toolkit Web Services
  • ESB Management Portal
  • ESB Toolkit Version 2.2 (BizTalk 2013) sneak peak.

If you are doing some work with the ESB toolkit and are looking for a good resource then this a good place to start. (Amazon)

ESB Book

 

The Phoenix Project: A Novel about IT, DevOps and Helping your Business Win

I was made aware of this book via a Scott Gu tweet and boy it was worth picking up.  This book reads like a novel but there are a lot of very valuable lessons embedded within the book.  This book was so relevant to me that I could have sworn that I have worked with this author before because I had experienced so much of what was in this book.  If you are new to a leadership role or are struggling in that role this book will be very beneficial to you. (Amazon)

The Phoenix Project

 

Adventures of an IT Leader

This is a book that I read as part of my ASU Strategic Value of IT course.  It is similar in nature to the Phoenix Project and also reads like a novel.  In this case a Business Leader has transitioned into a CIO position.  This book takes you through his trials and tribulations and really begs the question is “IT Management just about Management”. (Amazon)

IT Leadership

The Opinionated Software Developer: What Twenty-Five Years of Slinging Code Has Taught Me

This was an interesting read as it describes Shawn Wildermuth’s experiences as a Software Developer.  It was a quick read but was really interesting to learn about Shawn’s experiences throughout his career. I love learning about what other people have experienced in their careers and this provided excellent insight into Shawn’s. (Amazon)

Shawn

Hard Facts, Dangerous Half-Truths, and Total Nonsense: Profiting from Evidence-based Management

Another book from my ASU studies but this one was interesting.  It does read more like a text book but the authors are very well recognized for their work in Business Re-engineering space.  I think the biggest thing that I got out of this book was to not lose sight of evidence-based management. All too often technical folks use their previous experiences to dictate future decisions.  For example at a previous company or client a particular method worked.  However taking this approach to a new company or client provides you no guarantees that it will work again.  This book was a good reminder that a person needs to stick to the facts when making decisions and to not rely (too much) on what has worked (or hasn’t) in the past. (Amazon)

Hard Facts

 

 2014

Looking ahead I expect 2014 to be as chaotic and exciting as 2013.  It has already gotten off to a good start with Microsoft awarding me with my seventh consecutive MVP award in the Integration discipline.  I want to thank all of the people working in the Product Group, the Support Group and in the Community teams for their support.  I also want to thank my MVP buddies who are an amazing bunch of people that I really enjoy learning from.

MVP_FullColor_ForScreen

Also, look for a refresh of the (MCTS): Microsoft BizTalk Server 2010 (70-595) Certification Guide book. No the exam has not changed, but the book has been updated to include BizTalk 2013 content that is related to the Microsoft BizTalk 2013 Partner competency exam.  I must stress that this book is a re-fresh so do not expect 100% (of anywhere near that) of new content.

Tuesday, December 10, 2013

BizTalk 2013–Integration with Amazon S3 storage using the WebHttp Adapter

I have recently encountered a requirement where we had to integrate a legacy Document Management system with Amazon in order to support a Mobile-Field Worker application.  The core requirement is that when a document reaches a certain state within the Document Management System, we need to publish this file to an S3 instance where it can be accessed from a mobile device.  We will do so using a RESTful PUT call.

Introduction to Amazon S3 SDK for .Net

Entering this solution I knew very little about Amazon S3.  I did know that it supported REST and therefore felt pretty confident that BizTalk 2013 could integrate with it using the WebHttp adapter.

The first thing that I needed to do was to create a Developer account on the Amazon platform. Once I created my account I then downloaded the Amazon S3 SDK for .Net. Since I will be using REST technically this SDK is not required however there is a beneficial tool called the AWS Toolkit for Microsoft Visual Studio.  Within this toolkit we can manage our various AWS services including our S3 instance.  We can create, read, update and delete documents using this tool.  We can also use it in our testing to verify that a message has reached S3 successfully.

image

Another benefit of downloading the SDK is that we can use the managed libraries to manipulate S3 objects to better understand some of the terminology and functionality that is available.  Another side benefit is that we can fire up Fiddler while we are using the SDK and see how Amazon is forming their REST calls, under the hood, when communicating with S3

Amazon S3 Accounts

When you sign up for an S3 account you will receive an Amazon Key ID and a Secret Access Key. These are two pieces of data that you will need in order to access your S3 services.  You can think of these credentials much like the ones you use when accessing Windows Azure Services.

image

BizTalk Solution

To keep this solution as simple as possible for this Blog Post, I have stripped some of the original components of the solution so that we can strictly focus on what is involved in getting the WebHttp Adapter to communicate with Amazon S3.

For the purpose of this blog post the following events will take place:

  1. We will receive a message that will be of type: System.Xml.XmlDocument.  Don’t let this mislead you, we can receive pretty much any type of message using this message type including text documents, images and pdf documents.
  2. We will then construct a new instance of the message that we just received in order to manipulate some Adapter Context properties. You may now be asking – Why do I want to manipulate Adapter Context properties?  The reason for this is that since we want to change some of our HTTP Header properties at runtime we therefore need to use a Dynamic Send Port as identified by Ricardo Marques.

    image

    The most challenging part of this Message Assignment Shape was populating the WCF.HttpHeaders context property.  In C# if you want to populate headers you have a Header collection that you can populate in a very clean manner:

    headers.Add("x-amz-date", httpDate);

    However, when populating this property in BizTalk it isn’t as clean.  You need to construct a string and then append all of the related properties together.  You also need to separate each header attribute onto a new line by appending “\n” . 

    Tip: Don’t try to build this string in a Helper method.  \n characters will be encoded and the equivalent values will not be accepted by Amazon so that is why I have built out this string inside an Expression Shape.

    After I send a message(that I have tracked by BizTalk) I should see an HTTP Header that looks like the following:

    <Property Name="HttpHeaders" Namespace="http://schemas.microsoft.com/BizTalk/2006/01/Adapters/WCF-properties" Value=

    "x-amz-acl: bucket-owner-full-control
    x-amz-storage-class: STANDARD
    x-amz-date: Tue, 10 Dec 2013 23:25:43 GMT
    Authorization: AWS <AmazonKeyID>:<EncryptedSignature>
    Content-Type: application/x-pdf
    Expect: 100-continue
    Connection: Keep-Alive"/>

    For the meaning of each of these headers I will refer you to the Amazon Documentation.  However, the one header that does warrant some additional discussion here is the Authorization header.  This is how we authenticate with the S3 Service.  Constructing this string requires some additional understanding.  To simplify the population of this value I have created the following helper method which was adopted from the following post on StackOverflow:

    public static string SetHttpAuth(string httpDate)
         {
              string AWSAccessKeyId = "<your_keyId>";
              string AWSSecretKey = "<your_SecretKey>";

             string AuthHeader = "";
            string canonicalString = "PUT\n\napplication/x-pdf\n\nx-amz-acl:bucket-owner-full-control\nx-amz-date:" + httpDate + "\nx-amz-storage-class:STANDARD\n/<your_bucket>/310531500150800.PDF";
                

             // now encode the canonical string
             Encoding ae = new UTF8Encoding();
             // create a hashing object
             HMACSHA1 signature = new HMACSHA1();
             // secretId is the hash key
             signature.Key = ae.GetBytes(AWSSecretKey);
             byte[] bytes = ae.GetBytes(canonicalString);
             byte[] moreBytes = signature.ComputeHash(bytes);
             // convert the hash byte array into a base64 encoding
             string encodedCanonical = Convert.ToBase64String(moreBytes);
             // finally, this is the Authorization header.
             AuthHeader = "AWS " + AWSAccessKeyId + ":" + encodedCanonical;

             return AuthHeader;
         }

    The most important part of this method is the following line(s) of code:

    string canonicalString = "PUT\n\napplication/x-pdf\n\nx-amz-acl:bucket-owner-full-control\nx-amz-date:" + httpDate + "\nx-amz-storage-class:STANDARD\n/<your_bucket>/310531500150800.PDF";
                

    The best way to describe what is occurring is to borrow the following from the Amazon documentation.

    The Signature element is the RFC 2104HMAC-SHA1 of selected elements from the request, and so the Signature part of the Authorization header will vary from request to request. If the request signature calculated by the system matches the Signature included with the request, the requester will have demonstrated possession of the AWS secret access key. The request will then be processed under the identity, and with the authority, of the developer to whom the key was issued.

    Essentially we are going to build up a string that reflects that various aspects of our REST call (Headers, Date, Resource) and then create a Hash using our Amazon secret.  Since Amazon is aware of our Secret they can decrypt this payload and see if it matches our actual REST call.  If it does – we are golden.  If not, we can expect an error like the following:

    A message sent to adapter "WCF-WebHttp" on send port "SendToS3" with URI http://<bucketname>.s3-us-west-2.amazonaws.com/ is suspended.
    Error details: System.Net.WebException: The HTTP request was forbidden with client authentication scheme 'Anonymous'.
    <?xml version="1.0" encoding="UTF-8"?>
    <Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message><StringToSignBytes>50 55 54 0a 0a 61 70 70 6c 69 63 61 74 69 6f 6e 2f 78 2d 70 64 66 0a 0a 78 2d 61 6d 7a 2d 61 63 6c 3a 62 75 63 6b 65 74 2d 6f 77 6e 65 72 2d 66 75 6c 6c 2d 63 6f 6e 74 72 20 44 65 63 20 32 30 31 33 20 30 34 3a 35 37 3a 34 35 20 47 4d 54 0a 78 2d 61 6d 7a 2d 73 74 6f 72 61 67 65 2d 63 6c 61 73 73 3a 53 54 41 4e 44 41 52 44 0a 2f 74 72 61 6e 73 61 6c 74 61 70 6f 63 2f 33 31 30 35 33 31 35 30 30 31 35 30 38 30 30 2e 50 44 46</StringToSignBytes><RequestId>6A67D9A7EB007713</RequestId><HostId>BHkl1SCtSdgDUo/aCzmBpPmhSnrpghjA/L78WvpHbBX2f3xDW</HostId><SignatureProvided>SpCC3NpUkL0Z0hE9EI=</SignatureProvided><StringToSign>PUT

    application/x-pdf

    x-amz-acl:bucket-owner-full-control
    x-amz-date:Thu, 05 Dec 2013 04:57:45 GMT
    x-amz-storage-class:STANDARD
    /<bucketname>/310531500150800.PDF</StringToSign><AWSAccessKeyId><your_key></AWSAccessKeyId></Error>

    Tip: Pay attention to these error messages as they really give you a hint as to what you need to include in your “canonicalString”.  I discounted these error message early on and didn’t take the time to really understand what Amazon was looking for. 

    For completeness I will include the other thresshelper methods that are being used in the Expression Shape.  For my actual solution I have included these in a configuration store but for the simplicity of this blog post I have hard coded them.

    public static string SetAmzACL()
        {
            return "bucket-owner-full-control";
        }

        public static string SetStorageClass()
        {
            return "STANDARD";
        }

    public static string SetHeaderDate()
          {
              //Use GMT time and ensure that it is within 15 minutes of the time on Amazon’s Servers
              return DateTime.UtcNow.ToString("ddd, dd MMM yyyy HH:mm:ss ") + "GMT";
             
          }

  3. The next part of the Message Assignment shape is setting the standard context properties for WebHttp Adapter.  Remember since we are using a Dynamic Send Port we will not be able to manipulate these values through the BizTalk Admin Console.

    msgS3Request(WCF.BindingType)="WCF-WebHttp";
    msgS3Request(WCF.SecurityMode)="None";
    msgS3Request(WCF.HttpMethodAndUrl) = "PUT";  //Writing to Amazon S3 requires a PUT
    msgS3Request(WCF.OpenTimeout)= "00:10:00";
    msgS3Request(WCF.CloseTimeout)= "00:10:00";
    msgS3Request(WCF.SendTimeout)= "00:10:00";
    msgS3Request(WCF.MaxReceivedMessageSize)= 2147483647;

    Lastly we need to set the URI that we want to send our message to and also specify that we want to use the WCF-WebHttp adapter.

    Port_SendToS3(Microsoft.XLANGs.BaseTypes.Address)="http://<bucketname>.s3-us-west-2.amazonaws.com/310531500150800.PDF";
    Port_SendToS3(Microsoft.XLANGs.BaseTypes.TransportType)="WCF-WebHttp";

    Note: the last part of my URI 310531500150800.PDF represents my Resource.  In this case I have hardcoded a file name.  This is obviously something that you want to make dynamic, perhaps using the FILE.ReceivedFileName context property.

  4. Once we have assembled our S3 message we will go ahead and send it through our Dynamic Solicit Response Port.  The message that we are going to send to Amazon and Receive back is once again of type System.Xml.XmlDocument
  5. One thing to note is that when you receive a response back from Amazon is that it won’t actually have a message body (this is inline with REST).  However even though we receive an empty message body, we will still find some valuable Context Properties.  The two properties of interest are:

    InboundHttpStatusCode

    InboundHttpStatusDescription

    image

     

  6. The last step in the process is to just write our Amazon response to disk.  But, as we have learned in the previous point is that our message body will be empty but does give me an indicator that the process is working (in a Proof of Concept environment).

Overall the Orchestration is very simple.  The complexity really exists in the Message Assignment shape. 

image

 Testing

Not that watching files move is super exciting, but I have created a quick Vine video that will demonstrate the message being consumed by the FILE Adapter and then sent off to Amazon S3.

 https://vine.co/v/hQ2WpxgLXhJ

Conclusion

This was a pretty fun and frustrating solution to put together.  The area that caused me the most grief was easily the Authorization Header.  There is some documentation out there related to Amazon “PUT”s but each call is different depending upon what type of data you are sending and the related headers.  For each header that you add, you really need to include the related value in your “canonicalString”.  You also need to include the complete path to your resource (/bucketname/resource) in this string even though the convention is a little different in the URI.

Also it is worth mentioning that /n Software has created a third party S3 Adapter that abstracts some of the complexity  in this solution.  While I have not used this particular /n Software Adapter, I have used others and have been happy with the experience. Michael Stephenson has blogged about his experiences with this adapter here.