New feature in the BizTalk Password Manager utility

A couple of years ago I had released a small command line tool to set the passwords of the send ports (primary transport only) and the receive locations.

You can find the latest release of this tool at

This allowed me to create scripts that, in the case of a complete application deployment, set all the necessary passwords reducing manual operations that often can cause errors.

This is just an example that can be extended if necessary. However, it has several features that can be used as is.

In the release I released a few days ago I implemented some new features with the aim of facilitating and therefore speeding up the updating of the scripts.

In particular I added these features:

  • List all the port that have a user account and a password set
  • Generate script to set passwords

To list all the port that have a user account and a password set, is enough type the following command:

Here the result:

The output is made in such a way that it can be easily imported into an Excel spreadsheet for documentation purposes.

To set passwords it is necessary to execute the “set” command. Being a command line tool, it is possible to create a batch file that sets all the necessary usernames/passwords. Keeping the script up to date can be a problem or a rather expensive activity, so I added a couple of features to simplify the already difficult deployment adventure.

In this regard it is possible to execute the “generatescript” command which generates a file with the necessary commands. The command can be executed as is, and in this case the passwords will have a placeholder “######” or can be executed with a json mapping file.

To generate the mapping file, run the following command:

The command generates a file .json like the following:

Now you have to substitute “null” with the password, then you can run the following command:

As a result you will have a “cmd” file with the commands ready to be executed.

How to call a Microsoft Graph API from Azure Logic App

Microsoft Graph is the gateway to data and intelligence in Microsoft 365.

You can use the Microsoft Graph API to build apps for organizations and consumers that interact with the data of millions of users. With Microsoft Graph, you can connect to a wealth of resources, relationships, and intelligence, all through a single endpoint:

For additional information refer to link.

Before using the APIs from the Microsoft Logic App, some configurations must be made within Microsoft Azure Active Directory.

Azure Active Directory Setup

Sign in to the Azure portal.

In the left-hand navigation pane, select the Azure Active Directory service.

Select “App registrations” (in preview at this moment) and then select “New registration”.

Enter an application name, select the account type you want to enable and click on “Register”

When finished, take note of “Application (client) Id” and “Directory (tenant) Id”.

To give the capability of calling Microsoft Graph API to your Logic App, you have to select the API permissions.

Click on “Add a permission”.

Select “Microsoft Graph”.

Now you have to choose the permission type, Delegated or Application.

In our case, we need to call the API listed below. Microsoft Graph API documentation specifies the permission required for each method.

List All Teams

Clone a Team

Add Group Owner

Add Member

So, we have granted the following permission.

Permission Type
Directory.AccessAsUser.All Delegated
Directory.ReadWrite.All Delegated
Directory.ReadWrite.All Application
Group.ReadWrite.All Delegated
Group.ReadWrite.All Application
User.Read Delegated
User.Read.All Delegated
User.Read.All Application
User.ReadWrite.All Delegated
User.ReadWrite.All Application

Once the necessary permissions have been added, you will see the following result. Now you have to delegate the permission. Click on “Grant admin consent for…”

Click on “Yes” to confirm.

As a last step, you need to generate a secret key. Select “Certificates & secret” and then “new client secret”.

Specify a description and choose the expiration time.

Once the secret has been generated, it is very important you take note of the secret value. You won’t able to retrieve it after you leave the page.


What you need to successfully call an API from a Logic App are:

  • Application (client) Id
  • Directory (tenant) Id
  • Client secret key

Logic App Development

Once the Azure Active Directory setup is complete, you can use the HTTP connector to make calls to the Graph API.

My requirement was to clone a team, so I had to first look for the source MS Teams id and then proceed with cloning operation.

The following picture shows the search operation.

This API return a json inside which you can find the MS Teams id, then you can call the cloning operation.

The URI is the follow:

@concat(’, body(‘Get_Teams_Template_Id’), ‘/clone’)

Additional notes

Keep in mind that some of the Graph API operations are not synchronous but may have a delay, especially when working with application permissions, you may won’t see the right permission in Teams until the permission synchronization has occurred (

Configuring BizTalk 2016 in High Availability Config with SQL Server 2016 Always On

This article tells about our recent experience with an installation of BizTalk Server 2016 in high availability configuration with SQL Server 2016 Always On. It’s just a small experience, but I hope it can be useful for anyone who wants to install a BizTalk infrastructure in high availability on Microsoft Azure or in a virtual environment where cluster failover is not well accepted by system engineers.

There are several articles dealing with this topic and we highly recommend reading them carefully.

In addition, we suggest you to read the article related to Master Secret Server configuration in cluster (

To design our server infrastructure, we referred to the scenario suggested by the Microsoft article.

Anyway, our implementation is smaller than the examples you can see in the article. This because it often happens to have the need for configurations in high availability but not to have a very high load.

As recommend, we have grouped BizTalk Server databases into the following four SQL Server instances:

The subdivision of the instances is due to the fact that MSDTC on SQL Server 2016 Always-On Availability Group is not able to handle distributed transactions between multiple databases within the same instance. We are all waiting for BizTalk to support the 2017 version of SQL that is able to do that.

In this specific case we don’t need ESB toolkit databases and BAM Analysis and Alert, so our final cluster implementation is illustrated in the figure below.

As you can see, we have configured one Listener for each SQL Server instance.

We do not report the installation and configuration of SQL Server as it is extensively described in the articles we referred to.

SQL Server Installed Features

On each server node and node instance we have installed the following services and features:

  • Database Engine Services
    • SQL Server Replication
    • Full-Text and Semantic Extractions for Search
  • Shared Features
    • Client Tools Connectivity
    • Integration Services

DTC Configuration

In this configuration it is not necessary to configure a DTC instance as a cluster resource because a local DTC instance will be used.

To enable Network DTC Access on the BizTalk Server and the SQL Server follow the step below:

  • In the Start menu, open “dcomcnfg”.
    • Or, open Administrative Tools, and then select Component Services.
    • Or, open Server Manager, select Tools, and then select Component Services.
  • Expand Component Services, expand Computers, expand My Computer, and expand Distributed Transaction Coordinator.
  • Right-click Local DTC, and select Properties.
  • Go to the Security tab, and check the following:
    • Network DTC Access
    • Allow Inbound
    • Allow Outbound
    • No Authentication Required
  • Select OK. If prompted to restart MS DTC, select Yes.

Enable remote access to COM

Follow the instructions in the following article to enable remote access to COM.

Cluster resource check

From the failover cluster management console, enable the prevent back option in each resource group created.

This may be your preferred setting because it allows you to control when the failback occurs. For example, you may want to select Prevent failback if you want to take time to troubleshoot or run diagnostics on the failed node before allowing the node to take ownership of the SQL Server again.

Availability Group Options

Make sure Availability Groups are configured with the Per Database DTC Support option.

Also configure “Readable Secondary” equal to “Yes”. Otherwise the SSO service could generate the following error:

Failed to contact the SSO database: The target database, ‘SSODB’, is participating in an availability group and is currently not accessible for queries. Either data movement is suspended or the availability replica is not enabled for read access. To allow read-only access to this and other databases in the availability group, enable read access to one or more secondary availability replicas in the group. For more information, see the ALTER AVAILABILITY GROUP statement in SQL Server Books Online.
Data Source=***;Integrated Security=SSPI;Initial Catalog=SSODB

Error code: 0x800710D9, Unable to read from or write to the database.

Finally, you must check that “Primary” has been selected in the Backup Preferences tab, in the “Where backup should occur” configuration setting.

SSO Setup

After completing the installation of SQL Server, before proceeding with the installation of BizTalk, you need to install the Enterprise Single Sign-On master secret server.

We will follow this article regarding the installation of the clustered SSO service:

We will set up a cluster group that will contain IP address, Network Name and a generic service resource for the SSO service.

Execute the following step on both SQL Server nodes.

Enable only the following components

  • Enterprise Single Sign-On Administration Module
  • Enterprise Single Sign-On Master Secret Server

Execute the following step on first SQL Server node

Execute the following step on second node

To configure SSO in failover cluster follow the below steps:

  1. Restart the Enterprise SSO service on the first node
  2. From the command prompt type the following command:

    “<drive>:\Program Files\Common Files\Enterprise Single Sign-On\ssomanage” -updatedb XMLFile

    Where XMLFile is a file with the following content “<sso><globalInfo><secretServer>[SSO NET NAME CLUSTER RES]</secretServer></globalInfo></sso>”.

  3. Configure manual start type the “Enterprise Single Sign-On Service” service on both servers.
  4. From the failover cluster management console create a Generic Services resource for the SSO service
  5. Restore the master secret on the second cluster node

Configure SSO Database replica

Before to start to configure SSODB database replica you have to execute e full backup of database.

Install BizTalk Server Application Server

Before starting with the BizTalk Server installation, the prerequisites must be installed. Follow the step described in the Microsoft article

The article shows a series of mandatory and optional steps. Following what I have done:

  • Enable Network DTC Access
  • Disable Windows Firewall
  • Disable UAC
  • Enable IIS
  • Run 64-bit BAM portal
  • Install Visual C++ redistributable package
  • Install SQL XML 4
  • SQL Server Integration Services

BizTalk Setup

Execute the following step on both BizTalk Servers.

Enable all available components

BizTalk Configuration (First Node)

In general, the configuration of the first BizTalk node creates configuration databases while the configuration of the second node is limited to joining the existing farm.

BizTalk Configuration (Second Node)

Final Database Configuration

Database Replica

Now that all the databases have been created it is necessary to proceed with the configuration of the replica within the different SQL Server availability groups. Below are the steps to configure the BizTalkMgmtDb database.

Repeat these steps for each database created by the configuration procedure.

Linked Server & Job

Linked servers and jobs are automatically created by the configuration procedure, anyway it is better to verify that they have been created on both nodes.

It is also necessary to modify all the steps of each job to include the control that the server running is the primary one. So, include the original step inside the following code:

if (sys.fn_hadr_is_primary_replica(‘<DBNAME>’)
= 1)

    [original step commands]


I suggest to use the validation script for BizTalk Availability Group ( in order to check the configuration, it is a very useful tool.

I want to thank my colleague Paolo Barbieri who helped me to reason during this little adventure and write this article.

BizTalk tools available on GitHub


In some of the articles I wrote, I published part of the tools I and my team use in the integration projects developed with BizTalk Server.

These tools are available on CodePlex, but as many of you know, next December 15, CodePlex will be shut down, so we decided to make our work available on GitHub.


BizTalk Deployment Utility (BizTalkDiff)

BizTalk Deployment Utility is a simple tool allows you to compare two systems based on BizTalk Server and align environments by copying specific ports from a source system to a target system.

With this application you can check the existance of applications, receive ports, receive locations, send ports and send port groups, and decide to create specific objects on the target system. It is a kind of BizTalk Diff.



You can now find this tool at


BizTalk Password Manager

This little tool allows you to set BizTalk send ports and receive locations passwords. You can use it to create automated scripts during deployment operations.

It is just an example that currently manages the adapter FILE, FTP, SFTP, POP3 and WCF *. Of course you can access the source code in order to manage other kind of adapters.


You can now find this tool at


BizTalk Configuration Reports

BizTalk Configuration Reporting allows you to view and document the current configuration of BizTalk Server in terms of receive ports, send ports and orchestrations.

This tool consists of a database to be placed in the same database instance of BizTalkMgmtDb and a some reports to be installed in an instance of Microsoft SQL Server Reporting Services (SSRS).

The reports, being based on SSRS, display data in real time and allow export in different formats such as Microsoft Word, PDF, Microsoft Excel, TIFF, etc.

This allow to have documentation always up to date.



You can now find this tool at


BizTalk Deployment Utility (BizTalk Diff) is growing

Thanks to my team that helps me face increasingly difficult missions in my work, the BizTalk Diff grows by inserting new features. In particular, I want to thank Matteo Chieli who did much of the work.

You can find the updated version of the tool at address.

In Particular, it is now possible to apply some changes in block to the configuration:

You can set the following settings:

Change send port host and transport options (Retry Count, Retry Interval and Enable Routing for failed messages)

Change receive location host

Change orchestration host

Change WCF based send port timeouts

Change WCF based receive location timeouts

Upload Flat File on Azure SQL Database using Azure Logic App

In this article, I’m going to describe how to upload on-premises flat files on Azure SQL Database using Azure Logic Apps.

Logic Apps provide integration and workflows capabilities in the cloud. It provides a visual designer to model and automated processes as a series of steps.

In this sample, I also use Azure Service Bus as an entry point to the cloud because the file size is less than 1024 KB. Consider to use Azure Storage if you plan to upload file greater than that size.

Initial Consideration

Before you start, consider installing Microsoft Azure Logic Apps Enterprise Integration Tools for Visual Studio 2015 ( in order to facilitate the creation of schemas and maps.

In addition, you can install Azure Logic Apps Tools for Visual Studio ( to develop Logic App integration using visual studio.

At this link (, you can find several tools for the development of solutions based on Microsoft Azure.

The following picture shows the data flow that I am going to describe.


Following the check list

  • Create Resource Group
  • Create Service Bus Namespace
  • Create Service Bus Queue
  • Create SQL Server
  • Create SQL Database
  • Create SQL Objects (Table, Schema and Procedures)
  • Configure API Connection to Service Bus
  • Configure API Connection to SQL Database
  • Create an Integration Account
  • Upload Schemas and Maps on Integration Account
  • Configure Transform Function (this step should be unnecessary in the next future)
  • Create and Test Logic App integration flow

To complete each step, you must first access the Microsoft Azure portal (

I think it is unnecessary to describe each step in detail, so I will just provide some screen shots.

Create Resource Group


Create Service Bus Namespace


Create Service Bus Queue


Create SQL Server (Logical Server)


Connect to SQL Server and create database objects

Before you can connect to SQL Database virtual server, you have to configure Azure firewall to allow the communication.


Use SQL Server Management Studio to create database and tables


Create database
   1: /*

   2: For additional information refer to address

   3: */

   4: CREATE DATABASE [int-fftosql-sqldb]

   5: (

   6:     EDITION = 'basic',

   7:     SERVICE_OBJECTIVE='basic'

   8: );


  10: GO

Create table and procedures



   2: N'<xsd:schema xmlns:xsd="">

   3:   <xsd:element name="Root">

   4:     <xsd:complexType mixed="true">

   5:       <xsd:complexContent mixed="true">

   6:         <xsd:restriction base="xsd:anyType">

   7:           <xsd:sequence>

   8:             <xsd:element name="Order" maxOccurs="unbounded">

   9:               <xsd:complexType mixed="true">

  10:                 <xsd:complexContent mixed="true">

  11:                   <xsd:restriction base="xsd:anyType">

  12:                     <xsd:sequence>

  13:                       <xsd:element name="Lines" type="OrderDetailType" maxOccurs="unbounded" />

  14:                     </xsd:sequence>

  15:                     <xsd:attribute name="orderDate" type="xsd:string" use="required" />

  16:                     <xsd:attribute name="orderNumber" type="xsd:string" use="required" />

  17:                     <xsd:attribute name="channel" type="xsd:string" use="required" />

  18:                   </xsd:restriction>

  19:                 </xsd:complexContent>

  20:               </xsd:complexType>

  21:             </xsd:element>

  22:           </xsd:sequence>

  23:         </xsd:restriction>

  24:       </xsd:complexContent>

  25:     </xsd:complexType>

  26:   </xsd:element>

  27:   <xsd:complexType name="OrderDetailType" mixed="true">

  28:     <xsd:complexContent mixed="true">

  29:       <xsd:restriction base="xsd:anyType">

  30:         <xsd:choice maxOccurs="unbounded">

  31:           <xsd:element name="productName" type="xsd:string" />

  32:           <xsd:element name="quantity" type="xsd:string" />

  33:           <xsd:element name="value" type="xsd:string" />

  34:         </xsd:choice>

  35:       </xsd:restriction>

  36:     </xsd:complexContent>

  37:   </xsd:complexType>

  38: </xsd:schema>'

  39: GO


  41: CREATE TABLE [dbo].[OrderHeader]

  42: (

  43:     [orderId] [uniqueidentifier] NOT NULL,

  44:     [orderNumber] [varchar](20) NOT NULL,

  45:     [orderDate] [datetime] NOT NULL,

  46:     [channel] [varchar](50) NULL,


  48:     (

  49:         [orderId] ASC

  50:     )

  51: )

  52: GO


  54: CREATE TABLE [dbo].[OrderLine]

  55: (

  56:     [orderLineId] [int] IDENTITY(1,1) NOT NULL,

  57:     [orderId] [uniqueidentifier] NOT NULL,

  58:     [productName] [varchar](100) NOT NULL,

  59:     [quantity] [int] NOT NULL,

  60:     [value] [numeric](18, 3) NOT NULL,


  62:     (

  63:         [orderLineId] ASC

  64:     )

  65: )

  66: GO


  68: /*


  70: -------------------------------

  71: EXECUTE OrderInsert N'<Root>

  72:   <Order orderDate="20160101" orderNumber="000001" channel="Internet">

  73:     <Lines>

  74:       <productName>Portable Radio</productName>

  75:       <quantity>10</quantity>

  76:       <value>23.5</value>

  77:     </Lines>

  78:     <Lines>

  79:       <productName>6 red wine bottles</productName>

  80:       <quantity>20</quantity>

  81:       <value>30.9</value>

  82:     </Lines>  

  83:   </Order>

  84: </Root>'

  85: -------------------------------

  86: */

  87: CREATE PROCEDURE [dbo].[OrderInsert]

  88:     @orderStream VARCHAR(MAX)

  89: AS

  90: BEGIN


  92:     -- Fix Incoming XML: Logic App add a "?" char at the beginning of the string

  93:     SET @orderStream = REPLACE(@orderStream, '?<', '<');


  95:     DECLARE @orderXml XML(OrderXml) = CAST(@orderStream AS XML(OrderXml));



  98:     INSERT INTO OrderHeader

  99:     (

 100:         orderId,

 101:         orderNumber,

 102:         orderDate,

 103:         channel

 104:     )

 105:     SELECT

 106:         @orderId,

 107:         orderNumber = OrderObject.Header.value('./@orderNumber', 'varchar(20)'),

 108:         orderDate = OrderObject.Header.value('./@orderDate', 'datetime'),

 109:         channel = OrderObject.Header.value('./@channel', 'varchar(50)')

 110:     FROM

 111:         @orderXml.nodes('/Root/Order') AS OrderObject(Header);


 113:     INSERT INTO OrderLine

 114:     (

 115:         orderId,

 116:         productName,

 117:         quantity,

 118:         value

 119:     )

 120:     SELECT

 121:         @orderId,

 122:         productName = OrderObject.Lines.value('./productName[1]', 'varchar(100)'),

 123:         quantity = OrderObject.Lines.value('./quantity[1]', 'int'),

 124:         value = OrderObject.Lines.value('./value[1]', 'numeric(18,3)')

 125:     FROM

 126:         @orderXml.nodes('/Root/Order/Lines') AS OrderObject(Lines);


 128: END

 129: GO

You should see something like that.


Create API Connection to Service Bus queue

Before to configure the connection, you need to know the service bus connection string.



Create API Connection to SQL Server


Create Integration Account

Integration accounts allow you to manage artifacts such as schemas, maps, certificates, partners and agreements in order to build Integration Application cloud based apps.

For additional information, refer to


Configure Integration Account


Add source schema (PurchaseOrderFF.xsd)


   1: <?xml version="1.0" encoding="utf-16"?>

   2: <xs:schema xmlns="http://int1demoa1project.Order" xmlns:b="" targetNamespace="http://int1demoa1project.Order" xmlns:xs="">

   3:   <xs:annotation>

   4:     <xs:appinfo>

   5:       <schemaEditorExtension:schemaInfo namespaceAlias="b" extensionClass="Microsoft.Azure.Integration.DesignTools.FlatFileExtension.FlatFileExtension" standardName="Flat File" xmlns:schemaEditorExtension="" />

   6:       <b:schemaInfo standard="Flat File" codepage="1250" xmlns:b="" root_reference="Order" />

   7:     </xs:appinfo>

   8:   </xs:annotation>

   9:   <xs:element name="Order">

  10:     <xs:annotation>

  11:       <xs:appinfo>

  12:         <b:recordInfo structure="delimited" child_delimiter_type="hex" child_delimiter="0xD 0xA" child_order="postfix" />

  13:       </xs:appinfo>

  14:     </xs:annotation>

  15:     <xs:complexType>

  16:       <xs:sequence>

  17:         <xs:element name="Header">

  18:           <xs:annotation>

  19:             <xs:appinfo>

  20:               <b:recordInfo tag_name="H0" structure="delimited" child_delimiter_type="char" child_delimiter=";" child_order="prefix" />

  21:             </xs:appinfo>

  22:           </xs:annotation>

  23:           <xs:complexType>

  24:             <xs:sequence>

  25:               <xs:element name="orderDate" type="xs:string">

  26:                 <xs:annotation>

  27:                   <xs:appinfo>

  28:                     <b:fieldInfo justification="left" />

  29:                   </xs:appinfo>

  30:                 </xs:annotation>

  31:               </xs:element>

  32:               <xs:element name="orderNumber" type="xs:string">

  33:                 <xs:annotation>

  34:                   <xs:appinfo>

  35:                     <b:fieldInfo justification="left" />

  36:                   </xs:appinfo>

  37:                 </xs:annotation>

  38:               </xs:element>

  39:               <xs:element name="channel" type="xs:string">

  40:                 <xs:annotation>

  41:                   <xs:appinfo>

  42:                     <b:fieldInfo justification="left" />

  43:                   </xs:appinfo>

  44:                 </xs:annotation>

  45:               </xs:element>

  46:             </xs:sequence>

  47:           </xs:complexType>

  48:         </xs:element>

  49:         <xs:element maxOccurs="unbounded" name="Detail">

  50:           <xs:annotation>

  51:             <xs:appinfo>

  52:               <b:recordInfo tag_name="D0" structure="delimited" child_delimiter_type="char" child_delimiter=";" child_order="prefix" />

  53:             </xs:appinfo>

  54:           </xs:annotation>

  55:           <xs:complexType>

  56:             <xs:sequence>

  57:               <xs:element name="productName" type="xs:string">

  58:                 <xs:annotation>

  59:                   <xs:appinfo>

  60:                     <b:fieldInfo justification="left" />

  61:                   </xs:appinfo>

  62:                 </xs:annotation>

  63:               </xs:element>

  64:               <xs:element name="quantity" type="xs:string">

  65:                 <xs:annotation>

  66:                   <xs:appinfo>

  67:                     <b:fieldInfo justification="left" />

  68:                   </xs:appinfo>

  69:                 </xs:annotation>

  70:               </xs:element>

  71:               <xs:element name="value" type="xs:string">

  72:                 <xs:annotation>

  73:                   <xs:appinfo>

  74:                     <b:fieldInfo justification="left" />

  75:                   </xs:appinfo>

  76:                 </xs:annotation>

  77:               </xs:element>

  78:             </xs:sequence>

  79:           </xs:complexType>

  80:         </xs:element>

  81:       </xs:sequence>

  82:     </xs:complexType>

  83:   </xs:element>

  84: </xs:schema>

Add SQL Server stored procedure schema (PurchaseOrderDB.xsd)


   1: <?xml version="1.0" encoding="utf-16"?>

   2: <xsd:schema xmlns:b="" attributeFormDefault="unqualified" elementFormDefault="unqualified" xmlns:xsd="">

   3:   <xsd:complexType name="OrderDetailType" mixed="true">

   4:     <xsd:choice minOccurs="1" maxOccurs="unbounded">

   5:       <xsd:element minOccurs="1" maxOccurs="1" name="productName" type="xsd:string" />

   6:       <xsd:element minOccurs="1" maxOccurs="1" name="quantity" type="xsd:string" />

   7:       <xsd:element minOccurs="1" maxOccurs="1" name="value" type="xsd:string" />

   8:     </xsd:choice>

   9:   </xsd:complexType>

  10:   <xsd:element name="Root">

  11:     <xsd:complexType mixed="true">

  12:       <xsd:sequence>

  13:         <xsd:element minOccurs="1" maxOccurs="unbounded" name="Order">

  14:           <xsd:complexType mixed="true">

  15:             <xsd:sequence>

  16:               <xsd:element minOccurs="1" maxOccurs="unbounded" name="Lines" type="OrderDetailType" />

  17:             </xsd:sequence>

  18:             <xsd:attribute name="orderDate" type="xsd:string" use="required" />

  19:             <xsd:attribute name="orderNumber" type="xsd:string" use="required" />

  20:             <xsd:attribute name="channel" type="xsd:string" use="required" />

  21:           </xsd:complexType>

  22:         </xsd:element>

  23:       </xsd:sequence>

  24:     </xsd:complexType>

  25:   </xsd:element>

  26: </xsd:schema>

Add XML Transformation Map



   1: <?xml version="1.0" encoding="utf-16"?>

   2: <xsl:stylesheet xmlns:xsl="" xmlns:msxsl="urn:schemas-microsoft-com:xslt" xmlns:var="" exclude-result-prefixes="msxsl var s0" version="1.0" xmlns:s0="http://int1demoa1project.Order">

   3:   <xsl:output omit-xml-declaration="yes" encoding="windows-1250" version="1.0" indent="no" method="xml" />

   4:   <xsl:template match="/">

   5:     <xsl:apply-templates select="/s0:Order" />

   6:   </xsl:template>

   7:   <xsl:template match="/s0:Order">

   8:     <Root>

   9:       <Order>

  10:         <xsl:attribute name="orderDate">

  11:           <xsl:value-of select="Header/orderDate/text()" />

  12:         </xsl:attribute>

  13:         <xsl:attribute name="orderNumber">

  14:           <xsl:value-of select="Header/orderNumber/text()" />

  15:         </xsl:attribute>

  16:         <xsl:attribute name="channel">

  17:           <xsl:value-of select="Header/channel/text()" />

  18:         </xsl:attribute>

  19:         <xsl:for-each select="Detail">

  20:           <Lines>

  21:             <productName>

  22:               <xsl:value-of select="productName/text()" />

  23:             </productName>

  24:             <quantity>

  25:               <xsl:value-of select="quantity/text()" />

  26:             </quantity>

  27:             <value>

  28:               <xsl:value-of select="value/text()" />

  29:             </value>

  30:             <xsl:value-of select="./text()" />

  31:           </Lines>

  32:         </xsl:for-each>

  33:       </Order>

  34:     </Root>

  35:   </xsl:template>

  36: </xsl:stylesheet>


Temporary operations

Create Storage Account


Create App Service Plan


Create Function App


Add the transformation function

To add the transformation Function for Logic App run the following link


Create Logic App


Associate Integration Account to the Logic App


Create a blank solution


Configure trigger



Modify the polling frequency if you need to process the file in near real time.


Add Flat-File decoding to parse flat file and fill into XSD schema message.



Add the map to transform flat file message to SQL Database XML schema type.


Configure the call to the stored procedure.



Write and run client application

Now you able to run the client application. To create a sample application, follow the steps below.

1 – Install WindowsAzure.ServiceBus NuGet Package


2 – Get service bus Connection String


3 – Write your code


   2: using Microsoft.ServiceBus.Messaging;


   4: ...

   5: using (FileStream file = File.Open(<FilePath>, FileMode.Open, FileAccess.Read, FileShare.Delete))

   6: {


   8:   // Send

   9:   file.Seek(0, SeekOrigin.Begin);


  11:   var queueClient = QueueClient.CreateFromConnectionString(<ConnectionString>, <QueueName>);

  12:   BrokeredMessage serviceBusMessage;


  14:   serviceBusMessage = new BrokeredMessage(file);


  16:   queueClient.Send(serviceBusMessage);


  18: }

Upload Dataset on Azure SQL Database using Azure Event Hub and Azure Stream Analytics

There are several ways to upload data to SQL Database, including SSIS, BCP, API and several other ways.

For those who work with SQL Server it is natural to use tools that historically belong to the SQL platform, so I wanted to write this article to expose an alternative and to talk about different techniques that apply to different scenarios.

When designing an architecture, you may find yourselves at a crossroads, and often you do not know what to expect once made a decision. How requirements will change over time.

Azure Event hub allows us to decouple who collects the data (the client) from who receives the data, allow you to change kind of receivers or add additional targets.

For this reason, I consider it the entry point to Microsoft Azure.

The following picture shows the data flow that I am going to describe.

Following the check list

  • Create Resource Group
  • Create Event Hubs Namespace
  • Create Event Hub Queue
  • Create SQL Server
  • Create SQL Database
  • Create SQL Table
  • Create Stream Analytics Job
  • Configure Input
  • Configure Output
  • Configure Query
  • Start Stream Analytics Job

To complete each step, you must first access the Microsoft Azure portal (

I think it is unnecessary to describe each step in detail, so I will just provide some screen shots.

Create Resource Group

Create Event Hubs Namespace

Create Event Hub

Create SQL Server (logical server)

Connect to SQL Server and create database objects

Before you can connect to SQL Database virtual server, you have to configure Azure firewall to allow the communication.

Use SQL Server Management Studio to create database and tables

Create database


For additional information refer to address


DATABASE [int-dstosql-sqldb]


    EDITION ‘basic’,




Create table




    city VARCHAR(50NOT NULL,

    orderValue NUMERIC(10,2NOT NULL,



Create Stream Analytics Job

Create Stream Analytics Job Input

Create Stream Analytics Job output

Define Stream Analytics Job Query

Refer to the link for additional information about Azure Stream Analytics query language.

Save and start stream analytics job.

Write and run client application

Now you able to run the client application. To create a sample application, follow the steps below.

1 – Install WindowsAzure.ServiceBus NuGet Package

2 – Get Event Hubs Connection String

3 – Write your code

using Microsoft.ServiceBus.Messaging;

using System.Threading;

DataTable dataTable = <Put here your code to retrieve data>;

var data = JsonConvert.SerializeObject(dataTable);

var eventHubClient = EventHubClient.CreateFromConnectionString(<Event Hub Conn String>, <Queue Name>);

eventHubClient.Send(new EventData(Encoding.UTF8.GetBytes(data)));


Refer to the link to additional information.

BizTalk CRM Lookup functoid is now ready for CRM Online

In recent years I often integrate Microsoft Dynamics CRM by using BizTalk Server, and as many know, BizTalk Server does not have a native connector for MS CRM.

For this reason, we must work hard to generate canonical schemas, to resolve lookup type fields, Guid type fields and generally to develop maps.

To increase the productivity of my team I have always tried to develop tools or helpers to automate the development of long and repetitive tasks, reducing the possibility of generating errors and standardize approaches to problems.

In recent months, with the cooperation of Nino Crudele and Sandro Pereira we published several functoids that simplify integration with Dynamics CRM and in this article I want to describe the work that has been done to also integrate the CRM Online.

When you integrate MS CRM using BizTalk Server, typically you have at least two problems:

  • Transcode the Lookup and Guid type fields
  • Mapping the input information with the generic schema exposed by CRM

The BizTalk Mapper Extension Utility pack simplifies both tasks by providing specialized functoids.

Today we released a new version where you can find the CRM Lookup functoid with two new features:

  • The ability to define more than one configuration in the SSO. This allows of integrating more than one CRM instance with a single BizTalk Farm
  • Now you can integrate CRM On-Line. CRM Lookup, for performance reasons, performs lookups operation by accessing the CRM database in read only. However, the CRM Online does not expose his database, so we added the ability to do lookups using the native CRM web services that are present in both the online and on-premises versions.

The functoid now takes five mandatory input parameters:

  1. SSO Application Name: the connection parameters are stored inside SSO. Here you have to specify the application name where you decide to store the settings. This allow you to have multiple CRM environments that you can integrate;
  2. CRM Guid Field Name: key field name of which you want to get the value;
  3. CRM Entity Name: CRM entity name on which the resolution will be performed
  4. CRM Input Field Name: imagining having a “code” as input and you want to know the CRM physical key, in this attribute should specify the name of the “code” field.
  5. CRM Input Value: value from source system to resolve

The output of the functoid is a string, Example: 6erg5r-sdrf56-4dfgt5-6ty5r4-456trt 24


Download and install BizTalk Mapper Extensions UtilityPack from

Download CRM Sdk and register in GAC the following assemblies:

  • Microsoft.Xrm.Sdk.dll (version 5.0.9910.15)

Create a SSO application by using SSO Configuration Application MMC Snap-In available at

SQL Server Lookup

Create the following SSO configuration parameters

  • CrmConnectionType: “sql”
  • CrmDatabaseConnectionString: specify the CRM database connection string

Open SQL Server management console, connect to CRM database and execute the script under CRMLookup.sql file in project folder.

Local CRM web services lookup

Create the following SSO configuration parameters

CRM Online web services lookup

Create the following SSO configuration parameters

How-to create an Azure Service Bus namespace in a specific Resource Group

At this moment, the Azure portal does not allow the creation of a Service Bus namespace in a specific Resource Group. You can do it via REST or PowerShell but not from the portal.

In this short article, I will show how to do it without writing lines of code.

  • Create the resource group from the portal (for example “sampleResources”)
  • From app navigate “subscription”, “your subscription name”, “resourceGroups” and then choose the resource group previously created.
  • Expand “providers” and click on “Show all”
  • Expand “Microsoft.ServiceBus” and then select “namespaces”
  • On the right panel click on “Read\Write” and then on “Create” buttons
  • Specify the name of your Service Bus Namespace (for example “sampleServiceBus”)
  • Paste the following json code


  “id”: “/subscriptions/SUBSCRIPTION_GUID/resourceGroups/sampleResources/providers/Microsoft.ServiceBus/namespaces/sampleServiceBus,
  “name”: sampleServiceBus,
  “type”: “Microsoft.ServiceBus/namespaces”,
  “location”: West US,
  “kind”: “Messaging”,
  “tags”: {},
  “properties”: {
    “provisioningState”: “Succeeded”,
    “status”: “Active”,
    “createdAt”: “2016-05-10T12:00:00.000Z”,
    “serviceBusEndpoint”: “”,
    “enabled”: true,
    “critical”: false,
    “updatedAt”: “2016-05-10T12:00:00.000Z”,
    “eventHubEnabled”: true,
    “namespaceType”: “Messaging”,
    “messagingSku”: 2

You should have the following

  • Click on “PUT” button and that’s it!

Integrate SQL Server with Power BI

One of the most common scenarios for those who use Microsoft technologies and intend to publish its data into Power BI typically use SQL Server as a data source.

Power BI provides several ways to access SQL Server information, among them there are some that need simple configurations and others that require developments. Below we summarize the different possibilities that you have available:

Power BI has a wide range of connector to access to several types of data sources, these include a connector for Azure SQL Database and one for Azure SQL Data Warehouse.

To access to a SQL Server, whether it be an instance of Azure IaaS or On-Premises instance you can use Power BI Gateways. There is a Power BI Enterprise Gateway for enterprise deployments and a Power BI Personal Gateway for personal use. You can download it directly from Power BI portal and install in a server near your SQL Server.

For additional information about Power BI Gateways refer to

Both these features support direct query option, that is SQL Server is queried when the user accesses the report or dashboard. Typically you prefer to schedule the frequency of the update. You can schedule updates every few minutes.

If you want to get a dashboard that receives continuous updates getting the effect of a live dashboard you need to perform simple developments.

To update a Power BI dataset you can use different Microsoft Azure functionality:

Windows Azure Event Hubs + Windows Azure Stream Analytics


Both techniques need configuration on Microsoft Azure side and a Windows service that resides in the same network of your SQL Server that is responsible for reading the information and send them to the Cloud.

This approach allows you to make updates with a frequency close to the second.

Use Windows Azure Event Hubs and Microsoft Azure Stream Analytics is simpler and the Power BI dataset is created automatically, but with this technique, you can only insert new information.

The Power BI APIs are a bit more complex but allow several types of operations. For example it is possible to empty the dataset before inserting new information.

However, you must manually create the data set.

You can download a Windows service sample at the address This simple solution it able to use both techniques. It is enough to apply a configuration to the QueryConfig.json file and write a query file. The project includes a console (DbToP Svc.Power Bi.UI.exe) that allows you to create the dataset and retrieve its id (required in the configuration file).

Below is an example of Query Config.json file.

As I said, before starting the service you need to apply some Microsoft Azure side configurations.

You can follow the article to configure Windows Azure Event Hubs and a Stream Analytics Job.

You have to follow the article before use Power BI REST API.