BizTalk Deployment Utility (BizTalk Diff) is growing

Thanks to my team that helps me face increasingly difficult missions in my work, the BizTalk Diff grows by inserting new features. In particular, I want to thank Matteo Chieli who did much of the work.

You can find the updated version of the tool at https://biztalkdeployutil.codeplex.com/ address.

In Particular, it is now possible to apply some changes in block to the configuration:


You can set the following settings:

Change send port host and transport options (Retry Count, Retry Interval and Enable Routing for failed messages)


Change receive location host


Change orchestration host


Change WCF based send port timeouts


Change WCF based receive location timeouts


Upload Flat File on Azure SQL Database using Azure Logic App

In this article, I’m going to describe how to upload on-premises flat files on Azure SQL Database using Azure Logic Apps.

Logic Apps provide integration and workflows capabilities in the cloud. It provides a visual designer to model and automated processes as a series of steps.

In this sample, I also use Azure Service Bus as an entry point to the cloud because the file size is less than 1024 KB. Consider to use Azure Storage if you plan to upload file greater than that size.

Initial Consideration

Before you start, consider installing Microsoft Azure Logic Apps Enterprise Integration Tools for Visual Studio 2015 (https://www.microsoft.com/en-us/download/details.aspx?id=53016) in order to facilitate the creation of schemas and maps.

In addition, you can install Azure Logic Apps Tools for Visual Studio (https://visualstudiogallery.msdn.microsoft.com/e25ad307-46cf-412e-8ba5-5b555d53d2d9) to develop Logic App integration using visual studio.

At this link (https://azure.microsoft.com/en-us/tools/), you can find several tools for the development of solutions based on Microsoft Azure.

The following picture shows the data flow that I am going to describe.

clip_image002

Following the check list

  • Create Resource Group
  • Create Service Bus Namespace
  • Create Service Bus Queue
  • Create SQL Server
  • Create SQL Database
  • Create SQL Objects (Table, Schema and Procedures)
  • Configure API Connection to Service Bus
  • Configure API Connection to SQL Database
  • Create an Integration Account
  • Upload Schemas and Maps on Integration Account
  • Configure Transform Function (this step should be unnecessary in the next future)
  • Create and Test Logic App integration flow

To complete each step, you must first access the Microsoft Azure portal (https://portal.azure.com).

I think it is unnecessary to describe each step in detail, so I will just provide some screen shots.

Create Resource Group

clip_image004

Create Service Bus Namespace

clip_image006

Create Service Bus Queue

clip_image008

Create SQL Server (Logical Server)

clip_image010

Connect to SQL Server and create database objects

Before you can connect to SQL Database virtual server, you have to configure Azure firewall to allow the communication.

clip_image012

Use SQL Server Management Studio to create database and tables

clip_image014

Create database
   1: /*

   2: For additional information refer to address https://msdn.microsoft.com/en-us/library/dn268335.aspx

   3: */

   4: CREATE DATABASE [int-fftosql-sqldb]

   5: (

   6:     EDITION = 'basic',

   7:     SERVICE_OBJECTIVE='basic'

   8: );

   9:

  10: GO

Create table and procedures

 

   1: CREATE XML SCHEMA COLLECTION [dbo].[OrderXml] AS

   2: N'<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema">

   3:   <xsd:element name="Root">

   4:     <xsd:complexType mixed="true">

   5:       <xsd:complexContent mixed="true">

   6:         <xsd:restriction base="xsd:anyType">

   7:           <xsd:sequence>

   8:             <xsd:element name="Order" maxOccurs="unbounded">

   9:               <xsd:complexType mixed="true">

  10:                 <xsd:complexContent mixed="true">

  11:                   <xsd:restriction base="xsd:anyType">

  12:                     <xsd:sequence>

  13:                       <xsd:element name="Lines" type="OrderDetailType" maxOccurs="unbounded" />

  14:                     </xsd:sequence>

  15:                     <xsd:attribute name="orderDate" type="xsd:string" use="required" />

  16:                     <xsd:attribute name="orderNumber" type="xsd:string" use="required" />

  17:                     <xsd:attribute name="channel" type="xsd:string" use="required" />

  18:                   </xsd:restriction>

  19:                 </xsd:complexContent>

  20:               </xsd:complexType>

  21:             </xsd:element>

  22:           </xsd:sequence>

  23:         </xsd:restriction>

  24:       </xsd:complexContent>

  25:     </xsd:complexType>

  26:   </xsd:element>

  27:   <xsd:complexType name="OrderDetailType" mixed="true">

  28:     <xsd:complexContent mixed="true">

  29:       <xsd:restriction base="xsd:anyType">

  30:         <xsd:choice maxOccurs="unbounded">

  31:           <xsd:element name="productName" type="xsd:string" />

  32:           <xsd:element name="quantity" type="xsd:string" />

  33:           <xsd:element name="value" type="xsd:string" />

  34:         </xsd:choice>

  35:       </xsd:restriction>

  36:     </xsd:complexContent>

  37:   </xsd:complexType>

  38: </xsd:schema>'

  39: GO

  40:

  41: CREATE TABLE [dbo].[OrderHeader]

  42: (

  43:     [orderId] [uniqueidentifier] NOT NULL,

  44:     [orderNumber] [varchar](20) NOT NULL,

  45:     [orderDate] [datetime] NOT NULL,

  46:     [channel] [varchar](50) NULL,

  47:     CONSTRAINT [PK_OrderHeader] PRIMARY KEY CLUSTERED

  48:     (

  49:         [orderId] ASC

  50:     )

  51: )

  52: GO

  53:

  54: CREATE TABLE [dbo].[OrderLine]

  55: (

  56:     [orderLineId] [int] IDENTITY(1,1) NOT NULL,

  57:     [orderId] [uniqueidentifier] NOT NULL,

  58:     [productName] [varchar](100) NOT NULL,

  59:     [quantity] [int] NOT NULL,

  60:     [value] [numeric](18, 3) NOT NULL,

  61:     CONSTRAINT [PK_OrderLine] PRIMARY KEY CLUSTERED

  62:     (

  63:         [orderLineId] ASC

  64:     )

  65: )

  66: GO

  67:

  68: /*

  69: TEST STATEMENT:

  70: -------------------------------

  71: EXECUTE OrderInsert N'<Root>

  72:   <Order orderDate="20160101" orderNumber="000001" channel="Internet">

  73:     <Lines>

  74:       <productName>Portable Radio</productName>

  75:       <quantity>10</quantity>

  76:       <value>23.5</value>

  77:     </Lines>

  78:     <Lines>

  79:       <productName>6 red wine bottles</productName>

  80:       <quantity>20</quantity>

  81:       <value>30.9</value>

  82:     </Lines>  

  83:   </Order>

  84: </Root>'

  85: -------------------------------

  86: */

  87: CREATE PROCEDURE [dbo].[OrderInsert]

  88:     @orderStream VARCHAR(MAX)

  89: AS

  90: BEGIN

  91:

  92:     -- Fix Incoming XML: Logic App add a "?" char at the beginning of the string

  93:     SET @orderStream = REPLACE(@orderStream, '?<', '<');

  94:

  95:     DECLARE @orderXml XML(OrderXml) = CAST(@orderStream AS XML(OrderXml));

  96:     DECLARE @orderId UNIQUEIDENTIFIER = NEWID();

  97:

  98:     INSERT INTO OrderHeader

  99:     (

 100:         orderId,

 101:         orderNumber,

 102:         orderDate,

 103:         channel

 104:     )

 105:     SELECT

 106:         @orderId,

 107:         orderNumber = OrderObject.Header.value('./@orderNumber', 'varchar(20)'),

 108:         orderDate = OrderObject.Header.value('./@orderDate', 'datetime'),

 109:         channel = OrderObject.Header.value('./@channel', 'varchar(50)')

 110:     FROM

 111:         @orderXml.nodes('/Root/Order') AS OrderObject(Header);

 112:

 113:     INSERT INTO OrderLine

 114:     (

 115:         orderId,

 116:         productName,

 117:         quantity,

 118:         value

 119:     )

 120:     SELECT

 121:         @orderId,

 122:         productName = OrderObject.Lines.value('./productName[1]', 'varchar(100)'),

 123:         quantity = OrderObject.Lines.value('./quantity[1]', 'int'),

 124:         value = OrderObject.Lines.value('./value[1]', 'numeric(18,3)')

 125:     FROM

 126:         @orderXml.nodes('/Root/Order/Lines') AS OrderObject(Lines);

 127:

 128: END

 129: GO

You should see something like that.

clip_image002[4]

Create API Connection to Service Bus queue

Before to configure the connection, you need to know the service bus connection string.

clip_image004[4]

clip_image006[4]

Create API Connection to SQL Server

clip_image008[4]

Create Integration Account

Integration accounts allow you to manage artifacts such as schemas, maps, certificates, partners and agreements in order to build Integration Application cloud based apps.

For additional information, refer to https://azure.microsoft.com/en-us/documentation/articles/app-service-logic-enterprise-integration-create-integration-account/.

clip_image010[4]

Configure Integration Account

clip_image012[4]

Add source schema (PurchaseOrderFF.xsd)

clip_image014[4]

   1: <?xml version="1.0" encoding="utf-16"?>

   2: <xs:schema xmlns="http://int1demoa1project.Order" xmlns:b="http://schemas.microsoft.com/BizTalk/2003" targetNamespace="http://int1demoa1project.Order" xmlns:xs="http://www.w3.org/2001/XMLSchema">

   3:   <xs:annotation>

   4:     <xs:appinfo>

   5:       <schemaEditorExtension:schemaInfo namespaceAlias="b" extensionClass="Microsoft.Azure.Integration.DesignTools.FlatFileExtension.FlatFileExtension" standardName="Flat File" xmlns:schemaEditorExtension="http://schemas.microsoft.com/BizTalk/2003/SchemaEditorExtensions" />

   6:       <b:schemaInfo standard="Flat File" codepage="1250" xmlns:b="http://schemas.microsoft.com/BizTalk/2003" root_reference="Order" />

   7:     </xs:appinfo>

   8:   </xs:annotation>

   9:   <xs:element name="Order">

  10:     <xs:annotation>

  11:       <xs:appinfo>

  12:         <b:recordInfo structure="delimited" child_delimiter_type="hex" child_delimiter="0xD 0xA" child_order="postfix" />

  13:       </xs:appinfo>

  14:     </xs:annotation>

  15:     <xs:complexType>

  16:       <xs:sequence>

  17:         <xs:element name="Header">

  18:           <xs:annotation>

  19:             <xs:appinfo>

  20:               <b:recordInfo tag_name="H0" structure="delimited" child_delimiter_type="char" child_delimiter=";" child_order="prefix" />

  21:             </xs:appinfo>

  22:           </xs:annotation>

  23:           <xs:complexType>

  24:             <xs:sequence>

  25:               <xs:element name="orderDate" type="xs:string">

  26:                 <xs:annotation>

  27:                   <xs:appinfo>

  28:                     <b:fieldInfo justification="left" />

  29:                   </xs:appinfo>

  30:                 </xs:annotation>

  31:               </xs:element>

  32:               <xs:element name="orderNumber" type="xs:string">

  33:                 <xs:annotation>

  34:                   <xs:appinfo>

  35:                     <b:fieldInfo justification="left" />

  36:                   </xs:appinfo>

  37:                 </xs:annotation>

  38:               </xs:element>

  39:               <xs:element name="channel" type="xs:string">

  40:                 <xs:annotation>

  41:                   <xs:appinfo>

  42:                     <b:fieldInfo justification="left" />

  43:                   </xs:appinfo>

  44:                 </xs:annotation>

  45:               </xs:element>

  46:             </xs:sequence>

  47:           </xs:complexType>

  48:         </xs:element>

  49:         <xs:element maxOccurs="unbounded" name="Detail">

  50:           <xs:annotation>

  51:             <xs:appinfo>

  52:               <b:recordInfo tag_name="D0" structure="delimited" child_delimiter_type="char" child_delimiter=";" child_order="prefix" />

  53:             </xs:appinfo>

  54:           </xs:annotation>

  55:           <xs:complexType>

  56:             <xs:sequence>

  57:               <xs:element name="productName" type="xs:string">

  58:                 <xs:annotation>

  59:                   <xs:appinfo>

  60:                     <b:fieldInfo justification="left" />

  61:                   </xs:appinfo>

  62:                 </xs:annotation>

  63:               </xs:element>

  64:               <xs:element name="quantity" type="xs:string">

  65:                 <xs:annotation>

  66:                   <xs:appinfo>

  67:                     <b:fieldInfo justification="left" />

  68:                   </xs:appinfo>

  69:                 </xs:annotation>

  70:               </xs:element>

  71:               <xs:element name="value" type="xs:string">

  72:                 <xs:annotation>

  73:                   <xs:appinfo>

  74:                     <b:fieldInfo justification="left" />

  75:                   </xs:appinfo>

  76:                 </xs:annotation>

  77:               </xs:element>

  78:             </xs:sequence>

  79:           </xs:complexType>

  80:         </xs:element>

  81:       </xs:sequence>

  82:     </xs:complexType>

  83:   </xs:element>

  84: </xs:schema>

Add SQL Server stored procedure schema (PurchaseOrderDB.xsd)

clip_image002[6]

   1: <?xml version="1.0" encoding="utf-16"?>

   2: <xsd:schema xmlns:b="http://schemas.microsoft.com/BizTalk/2003" attributeFormDefault="unqualified" elementFormDefault="unqualified" xmlns:xsd="http://www.w3.org/2001/XMLSchema">

   3:   <xsd:complexType name="OrderDetailType" mixed="true">

   4:     <xsd:choice minOccurs="1" maxOccurs="unbounded">

   5:       <xsd:element minOccurs="1" maxOccurs="1" name="productName" type="xsd:string" />

   6:       <xsd:element minOccurs="1" maxOccurs="1" name="quantity" type="xsd:string" />

   7:       <xsd:element minOccurs="1" maxOccurs="1" name="value" type="xsd:string" />

   8:     </xsd:choice>

   9:   </xsd:complexType>

  10:   <xsd:element name="Root">

  11:     <xsd:complexType mixed="true">

  12:       <xsd:sequence>

  13:         <xsd:element minOccurs="1" maxOccurs="unbounded" name="Order">

  14:           <xsd:complexType mixed="true">

  15:             <xsd:sequence>

  16:               <xsd:element minOccurs="1" maxOccurs="unbounded" name="Lines" type="OrderDetailType" />

  17:             </xsd:sequence>

  18:             <xsd:attribute name="orderDate" type="xsd:string" use="required" />

  19:             <xsd:attribute name="orderNumber" type="xsd:string" use="required" />

  20:             <xsd:attribute name="channel" type="xsd:string" use="required" />

  21:           </xsd:complexType>

  22:         </xsd:element>

  23:       </xsd:sequence>

  24:     </xsd:complexType>

  25:   </xsd:element>

  26: </xsd:schema>

Add XML Transformation Map

clip_image002[8]

clip_image004[6]

   1: <?xml version="1.0" encoding="utf-16"?>

   2: <xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:msxsl="urn:schemas-microsoft-com:xslt" xmlns:var="http://schemas.microsoft.com/BizTalk/2003/var" exclude-result-prefixes="msxsl var s0" version="1.0" xmlns:s0="http://int1demoa1project.Order">

   3:   <xsl:output omit-xml-declaration="yes" encoding="windows-1250" version="1.0" indent="no" method="xml" />

   4:   <xsl:template match="/">

   5:     <xsl:apply-templates select="/s0:Order" />

   6:   </xsl:template>

   7:   <xsl:template match="/s0:Order">

   8:     <Root>

   9:       <Order>

  10:         <xsl:attribute name="orderDate">

  11:           <xsl:value-of select="Header/orderDate/text()" />

  12:         </xsl:attribute>

  13:         <xsl:attribute name="orderNumber">

  14:           <xsl:value-of select="Header/orderNumber/text()" />

  15:         </xsl:attribute>

  16:         <xsl:attribute name="channel">

  17:           <xsl:value-of select="Header/channel/text()" />

  18:         </xsl:attribute>

  19:         <xsl:for-each select="Detail">

  20:           <Lines>

  21:             <productName>

  22:               <xsl:value-of select="productName/text()" />

  23:             </productName>

  24:             <quantity>

  25:               <xsl:value-of select="quantity/text()" />

  26:             </quantity>

  27:             <value>

  28:               <xsl:value-of select="value/text()" />

  29:             </value>

  30:             <xsl:value-of select="./text()" />

  31:           </Lines>

  32:         </xsl:for-each>

  33:       </Order>

  34:     </Root>

  35:   </xsl:template>

  36: </xsl:stylesheet>

 

Temporary operations

Create Storage Account

clip_image002[10]

Create App Service Plan

clip_image004[8]

Create Function App

clip_image006[6]

Add the transformation function

To add the transformation Function for Logic App run the following link

https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2F201-logic-app-transform-function%2Fazuredeploy.json

clip_image008[6]

Create Logic App

clip_image010[6]

Associate Integration Account to the Logic App

clip_image012[6]

Create a blank solution

clip_image014[6]

Configure trigger

clip_image016

clip_image018

Modify the polling frequency if you need to process the file in near real time.

clip_image020

Add Flat-File decoding to parse flat file and fill into XSD schema message.

clip_image022

clip_image024

Add the map to transform flat file message to SQL Database XML schema type.

clip_image026

Configure the call to the stored procedure.

clip_image028

clip_image030

Write and run client application

Now you able to run the client application. To create a sample application, follow the steps below.

1 – Install WindowsAzure.ServiceBus NuGet Package

clip_image032

2 – Get service bus Connection String

clip_image034

3 – Write your code

   1:

   2: using Microsoft.ServiceBus.Messaging;

   3:

   4: ...

   5: using (FileStream file = File.Open(<FilePath>, FileMode.Open, FileAccess.Read, FileShare.Delete))

   6: {

   7:

   8:   // Send

   9:   file.Seek(0, SeekOrigin.Begin);

  10:

  11:   var queueClient = QueueClient.CreateFromConnectionString(<ConnectionString>, <QueueName>);

  12:   BrokeredMessage serviceBusMessage;

  13:

  14:   serviceBusMessage = new BrokeredMessage(file);

  15:

  16:   queueClient.Send(serviceBusMessage);

  17:

  18: }

Upload Dataset on Azure SQL Database using Azure Event Hub and Azure Stream Analytics

There are several ways to upload data to SQL Database, including SSIS, BCP, API and several other ways.

For those who work with SQL Server it is natural to use tools that historically belong to the SQL platform, so I wanted to write this article to expose an alternative and to talk about different techniques that apply to different scenarios.

When designing an architecture, you may find yourselves at a crossroads, and often you do not know what to expect once made a decision. How requirements will change over time.

Azure Event hub allows us to decouple who collects the data (the client) from who receives the data, allow you to change kind of receivers or add additional targets.

For this reason, I consider it the entry point to Microsoft Azure.

The following picture shows the data flow that I am going to describe.

Following the check list

  • Create Resource Group
  • Create Event Hubs Namespace
  • Create Event Hub Queue
  • Create SQL Server
  • Create SQL Database
  • Create SQL Table
  • Create Stream Analytics Job
  • Configure Input
  • Configure Output
  • Configure Query
  • Start Stream Analytics Job

To complete each step, you must first access the Microsoft Azure portal (https://portal.azure.com).

I think it is unnecessary to describe each step in detail, so I will just provide some screen shots.

Create Resource Group

Create Event Hubs Namespace

Create Event Hub

Create SQL Server (logical server)

Connect to SQL Server and create database objects

Before you can connect to SQL Database virtual server, you have to configure Azure firewall to allow the communication.

Use SQL Server Management Studio to create database and tables

Create database

/*

For additional information refer to address https://msdn.microsoft.com/en-us/library/dn268335.aspx

*/

CREATE
DATABASE [int-dstosql-sqldb]

(

    EDITION ‘basic’,

    SERVICE_OBJECTIVE=‘basic’

);

GO

Create table

CREATE TABLE OrdersByCity

(

    entryId INT NOT NULL IDENTITY(1,1) PRIMARY KEY,

    city VARCHAR(50NOT NULL,

    orderValue NUMERIC(10,2NOT NULL,

    orderDate DATETIME NOT NULL

);

Create Stream Analytics Job

Create Stream Analytics Job Input

Create Stream Analytics Job output

Define Stream Analytics Job Query

Refer to the link https://msdn.microsoft.com/en-us/library/azure/dn834998.aspx for additional information about Azure Stream Analytics query language.

Save and start stream analytics job.

Write and run client application

Now you able to run the client application. To create a sample application, follow the steps below.

1 – Install WindowsAzure.ServiceBus NuGet Package

2 – Get Event Hubs Connection String

3 – Write your code

using Microsoft.ServiceBus.Messaging;

using System.Threading;


DataTable dataTable = <Put here your code to retrieve data>;

var data = JsonConvert.SerializeObject(dataTable);

var eventHubClient = EventHubClient.CreateFromConnectionString(<Event Hub Conn String>, <Queue Name>);

eventHubClient.Send(new EventData(Encoding.UTF8.GetBytes(data)));

eventHubClient.Close();

Refer to the link https://azure.microsoft.com/en-us/documentation/articles/event-hubs-csharp-ephcs-getstarted/ to additional information.

BizTalk CRM Lookup functoid is now ready for CRM Online

In recent years I often integrate Microsoft Dynamics CRM by using BizTalk Server, and as many know, BizTalk Server does not have a native connector for MS CRM.

For this reason, we must work hard to generate canonical schemas, to resolve lookup type fields, Guid type fields and generally to develop maps.

To increase the productivity of my team I have always tried to develop tools or helpers to automate the development of long and repetitive tasks, reducing the possibility of generating errors and standardize approaches to problems.

In recent months, with the cooperation of Nino Crudele and Sandro Pereira we published several functoids that simplify integration with Dynamics CRM and in this article I want to describe the work that has been done to also integrate the CRM Online.

When you integrate MS CRM using BizTalk Server, typically you have at least two problems:

  • Transcode the Lookup and Guid type fields
  • Mapping the input information with the generic schema exposed by CRM

The BizTalk Mapper Extension Utility pack simplifies both tasks by providing specialized functoids.

Today we released a new version where you can find the CRM Lookup functoid with two new features:

  • The ability to define more than one configuration in the SSO. This allows of integrating more than one CRM instance with a single BizTalk Farm
  • Now you can integrate CRM On-Line. CRM Lookup, for performance reasons, performs lookups operation by accessing the CRM database in read only. However, the CRM Online does not expose his database, so we added the ability to do lookups using the native CRM web services that are present in both the online and on-premises versions.

The functoid now takes five mandatory input parameters:

  1. SSO Application Name: the connection parameters are stored inside SSO. Here you have to specify the application name where you decide to store the settings. This allow you to have multiple CRM environments that you can integrate;
  2. CRM Guid Field Name: key field name of which you want to get the value;
  3. CRM Entity Name: CRM entity name on which the resolution will be performed
  4. CRM Input Field Name: imagining having a “code” as input and you want to know the CRM physical key, in this attribute should specify the name of the “code” field.
  5. CRM Input Value: value from source system to resolve


The output of the functoid is a string, Example: 6erg5r-sdrf56-4dfgt5-6ty5r4-456trt 24

Installation

Download and install BizTalk Mapper Extensions UtilityPack from https://btsmapextutilitypack.codeplex.com/.

Download CRM Sdk and register in GAC the following assemblies:

  • Microsoft.Xrm.Sdk.dll (version 5.0.9910.15)

Create a SSO application by using SSO Configuration Application MMC Snap-In available at https://www.microsoft.com/en-us/download/details.aspx?id=14524.

SQL Server Lookup

Create the following SSO configuration parameters

  • CrmConnectionType: “sql”
  • CrmDatabaseConnectionString: specify the CRM database connection string

Open SQL Server management console, connect to CRM database and execute the script under CRMLookup.sql file in project folder.

Local CRM web services lookup

Create the following SSO configuration parameters

CRM Online web services lookup

Create the following SSO configuration parameters

How-to create an Azure Service Bus namespace in a specific Resource Group

At this moment, the Azure portal does not allow the creation of a Service Bus namespace in a specific Resource Group. You can do it via REST or PowerShell but not from the portal.

In this short article, I will show how to do it without writing lines of code.

  • Create the resource group from the portal (for example “sampleResources”)
  • From https://resources.azure.com app navigate “subscription”, “your subscription name”, “resourceGroups” and then choose the resource group previously created.
  • Expand “providers” and click on “Show all”
  • Expand “Microsoft.ServiceBus” and then select “namespaces”
  • On the right panel click on “Read\Write” and then on “Create” buttons
  • Specify the name of your Service Bus Namespace (for example “sampleServiceBus”)
  • Paste the following json code

 

{
  “id”: “/subscriptions/SUBSCRIPTION_GUID/resourceGroups/sampleResources/providers/Microsoft.ServiceBus/namespaces/sampleServiceBus,
  “name”: sampleServiceBus,
  “type”: “Microsoft.ServiceBus/namespaces”,
  “location”: West US,
  “kind”: “Messaging”,
  “tags”: {},
  “properties”: {
    “provisioningState”: “Succeeded”,
    “status”: “Active”,
    “createdAt”: “2016-05-10T12:00:00.000Z”,
    “serviceBusEndpoint”: “https://sampleServiceBus.servicebus.windows.net:443/”,
    “enabled”: true,
    “critical”: false,
    “updatedAt”: “2016-05-10T12:00:00.000Z”,
    “eventHubEnabled”: true,
    “namespaceType”: “Messaging”,
    “messagingSku”: 2
  }
}

You should have the following

  • Click on “PUT” button and that’s it!

Integrate SQL Server with Power BI

One of the most common scenarios for those who use Microsoft technologies and intend to publish its data into Power BI typically use SQL Server as a data source.

Power BI provides several ways to access SQL Server information, among them there are some that need simple configurations and others that require developments. Below we summarize the different possibilities that you have available:

Power BI has a wide range of connector to access to several types of data sources, these include a connector for Azure SQL Database and one for Azure SQL Data Warehouse.

To access to a SQL Server, whether it be an instance of Azure IaaS or On-Premises instance you can use Power BI Gateways. There is a Power BI Enterprise Gateway for enterprise deployments and a Power BI Personal Gateway for personal use. You can download it directly from Power BI portal and install in a server near your SQL Server.

For additional information about Power BI Gateways refer to https://powerbi.microsoft.com/en-us/gateway/.

Both these features support direct query option, that is SQL Server is queried when the user accesses the report or dashboard. Typically you prefer to schedule the frequency of the update. You can schedule updates every few minutes.

If you want to get a dashboard that receives continuous updates getting the effect of a live dashboard you need to perform simple developments.

To update a Power BI dataset you can use different Microsoft Azure functionality:

Windows Azure Event Hubs + Windows Azure Stream Analytics

Power BI REST API

Both techniques need configuration on Microsoft Azure side and a Windows service that resides in the same network of your SQL Server that is responsible for reading the information and send them to the Cloud.

This approach allows you to make updates with a frequency close to the second.

Use Windows Azure Event Hubs and Microsoft Azure Stream Analytics is simpler and the Power BI dataset is created automatically, but with this technique, you can only insert new information.

The Power BI APIs are a bit more complex but allow several types of operations. For example it is possible to empty the dataset before inserting new information.

However, you must manually create the data set.

You can download a Windows service sample at the address http://db2powerbi.codeplex.com/. This simple solution it able to use both techniques. It is enough to apply a configuration to the QueryConfig.json file and write a query file. The project includes a console (DbToP Svc.Power Bi.UI.exe) that allows you to create the dataset and retrieve its id (required in the configuration file).

Below is an example of Query Config.json file.

As I said, before starting the service you need to apply some Microsoft Azure side configurations.

You can follow the article https://azure.microsoft.com/en-us/documentation/articles/stream-analytics-power-bi-dashboard/ to configure Windows Azure Event Hubs and a Stream Analytics Job.

You have to follow the article https://powerbi.microsoft.com/en-us/documentation/powerbi-developer-register-a-client-app/ before use Power BI REST API.

Manage your BizTalk Passwords

During the deployment of BizTalk applications, it is sometimes necessary to set passwords manually. There are other options such as those to prepare a binding file that contains the passwords, but it is not always a quick and easy job.

Then I created a console application that easily sets the password for the send ports and receive locations so that it can create automatic deployment script.

You can get this tool at https://biztalkpassword.codeplex.com/ site.

It is just an example that currently manages the adapter FILE, FTP, POP3 and WCF *. Of course you can access the source code in order to manage other kind of adapters.

This console is very simple to use, simply specify the port type (if a send port or a receive location), a user and a password.

Following an example of use:

Microsys.EAI.Framework.PasswordManager.exe -set -receive -name:MyReceiveLocation -user:John -password:@Passw0rd1

Microsys.EAI.Framework.PasswordManager.exe -set -send -name:MySendPort -user:John -password:@Passw0rd1

In addition to setting the password, it allows you to view the property of the ports that normally contain credentials (TransportTypeData).

Following are other examples of use:

Microsys.EAI.Framework.PasswordManager.exe -list -application:[application name]

Microsys.EAI.Framework.PasswordManager.exe -get -receive -name:[receive location name]

Microsys.EAI.Framework.PasswordManager.exe -get -send -name:[send port name]

Microsys.EAI.Framework.PasswordManager.exe -set -receive -name:[receive location name] -user:[username] -password:[password]

Microsys.EAI.Framework.PasswordManager.exe -set -send -name:[send port name] -user:[username] -password:[password]