wiki

Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Thursday, September 4, 2014 3:57 PM

Glenn Colpaert by Glenn Colpaert

Description of an issue when adding the WCF.OutboundCustomHeaders to the context when sending to the BizTalk ServiceBus adapter.

For a hybrid scenario I'm currently working on, using onsite-WCF services and Azure Service Bus, it was necessary to have the WCF Headers from the original call available as Brokered Message Properties in Azure Service Bus.

We created a SB-Messaging port to send the message to Azure Service Bus and added the WCF Namespace (http://schemas.microsoft.com/BizTalk/2006/01/Adapters/WCF-properties) in the Brokered Message Properties window.

 

We quickly ran into following issue:

The adapter failed to transmit message going to send port "SpBlog" with URL "sb://coditblogdemo.servicebus.windows.net/demo". It will be retransmitted after the retry interval specified for this Send Port. Details:"System.InvalidOperationException: Envelope Version 'EnvelopeNone ( http://schemas.microsoft.com/ws/2005/05/envelope/none)' does not support adding Message Headers.

At first I tought this error was related to the fact that we added the WCF Namespace in the Brokered Message Properties window, but even if we removed the WCF namespace, the error still occurred. 

In fact we noticed that from the moment the WCF.OutboundCustomHeaders is in the context of the message this error occurs on the SB-Messaging port.

Cause

Looking at the stack trace it seems that the SB-Messaging adapter is built on top of the BizTalk WCF Adapter runtime. Which makes perfect sense. The only downside here is that the SB-Messaging adapter acts the same as the WCF adapter when it comes to the OutboundCustomHeaders property. 
When using the WCF adapter and adding the WCF.OutboundCustomHeaders to the context, the value of this property gets added to the SOAP:Header of the outgoing message and that is exactly what is also happening with the SB-Messaging adapter. Let’s call it a hidden feature in the SB-Messaging adapter.

Solution

The solution for this problem was fairly simple and straight forward. We created a custom pipeline component: ‘Context Copier” that copies the value of the OutboundCustomHeaders property to another context property. After assigning the value to the new context property we write null to the OutboundCustomHeaders context property. That way the OutBoundCustomHeaders is removed from the context.

 

Off course this is only the basic outline of the component. In our ‘Context Copier’ we added the possibility to add a source and destination list of properties that need to be copied. But the basic outline below will already solve the issue of the OutboundCustomHeaders when sending to Service Bus.

 

 

After adding this pipeline component to our Send Pipeline, the issue was resolved and we could happily continue integration with Service Bus!

 

Cheers,

Glenn Colpaert

written by: Glenn Colpaert

Posted on Friday, August 8, 2014 3:00 PM

Glenn Colpaert by Glenn Colpaert

Sentinet is highly extendable through standard Microsoft .Net, WCF and WIF extensibility points. Also the Sentinet API interfaces can be used to extend Sentinet possibilities.

In this post I would like to explain another extensibility point of Sentinet: Custom Alert Handlers.

Alerts or Violation Alerts are triggered in the Sentinet Service Agreements when certain configured SLA violations occurred.

In some previous post that were release on this blog we saw how to build a custom access rule expression and how to leverage the WCF extensibility by setting up a virtual service with a custom endpoint behavior .

In this post I would like to explain another extensibility point of Sentinet: Custom Alert Handlers.

Alerts or Violation Alerts are triggered in the Sentinet Service Agreements when certain configured SLA violations occurred. More details about Sentinet Service Agreements can be found here.

Scenario

The main scenario for this blog post is very simple. We will create our own Custom Alert Handler class by inheriting certain interfaces, register our custom Alert Handler in Sentinet and use it as an Alert when a certain SLA is violated.

Creating the Custom Alert Handler

We start our implementation of the custom alert handler by creating a new class that inherits from the IAlertHandler interface. This interface is available through the Nevatech.Vbs.Repository.dll.

This interface contains one single method: ProcessAlerts where you put your logic to handle the alert after the SLA violation has occurred.

clip_image001[6]

One more thing to do before we can start our implementation of the ProcessAlerts method is to add a reference to the Twilio REST API through NuGet. More information about Twilio can be found here.

clip_image002

 The final implementation of our custom Alert Handler looks like below. We start by initializing some variables needed for Twilio. After that everything is pretty straight forward. We read our handler configuration, here we made the choice for a CSV configuration string, but you can perfectly go for an XML configuration and parse it to an XMLDocument or XDocument.

When we've read the receivers from the configuration we create an alert message by concatenating the alert description, after that we send an SMS message by using the Twilio REST API.

clip_image002[5]

Register

The first step in registering your Custom Alert is to add your dll(s) to the Sentinet installation folder. This way our dll(s) can be accessed by the "Nevatech.Vsb.Agent" process, that is responsible for generating the alert. There is no need to add your dll(s) to the GAC.

Next step is to register our Custom Alert in Sentinet itself and assign it to a desired Service Agreement.

In the Sentinet Repository window, click on the Service Agreements and navigate to alerts and then choose add Alert. Following screen will popup.

clip_image002[7]

Next step is clicking on the 'Add Custom Alert Action' button as shown below.

clip_image002[9]

In the following screen we have to add all our necessary parameters to configure our Custom Alert.

  • Name: The friendly name of the Custom Alert
  • Assembly: The fully qualified assembly name that contains the custom alert
  • Type: The .NET class that implements the IAlertHandler interface
  • Default Configuration: The optional default configuration. In this example I specified a CSV value of difference phone number. You can access this value inside your class that implements the IAlertHandler interface.

clip_image002[11]

Confirm the configuration by clicking 'OK'. In the next screen be sure to select your newly configured Custom Alert.

You will end up with following Configured Violation Alerts.

clip_image002[13]

Testing

To test this alert I've modified my Service Agreement Metrics to a very low value (ex 'Only 1 call is allowed per minute'). So I could easily trigger the alert. After I called my Virtual Service multiple times per minute, I received following SMS.

Service agreement "CoditBlogAgreement"  has been violated one or more times. The most recent violation is reported for the time period started on 2014-07-24 18:15:00 (Romance Standard Time).

Conclusion

Sentinet is designed to be extensible in multiple areas of the product. In this post I’ve demonstrated how to create a Custom Alert Handler that will send an SMS when an SLA has been violated.

 

Cheers,

Glenn Colpaert

Categories: .NET BizTalk Sentinet
written by: Glenn Colpaert

Posted on Tuesday, July 22, 2014 4:00 PM

Tom Kerkhove by Tom Kerkhove

Integration with SQL Server is not always a walk in the park. Recently I had to integrate dynamic SQL scripts with Typed-Polling and some of the resulted in empty result sets. In this post I explain the pitfalls and lessons learned of our solution by using a sample scenario.

Recently I noticed some odd behavior when I was troubleshooting a SQL integration scenario with BizTalk 2010. I was using WCF-Custom adapter to perform Typed-Polling that executed a stored procedure. This stored procedure was using dynamic SQL to fetch the data because it is targeting multiple tables with one generic stored procedure.

In this blog post I will tell you how we implemented this scenario as well as where the adapter was failing when polling.
Next to that I will talk about some "problems" with the receive location and the adapter.
I will finish with some small hints that made our development easier.

  

Scenario

In this simplified scenario we have an application that is polling on two tables called ‘tbl_Shipment_XXX’ where XXX is the name of a warehouse. Each warehouse will have a corresponding receive location that will poll the data that is marked as ready to be processed.

This is performed by using a stored procedure called ‘ExportShipments’ which requires the name of the target warehouse and will use the proceed in the following steps – Lock data as being processed, export data to BizTalk & mark as successfully processed.

troubleshooting_wcf_dynamic_sql

 

Creating our polling statement

In our polling statement we will execute our generic stored procedure. This procedure will simply mark our data as being processed, execute a function that will return a SQL statement as a NVARCHAR(MAX). Afterwards we will execute the statement & mark the data as processed.

CREATE PROCEDURE ExportShipments
	@Warehouse nvarchar(10)
AS
BEGIN
	SET NOCOUNT ON;
 
	-- MARK DATA AS LOCKED
	DECLARE @lockData NVARCHAR(MAX);
	SET @lockData = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 7 WHERE [STATUS_ID] = 1';
	EXEC(@lockData);
 
	-- EXTRACT DATA
	DECLARE @exportData NVARCHAR(MAX);
	SET @exportData = [ShopDB].[dbo].[ComposeExportShipmentSelect] (@Warehouse)
	EXEC(@exportData);
 
	-- MARK DATA AS PROCESSED
	DECLARE @markProcessed NVARCHAR(MAX);
	SET @markProcessed = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 10 WHERE [STATUS_ID] = 7';
	EXEC(@markProcessed);
 
END

In our function we will compose a simple SELECT-statement where we fill in the name of our warehouse.

CREATE FUNCTION ComposeExportShipmentSelect
(
	@Warehouse nvarchar(10)
)
RETURNS NVARCHAR(MAX)
AS
BEGIN
	-- DECLARE SELECT-STRING
	DECLARE @result NVARCHAR(MAX);
 
	-- COMPOSE SELECT STATEMENT
	SET @result = 'SELECT [ID], [ORDER_ID], [WAREHOUSE_FROM], [WAREHOUSE_TO], [QUANTITY] FROM [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] WHERE [STATUS_ID] = 7';
 
	-- RETURN SELECT STRING
	RETURN @result
 
END

I chose to seperate the composition of the SELECT-statement in a function because we were using a lot of JOIN's & UNION's and I wanted to seperate this logic from the stored procedure for the sake of code readability.

Although our scenario is very simple we will use a function to illustrate my problem.

 

Generating the Types-polling schema

Now that everything is set in our database we are ready to generate our Typed-Polling schema for the WarehouseA-polling to our BizTalk project.

1. Right-click on your project and select Add > Add generated items... > Consume Adapter Service

 

2. Select the SqlBinding and click "Configure"

3. Fill in the Server & InitialCatalog and a unique InboundId. The InboundId will be used for in the namespace of your schema and URI of your receive location. Each receive location requires its own schema. I used "WarehouseA" since I am creating the schema for polling on "tbl_Shipments_WarehouseA".

4. Configure the binding by selecting "TypedPolling" as the InboundOperationType and execute our stored procedure as PollingStatement with parameter "WarehouseA". (Note that we are not using ambient transactions)

 

5. Click Connect,select the "Service" as contract type and select "/" as category. If everything is configured correctly you can select TypedPolling and click Properties that will show you the metadata. If this is the case click Add and a schema will be generated.

 

The wizard will create the following schema with a sample configuration for your receive location.

 

It's all about metadata

The problem here is that executing the dynamic SQL doesn't provide the required metadata to BizTalk in order to successfully generate a schema for that result set.

I solved this by replacing the function 'ComposeExportShipmentSelect' with a new stored procedure called 'ExportShipment_AcquireResults'.
In this stored procedure will execute the dynamic SQL and insert the result set into a TABLE and return it to the caller. This tells BizTalk what columns the result set will contain and of what type they are. 

CREATE PROCEDURE ExportShipment_AcquireResults
	@Warehouse nvarchar(10)
AS DECLARE @result 
      TABLE(
			  [ID] [INT] NOT NULL,
			  [ORDER_ID] [NVARCHAR](50) NOT NULL,
			  [WAREHOUSE_FROM] [NVARCHAR](20) NOT NULL,
			  [WAREHOUSE_TO] [NVARCHAR](18) NULL,
			  [QUANTITY] [INT] NOT NULL
		   )
BEGIN
	-- DECLARE SELECT-STRING
	DECLARE @select NVARCHAR(MAX);
 
	-- COMPOSE SELECT STATEMENT
	SET @select = 'SELECT [ID], [ORDER_ID], [WAREHOUSE_FROM], [WAREHOUSE_TO], [QUANTITY] FROM [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] WHERE [STATUS_ID] = 7';
 
	-- EXECUTE SELECT AND INSERT INTO RESULT
	INSERT INTO @result
		(
			[ID],
			[ORDER_ID],
			[WAREHOUSE_FROM],
			[WAREHOUSE_TO],
			[QUANTITY]
		)
		EXEC(@select);
 
	-- RETURN RESULT SET
	SELECT * FROM @result;
END

Our generic polling stored procedure simply execute our new stored procedure.

BEGIN
	SET NOCOUNT ON;
 
	-- MARK DATA AS LOCKED
	DECLARE @lockData NVARCHAR(MAX);
	SET @lockData = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 7 WHERE [STATUS_ID] = 1';
	EXEC(@lockData);
 
	-- EXTRACT DATA
	DECLARE @exportData NVARCHAR(MAX);
	EXEC [ShopDB].[dbo].[ExportShipment_AcquireResults] @Warehouse
 
	-- MARK DATA AS PROCESSED
	DECLARE @markProcessed NVARCHAR(MAX);
	SET @markProcessed = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 10 WHERE [STATUS_ID] = 7';
	EXEC(@markProcessed);
 
END

When we regenerate our schema the problem should be resolved and your schema looks like this -
(Note that the new filename include your InboundId) 

 

Why not use a #tempTable?

You can also achieve this by setting FMTONLY OFF and using a temporary table but this will not work in every scenario.

CREATE PROCEDURE ExportShipments
	@Warehouse nvarchar(10)
AS
BEGIN
	SET NOCOUNT ON;
 
	-- MARK DATA AS LOCKED
	DECLARE @lockData NVARCHAR(MAX);
	SET @lockData = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 7 WHERE [STATUS_ID] = 1';
	EXEC(@lockData);
 
	-- EXTRACT DATA
	DECLARE @exportData NVARCHAR(MAX);
	SET @exportData = [ShopDB].[dbo].[ComposeExportShipmentSelect] (@Warehouse)
 
	-- CREATE TEMP TABLE
	SET FMTONLY OFF;
	CREATE TABLE #tempTable
		(
			  [ID] [INT] NOT NULL,
			  [ORDER_ID] [NVARCHAR](50) NOT NULL,
			  [WAREHOUSE_FROM] [NVARCHAR](20) NOT NULL,
			  [WAREHOUSE_TO] [NVARCHAR](18) NULL,
			  [QUANTITY] [INT] NOT NULL
		   )
	
	-- INSERT SELECT RESULTS INTO TEMP TABLE
	INSERT INTO #tempTable
		EXEC(@exportData);
	
	SET FMTONLY ON;
 
	-- MARK DATA AS PROCESSED
	DECLARE @markProcessed NVARCHAR(MAX);
	SET @markProcessed = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 10 WHERE [STATUS_ID] = 7';
	EXEC(@markProcessed);
 
	-- RETURN RESULT
	SELECT * FROM #tempTable;
END

We were using transactions on SQL-level and Try/Catch statements but this conflicted with the FMTONLY and resulted in a Severe error in the event log. Also this didn't make any sense at all since we seperated the SELECT composition to a seperate function/stored procedure and the definition of the result set should be defined there.

If you want to read more about #tempTables, I recommend this post.

 

PollingDataAvailableStatement & no ambient transactions

With our schema generated I deployed my application to my machine and started creating the receive locations for the polling on my database. The configuration is pretty easy - No ambient transactions, Typed polling, pollingstatement is our stored procedure and specify a SELECT in the PollDataAvailableStatement to check if we need to run the stored procedure.

Apparently the PollDataAvailableStatement is only used when you enable ambient transactions according to this article.

Problem here is that when you clear out the PollDataAvailableStatement and start your receive location it will be disabled automatically with the following error - 

PollDataAvailableStatement seems to be a mandatory field although it is not being used, I easily fixed it with "SELECT 0".

 

Empty result sets & TimeOutExceptions

Later on the project we had to move from transactions on SQL-level to ambient transactions and our PollDataAvailableStatement was consulted before it ran the stored procedure.

In our stored procedure we used the found data to perform cross-references and return a set of data when required so it is possible that it is not required to return a set.

We started experiencing locks on our tables and TimeOutExceptions occured without any obvious reason.
It seemed that the adapter had a bug: when the adapter has found data it is required to return a result to BizTalk, if not it will lock SQL resources and lock the tables.

This issue can be resolved by installing the standalone fix (link) or BizTalk Adapter Pack 2010 CU1 (link)

 

Tips & Tricks

During the development of this scenario I noticed some irregularities when using the wizard, here are some of the things I noticed -

  • It is a good practices to validate the metadata of the TypedPolling-operation before adding it. If something is misconfigured or your stored procedure is correct you will receive an error with more information.
    If this is the case you should click OK instead of clicking the X because otherwise the wizard will close. The same error might occur when you finalize the wizard, then it is important to click cancel instead of ok or the wizard will close automatically. 
     
  • If your wizard closes for some reason it will remember the configuration of the previous wizard when you restart it and you can simply connect, select the requested operation and request the metadata or finish the wizard.
    It might occur that this results in a MetadataException that says that the PollingStatement is empty and therefor invalid -


    This is a bug in the wizard where you always need to open the URI configuration although it still remembers your configuration. You don't need to change anything, just open & close and the exception is gone.

Conclusion

In this blog post I highlighted the problem where I was unable to generate a schema for my stored procedure by only executing the result of my function. I also illlustrated how I fixed it and why I didn't use FMTONLY and a temporary table.

Next to the generation of our schemas I talked about the problems with the receive location where it required a PollDataAvailableStatement even when it was not used and the locking of our tables because our stored procedure wasn't returning a result set. Last but not least I gave two examples of common irregularities when using the wizard and how you can bypass them.

For me it is important to write your SQL scripts like you write your code - Use decent comments, seperate your procedure into subprocedures & functions according to the separation of concern.
While writing you scripts everything might sound obvious but will it in a couple of months? And how about your colleagues? 
 

All the scripts for this post, incl. DB generation, can be found here.
 

Thank you for reading,

Tom.

Posted on Thursday, July 3, 2014 4:03 PM

Brecht Vancauwenberghe by Brecht Vancauwenberghe

A must read for all BizTalk professionals! In this article I explain how you should maintain your BAM databases. If you don't maintain them and you are facing diskspace problems, you won't be able to remove the data in a supported way!

With some luck your environment was installed by a BizTalk professional and the most critical maintenance tasks were configured at installation. When I say critical maintenance tasks, I mean the BizTalk backup job, DTA Purge & Archive,... If these tasks are not configured I'm sure your production BizTalk environment will only be running smoothly for a couple of days or months but definitely not for years!

I'm not going to write about basic configuration tasks as most BizTalk professionals should be aware of these. 

At Codit Managed Services our goal is to detect, avoid and solve big and small issues before it's too late. This way we often detect smaller missing configuration tasks. Not necessary at first, but definitely necessary for the BizTalk environment to keep on running smoothly through the years! 

 

Purging/archiving the BAM databases:

I'm going to explain you how you should maintain the BizTalk BAM databases:

In most environments these maintenance tasks are missing. You will find the necessary information on the internet, but probably only when it's urgent and you really need it.

 

My goal is to provide you with this information in time, so you are able to configure the BAM maintenance tasks without putting your SQL server or yourself under some heavy stress. If it comes to BAM and data purging, it's rather simple. By default, nothing will be purged or archived. Yes, even if you have the BizTalk backup and all the other jobs succesfully running! 

 

I have seen a production environment, running for only two years where the BAMPrimaryImport database had a size of 90GB! It takes a lot of time and processing power to purge such a database. To maintain and purge the BAM databases you will need to configure how long you want to keep the BAM data, configure archiving, trigger several SQL SSIS packages,...

 

The problem is: purging is configured per Activity, so this is a task for the developer and not a task for the guy who installed the BizTalk environment. You will find all the information to do this on following sites:

http://blogs.biztalk360.com/bam-production-environment-management

http://blogs.msdn.com/b/nabeelp/archive/2013/10/22/sql-script-to-clean-up-old-bamarchive-tables.aspx

http://www.biztalkbill.com/Home/tabid/40/EntryId/103/BizTalk-BAM-Archiving.aspx

http://blogs.msdn.com/b/appfabriccat/archive/2010/02/10/best-practices-for-configuring-bam-data-maintenance-and-cube-update-ssis-packages-in-biztalk-solutions.aspx

http://geekswithblogs.net/andym/archive/2009/05/21/132346.aspx

  

BAMData not being purged immediately?

Something very important that you should be aware of is the fact that if you for example want to keep a year of data, you will have to wait another year for the data being purged/archived in a supported way!

That's why it's so important to configure these jobs in time! It's not like the DTA P&A job, where the data is purged immediately.

You can find more information about this on following blog: http://www.richardhallgren.com/bam-tracking-data-not-moved-to-bam-archive-database

 

Purging the BAMAlertsApplication database:

I'm rather sure the following maintenance task will not be sheduled to clean your BAMAlertsApplication database. I only discovered this myself a couple of days ago! Probably not a lot of people notice this database because it's rather small. After 2 years running in production with a small load it had a size of 8GB. But it's 8GB of (wasted) diskspace!

 

If you search on the internet on how to clean this database you will find nothing official by Microsoft,...

Credits on how to purge the BAMAlertsApplication go to Patrick Wellink and his blogpost: http://wellink.bloggingabout.net/2011/02/03/millions-of-records-in-the-bamalertsapplication-and-how-to-get-rid-of-them-nsvacuum-to-the-rescue

If you wonder what the NSVacuum stored procedure looks like, you can find it below:

USE [BAMAlertsApplication]
GO

BEGIN

	DECLARE @QuantumsVacuumed	INT
	DECLARE @QuantumsRemaining	INT
	DECLARE @VacuumStatus		INT
	DECLARE @StartTime			DATETIME

	SET @QuantumsVacuumed = 0
	SET @QuantumsRemaining = 0

	SET @StartTime = GETUTCDATE()

	-- Volunteer to be a deadlock victim
	SET DEADLOCK_PRIORITY LOW

	EXEC @VacuumStatus = [dbo].[NSVacuumCheckTimeAndState] @StartTime, @SecondsToRun

	IF (0 != @VacuumStatus)			-- VacuumStatus 0 == Running
	BEGIN
		GOTO CountAndExit
	END

	DECLARE @CutoffTime			DATETIME
	DECLARE @RetentionAge		INT
	DECLARE @VacuumedAllClasses	BIT

	-- Remember the last run time and null counts
	UPDATE [dbo].[NSVacuumState] SET LastVacuumTime = @StartTime, LastTimeVacuumEventCount = 0, LastTimeVacuumNotificationCount = 0

	-- Get the retention age from the configuration table (there should only be 1 row)
	SELECT TOP 1 @RetentionAge = RetentionAge FROM [dbo].[NSApplicationConfig]

	SET @CutoffTime = DATEADD(second, -@RetentionAge, GETUTCDATE())

	-- Vacuum incomplete event batches
	EXEC [dbo].[NSVacuumEventClasses] @CutoffTime, 1

	-- Mark expired quantums as 'being vacuumed'
	UPDATE	[dbo].[NSQuantum1] SET QuantumStatusCode = 32
	WHERE	(QuantumStatusCode & 64) > 0 AND		-- Marked completed
			(EndTime < @CutoffTime)					-- Old

	DECLARE @QuantumId			INT
	DECLARE @QuantumEndTime		DATETIME

	DECLARE QuantumsCursor CURSOR
	LOCAL READ_ONLY FAST_FORWARD
	FOR
	SELECT	QuantumId, EndTime
	FROM	NSQuantum1 WITH (READUNCOMMITTED)
	WHERE	QuantumStatusCode = 32
	ORDER BY EndTime

	OPEN QuantumsCursor

	-- Do until told otherwise or the time limit expires
	WHILE (1=1)
	BEGIN
		EXEC @VacuumStatus = [dbo].[NSVacuumCheckTimeAndState] @StartTime, @SecondsToRun

		IF (0 != @VacuumStatus)			-- VacuumStatus 0 == Running
		BEGIN
			BREAK
		END

		FETCH NEXT FROM QuantumsCursor INTO @QuantumId, @QuantumEndTime

		IF (@@FETCH_STATUS != 0)
		BEGIN
			SET @VacuumStatus = 2		-- VacuumStatus 2 == Completed
			SET @QuantumsRemaining = 0
			GOTO CloseCursorAndExit
		END

		-- Vacuum the Notifications
		EXEC [dbo].[NSVacuumNotificationClasses] @QuantumId, @VacuumedAllClasses OUTPUT

		EXEC @VacuumStatus = [dbo].[NSVacuumCheckTimeAndState] @StartTime, @SecondsToRun

		IF (0 != @VacuumStatus)
		BEGIN
			BREAK
		END

		-- Vacuum the Events in this quantum
		EXEC [dbo].[NSVacuumEventClasses] @QuantumEndTime, 0

		-- Delete this Quantum from NSQuantums1 if its related records were also deleted
		IF (1 = @VacuumedAllClasses)
		BEGIN
			DELETE [dbo].[NSQuantum1] WHERE QuantumId = @QuantumId

			-- Update the count of quantums vacuumed
			SET @QuantumsVacuumed = @QuantumsVacuumed + 1
		END

		EXEC @VacuumStatus = [dbo].[NSVacuumCheckTimeAndState] @StartTime, @SecondsToRun

		IF (0 != @VacuumStatus)
		BEGIN
			BREAK
		END

	END	-- Main WHILE loop

CloseCursorAndExit:

	CLOSE QuantumsCursor
	DEALLOCATE QuantumsCursor

CountAndExit:

	-- Report progress
	SET @QuantumsRemaining = (SELECT COUNT(*) FROM [dbo].[NSQuantum1] WITH (READUNCOMMITTED) WHERE QuantumStatusCode = 32)

	SELECT	@VacuumStatus AS Status, @QuantumsVacuumed AS QuantumsVacuumed,
			@QuantumsRemaining AS QuantumsRemaining

END -- NSVacuum

You need shedule the NSVacuum command on your environment. Run this step by step, as it puts a lot of stress on your SQL server.

 

The content of this post is something every BizTalk professional should add to his BizTalk Server installation/deployment procedure!

Categories: BAM BizTalk

Posted on Wednesday, June 25, 2014 11:00 AM

Glenn Colpaert by Glenn Colpaert

This blogpost will take a look on what's new in BizTalk 2013 R2: SB-Messaging Adapter

As discussed yesterday in my blog post around the enhancements to the WCF-WebHttp Adapter the R2 releases of the BizTalk products are focusing more on 'compatibility and platform' alignment over shipping new major features/add-ons to the platform.

To give you another overview, the following features where added in new BizTalk Server 2013 R2 release:

  • Platform Alignment with Visual Studio, SQL Server,…
  • Updates to the SB-Messaging Adapter
  • Updates to the WCF-WebHttp Adapter
  • Updates to the SFTP Adapter
  • Updates to the HL7 Accelerator

This blogpost will focus on the updates of the SB-Messaging Adapter that where shipped with this new release of Microsoft BizTalk Server.

SB-Messaging Adapter enhancements

With this new version of BizTalk Server the SB-Messaging adapter now also supports SAS (Shared Access Signature) authentication, in addition to ACS (Access Control Service).

Due to this improvement BizTalk Server can now also interact with the on-premise edition of Service Bus that is available through the Windows Azure Pack.

More information on SAS Authentication can be found here: http://msdn.microsoft.com/en-us/library/dn170477.aspx)

More information on the Windows Azure Pack can be found here: http://www.microsoft.com/en-us/server-cloud/products/windows-azure-pack/default.aspx?nv1if0=1#fbid=nVinI5x6SOz?hashlink=s1section6

The image below is a head to head comparison between the ‘old’ version of the SB-Messaging adapter authentication properties window and the ‘new’  version that ships with BizTalk Server 2013 R2.

clip_image002[1]clip_image002[3]

Out with the old, in with the new

When we compare the 'Access connection information' from both the Windows Azure Portal (cloud) as the Windows Azure Pack Portal (on premise) we already see a clear difference on ways to authenticate to the Service Bus.

You also notice why the update to include SAS was really a "must have" for BizTalk Server.

On the left side you see the cloud portal, that has support for SAS and ACS, on the right side you notice the on premise version of service bus that only supports the SAS and Windows authentication.

 clip_image002clip_image002[1]

Conclusion

This ‘small’ addition to the SB-Messaging adapter has a great impact on interacting with Service Bus as we finally can use the full potential of the on-premise version of Service Bus.

We have started implementing this feature at one of our customers. One of the requirements there is to use the on-premise version of Service Bus. First test are looking good and the behavior and way of implementing is the same as connecting to the service bus in the cloud.

However one downside that I personally find is the lacking of the Windows Authentication possibility feature.

 

Happy Service Bus’ing !!

Glenn Colpaert