When Microsoft introduced the DataMart connector in Management Reporter 2012 (aka MR2012) back in 2012 with the RU3 release, they probably didn’t put much tought into the fact that some users are running multiple GP companies up to several dozen databases on the same SQL instance… Although at the time one of the benefits that were put in the head lights was the performance enhancements that the DataMart was supposedly provide.
When the RU3 release came out, I installed it on my TEST bed GP server where I had already a copy of MR2012 installed to have a simulation environment for upgrades. We only run about a half-dozen GP companies with financial data and two consolidation companies for the eliminations. Though our CFO never really was impressed by the performance of MR since it took over FRx 6.7 in our company, he wanted to believe that over time the product would improve and become a solid replacement candidate for FRx. Needless to say, it took us about one year of parallel testing for financial reports between MR 2.0 (at the time) and FRx before deciding to dump FRx as the company’s financial reporting tool and make the jump to MR 2012.
With the time the performance of the product improved and it became more stable and less prone to bugs and various errors of unknown nature. So when the DataMart was introduced, I tested it on my GP test system and was expecting the so-promised performance improvement over the “Legacy Connector”, especially when it came to consolidated reports across multiple companies… The first impression was not really ‘impressive’… I would even say that the DataMart was slower then the Legacy connector to complete the generation process and open MR Viewer.. The other drawback was that the initial version of DM didn’t support Multi-Currency reporting, a feature we absolutely needed, given our international business and presence with our various GP companies.
One other drawback I learned the day RU3 was installed on my TEST GP system was the fact that there was now a service from MR that was permanently connected to all my GP databases that were “integrated” with MR 2012 using the DataMart connector. Every time I needed to restore one of my production databases to my GP Test server I had first to stop the MR services that run in the background to allow access to my SQL DB’s.
Another thing I realized also pretty soon as I’m monitoring very closely my various SQL servers in the company and keep an eye on the performance, catching processes that could trigger some phone calls by the end-users, was the unusual activity going on. My SQL server on my GP test server was doing some strange tasks every minute or so, that would use quite a good chunk of CPU from my system… something I wasn’t really comfortable with. The process was hitting about 60-70% of my CPU capacity and last for about 12-15 seconds every minute… My I/O state would also show some disk activity that shows some 5-6 Mb was read and written every minute.. Not that it was taking very long, but it was annoying. In some cases, the CPU would even hit almost 100%.

I tried to get in touch with Microsoft about this and after a couple of e-mail exchanges I got a confirmation that this was “by design” for the new DataMart connector to allow for the refresh of the financial data of the GP companies. I must confess that when I first installed the new release with the DM, I was very disappointed as in my mind ‘DataMart’ was sounding like ‘Data Warehouse’ and thought that Microsoft would finally make the jump and build an Analysis Cube for the GP financial data that would be used by MR 2012… Boy was I wrong !.. After the installation of the new connector, I was the proud owner of another new flat database called ‘ManagementReporterDM’.
I discussed the issue with the performance hit on the SQL server with Microsoft and explained that I found that a default refresh value of 1 minute for the GP Financial was quite a little bit overkill, especially when you think that it is most of the time for financial reports about month-end data for the past months… usually historical data between your last month closing and YTD. So I wont really see the benefits of refreshing the data that much often. I have other HR Analysis Cubes on my SQL server that refresh twice a day ! and that’s even more then enough…
My main grip and argument towards Microsoft was the fact that this refresh rate was out of user control, and there was no place you could act on this settings in the MR 2012 configuration console. As a reply to this, Microsoft sent me a SQL script that I could use (with great care of course, and no liability from them 🙂 ) to find out in the MR 2012 configuration database what current settings are active.
/************************************************************************************
This scripts reads out some of the important MR2012 system parameters that are
scheduled to be refreshed on regular intervals.
Pick the trigger ID that needs to be changed and update the value with the second
statement below
(c) April Olson, MS, 2014-04-14
************************************************************************************/
SELECT t.id AS TaskID
, t.name
, ts.StateType
, ts.Progress
, ts.LastRunTime
, ts.NextRunTime
, tr.IsEnabled
, tr.Interval
, tr.UnitOfMeasure
, tr.id AS TriggerID
FROM
scheduling.Task t JOIN scheduling.TaskState ts ON t.id = ts.TaskId
JOIN Scheduling.[trigger] tr ON t.TriggerId = tr.id
WHERE TypeId = ’55D3F71A-2618-4EAE-9AA6-D48767B974D8′;
The result would be something like this :

In this case the value we’re looking for is the TriggerID (original field ‘ID’ from the trigger table) for the Task Name that is called ‘GL Transaction to Fact’. You may see several other entries with the value ‘Interval’ set to 1, in my case above I already changed many of the them to a 5 minute interval. The statement you’ll need to execute to change the refresh interval would look like this :
update scheduling.[trigger] set Interval = 5 where ID = ‘A4A77F6F-4310-492F-BA02-953CBB395093’
This will uptdate the trigger schedule table and you’ll almost instantly see the difference on your SQL server.. there is no need to restart your MR services. The UnitOfMeasure value of the table seem to carry different effects, as I’ve seen the dates in the NextRunTime field set apart of 24hrs from the LastRunTime value, and the interval was still = 1. My findings so far are that the UoM = 2 is for Minutes, and UoM = 4 is for Days.. so maybe 3 = Hours…

Try it out and maybe you’ll be able to find the best settings for your setup. As always, never carry those tweaking on your production system without having tested it out first in a development or test environment. Also, do a full backup of your MR databases in case something goes wrong. I’ve heard from some end-users that have to struggle with some 50-100 GP companies and the 1-minute refresh cycle is definitely not appropriate for situations like this… the whole refresh cycle can’t even run to the end as the next cycle is already triggered by the system..
Read more about the thread in the Dynamics Community here.
I’d warmly suggest to Microsoft to reconsider the possibility in a future release of MR 2012 to make these settings accessible to the MR2012 Configuration Console, so at least the system administrator has a control of it.
Until next post…
PS: I’ve created a new ticket on Microsoft Connect that you can vote for here (you need to be registered to vote)
Update (2020-03-30): Since the above link doesn’t work anymore after MS Connect was retired a few years ago, I re-opened a new request on the new Microsoft Ideas site that replaced MS Connect. Please go and vote for this feature to be added to MR 2012 Configuration console.
Thanks for the post, this was very helpful!
LikeLike
Hi Kerry,
I’m glad it helped. I’m going to publish an article related to the DR webinar from the 23rd with the various scripts I use on my SQL server..
LikeLike
This is great information.
LikeLiked by 1 person