Wednesday, June 25, 2014

Dynamics GP takes over 10 minutes to login or switch companies (aka Don't blame Mekorma MICR)

By Steve Endow

I have had a few calls with a Dynamics GP customer that was having performance issues.

They have two separate Dynamics GP environments for two separate entities.   Environment A runs fine--GP performance is fine and only takes about 15 seconds to login.  Environment B takes 10 to 12 minutes to load!!!  Environment A has about 160 company databases, and Environment B has about 150 company databases.

When this type of Dynamics GP performance issue occurs, I usually notice that the client misdiagnoses the problem and focuses on possible causes that are based on coincidence rather than analysis or data.

This client had recently installed Mekorma MICR in Environment B.  They believe that the performance issues started after they installed Mekorma MICR, so they were calling Mekorma support and spending hours trying to troubleshoot the issue.  Not surprisingly, they told me that Mekorma was baffled by the performance issues.

Knowing how Mekorma MICR works, and having worked with many companies that use Mekorma with a lot of company databases, I was very skeptical that Mekorma MICR was causing the performance issue in Environment B, especially since Environment A also used Mekorma MICR.  It just didn't make sense.

The contact at the client explained that they have had several internal meetings with DBAs and network engineers analyzing activity and packets and they claim that they aren't able to observe any indicators of a performance issue.  Yet I can guarantee that having to wait 12 minutes for GP to login definitively indicates a severe performance issue--I don't care what CPU utilization stats say.

Environment A is a physical environment, with SQL Server running on a powerful server with 48GB of RAM.  It could probably use more RAM, but performance seems fine.  Environment B is a virtual environment with 24GB of RAM.  Why the difference in RAM?  I've found that for some reason, people think that virtual environments only need a fraction of the RAM that they would normally allocate to a physical machine.  24GB for a SQL Server that is hosting over 150 company databases seems grossly inadequate.  My desktop computer has 32GB of RAM, so I am baffled that a company is looking to conserve RAM by starving their production SQL Server virtual machine.

I recommended that the client try a simple test of allocating 48GB of RAM to the SQL VM in Environment B and restarting the server to allow SQL Server to reallocate its memory usage.

I then thought about checking to see how large the company databases were in both environments.  I had the client run this query to get a list of all databases and their sizes.

with fs
    select database_id, type, size * 8.0 / 1024 size
    from sys.master_files
    (select sum(size) from fs where type = 0 and fs.database_id = db.database_id) DataFileSizeMB,
    (select sum(size) from fs where type = 1 and fs.database_id = db.database_id) LogFileSizeMB
from sys.databases db

The client sent me the results and one thing jumped out right away.

The tempdb database is 87 GB!  As the Mythbusters would say, "Now THERE'S your problem!"

I don't know for sure that the massive tempdb is the cause, but at the moment it's at the top of my list of suspects.

This Microsoft KB article explains that the easiest way to clear and shrink the tempdb database is to restart the SQL Server service. So I have recommended that the client schedule a time to restart their SQL Server.

But what would cause the tempdb to become so large?  I speculate that one possible cause might be...wait for it...insufficient memory!  I haven't yet found specific documentation that links the two, but it is mentioned on a few forum threads such as this one:

So I speculate that the root cause is insufficient memory, which causes excessive use of tempdb.  As tempdb grows, its ability to compensate for lack of memory decreases.

I'm waiting to hear back from the client to see what changes they make and whether it improves performance.

12 minutes!!!!  That's just crazy.

Steve Endow is a Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Tuesday, June 24, 2014

Is eConnect multi-threaded?

By Steve Endow

Just when you think you know something about something, a question comes up that gives you pause because you don't know the answer.

I just fielded a seemingly simple question about eConnect:  Is the eConnect service multi-threaded?

Good question, I didn't know.

I tracked down the GP 2013 eConnect Install Admin Guide and searched for "thread", and what do you know...  There is a configuration option to specify the number of threads for the eConnect.  Kind of, but maybe not really.

The catch is that the configuration option is listed under the Incoming Service.

eConnect.Threads This key specifies the number of threads that are available
when the service starts. The default value is “0”.

< add key="eConnect.Threads" value="0" / >

A value of “0” indicates a single thread will be used. You can set the value to any
number between 0 and 19. A value of 19 makes 20 threads available when the
service starts. A higher number of threads should allow the service to process
documents more quickly.

While that may sound geekishly cool, like turning on eConnect afterburners, there is a good reason why the default value is one thread.  The subsequent warning on page 34 explains:

There are scenarios where increasing the number of threads degrades system performance. If the server cannot support more threads, or custom code added to an eConnect pre or post stored procedures is not thread safe, adding threads may cause unexpected results. Changes to this key’s value require careful evaluation and testing.

So while this indicates that the Incoming Service supports multiple threads, can we then deduce that the main eConnect service supports multiple threads?  I guess I still don't know for sure since I've never bothered to try and develop a multi-threaded eConnect application.  Presumably someone at Microsoft has a definitive answer.

I've developed integrations that have processed millions of transactions and have only had a few situations where multiple threads might have helped to process transactions faster, but it would have required me to write a multi-threaded integration, which probably would have been difficult to justify.

But if you know that you will be running multiple large integrations at the same time and have a limited amount of time to import transactions, it might be worth investigating.  And testing...thoroughly!

Steve Endow is a Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Tuesday, June 17, 2014

Integration Manager 11 / 2010 Error: Invalid object name SY00100

By Steve Endow

This week I helped someone troubleshoot and refine an Integration Manager script for use on GP 2010.

The integration is for a GL JE import with AA, so I fired up my dedicated AA virtual server.  I found that I didn't have IM installed on the server, so I installed IM 11 for GP 2010 from the 2010 R2 installation files.

I got IM installed and configured and tested the GL AA integration.  When I ran the import, I received this error message:

Beginning integration...
DOC 1 ERROR: System.Data.SqlClient.SqlError: Invalid object name 'SY00100'.
Integration Failed

The SY00100 table stores the name of the System database when the named system database feature was released with GP 2013.  I'm not an expert on the history of the SY00100 table, but my assumption is that it was released with GP 2013 and does not exist with GP 2010.  My GP 2010 version is 2044, or SP 3, which should be later than my IM install of R2, so it is puzzling.

I do have both GP 2010 and 2013 installed on this server, so I double checked that I'm using IM 2010 and that IM is pointing to my GP 2010 SQL instance.  And SY00100 definitely does not exist in my GP 2010 install.  So why is Integration Manager for GP 2010 looking for a table that only exist in GP 2013?

I'm assuming there is some explanation, which I'm guessing relates to some version discrepancy with my GP 2010 installation, but I don't know the answer at the moment.

Any ideas?

Fortunately I didn't have to get the integration running on my machine--I just had to troubleshoot a script, which I was able to do despite the error.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Wednesday, June 11, 2014

GP Customers: Talk to your Dynamics GP partner! Let them know if you are not happy with GP or with their service!

By Steve Endow

I am sure I have written about this at least once before, but today, in the last two hours, I had TWO conversations regarding Dynamics GP customers that are not happy with Dynamics GP and are not terribly pleased with the service their Dynamics GP partner is providing them.

The sad thing is that the customers contacted me before they contacted their GP partner!

Customers:  If you are not happy with GP, or if you are not happy with your Dynamics GP partner, the first step is to give them a call and let them know!  If you aren't satisfied with the service you get in response, by all means, shop for a new partner.  If another partner or two can't get GP to meet your needs, by all means, shop for a new ERP system.

But please don't just call someone up and tell them that you aren't happy with GP or your partner before you give your partner a chance.

Yes, partners should be checking in on their existing customers to maintain communication and make sure the customer is happy, but this rarely happens--we are all busy and often just don't give existing customers as much attention as we should.  I'm certainly guilty of that myself.

But it's very costly to switch partners, and it's incredibly expensive to switch ERP systems, so customers, it is in your best interest to try and work with your existing partner to address any concerns you may have regarding Dynamics GP.

If your partner can't provide the level of service you are seeking, then at least you have tried that option and you can move on to the more costly options next.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Dynamics GP crashes after installing AddIn DLL with System.IO.FileLoadException error

By Steve Endow

A client was trying to test some Dynamics GP AddIns on their server, but when they copied the DLL files to the AddIns folder, they could no longer launch GP.  It would crash immediately.

When we clicked on View problem details, it displayed a rather convoluted and generally meaningless pile of technical error information.  Here is an example of the error details:

Faulting application name: Dynamics.exe, version:, time stamp: 0x52784445
Faulting module name: KERNELBASE.dll, version: 6.1.7601.18409, time stamp: 0x53159a86
Exception code: 0xe0434352
Fault offset: 0x0000c42d
Faulting process id: 0x111c
Faulting application start time: 0x01cfacec50a1c5aa
Faulting application path: C:\Program Files (x86)\Microsoft Dynamics\GP2013\Dynamics.exe
Faulting module path: C:\Windows\syswow64\KERNELBASE.dll
Report Id: 91d6dc87-18df-11e4-88f1-001e4f4c0702

Problem signature:
Problem Event Name: CLR20r3
Problem Signature 01: dynamics.exe
Problem Signature 02:
Problem Signature 03: 52784445
Problem Signature 04: mscorlib
Problem Signature 05: 4.0.30319.18444
Problem Signature 06: 52717edc
Problem Signature 07: 26fb
Problem Signature 08: 0
Problem Signature 09: System.IO.FileLoadException
OS Version: 6.1.7601.
Locale ID: 1033
Additional Information 1: 0a9e
Additional Information 2: 0a9e372d3b4ad19135b953a78882e789
Additional Information 3: 0a9e
Additional Information 4: 0a9e372d3b4ad19135b953a78882e789

The only thing that seemed recognizable was this detail:

Problem Signature 09:  System.IO.FileLoadException

So presumably GP or the AddIn couldn't load a file.  While a clue, it doesn't narrow things down very much.  Is that due to a missing dependency?  Permission issue?  .NET version issue?  x64 vs. x86 issue?

After some searching, I found a GP forum post where another user had the same problem.  A very astute forum reader pointed me to Patrick Roth's blog post where he describes the general problem, but  he doesn't mention the System.IO.FileLoadException detail.

Patrick explains that he finally tracked the problem to the Windows security feature that automatically blocks certain files when they are downloaded from another source.  I don't know the proper name for this annoying feature, but I call it "Block" / "Unblock".

When an EXE or DLL is downloaded, particularly using a web browser, Windows blocks access to the file until you go unblock it.  In the case of DLLs, it seems the only way to do this is to go into the file Properties and click on Unblock.

This feature is insidious for software developers who have to distribute EXE and DLL files.  When the file is blocked, it is done at such a low level that you have very little information to track down the problem.  And with the growing use of services like OneDrive, DropBox, Box, etc., I speculate that downloaded customizations will run into this issue more frequently.

Anyway, many thanks to Patrick Roth for posting his observations, and for user Mary on the forums who pointed me in the right direction!

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Friday, June 6, 2014

In praise of source code control: Do you use a source code repository for your Dynamics GP customizations?

By Steve Endow

In February 2012, I installed VisualSVN for source code control.

For years I had just been zipping up my entire projects with date and time and version file names, and archiving those copies.  It was simple, low tech, and actually worked well.  The only downside is that I would end up with dozens and dozens of zip files over time and I would have to back up those archives.  By 2012, I finally gave in and decided to implement some type of source code control.

I had looked into Source Safe, but it was ancient, so I looked into its replacement, Team Foundation Server.  While I have heard some good things about TFS, it seemed like total overkill for me.  I didn't need lifecycle management--only source code control, so I kept looking.

While looking into open source options, I found VisualSVN for Windows, which looked like a great fit. It integrates seamlessly with Visual Studio and utilizes TortoiseSVN to work with Subversion for Windows, so it's perfect for managing my .NET projects.  I've been using it for over two years now, and it has been fantastic.  It's extremely simple, very light weight, and does exactly what I need.  I can submit code, retrieve code, view and compare prior versions, and have a centralized location where all of my code is backed up.  It has worked flawlessly.

Today was probably the first time that I REALLY had to use VisualSVN.  I had poked around some of the previous versions of certain projects or files, but I don't think I've ever had to actually revert to a prior version.  Until today.

A customer noticed an issue with a very complex HR + Payroll integration that I developed.  They believed that the issue started some time in March, but they only noticed the symptoms today.  I've been working on the project for the last week making numerous changes that haven't yet been fully tested, so I no longer had the old code, and couldn't make a quick fix to release to production.

But, I was able to open the VisualSVN repository browser and see when I had checked in changes and see when I had created different versions of my project.

I saw that the last release was in January, but didn't recall if it was ever released or when it might have been released to production.

Using VisualSVN I was able to look at each of the changes made in January and compare them to the prior code and my current code.  I then pulled down a separate copy of my code from the January 10th release, found the bug, fixed it, recompiled, and sent the client a version 1.24.  It only took about a few minutes once I figured out what I needed to do.  The one challenge is that I so rarely look at my old versions that I had to figure out how to retrieve the old code from VisualSVN.  But it was a simple process and let me quickly fix the problem.

If you aren't using source code control for all of your Dynamics GP customizations, I strongly recommend that you implement a solution.  Having worked with clients and partners that could not provide a copy of old source code, I'm going to venture to guess that most Dynamics GP partners do not use any type of source code control.  And I would even guess that if a developer quit or was unavailable, most partners would be unable to find and retrieve source code.  I've seen it several times, and it doesn't have to be that way.  It isn't difficult to implement source code control.

Update:  I don't currently do Dexterity development, but it appears that there is a Team Foundation Server (TFS) provider for Dexterity.  If you do both Dex and .NET development, TFS may be a better choice than VisualSVN, as it would be a single repository for both.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Thursday, June 5, 2014

Developing a RESTful JSON web service integration for Dynamics GP - Part 1

By Steve Endow

I just finished a very interesting Dynamics GP integration.  It is a RESTful JSON web service that allows a web site to integrate with Dynamics GP,, and k-eCommerce / Azox Credit Card Extension.  Because it is a web service that will receive credit card data, it also includes HMAC authentication and AES encryption and runs over HTTPS.

As Fancy Nancy would say:  Très geek!

The customer uses an internal web site to record customer orders.  The web site collects the customer information, credit card information, and the products that the customer is purchasing.  In the past, the customer information was saved and then separately entered or imported into Dynamics GP.  Credit card transactions were processed through a payment gateway, and setup with recurring billing through the gateway.  All transactions were entered in GP after the fact.  The client wanted a real time integration between the web site and Dynamics GP so that the customer information is immediately saved to GP.

However, the customer did not want to store the credit card information in Dynamics GP or the Credit Card Extension module--even if it is encrypted.  They wanted to use "tokenization".  This involves sending the credit card information to the payment gateway (in this case, and receiving a unique ID, or "token" that can later be used to process credit card transactions.  Once the token was received, it would be stored in the Credit Card Extension module so that future credit card transactions could be performed from within Dynamics GP.

So why RESTful?  Well, the customer's web site was developed using PHP on Linux.  That pretty much ruled out direct APIs such as COM and .NET libraries.  SOAP would have required the client's developers do more work, so we went with REST, which they were familiar with, and REST is pretty simple and would work well for this relatively straightforward integration.

Why JSON?  The customer's web developers were used to JSON-based web services, so that was much easier than trying to implement a SOAP based web service or anything that involved XML.  Neither the client's web developer nor I like XML, so I was open to trying JSON.

So with those architectural choices made, we had a few implementation details to work out.  How would we authenticate calls to the web service?  The client's developer recommended HMAC--something that I had, sadly, never heard of before.  HMAC is a very clever and simple authentication method where you hash the contents of a request combined with a secret shared key.  The resulting hash produces a unique "password" of sorts that is effectively impossible for someone else to forge.  When the web service receives the request and the HMAC value, it performs the same hash on the request data, with the same secret key.  If the resulting value matches the HMAC value submitted by the caller, the request is considered valid.  We chose to use SHA-256 for our hashing algorithm.

Next was the credit card data.  Even though the web service would use SSL and only accept HTTPS requests, we still did not want to transmit the credit card data as plain text.  We decided to encrypt the credit card info using AES-128.  We thought that would be pretty straightforward, but it ended up being more difficult than we expected.  One challenge was figuring out how to choose an "initialization vector" (IV), and a second challenge was learning how AES is implemented in PHP vs. .NET--it took us a while to figure out that there are different default settings for PHP encryption, so we had to dig pretty deep and spend several hours researching every one of the encryption settings.  We finally figured out that PHP uses NULL padding by default, where as my encryption library uses PKCS7 padding by default.  Got that?  Easy peasy lemon squeezee!  (okay, so it is not obvious at all)

In the next several posts on this topic, I'll explore each of the elements of the project in more detail.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Illegal address for field TPEInitialize in script [Not Found]. Script terminated.

By Steve Endow

Today I worked with a customer who was receiving this error message when posting batches:

Illegal address for field 'TPEInitialize' in script '[Not Found]'. Script terminated.

We also saw:

Transaction not completed at initiating scripts exist, implicitly rolled back.

I've seen the first error a few times before, but didn't know the cause, other than that it is some type of internal Dynamics GP / Dexterity error.  This customer was seeing the error when trying to post batches using the Distributed Process Server.

Today, after working with the customer for over an hour, we found the cause of the problem in the customer's environment.  The customer has numerous complex Dynamics GP customizations--when attempting to save and post a batch in Dynamics GP with a particular GP login, we saw that some permission errors occurred.  The permissions were related to custom tables and the GP login didn't have sufficient access to those tables.  Those errors caused all sorts of failures and problems in the GP client.

Once we fixed the permission settings, we were able to save and post batches in GP, and then the batches posted successfully in DPS.

The error messages are pretty generic, so there may be other causes besides security, but if you run into them, try performing all of the steps of the process manually in GP with the same GP login.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

How do you avoid accounting process mistakes? How about a checklist?

By Steve Endow

A customer purchased some software from me recently, and decided to pay with a credit card.  I processed the credit card transaction last week and all was well.

But today I received a check in the mail from the same customer.  The check was for the invoice that the customer had already paid by credit card.

It seems that someone forgot to record the credit card payment, causing the invoice to get picked up in the weekly AP check run.  Once I realized that it was a duplicate payment, I contacted my customer to let them know.

But, what if the check had gone to a very large vendor who automatically cashes all checks?  Presumably, hopefully, the extra payment would end up as a credit on the customer's account.  At some point, someone would probably learn of the credit balance.

But imagine if I weren't an honest vendor and just shrugged and deposited the check?  Would anyone ever notice the duplicate payment?  Would a reconciliation of the credit card transactions catch such a mistake?

People make mistakes and occasionally forget to do things, even in accounting departments. So how do you catch such mistakes?

A few years ago, while reconciling my own bank accounts, I noticed that I forgot to record some cash receipts in my accounting system.  I had received and deposited the check, but simply forgot to record the receipt and deposit transactions.  Not a huge mistake, but it makes things mighty confusing when you are trying to reconcile.  And I did it more than once, so clearly it was something I needed to fix.

In my case, I went low tech.  If you have ever read Atul Gawande's books or articles, you may be familiar with The Checklist Manifesto.  When you need to do things right, sometimes all you need is a simple physical checklist.

I went online and found a site where I could order a custom ink stamp.  I designed my stamp, ordered it, and a few days later I was stamping all of the checks that came in the mail.  It works fantastic.  And because the stamp is associated with receiving money, I love using it!

I note the date I received the check, the date I scanned it, entered it, and deposited it.  Normally I do everything on the same day, but sometimes I am busy and may take a few days to scan or enter or deposit the check.  With this stamp it's easy to see if I have forgotten a step or not completed some of the steps.  The check doesn't leave my desk until all of the steps are completed.  Super low tech, no batteries, and pretty much idiot proof.

Such stamps are pretty common--I got the idea for mine from seeing accounting departments at my customers.

Perhaps my customer needs a stamp for their AP invoices with steps that has steps to track when an invoice is paid via credit card, and has a subsequent step of recording such manual AP payments in Dynamics GP?  Custom stamps are cheap, while mistakes in the accounting department can be expensive.

Are there other common accounting process mistakes that you have seen?

Would a checklist help employees avoid such mistakes?

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Tuesday, June 3, 2014

Recommended Reading: Dynamics GP and SQL Server Connection Issues and Disconnects

By Steve Endow

I'm working with a Dynamics GP customer that is experiencing some SQL Server connection issues and errors.  They started to notice the issue with a third party GP product that was logging SQL connection failures.  After looking into it further, they found that GP users on a Citrix application server were regularly being disconnected.

Network connectivity issues are often very difficult to diagnose and resolve, and they can be caused by several different things.  Network card drivers, driver settings, network cards, wiring, network switches, and routers could all potentially play a role.  I've had a customer report that upgrading the network drivers on their Dell server resolved the issue.  Another replaced a faulty network switch.  At another we found a bad network jack in a user's cubicle.  I've had a few cases where switching from shared dictionaries to local dictionaries resolved numerous GP errors--we speculate that dragging the large dictionary files across the network was causing the errors.

The customer I'm working with now did some searching and found this very interesting article by Kayla Schilling at The Resource Group, posted on the Dynamics University site.

It's a fascinating explanation of some Dynamics GP errors that may be caused by network connectivity issues with SQL Server.  In my experience, such network issues are relatively uncommon, so it is rare to have an article written with such authority by someone who has dealt with the issue multiple times.

I think it's a very impressive write up and recommend that all GP consultants review it, at least so that they are familiar with some of the possible symptoms of a network issue.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter