Monday, December 28, 2015

OpenVPN Add Route requires Run As Administrator

By Steve Endow

I use OpenVPN to connect to my network when I'm remote.  There was quite a learning curve getting it setup and configured, but now that I've been through that gauntlet, it works pretty well.  I can use it on my Windows laptops, as well as my iPhone and iPad, which is a nice bonus.  So I have the option to pull out my iPhone, connect to my network, connect to my desktop or VM servers, and take care of a quick fix.

I recently bought a Surface Pro 4 and got it all configured, including OpenVPN.  It worked fine when I connected from my work network, so I thought I was all set.

I'm up at Lake Tahoe this week and I'm somewhat futilely attempting to get some work done when the kids are asleep.



When I used OpenVPN, it would connect to my VPN server, but for some reason I couldn't connect to any of the machines on my network.

Looking at the log, I noticed a bunch of error messages.


"ERROR: Windows route add command failed"

I also saw that "Access is denied" was also listed in the log.  Although it did connect to my VPN server, I was unable to connect to the machines on my network.

Based on the "Access is denied" message, on a hunch, I set the OpenVPN GUI to run using Run As Administrator and then connected again.

It connected fine, there were no more error messages, and I was able to connect to my VM servers just fine.


I've seen UAC and the whole Run As Administrator feature result in so many frustrating errors, even when logged in as a local Administrator.  You just never know when it is causing a problem, so you always have to try it to rule it out as the cause of an issue.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

http://www.precipioservices.com

Wednesday, December 23, 2015

eConnect imports invalid Inventory Transaction causing divide by zero error when posting

By Steve Endow

I have the privilege of seeing lots of strange things in Dynamics GP.

I had a call with a customer who was having a problem posting a batch using Post Master Enterprise.  Other batches would post fine, but one batch kept failing at 97%, without any specific error message.

After resetting the batch status, we opened the batch in GP and reviewed the batch Edit List report.  No errors.  Everything looked fine.

We then attempted to post the batch within GP and at the same 97% mark, we received this error:


EXCEPTION_CLASS_SCRIPT_DIVIDE_BY_ZERO

Now we were making some progress!

The good thing about Divide by Zero errors is that you know what to look for:  A zero.

We then queried the IV10001 table and behold there was the culprit.


The QTYBSUOM field had a value of zero, while every other row had a value of 1.

Once we changed the field value to 1, the batch posted successfully.

The client imports the IV transactions with eConnect, so obviously eConnect is not validating something and is importing the record with an invalid value.

We queried the IV10001 table for other records with this issue and found several more.  All were related to the same item number, so it seems to be isolated to a single item number.

The client is going to research that item setup to try and identify the root cause.

Fun times!


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter






Monday, December 7, 2015

SQL Server Express idle feature causes .NET SQL error: TCP Provider The specified network name is no longer available

By Steve Endow

I just spoke with a partner who is testing Post Master Enterprise, which is a .NET application that automatically posts Dynamics GP batches.

Post Master Enterprise would run fine for a while, but would then log this error:

A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.) 


On the surface, this is indicating that Post Master is unable to find or connnect with the SQL Server.  This is very strange, since Post Master was working fine, was communicating with the server, and the error seems to occur randomly.

I asked the partner to check the SQL Server logs.  When he opened the logs from SQL Server Management Studio, we saw hundreds of these events.


"Server resumed execution after being idle 1792 seconds: user activity awakened the server"

The time stamp on the events corresponded with the network error that Post Master was receiving.

What in the world does that message mean?  Why would SQL Server be "idle", and since when does SQL need to be "awakened"?

When I searched for this message, I found this MSDN blog post:

http://blogs.msdn.com/b/sqlexpress/archive/2008/02/22/sql-express-behaviors-idle-time-resources-usage-auto-close-and-user-instances.aspx

It appears that SQL Server Express has a "feature" where it will become "idle" after some period of inactivity.  It looks like the feature was added under the guise of performance to somewhat cripple SQL Express.

I asked the GP partner if he was using SQL Server Express in his test environment, and he didn't think so, but when he checked @@VERSION, sure enough, it was Express.


So SQL Server Express was going to sleep, and when Post Master attempted to query the database, it took so long for Express to wake up that Post Master received a network error.

Lesson:  Don't use SQL Express, even for a test environment.

I can't remember the last time I used Express or even saw Express installed anywhere.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Wednesday, November 18, 2015

Want to become a Dynamics GP Visual Studio developer?

By Steve Endow

Today I received a question on Experts Exchange, asking how a developer who knows C# could learn how to develop with Dynamics GP.  Specifically, the person was looking for a class to learn how to develop Visual Studio solutions for Dynamics GP.


I thought it was a good question, but it's one of those questions where there isn't a simple answer.

Here's how I responded.

========================

Unfortunately, I don't believe there has ever been a formal Visual Studio developer training or curriculum for Dynamics GP.  The closest I've seen is on-site custom training conducted by a GP partner at a customer site (I have done such training for a customer).

I think there are two things you'll need to become a GP developer using Visual Studio:

1. Strong knowledge of Dynamics GP design, architecture, quirks, etc.
2. Knowledge of the Dynamics GP developer tools and APIs

#2 is pretty standard developer self-learning.  There is some decent documentation and a handful of samples to get started, but some areas will require some digging and time to understand and utilize.

Depending on how much you know about GP, #1 will be the more difficult part.  There is a lot of subtle background knowledge that helps inform how to design and develop Visual Studio customizations and integrations for Dynamics GP, but there is no simple training session that can capture it all.

You could start by researching the following:

1. Leslie Vail's book, Developing Microsoft Dynamics GP Business Applications, might be a good place to start to get oriented on the tools and options for GP development.
https://www.packtpub.com/application-development/developing-microsoft-dynamics-gp-business-applications

2. Read the MS documentation for Visual Studio Tools (VS Tools) for Dynamics GP
https://msdn.microsoft.com/en-us/library/cc543538.aspx

3. If you foresee a need to do any integrations, look into eConnect and GP Web Services.  I only use eConnect, and haven't found much value for Web Services, but it would be good to be aware of both.
https://msdn.microsoft.com/en-us/library/aa973830.aspx
https://msdn.microsoft.com/en-us/library/bb219081.aspx
https://msdn.microsoft.com/en-us/library/cc534132.aspx

4. Setup Dynamics GP on a development server or create an Azure VM using one of the Azure Dynamics GP images to get access to Dynamics GP and start developing some simple projects.

There are a fair amount of blog posts or articles online about Visual Studio development for GP, but I believe most cover a single aspect or small topic, so it will take some searching to get a big picture of the options and processes.

In short, you can do it yourself, but it will require some time. If you have the budget to hire an experienced GP developer to work with you one-on-one, that will probably save you time and make the learning curve easier.

========================


If anyone has any additional tips on how to become a Dynamics GP Visual Studio developer, please let me know.


 Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Tuesday, November 17, 2015

eConnect 2015 Error: The pipe endpoint EntityOperations could not be found

By Steve Endow

Back in the now distant year of 2010, I wrote a post about an eConnect error:  There was no endpoint listening at net.pipe.

That has been one of the most frequently accessed posts on my blog.

Well today I encountered a strange variant at a client.  I had upgraded an existing integration from GP 2010 to GP 2015, and the new GP 2015 integration was installed on the same workstation where the 2010 version had been installed.

When the client ran the integration, she received this error:

eConnect error: System.IO.PipeException: The pipe endpoint 'net.pipe://localhost/Microsoft/Dynamics/GP/eConnect/EntityOperations' could not be found on your local machine.


Since this wasn't the usual "no endpoint listening" message, I wasn't sure what the cause could be.

But since it involved the words net.pipe and endpoint, I checked the eConnect 2015 Windows Service to see if it was running.

Interestingly, the service was not running.  After starting the eConnect 2015 service, the user ran the integration again and it worked fine without error.

I have tested two of my GP 2015 development machines, but when I stop the eConnect service and attempt an integration, I get the standard "There was no endpoint listening at net.pipe" error message.

I'm not sure why the client saw something different, but since it's the first time I've seen it, and since it happened to occur with eConnect 2015, I thought I would document it in case someone else comes across this variant.

If you do see it, check the eConnect windows service and see if it is running.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Monday, November 16, 2015

Supporting a Dynamics GP ISV solution involves far more than just product support

By Steve Endow

While reselling and supporting Post Master for Dynamics GP, I've worked with hundreds of customers who have reviewed, installed, tested, and implemented Post Master.

Most customers or partners do everything themselves without requiring any assistance, while a few need some assistance with the configuration and testing.

Then there are some that have an issue and need actual support.  But by support, I don't just mean Post Master support.  That would be easy.

Over the course of the last 6 years supporting Post Master, I've seen some of the strangest issues, many of which had nothing to do with Post Master, yet I've had to help the customer resolve them so that they could use Post Master in their environment.

For example, this is one of my favorites:

A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The semaphore timeout period has expired.)

I've had a handful of customers with this issue--for some reason there is a TCP/IP network error when Post Master attempts to query SQL Server.  If you Google the error, you'll see lots of people have this issue under many different circumstances.  It is a low level network error being returned by Windows or SQL Server, and there is very little I can do to fix it directly.  It could be a network driver, NIC card, switch issue, or maybe even a wiring issue.  But because Post Master happens to be the application that detects and logs the error, the customer expects me to be able to resolve it.

Another very common challenge is setting up SMTP so that Post Master can email posting reports or error notifications.  SMTP poses many challenges, but the most common is that most non-IT folks don't understand SMTP (and even some technical folks don't understand it).  With a name like "SMTP" this isn't surprising.  SMTP Server, Port, and SSL / TLS are somewhat technical values that a typical end user wouldn't know about.  And then you have SMTP relaying, which adds another layer of complexity.  And then there is relaying to internal vs. external email addresses.

A customer just contacted me, saying that emails to internal addresses are delivered fine, but emails to external addresses generate an error.  Looking at the Post Master logs, I found this explanation:

The server response was: 5.7.1 Unable to relay

The customer's SMTP server only allows relaying to internal addresses, so I asked the user to speak with their IT team to determine the best approach for relaying messages to an external address.  Unless you've administered an Exchange or SMTP server (which fortunately I have), these concepts aren't obvious.

Then there is SQL Server security.  Post Master has some custom SQL tables and stored procedures and a custom database role to manage access to those objects, independent of DYNGRP.  We have had a few customers with very strict SQL Server security auditing that automatically deletes all SQL objects that are not on an approved list.  No joke.  Or manual audits that resulted in security roles and permissions being deleted.  So we've had to figure out why Post Master's SQL security or objects have disappeared, and then figured out that it was a security audit that caused the problem, and request reviews and exemptions from the audit team.

We had one government client that was having a FIPS compliance audit who needed us to modify the internal Post Master encryption to allow them to pass their audit.  Fortunately, I know what FIPS is, I understand encryption algorithms, and knew exactly how to modify Post Master to meet their requirements.  And I just happened to know the history of and the difference between Rijndael and AES, so I was able to figure out why the rarely used and negligently undocumented Microsoft Windows FIPS compliance feature was complaining about Post Master's encryption.  That's an entire world of technology that represents about 1% of Post Master functionality, but unless you fully understand it, you can't properly support it.

Then there is the very broad category of compatibility.  First, there is Windows.  I have found that User Account Control (UAC) produces such strange and inconsistent behaviors that I never know when it is causing the issue.  When something odd happens, I ask the customer to try using Run As Administrator to see if that helps.

And don't forget good old Binary Sort Order in SQL Server.  During one Post Master release, we accidentally had a SQL script that referenced a lower case table name, and a customer with Binary sort received an error.

We have come across other Dynamics GP ISV solutions that result in a mix of .NET 2 and .NET 4.x, requiring us to handle .NET compatibility issues in configuration files and at run time.  And then there are direct product incompatibilities, where an ISV solution will cause errors with Post Master, or prevent a GP batch from posting properly.

Then there are the overzealous consultants.  I maintain the Post Master user manual, which is now 41 pages in total.  It has grown over time as I have documented additional troubleshooting tools and techniques, compatibility issues, you name it.  I have even created an installation video and a configuration video, both of which provide step by step instructions.  But occasionally there is an eager user, often a consultant, who glosses over the manual and attempts to use their own brand of kung fu to resolve errors.  They will find a configuration file and edit it, despite the fact that nowhere in the user manual does it mention editing configuration files.  They will start modifying SQL Server security.  They will do things that I would have never anticipated, turning a 30 second resolution into a 15 minute support call.

So when it comes to supporting a software product, you need to know a whole lot more than just your product, and in many cases, you need to know them intricately--as in you need to be an expert.

This simple list doesn't begin to convey the depth of knowledge required, but I think it's a list of categories that serve as a starting point of what I need to know to support Post Master:

Networking
Windows
SQL Server
SMTP
.NET
Dynamics GP
Other ISV solutions


Other products will likely need to have other ancillary areas of expertise, such as data sources, peripheral hardware such as bar code scanners, external web services, you name it.

I suppose the same thing applies to Dynamics GP consulting.  Customers expect you to solve any problem even remotely related to Dynamics GP, and those issues can have virtually nothing to do with Dynamics GP, other than the fact that they are causing a GP error.  But one difference with supporting your own product is that you don't typically bill by the hour for that support--so you really need to have the knowledge to resolve issues quickly, even those that aren't directly related to your product.

Break out  your WireShark network packet capture tool and get to work!


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter







Wednesday, November 11, 2015

A baffling bug in a modified SOP Transaction Entry window

By Steve Endow

A customer called me today asking for help with a GP customization.

They have a modified SOP Transaction Entry window and a modified SOP User Defined Fields Entry window.  The SOP Trx Entry window has two custom buttons and a little bit of VBA script, while the SOP UDF window has some fields hidden to simplify it, along with a custom button that opens a custom VBA window.

Pretty basic modifications--nothing special.


The client demonstrates that when they open the SOP UDF field, enter some values in the fields, then click on OK, then save the parent SOP Invoice, the SOP UDF values are not saved.

If they open the same invoice and then open the SOP UDF window, the fields are all blank.


Hmm, that's interesting.

But wait! There's more!

They tested different scenarios, and figured out that if they do not use the modified SOP Trx Entry window, the modified SOP UDF window does save properly.

Yes. You read that right.  If they revert to the standard GP SOP Trx Entry window, their modified SOP UDF window saves values and retrieves them properly.

So, it seems that for some reason, the modified SOP Trx Entry window breaks the SOP UDF window.

I ran a SQL Profiler trace on the SOP UDF window while using the modified SOP Trx Entry window, and here is what I saw.

BEGIN INSERT INTO TWO.dbo.SOP10106 (SOPTYPE, SOPNUMBE, USRDAT01, USRDAT02,
USRTAB01, USRTAB09, USRTAB03, USERDEF1, USERDEF2, USRDEF03, USRDEF04, USRDEF05,
COMMENT_1, COMMENT_2, COMMENT_3, COMMENT_4, CMMTTEXT)
VALUES ( 0, '', '1900.01.01' ....

Notice that SOPTYPE = 0 and SOPNUMBE = ''

And that statement produces an invalid record in SOP10106.


When I disable the modified SOP Trx Entry window, but keep the modified SOP UDF window, things return to normal.

BEGIN INSERT INTO TWO.dbo.SOP10106 (SOPTYPE, SOPNUMBE, USRDAT01, USRDAT02, USRTAB01, USRTAB09, USRTAB03,
USERDEF1, USERDEF2, USRDEF03, USRDEF04, USRDEF05, COMMENT_1, COMMENT_2, COMMENT_3, COMMENT_4, CMMTTEXT)

VALUES ( 3, 'STDINV2263' ....

Notice that the values for SOPTYPE and SOPNUMBE are populated.

Very, very strange.  Somehow, the modified SOP Trx Entry window is preventing the SOP UDF field from having the SOP Type and SOP Number values.

This set of modified SOP windows works fine on their old GP 2010 environment, but exhibits this problem on their current GP 2015 environment.  The modified forms did have some errors and problems when upgraded to GP 2015, so that may be a clue, and I don't know if they might have made a change that caused this issue.

But regardless, I found it a rather strange problem to have.

My recommendation is to recreate the modified SOP Trx Entry window from scratch and see if that solves the problem.  If not, also recreate the modified SOP UDF window from scratch.  The modifications shouldn't take long to reproduce, and I suspect that will be much easier and cleaner than trying to figure out the specific cause of the problem.

I'm always amazed at the strange things I continue to see.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter








Monday, November 9, 2015

GP Consulting Developers: How do you manage your dev VMs?

By Steve Endow

If you're lucky, there's GP 2010, 2013, and 2015.  Visual Studio 2010, 2013, and 2015.  If you aren't lucky, you've still got GP 10, or earlier.

Plus there's Analytical Accounting, Project Accounting, Manufacturing, and other lesser used modules, which are hassles unto themselves.

Then there are those clients with rather involved environments or third party modules or web services that are dependencies for your development.

How do you manage the many environments required to be a Dynamics GP developer for many customers?

When I first started using Hyper-V, I believe had one general purpose GP 10 development VM.  I would then build a separate dedicated VM for some customers.  Building customer specific VMs gets old pretty quick.  It takes up quite a bit of disk space, and if they aren't frequently used or dormant for a year or more at a time, they become an administrative hassle.

I've now streamlined my VMs a bit and have what seems like a more manageable VM setup, but it isn't perfect.  I now have a total of 19 VMs setup on my server, but fortunately I only typically use 9 or 10 of them.



My primary VMs are:

GP2010 - Primarily VS 2010
GP2013 - Primarily VS 2013
GP2015 - Currently VS 2013, but hope to start using VS 2015 soon

GPPA - For Project Accounting, with GP 2010, 2013, and 2015 all installed in separate SQL instances

GPAA - For Analytical Accounting,  with GP 2010, 2013, and 2015 all installed in separate SQL instances

I still have a few client-specific VMs.  One client used Project Accounting, but I just transferred that code to my GPPA server.  It's very difficult to test customizations and integrations without the client's database, but so far it has been managable.

The other client specific VM has a pretty elaborate HR + Payroll setup in GP.  I could potentially migrate that one as well, but will probably wait until they upgrade to GP 2015.

While my "consolidated" VMs, one for each version of GP, generally work, there are some drawbacks.

1. It's often difficult to remember which server any given client project is on. Was it GP 2010? Or did they upgrade? Did they upgrade to 2013 or 2015?  I often have to dig up my documentation on each project to see what version the customer is on.

2. Third party products are a hassle on a shared VM.  For a GP 2013 customer, I have a third party product that has a dictionary, AddIns, and a web service.  Sometimes I have to disable the third party products to work on other client projects, as they change or disable standard GP functionality.

3. Multiple projects on one GP instance can be a hassle.  If I have 5 customizations using Modifier, but need to DAG just one, I have to make a backup of everything, strip out four of those Modifier customizations, and then DAG the single modified form.  Then I eventually have to go back and reimport the other modifications.  Or if I need to test something on an unmodified form, or make a new customization of the same form, I have to swap between them.  It can get messy and is difficult to keep such an environment clean and organized.

4. I've been trying to dig into SBA, and given its setup requirements, I've put that on a dedicated VM so that I can have Active Directory, IIS, and all of the other pile of prerequisites that it requires present without having to mangle my other VMs or create any dependencies on other VMs.

5. Given the size of the consolidated VMs, the 16GB in my current server limits the number of VMs I can have running at once.  So I regularly have to save one VM before launch another one.  I'm planning on upgrading my server to 32GB, which should resolve this limitation.

There are other small annoyances, like managing different types of test data, but these are five that happened to come to mind at the moment.

I have never really looked into Diff Disks or whatever they are called now with Hyper-V.  If I were a Hyper-V guru, I suppose there is some option to have a base VM and then work with the multiple diffs, but I don't know how much easier that would be to manage multiple Diffs, or if performance would be an issue.

Anyway, I'm curious if anyone has any tips or techniques that has made managing multiple Dynamics GP VMs for multiple customer projects more manageable or easier.


UPDATE:  I was just reminded of another situation that my VMs don't currently handle:  Deployment testing.  A client tried to install a customization I developed, but the customization had a dependency on some SQL Server SDK DLLs.  The GP workstation doesn't have SQL Server or the SDK DLLs installed, so naturally my customization doesn't work.  So I now need to configure YET ANOTHER machine that has the full Dynamics GP 2015 client, but does NOT have SQL Server on it.  Seems like the possibilities never end.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




High Volume GP Integration Tip: Increment your batch ID every X thousand records

By Steve Endow

I've developed several high volume Dynamics GP integrations for customers that needed to import tens of thousands of transactions at a time, and I also work with customers who use Post Master Enterprise to automatically post their batches--many of whom have very high volume integrations.

While the process of importing 20,000 or even 100,000 transactions into Dynamics GP is not terribly different than importing 100, there is one thing that can be challenging with such high volume GP integrations:  Batch Posting.

eConnect imports can shove a lot of data into Dynamics  GP very quickly, but once the import is done, you are stuck with a massive pile of transactions to post in Dynamics GP.  And those batches can take a long time to post.

One thing I always recommend to customers who have to import thousands of transactions at a time is to have their import break up the data into multiple Dynamics GP batches.  So instead of having a single batch with 40,000 transactions, have eight batches with 5,000 transactions, or even 20 batches with 2,000 transactions.  There is no magic number for batch size, but I would personally keep them under 5,000 per batch.

I recommend this because if there is a batch posting failure or an error that causes the posting to stop, you won't be holding up 39,000 transactions.  If the batch size is 2,000 transactions, a posting failure will only delay posting for those 2,000 transactions, while 38,000 transactions in the other batches can continue to post.

So how do you split your import of 40,000 transactions from a single data file into multiple batches?

I'm glad you asked!  It's quite easy:  You just change your batch ID.

In your import, you will maintain a transaction counter that gets incremented every time a transaction is imported.  After that counter hits X thousand records, you change your batch ID and reset your counter.

I prefer to make the batch size a configuration option in my integrations.  So the customer can change it to any value they want at any time.  Then, when I assign the batch ID to the transaction, I check the counter.  If the counter hits 5,000 (or whatever value is configured), I increment the batch ID suffix, like INV12345-1, INV12345-2, INV12345-3.

If you have a .NET eConnect integration, this is just a few simple lines of code.  If you are using SmartConnect or Scribe, you would probably need a few global variables to store your batch size and transaction count, and a few lines of script in your Batch ID field.

To me this technique of managing batch size seems very obvious, but I have had dozens of phone calls with customers who are importing tens of thousands of transactions into a single batch, and then stressing over the batch posting process at 1am, because a failed batch posting would take them hours to re-post.

Skip the stress--split your batches.

Happy importing, and merry posting.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Sunday, November 8, 2015

SmartConnect Calculated Field VB Script to generate unique cash receipt check numbers

By Steve Endow

A Dynamics GP client has a situation where they may receive multiple payments from a customer with the same "check" number on different days.  So on Monday November 2, they might receive check "12669", and on Wednesday November 4, they might receive another payment marked with the same check number "12669".

I don't know the background of the source data, but it's coming in electronically (such as from a lock box, bank web site, or other cash management system), and it is being imported using SmartConnect.

If you attempt to import a customer check number that already exists in GP, SmartConnect will spew this eConnect error at you.

"Duplicate check number for this customer already exists in RM00401"


So, in the Dynamics GP import tool that doesn't require any programming, you'll need to do some programming if you want to work past this error.

First, open the Mapping window and click on Additional Columns.  Then add a new Calculated column.

I called mine the overly long but self explanatory "UNIQUECUSTOMERCHECKNUMBER".


In the calculated column, paste in this script.  I would have written it in C#, but the client's other SmartConnect integrations have some VB, so I went with VB.

This is my inaugural SmartConnect script, so excuse me if I'm doing anything wrong or sloppy.  And I'm sure there are at least 3 different ways to do this in SmartConnect, so if you have a better way, post a comment and let me know.

As a side note, one reason why I hate "scripting" in any tool is that it is typically a euphemism for "Here's an empty dumb window with no Intellisense or feedback about the code you are writing".  It's a glorified version of Notepad.  It makes coding horribly tedious compared to Visual Studio, as modern IDEs no longer require you to memorize every excruciating detail about the language syntax.  So if you also find such scripting windows painful, just write the script in Visual Studio first, and then paste it into SmartConnect.


        Dim connString As String
        Dim commandString As String
        Dim recordCount As Integer
        Dim dateString As String

        connString = _SQLDEF_GP2013

        Dim conn As System.Data.SqlClient.SqlConnection = New System.Data.SqlClient.SqlConnection(connString)
        conn.Open()

        commandString = "SELECT COUNT(*) AS RecordCount FROM RM00401 WHERE CUSTNMBR = '" & _CUSTOMERID & "' AND RMDTYPAL = 9 AND CHEKNMBR = '" & _CHECK_CARDNO & "'"

        Dim cmd As System.Data.SqlClient.SqlCommand = New System.Data.SqlClient.SqlCommand(commandString, conn)
        cmd.CommandType = System.Data.CommandType.Text

        recordCount = Convert.ToInt32(cmd.ExecuteScalar())

        dateString = Microsoft.VisualBasic.Strings.Right("0" & System.DateTime.Now.Month, 2) & Microsoft.VisualBasic.Strings.Right("0" & System.DateTime.Now.Day, 2) & Microsoft.VisualBasic.Strings.Right(System.DateTime.Now.Year, 2)

        If (recordCount > 0) Then
            Return _CHECK_CARDNO & dateString
        Else
            Return _CHECK_CARDNO
        End If


This script queries the RM00401 table to count any cash receipt records for the customer and given check number.  If it finds any, it appends the date (MMDDYY) to the end of the check number value and returns it back to the Check Number field in the Mapping.


Please note that I've highlighted two variables above.  The first one, _SQLDEF_GP2013, is the SmartConnect default connection string for my server.  You will want to change that variable to point to your server.

In the Edit Calculation window, expand Default Connections -> Generic -> MSSQL Default to find your connection string variable.


The second variable I've highlighted, "_CHECK_CARDNO" is the name of the check number field in my particular source data file.  Yours will likely be different, so you'll need to change that to match your check number field.

I suspect that this situation of having duplicate customer check numbers is probably unusual, so you may not need to use the script for this specific scenario, but the script design would be useful for other situations where other duplicate document numbers may need to be imported.  You would just need to change the SQL query to work for your particular document type.

Happy coding.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Thursday, November 5, 2015

Run a SQL script across all Dynamics GP company databases

By Steve Endow

This may be posted elsewhere, but I just whipped it up quickly and thought I would post it for posterity.

If you need to run a SQL script across all Dynamics GP company databases, this is one way.  There are likely other ways, but this approach is actually relatively simple compared to the alternatives.


USE DYNAMICS
DECLARE @INTERID varchar(10)
DECLARE INTERID_Cursor CURSOR FOR
SELECT INTERID FROM DYNAMICS..SY01500

OPEN INTERID_Cursor

FETCH INTERID_Cursor INTO @INTERID

WHILE @@Fetch_Status = 0
BEGIN
  EXEC('UPDATE ' + @INTERID + '..SOP10100 SET WorkflowApprStatCreditLm = 9 WHERE BACHNUMB LIKE ''ABC%'' AND WorkflowApprStatCreditLm = 1')
  FETCH INTERID_Cursor INTO @INTERID
END

CLOSE INTERID_Cursor
DEALLOCATE INTERID_Cursor




Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Sunday, November 1, 2015

Dynamics GP Error: Script CheckListAccess of form [Not found] when using window Shortcut

By Steve Endow

On this beautiful sunny Sunday morning in Los Angeles, I'm working on a modified Dynamics GP Item Maintenance window and writing the VS Tools code to manage the data on some custom fields.



On my GP 2015 server, I have GP customizations for multiple clients.  Before I use the DAG to create a new Modified Forms assembly to include some changes to the Item window, I had to remove the other customizations.  So I renamed the FORMS.DIC in the data directory, removed the VBA files in the GP application directory, and then re-imported the package for the modified Item Maintenance window.

All that went fine.  But after I completed that process, whenever I opened Modifier to make a change to the Item Maintenance window, I got this error.

Unhandled script exception: Script 'CheckListAccess of form [Not Found] not found.

My guess was that it was being caused by one of two possible reasons.  Either there was some issue with my modified form  and/or the FORMS.DIC, or my process had caused a problem with the Modified Forms access rules in GP.

I tried renaming my FORMS.DIC and re-importing my modified form package.  But the problem still existed.

So I then opened the Alternate/Modified Forms and Reports window and removed all selections for modified windows.  In addition to my Item Maintenance window, I still had selections for the other modified windows that were no longer present, as I had a new FORMS.DIC.  After setting all windows back to Dynamics GP default windows, I clicked on Save.

I then went back into the Alternate/Modified Forms window and selected my modified Item Maintenance window.


I then clicked on Save.  With that cleaned up, I opened my modified Item Maintenance window by using Cards -> Inventory -> Item, and then selected Modify Current Window.

The error didn't appear.  Great, I solved the problem--or so I thought.

I then clicked on a shortcut I had created to open my modified Item window.


When I then clicked on Modify Current Window, I received the error message again.

So it seemed like the problem was with my shortcut.  I deleted the shortcut and recreated it, thinking that would fix it.  I thought it worked okay the first time, but when I tried it again, I got the same error when opening Modifier.

So when I open the modified window through Cards -> Iventory -> Item, I didn't get the error.  But if I open the modified window with a shortcut, I did get the error.

So then I thought it had something to do with the shortcut.  But later, I got the error again when opening the window using Cards -> Inventory -> Item.  So it didn't seem to be the shortcut.

Coincidentally, the VS Tools AddIn I was developing had some errors with two of the 10 custom fields I had added to the Item Maintenance window in Modifier.  I could access 8 of the 10 fields fine, but the other two caused an error.

So at this point, my guess is that my modifications to the window somehow got corrupted, and even exporting the package and reimporting it from scratch didn't resolve the issue.

Because of the errors I got in VS Tools, I have just started from scratch in Modifier and added all 10 custom fields to a fresh Item Maintenance window to see if that fixes things.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter









Friday, October 30, 2015

Dynamics GP SOP Return transaction distributions: A Puzzling Observation

By Steve Endow

While updating an eConnect SOP integration, I observed something that I thought was strange.

My integration was originally designed to only import Invoice transactions, but the client now needed it to also handle Return transactions.  Okay, no problem.

So I made the small changes necessary to import SOP Returns as well.  The import ran just fine, importing both Invoices and Returns, and I thought all was well.

While testing the new integration, the customer noticed that the invoice distributions were not being calculated correctly--I had forgotten to reverse the distributions for the Returns.  Not a problem, simple fix.

But while testing the import and checking the corrected distributions, I noticed something that puzzled me.  I would open the Sales Distribution Entry window for the Return, and everything looked fine:  CR Receivables, DR Sales.


But when I clicked OK, I received this message.


Huh.  Okay.  So I'm not familiar enough with SOP distributions to understand the logic behind this, but for some reason, GP requires that SOP Returns have COGS and INV distribution lines prior to posting, while Invoices do not.

If I clicked on the Default button, GP populated the window with the distributions that it was expecting.

Okay, so I learned something, but that part made sense.  I could deal with that requirement and understand it.

But here's where things got strange.

When the client tested with this same transaction, they did not get the COGS distribution dialog.  They only needed to have two distribution lines:  RECV and SALES.  GP was not requiring them to have a COGS and INV distribution.


I checked with the client, and they didn't know why they didn't need the COGS and INV distributions.  So I checked with the client's GP partner.

The initial thought was that maybe I didn't have default accounts for my test inventory items in Fabrikam.  Turns out the COGS account was blank, so I set that default account. But my eConnect import was still only setup to import Sales and AR, so that didn't affect the imported distributions, and GP still insisted that I have the COGS and INV distributions.

The next thought was that maybe the Item accounts weren't defaulting.  But those were defaulting fine and were populated.

While I was searching for an answer, the client confirmed that their Return transactions do not require, and do not have COGS and INV distributions--at all.

The GP partner then suggested that the distributions were simply not visible in the client's environment--GP was generating COGS and INV behind the scenes, but just didn't display them on the Inquiry windows or the Edit List reports.  This seemed unlikely to me, since my Edit List report was pretty explicit about those two distributions being missing.


The client even tracked down a GL JE from a posted Return and confirmed that AR, Sales Tax, and Revenue were the only three accounts affected.  So there were no hidden distributions.

I didn't buy any of the explanations so far.  They just didn't seem to make sense given the symptoms I was seeing.

So I manually entered a simple Return transaction in Fabrikam and took a look at the Distributions.  My Return with a single line using a test item had a COGS and INV distribution amount of $0.01.


So why were those distribution lines $0.01?  Because the item had a Standard Cost of $0.01.


So this told me that GP was requiring the Return to have COGS and INV equal to the Standard Cost of the item.  But the client didn't have COGS or INV distributions....

Okay, so what if the Standard Cost of the item was $0.00?  That's a great question--let's give that a try.

In GP 2010, you can just change the Standard Cost, but in GP 2013 R2, you have to use the Change Item Standard Cost utility.


And before you change the Standard Cost, this window will require that the item not be present on any purchase orders or unposted receipts/invoices.  Okay, done.

So I then changed the Standard Cost to $0.00.

And behold.  When I then entered a SOP Return, Dynamics GP no longer required me to have COGS and INV distributions.


So why would the client have a standard cost of $0.00 for inventory items?  That actually makes sense, and seems obvious in hindsight, as they sell a subscription service that does not have COGS or inventory value.  (Why they track those subscriptions as inventory items, I don't know...that's a different research project.)

I'm now assuming that all of the client's inventory items are zero cost, and will therefore never need COGS and INV distributions.  .

So that was a multi-hour journey of puzzlement over a $0.01 Standard Cost.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter






Missing buttons on the SOP Transaction window? Turn off the Action Pane OR Use the Go To button

By Steve Endow

UPDATE:  Thanks to Ian Grieve for letting me know that when the Action Pane is enabled, the buttons on the SOP Transaction Entry window will be listed under the Go To button on the Action Pane.  Mystery solved.




Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter









Thursday, October 29, 2015

If you add your SQL files to Visual Studio, you may want to remove the sql exclusion from gitignore

By Steve Endow

I'm not sure why I haven't run across this issue before--perhaps it has always happened and I didn't notice?

I recently migrated an old Visual Studio solution from SVN to Git in Visual Studio 2013.  After committing the project, I noticed that there were two files still listed under "Included Changes" of the Team Explorer tab.


Puzzled, I check the files in my solution.  They had the small blue plus sign next to them, indicating they had been added.

I tried to commit the solution again, but Visual Studio told me:

An error occurred. Detailed message: No changes; nothing to commit.

I Googled this, and not surprisingly, I ended up on a Stack Overflow thread discussing this issue.

It pointed me to the gitignore settings file and sure enough, there was the culprit.


And that explains why my SQL files were not committed.  Here is what my project looks like:


I have a folder named SQL where I store the SQL scripts related to the project.  I started doing this so that I could have a centralized place where all code, including SQL, could be tracked, versioned, and backed up.

So while I found the problem, it makes me wonder--why is sql/ excluded by default in the gitignore file?  This thread mentions that the SQL exclusion is apparently a "best practice", but there is no such reference in the docs, and I find that puzzling, at least in my situation.  Am I doing something wrong?  I can't imagine that it's some taboo practice to have a SQL folder in my project to store my SQL files.

What do other people do to manage their SQL scripts related to a project?

So now I have to go back through my other projects that I've setup with Git and see if the SQL folder and all of the SQL files have been ignored.  And I need to remember to modify my global gitignore on every one of my dev servers in my HyperV stable.

The learning never stops.  Maybe I need a Git sensei.


May you have many happy commits in your future.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter