Friday, April 20, 2018

The Challenge of Posting Dates with Automated Dynamics GP Imports

By Steve Endow

If you are familiar with Dynamics GP, you are likely familiar with the confusion that can be caused by the "Posting Date" feature.  Many customers have never opened the additional transaction date window in GP.


A customer calls and asks, "Why did my April invoice post to the GL in March? The invoice is clearly dated April 5!"

In addition to the confusion between Document Date and Posting Date, there is also the potential confusion caused by Transaction Posting Date vs. Batch Posting Date.


As if that isn't enough fun related to dates, things can get particularly interesting and challenging with automated integrations with Dynamics GP.

The issue typically comes up during month end.  If an April 15 invoice is posted to April 16, it is usually not an issue and nobody notices.  But if a March 31 invoice is posted to April 1, that can cause issues.

When a user is entering transactions manually in GP, they can review the invoice, know whether it should be posted to March or April, and set the posting date accordingly.  But when an automated integration is importing data, it usually doesn't know which fiscal period a transaction belongs to.  It has to rely upon a data field in the source data to tell it what the posting date should be.

That sounds easy enough, right?

Unfortunately, it isn't always easy.


Above is some sample data from an integration.  A single invoice date is provided in the DOCDATE column.  And a Batch ID of 2018-04-20 is provided, implying that the transactions are related to April 20.  From this information, you could reasonably assume that the transactions should post to the 2018-04 fiscal period.

But what about this sample data.


This morning a concerned and upset customer called me asking "Why did our April invoices post to March??"

The batch ID of "20180401" indicated that these were April invoices and not March invoices.  But as we know, Dynamics GP doesn't care about the batch ID when it comes to posting.  The only date that matters is the Posting Date.

"But we don't import a posting date with our invoices. Only the Invoice (document) date!", the customer responded.

Good point.  The source data only contained DOCDATE, and their SmartConnect map was only setup to import the invoice Document Date field.

So why did all of their invoices in the 20180401 batch get posted to March 31?

Well, as I mentioned above, you have to know whether GP is configured to post using the transaction posting date or the batch posting date.  And to keep things confusing, it is possible to configure some batches to post using the transaction posting date, and have other batches post using a batch posting date.

So using the sample data above, why did the 20180401 batch post to March 31?

When importing transactions using eConnect (or SmartConnect, etc.), if the Batch ID specified for the transaction does not exits, eConnect will create the batch automatically.  You don't need to specify additional options--it will just handle it for you.

And when your Dynamics GP batch type is set to use the Batch Posting Date, guess what eConnect uses as the default value for the Batch Posting Date?  The document date.

So in the above sample data, the first invoice that is imported has a Document Date of March 31.  So eConnect dutifully creates a new batch with a posting date of March 31.  It then imports the invoices into that batch.  And all of the invoices in that batch will post to March 31.  Even if the invoice date is April 1.

Okay, so the customer just needs to fix the March 31 dates, right?

Perhaps it may be that simple.  Maybe there was a bug in their source data.

But what about invoices that are generated on April 1, but related to March?  What about a vendor invoice dated April 2 that is received from an external AP system on April 3, but was for a service performed in March?  An integration won't know the invoice should be posted to March--the source data would have to provide an additional clue, such as a separate Posting Date or Fiscal Period field.

I've only encountered a few customers who were able to supply that fiscal period field separate from the document date field.  In my experience, it is not common for a source system to know the fiscal period for a transaction--most only have a single transaction date.

So when designing a transaction import for Dynamics GP, make sure to consider what happens when transactions are dated the last day of the month or first day of the month, and whether transactions related to a prior fiscal period may show up in your source data.  It can be surprisingly tricky.


Steve Endow is a Microsoft MVP in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+







Monday, April 16, 2018

My First SQL Saturday event: It was amazing

By Steve Endow

The Microsoft SQL Server community is amazing.

Amazing.

That's not an exaggeration or platitude.

On Saturday, I attended my first "SQL Saturday" event in Orange County, California.  I left speechless.


Several hundred people attended the event at a local college.  On a Saturday.  I overheard one attendee say that she woke up at 4am and had a 2+ hour drive from San Diego to attend. Presenters flew in from all over the country to speak at the event, with several speakers facing a snow storm and flight cancellations trying to return home.  They did this, without compensation, on a Saturday.  And some were planning on attending up to 10 other SQL Saturday events across the country.

And I should mention that the event was free for attendees.  Completely free.

When I arrived at 8am, there was a line of 40 or 50 people waiting to check in.  There were lots of volunteers helping people check in, handing out tote bags, re-printing passes, setting up tables, and preparing the event.  Before the first session started, they had setup tables with gallons of free coffee, bagels, danishes, and donuts.

The event is organized by PASS, a non-profit organization that helps support people who use Microsoft data technologies.

Ten companies sponsored the SQL Saturday event, which has the following mission statement:

Our Mission
The PASS SQLSaturday program provides the tools and knowledge needed for groups and event leaders to organize and host a free day of training for SQL Server professionals. At the local event level, SQLSaturday events:
  • Encourage increased membership for the local user group
  • Provide local SQL Server professionals with excellent training and networking opportunities
  • Help develop, grow, and encourage new speakers
When I signed up, I didn't know what to expect.  I thought it might be a casual user-group style meeting with a few speakers.  But it was much more like a full fledged, single day, intense SQL Server conference.

Several of the speakers that I saw were simply amazing.

Here are the sessions that I attended:

  1. SQL Database and Query Design for Application Developers
  2. Azure Basics for the DBA
  3. PowerShell for the SQL DBA
  4. Spotlight on SQL Server by Quest Software (vendor presentation)
  5. Data Pages, Allocation Units, IAM Chains
  6. The Query Store and SQL Tuning
  7. Fundamentals That Will Improve Query Performance


The 6 educational sessions were incredible.  I felt I knew a fair amount about 3 of the topics, but still learned a ton in those sessions.  And the 3 sessions with topics that were new to me had so much valuable content that I was dizzy by the time the session ended.  For example, I learned how the data is structured inside of an 8K data page--down to the byte!  WHAT?!?!

I took pages of notes on my iPad during most of the sessions, as they were all offering real world knowledge, experience, anecdotes, and lessons about how to use different SQL Server features and tools.

It was 6 solid hours of high quality content presented by SQL Server experts.  It was intense, valuable learning, and I was tired at the end of the day.

It was amazing.

If you work with SQL Server and have an opportunity to attend a SQL Saturday event, I recommend it.



You can also find him on Twitter, YouTube, and Google+




Monday, March 26, 2018

My Typical Password: Is a 25 character minimum passphrase policy possible?

By Steve Endow

If you haven't read my prior post about passwords, perhaps read that first:

https://dynamicsgpland.blogspot.com/2016/10/how-do-you-choose-your-passwords-and.html


My "Passphrase Generator" has been working great since then. It isn't perfect, but has been working well enough for me.

I thought I was doing all the "right" things by using my passphrase generator and using a password manager religiously.

Using words with a max length of 7 characters, 2 numbers, and 1 symbol, I have been generating passphrases like:

Briony%4Cobwebs4    (16 chars)
Hyped/5Umber1    (13 chars)
Reecho%6Touzled8    (16 chars)
Tisanes#4Tangles6    (17 chars)


I considered these pretty safe passwords.

But I recently started listening to Kevin Mitnick's book, The Art of Invisibility, on Audible. In that book, Mitnick recommends that people now use passphrases of at least 25 characters.

25 characters?!?!?  (interrobang)

That's crazy!

But is it?

Those of us who work with Dynamics GP regularly bemoan the 15 character limit for passwords, as many of our customers have encountered issues with this limit. The customer's IT security policy requires a minimum of 15 characters, and they eventually figure out that their 16+ character passwords don't work in Dynamics GP.


So obviously that rules out using 25+ character passwords for Dynamics GP.

But I'm pretty sure I've run into web sites that would not allow me to have anywhere near 25 characters.

There's only one way to find out.  I just reset my password on these web sites.  These weren't the limits of the site, just the length of the long passwords that I randomly generated for each and successfully saved.

Twitter:  35 characters
Stack Overflow:  35 characters
GPUG.com:  38 characters
Atlassian/Bitbucket:  36 characters

Wow, moving right along!  It looks like a 25 character minimum password might be possible!

(play record scratch here)

Then I login to my online banking web site.  Major bank.  Big bank.  Huge bank.  Not a relatively tiny web site like GPUG. 

And what do I see?


20 character max!  What??

Strike one!

Hmmm, let's check another bank web site.  I log in to a smaller bank that I use, but when I try to change the password...and...

...it doesn't allow me to paste in the password from my Passphrase Generator!

That is garbage!

Troy Hunt lays out this entire stupid fake "security" policy of not allowing password pasting in his excellent blog post here.

And I see that he shows examples of GE Capital and PayPal and others.

So that pretty much kills the idea of consistently using 25+ character passwords.

Could I use really long passwords on sites that allow them, and that allow pasting?  Sure.  And I may start doing that.

But clearly there are many sites, particularly the large ones, that have indefensible password length limitations and block the paste function.  So for those, you're limited to their arbitrarily short password lengths.

So I guess that answers my question.

With that said, will I use 30+ character passwords?  Not sure.

Occasionally I have to manually enter the password on a mobile device, and it is a nightmare to try and type that many characters in a password field.  I can barely compose a simple text message on my phone without making a typo, so my password typing success rate is not stellar.

But I may give it a try.  As I reset my passwords going forward, I'll try and use a 25+ character passphrase and see how it goes.

Hopefully some day my bank will allow more than 20 characters.




You can also find him on Twitter, YouTube, and Google+


Tuesday, March 13, 2018

Beware of MIN, MAX, and TOP in Dynamics GP SQL queries!

By Steve Endow

A few weeks ago I started some research to compare the performance of MAX vs. TOP(1) in SQL Server queries.

After finding some unexpected results, I created a second video showing some odd behavior of MAX and TOP on one particular Dynamics GP table.  At that time, I couldn't figure out what was causing the performance issue with the MAX function.

Well, thanks to some very generous help and amazing insight from Kendra Little, I finally have a definitive explanation for the performance issue.


Wednesday, March 7, 2018

SQL MAX vs TOP 1: Part 2! The Revenge of IV30500!

By Steve Endow

I just can't let it go.  I need to know.  I need answers.  I need to solve the mystery.  The riddle.  The enigma.

Why does the MAX function sometimes perform very poorly compared to TOP 1?

I really thought that "MAX vs TOP 1" was a simple question. An easy question.  A spend 10 seconds on Stack Overflow and get the answer type of question.

But I just couldn't leave it alone and had to go and test it for myself.  And open a veritable Pandora's Box of SQL riddles.

In Part 1 of this series, I delved into how MAX and TOP 1 behave in several random queries, and I ended on a query that showed MAX performing quite poorly compared to TOP 1.

After that video, I stumbled across an even simpler query that produced an even more dramatic performance difference, where MAX performed miserably.  But I couldn't figure out why.

In this video, I discuss what I learned about the query and the specific Index Scan that is causing the MAX query to performing so poorly. 


Here's the recap:

When querying some fields, such as the IV30500 POSTEDDT, with no WHERE clause, both MAX and TOP 1 perform virtually the same, with both using a very efficient Index Scan.  50% vs 50% relative costs.



But when I query the TRXSOURCE field, with no WHERE clause, MAX shows 100% relative cost, whereas TOP 1 shows 0% relative cost.



What???

The POSTEDDT query uses an Index Scan.  The TRXSOURCE query uses an Index Scan.  But for some reason with the TRXSOURCE query, MAX is much more costly.

I stared at this result for a few hours trying to figure out why.  I eventually found this tiny little detail.


Notice that the Actual Number of Rows for the MAX Index Scan is 147,316.  That's every row in the table.

By contrast, the TOP 1 Index Scan has Actual Number of Rows = 1.

What is going on?

For some reason, the MAX is having to scan the ENTIRE AK1IV30500 index.  It isn't getting much benefit from the index.

But why?

Unfortunately, I don't yet know.

"Maybe SQL is caching the query execution plan?"

Apparently not.  I tried DBCC FREEPROCCACHE, and saw no change.


"Maybe your statistics are stale and need to be updated?"

Nope.  I tried UPDATE STATISTICS WITH FULLSCAN.  No change


"C'mon Steve, clearly you need to re-index!"

Did that.  I tried DBREINDEX on the specific AK1IV30500 index, as well as the entire IV30500 table.  No change in the execution plan.


I didn't find any standard maintenance task that changed the behavior of the MAX Index Scan.

As a last resort, I used the Import/Export wizard to export all of the data from the IV30500 table into a new table that I called IV30500REBUILD.  I then ran scripts to create all of the same indexes on the REBUILD table, making it identical to the original IV30500 table.

I then ran the MAX and TOP 1 queries on the new REBUILD table.

And like magic, the MAX Index Scan returned just one row.


Same table structure.

Same data.

Same indexes.

But the Index Scan on the new REBUILD table behaves properly.

So there is apparently something about my IV30500 that is causing this problem, and rebuilding the entire table resolves it.  But rebuilding a table isn't exactly a typical SQL maintenance task, so it's not really a solution.

But this is way past my SQL skill level, so I don't yet know what conventional maintenance task might be able to achieve the same results.

I've asked for help from a true SQL expert, so I'm hoping that she will assist and help me figure out this mystery.




You can also find him on Twitter, YouTube, and Google+





Saturday, March 3, 2018

"Can you add just one more little feature?": A Story About Software and Home Improvement

By Steve Endow


Wife: "Steve, can you install an exhaust fan in the small bathroom?"
Steve:  "Sure, hunny, no problem. I just ordered the fan and I'll call Sam to install it."

Customer:  "Can you add this simple little feature to our application?"
Developer:  "Sure, no problem.  I'll get right on that."

No big deal.


Sam: "Steve, I cut a hole in the bathroom ceiling for the fan, but something is strange. There's an extra layer of drywall on the ceiling, and it's not attached properly and it's sagging."
Steve: "Hmmm, that doesn't look right. Let's remove the extra drywall and see what the prior homeowner was covering up."



Customer: "So how's that new simple little feature coming along?"
Developer: "Well, I looked through the code, and the original developers didn't design the software to handle this feature, so it's going to require some redesign of the customization."

I think the project scope just changed.

Friday, March 2, 2018

Sample Dynamics GP eConnect XML for RM Apply (RMApplyType / taRMApply)

By Steve Endow

A customer asked for sample XML for the RMApplyType / taRMApply eConnect transaction type.  I couldn't find one handy during a search, so I had to cobble together some .NET code and generate the XML.

I'm wondering if there is an easier way to generate the sample eConnect XML.  In theory, eConnect Requester with the eConnect Outgoing Service can send certain XML documents to MSMQ, but that is a hassle to setup properly, and I don't know that all transaction types are supported by eConnect Requester--such RMApplyType.


So, here is a sample Dynamics GP eConnect XML document for RM Apply (RMApplyType / taRMApply)


<?xml version="1.0"?>
<eConnect xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
            <RMApplyType>
                        <eConnectProcessInfo xsi:nil="true" />
                        <taRequesterTrxDisabler_Items xsi:nil="true" />
                        <taAnalyticsDistribution_Items xsi:nil="true" />
                        <taRMApply>
                                    <APTODCNM>SALES100001</APTODCNM>
                                    <APFRDCNM>PYMNT100001</APFRDCNM>
                                    <APPTOAMT>123.45</APPTOAMT>
                                    <APFRDCTY>9</APFRDCTY>
                                    <APTODCTY>1</APTODCTY>
                                    <DISTKNAM>0</DISTKNAM>
                                    <APPLYDATE>2017-04-12</APPLYDATE>
                                    <GLPOSTDT>2017-04-12</GLPOSTDT>
                        </taRMApply>
            </RMApplyType>
</eConnect>


Discount Taken, Apply Date, and GL Post Date are optional, and do not have to be assigned or included in the XML if they are not needed.

Here is the eConnect documentation on taRMApply.