Thursday, May 11, 2017

Resolving the Veeam "Backup files are unavailable" message when restoring a VM

By Steve Endow

I'm a huge fan of the Veeam Backup and Replication product.  I've used it for several years now to backup my Hyper-V virtual machines to a Synology NAS, and it has been a huge improvement over the low-tech script based VM backups I was suffering with previously.

One quirk I have noticed with Veeam is that it seems to be very sensitive to any loss of connectivity with the backup infrastructure.  With a prior version, if Veeam was running but my file server was shut down, I would get error notifications indicating that it couldn't access the file server--even though backups were not scheduled to run.  I haven't noticed those messages lately, so I'm not sure if I just turned them off, or if I haven't been paying attention to them.

Since I don't need my servers running 24x7, I have them scheduled to shutdown in the evening, and then automatically turn on in the morning.  But sometimes if I wrap up my day early, I may shut down all of my servers and my desktop at, say, 8pm.  If I shut down my Synology NAS first, and Veeam detects that the file server is not accessible, it may log a warning or error.

Normally, this isn't a big deal, but I found one situation where this results in a subsequent error message.  I recently tried to restore a VM, and after I selected the VM to restore and chose a restore point, I received this error message.

Veeam Error:  Backup files are unavailable


When I first saw this message I was concerned there was a problem, but it didn't make sense because Veeam was obviously able to see the backup files and it even let me choose which restore point I wanted.  So I knew that the backup files were available and were accessible.

I could access the network share on my NAS file server and browse the files without issue.  I was able to click on OK to this error message, complete the restore wizard, and successfully restore my VM.  So clearly the backup files were accessible and there wasn't really an issue.

So why was this error occurring?

I submitted a support case to Veeam and spoke with a support engineer who showed me how to resolve this error.  It seems that whenever Veeam is unable to access the file share used in the Backup Infrastructure setting, it sets a flag or an error state in Veeam to indicate that the backup location is not available.  After this happens, you have to manually tell Veeam to re-scan the backup infrastructure in order to clear the error.  Fortunately, this is very simple and easy.

In Veeam, click on the Backup Infrastructure button in the bottom left, then click on the Backup Repositories page.  Right click on the Backup Repository that is giving the error, and select Rescan.


The Rescan will take several seconds to run, and when it is done, the "Backup files are unavailable" message will no longer appear when you perform a restore.  Or at least that worked for me.

Overall, I'm incredibly pleased with Veeam Backup and Replication and would highly recommend it if it's within your budget.


You can also find him on Twitter, YouTube, and Google+








Wednesday, May 10, 2017

Extract and Save Dynamics GP Document Attach Files Using .NET

By Steve Endow

In a prior post, I showed how to use BCP to extract and save Dynamics GP Document Attach files.  I decided to explore this further and use .NET to extract the GP Doc Attach files.

It is very easy to export the file attachments from SQL using .NET, but because you need some data access commands to read the data from the VarBinary field and separate write the data to a file using a stream, it's quite a few more lines than the very simple single-line BCP statement.

This MSDN forum post has the basic code sample, which is 9 lines.  So in theory you could have something that simple.  But in reality any .NET application of value is going to have a fair amount of additional code for a data access layer and other plumbing, allowing for parameters, processing multiple records, error handling, etc.

Very impressive SQL script for running multi-company Dynamics GP queries

By Steve Endow

I've previously posted some queries for retrieving values from multiple company databases.  I tried a few query designs, but ended up settling with cursors to loop through the different company databases and assemble the results.  Here are two examples:

https://dynamicsgpland.blogspot.com/2016/03/query-functional-currency-for-all.html

https://dynamicsgpland.blogspot.com/2015/10/finding-batches-with-errors-across-all.html


Given the importance of the queries and how infrequently I used them, I didn't worry about the cursor or spend much additional time trying to figure out an alternative.

While these ad hoc, one-time queries are probably just fine using cursors, many SQL folks (myself included) are not fans of cursors, and it's often a fun challenge to figure out a way to use a proper set-based query to replace an iterating cursor in a query.

Well, Ian Grieve has come up with a very clever technique for assembling a query that can run against multiple Dynamics GP company databases.  Rather than executing a query multiple times and storing the results each time, he's figured out an intriguing method for dynamically assembling the queries for multiple companies, and then running that entire dynamic query string using sp_executesql.

Here is his blog post:


http://www.azurecurve.co.uk/2017/05/sql-script-to-return-functional-currencies-for-all-companies-without-a-cursor/


While I understand the general technique, I am not familiar with some of the commands and syntax--particularly the STUFF and FOR XMLPATH statements--so it is going to take me a while to deconstruct and fully understand the entire query.

But hats off to Ian for the creativity!  I think he's come up with a new standard template for multi-company Dynamics GP queries!



You can also find him on Twitter, YouTube, and Google+



Tuesday, May 9, 2017

Extract Dynamics GP Document Attach Files From SQL Server and Save To Disk

By Steve Endow

Today I was asked if there was a way to read an image file from the Dynamics GP Document Attach table so that the image can be added to a report.  For instance, suppose a customer wants to display item images on an invoice.

I have done very little work with the blob / varbinary field type in SQL Server, so I didn't know how difficult it would be to do this.  I quickly did some research and testing, and while I don't have a complete solution for inserting images onto a report, I did test one method for extracting files from the Dynamics GP Document Attach table and saving them to disk.

From what I could tell, there are at least three standard SQL Server tools/commands that you can use to extract an image or file from a varbinary field.

1. OLE Automation Query – This looks pretty sketchy and appears to be very poorly documented.  I tried some samples and couldn’t get it to work.

2. CLR (.NET) inside of SQL – Appears to be a viable option, but requires enabling CLR on SQL, which I personally would try to avoid on a GP SQL Server if possible, so I didn't try this one yet

3. BCP – I was able to get this to work and it was surprisingly easy, but I don’t know how easy it will be to integrate into a report process


Since I was able to quickly get the BCP option to work, here are the commands I used.  If you are able to shell out to run BCP as part of a report or other process, this should work.  


For my testing, I attached a text file and a JPG image to a customer record, and was able to use these BCP commands to successfully extract both files and save them to disk.  

When I ran the BCP command, it asked me four questions, and then prompted me to save those settings to a format file.  I accepted all of the default options except for the header length—that needs to be set to 0, and then saved the file as the default bcp.fmt.



BCP "SELECT BinaryBlob FROM TWO..coAttachmentItems WHERE Attachment_ID = '88d9e324-4d52-41fe-a3ff-6b3753aee6b4'" queryout "C:\Temp\DocAttach\TestAttach.txt" -T -f bcp.fmt

BCP "SELECT BinaryBlob FROM TWO..coAttachmentItems WHERE Attachment_ID = '43e1099f-4e2b-46c7-9f1c-a155ece489fa'" queryout "C:\Temp\DocAttach\TestImage.jpg" -T -f bcp.fmt


I found that BCP was able to extract the files from the Document Attach table and save them, and I was able to then immediately open and view the files.  Dynamics GP does not appear to be compressing or otherwise modifying the files before saving them to the varbinary field, so no additional decoding is required.

Given how simple BCP is and how well it seems to work, I would probably opt for this approach over using .NET, either inside SQL or outside of SQL.  But if you are already developing a .NET app, then it's probably better to use .NET code to perform this extraction.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+



Friday, May 5, 2017

Updating .NET apps to support Authorize.net TLS 1.2 for Credit Card and ACH integrations

By Steve Endow

Authorize.net is disabling support for TLS 1.0 and 1.1 in production as of September 18, 2017.  As of that date, they will only support TLS 1.2.  You can read more here:

http://app.payment.authorize.net/e/es.aspx?s=986383348&e=1411090


I have a customer using a .NET integration with Authorize.net, so I reviewed my code and did some research on which protocols .NET supports and uses.

I reviewed my code and confirmed that I was not explicitly setting the TLS protocol version.  So I researched which versions of TLS were supported by which .NET versions, and how the version was chosen.

After reading a few posts on StackOverflow, I confirmed that .NET 4.5 does support TLS 1.2.  So that was good news.  After reading a few more posts, my understanding was that .NET auto negotiates the protocol with the server, so if Authorize.net requires TLS 1.2, I thought that my .NET app should work fine.

So I tested against the Authorize.net developer sandbox, which has already been set to require TLS 1.2, and to my surprise, I received a connection error.  I researched the error and confirmed that it was due to the TLS 1.2 requirement.  But if .NET 4.5 supports TLS 1.2 and is able to negotiate the protocol version, why would my connection fail?

My only guess is that Authorize.net, in addition to requiring TLS 1.2, Authorize.net have configured their systems to detect protocol negotiation requests and deny those connections.  My assumption is that this may be a measure to prevent surreptitious protocol downgrades, such as the POODLE vulnerability, which makes sense.

So I updated my application to include the following line of code:

System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;

(You will need a "using System.Net" reference as well in your class)

This explicitly tells System.Net to only use TLS 1.2.  After I added this line, my connections to the Authorize.net developer sandbox started working again.

Given this finding, I will need to prepare a new release and will have to work with the customer to deploy the new application before September 2017.

And one other small downside to this approach is that my application is now hard coded to TLS 1.2.  But in practical terms, I am not concerned about this, as TLS 1.2 is the latest version available in .NET 4.5.  If a new version of TLS is released, Authorize.net will probably require that specific protocol, and I'll need to update the .NET version of my application anyway, so I'll have to prepare a new release regardless.

UPDATE: And coincidentally, one after hour posting this article, I saw a discussion about TLS v1.3, which is apparently in the works:

https://github.com/tlswg/tls13-spec/issues/1001


So I learned a few things from this process, and fortunately the fix turned out to be very easy.



You can also find him on Twitter, YouTube, and Google+








Wednesday, April 26, 2017

Automatically generate and create batch IDs in Dynamics GP using VBA

By Steve Endow

This has probably been done a few thousand times by a few hundred people over the years, but I don't recall ever doing it before.

A customer wants to automatically generate batch IDs in Dynamics GP.  They don't want to have to type the batch ID or create batches when entering transactions.  They want to be able to open the Payables Transaction Entry window, have the Batch ID automatically generated and just starting entering a transaction.

Here is a link to a Word document with the instructions and the script.

           https://1drv.ms/w/s!Au567Fd0af9Tmh3tbpX2qrAXYfuY


Once the code is installed, if you open the Payables Transaction Entry window, a new batch is automatically created and the batch ID value is populated in the Batch ID field.


The customer didn't want to use the date, but wanted a sequential numeric batch ID that would be unique.  I used the username, plus the two digit year, plus the day of the year.  This ensures a unique batch for each day, and it's a way of using the date in the batch ID without using the full date.


Make sure to following the instructions in the Word doc and add a VBA Reference to Microsoft ActiveX Data Objects 2.8 Library in the VBA Editor.


Option Explicit

Dim conn As ADODB.Connection
Dim cmd As New ADODB.Command

Private Sub Window_BeforeOpen(OpenVisible As Boolean)
  Set conn = UserInfoGet.CreateADOConnection
  conn.DefaultDatabase = UserInfoGet.IntercompanyID
  cmd.ActiveConnection = conn
  cmd.CommandType = 1  'adCmdText
End Sub

Private Sub Window_AfterOpen()

    Dim interID As String
    Dim userID As String
    Dim shortYear As String
    Dim dayNumber As String
    
    Dim currentDate As Date
    Dim sqlDate As String
    Dim batch As String
    
    Dim sql As String
    
    interID = UserInfoGet.IntercompanyID  'Get the database name (INTERID)
    userID = UCase(UserInfoGet.userID)  'Get the GP user ID
    
    currentDate = DateTime.Now
    sqlDate = Format(currentDate, "yyyy-MM-dd")

    shortYear = Right(DateTime.year(currentDate), 2) '2 digit year
    dayNumber = DatePart("y", currentDate) 'Get day number (out of 365 / 366)
    batch = userID & "-" & shortYear & dayNumber  'Ex:  jdoe-17134
    
    'Create the batch if it does not already exist
    sql = "EXEC " & interID & "..taCreateUpdateBatchHeaderRcd '" & batch & "', '', 4, '" & sqlDate & "', 'PM_Trxent', 0, 1, 0, '', 0, 0, 1, 0, 0, 0, 0, '" & userID & "', 1, 0, 0, 0, '1900-01-01', '', '', '', '', '', '', '', 0, ''"
    cmd.CommandText = sql
    cmd.Execute
    
    'MsgBox (sql)
    
    'Assign the new batch ID to the batch field on the window
    batchID.Value = batch
    
End Sub



You can also find him on Twitter, YouTube, and Google+








Enhancing the reliability of complex Dynamics GP integrations

By Steve Endow

I had a call today with a customer who is implementing several complex Dynamics GP integrations.  One of the integrations involves the following steps:

1. A SmartConnect map imports a large AP Invoice batch once per day
2. Once the batch is fully imported, SmartConnect inserts a database record telling Post Master Enterprise that the batch is ready to post
3. Post Master Enterprise posts the AP Invoice batch
4. Once the batch is finished posting, Post Master Enterprise automatically fires a stored procedure where the customer will have custom SQL script
5. The stored procedure will verify the batch posted successfully and then call a SQL command to trigger a second SmartConnect map
6. SmartConnect will then import AP Manual Payments and apply the payments to the posted AP Invoice batch
7. Once SmartConnect is done importing the Payments and Applications, it will then insert another database record telling Post Master to post the AP Manual Payment batch

In theory, this process will work just fine, as both SmartConnect and Post Master can be configured to work together to coordinate in this manner.

But...

In any system integration with many steps or processes, there is a chance that one of the steps may not work properly, may encounter an error, may fail, may not fire, etc.

And in any system integration that is synchronous and tightly coupled like this one, any such problem in the chain of events can prevent the rest of the process from completing.  What if in Step 2, the SmartConnect map has a problem telling Post Master that the batch is ready to post?  What if in Step 3, Post Master finds that the Fiscal Period in Dynamics GP is closed, and it is unable to post the batch?  What if the batch fails to post due to a data error?  What if 682 out of 700 Invoices post successfully, but 18 fail to post due to invalid distributions? What if the custom SQL script in Step 4 encounters an error or table lock that prevents it from completing successfully?  What if SmartConnect encounters an error applying the Manual Payments to the posted Invoices?

In isolation, each of these issues is relatively small, but collectively, there are probably at least a dozen such minor problems that could potentially prevent the entire sequence from completing successfully.  Between the 7 plus different steps in this system integration and the dozen plus potential errors that can cause a failure, I would anticipate that this integration will have some reliability issues over time.

But that's okay.  That's what happens with complex systems--you often have equally complex failures and reliability challenges.

So how do you deal with this?  How do you manage such a complex integration?  And how do you increase reliability?

While speaking with the customer, a few things came to mind.

1. Error notifications.  Both SmartConnect and Post Master Enterprise can be configured to send notifications in case of an error.  As soon as a problem occurs, an email should be sent to a power user or administrator that has the tools to resolve the problem.

2. Proactive monitoring.  Sometimes problems occur, but notifications don't get sent or received, or are simply missed in the torrent of email that we all receive daily.  To supplement the error notifications, a monitoring job can be created that independently checks the status of the process.  Did the Invoices get imported by 1pm?  Did the Invoices get posted by 2pm?  Did the Manual Payments get imported and applied by 3pm?  Did the Manual Payments get posted by 4pm?  Each of these checks is just a simple SQL query against the Dynamics GP company database, and these checks serve as a second layer of notification in case there is a problem or delay.

3. Asynchronous design.  In my experience, synchronous, tightly coupled system integrations tend to be less reliable than asynchronous, loosely coupled integration.  So an integration could potentially be modified or redesigned to decouple one or more tasks in the chain, adopting more of a queue based integration rather than a real time integration.  In this particular integration, that would likely be challenging and would require a redesign.

4. Integration "Supervisor".  Rather than SmartConnect and Post Master working independently and simply handing off messages to each other, a "Supervisor" process or solution could be utilized that manages, or supervises, the entire process.  The Supervisor asks SmartConnect to run a map, monitoring that task until the import is complete.  It then hands a request to Post Master, monitoring that task until the posting is complete.  Rather than having to monitor two different applications and get notifications from each, this central supervisor service could potentially reduce the administration and monitoring of the overall process.

While these steps may help manage the process, they won't directly resolve the issues that cause failures or reliability issues in each step.  To increase reliability, I think you have to target the specific failures and eliminate the causes.  Increase data validation, add error handling, design each step to be able to proceed if some errors occur, and also utilize diagnostics to quickly track down the causes of errors.

There isn't a single answer for managing complex system integrations or improving reliability, but I think it is always a good idea to approach such complex integrations with an understanding of the potential errors and failures and an expectation that some additional measures may be required to increase reliability.

What have you done to make your complex system integrations easier to manage?  Have you used any techniques to increase overall reliability?

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+






Saturday, April 22, 2017

Importing Paid AP Invoices into Dynamics GP: Payment Numbers and Distributions

By Steve Endow

I recently developed a Dynamics GP import that created AP Invoices for employee expenses.  One thing that was different with this integration was that some of the employee expenses had already been reimbursed or paid, either with cash or by credit card.

It seemed pretty straightforward at first, but while designing the integration, I learned two interesting things.  When importing Paid AP Invoices into Dynamics GP using eConnect, you'll want to pay attention to the Payment Numbers and Distributions, as they will vary depending on whether the invoice was paid with Cash, Check, or Credit Card.

When importing an AP Invoice paid via Credit Card, on the Payables Credit Card Entry window, the Payment Number field should be populated with the next AP Voucher Number, not the next AP Payment Number.  And the Credit distribution for an invoice paid with a Credit Card should be Type of Cash, with the GL account being the AP account from the Credit Card vendor (my customer uses a different AP account for their Credit Card vendor).

Here's a video sharing what I learned about the proper values for Payment Numbers and Distributions when importing the paid invoices using eConnect:

https://www.youtube.com/watch?v=_DTK_jdzT1Y



Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+

http://www.precipioservices.com



Bug in GP ribbon buttons when AA is installed

By Steve Endow

Today I was doing some testing in GP 2016 R2, where I have Analytical Accounting installed, reviewing some imported AP Invoices.  I opened the invoice, clicked on Distributions, and then moved my mouse up to the ribbon to click on the Default button.

But...the Default button wasn't in the ribbon.


I only had View, Additional, File, and Tools buttons in the ribbon.  I was puzzled for a few seconds until I noticed the Delete and Default buttons were at the bottom of the window.

Monday, April 3, 2017

Dynamics 365 Financials Training - Day 4: Sales, Bank Rec, and GL

By Steve Endow

Today I attended day 4 of the Dynamics 365 Financials training class organized by Liberty Grove.

http://libertygrove.com/services/training/dynamics-365-for-financials-training-ndpm/


Here are links to my summaries of the prior days of the class:

Day 1 - Navigation, Company Setup, and Posting Groups

Day 2 - Dimensions, more Setup, and Financial Reporting

Day 3 - Payables and Inventory


Day 4 covered a lot of content and I found several items that I think are important to note, so this will be a long one.

And on a side note, I think I'm going to organize a contest to come up with some good abbreviations for the D365 products, since constantly typing out Dynamics 365 Financials gets old.  And I'm obviously not the only one with this issue:


I may use D365Fin or D3Fin or maybe even D3F eventually, but for now I've setup a keyboard shortcut in my FastKeys app that will automatically replace "d3" with "D365 Financials" as I type it.

And with that, let's get started.


Friday, March 31, 2017

Back to Basics: Purchase Order Posting Flow

Sometimes it helps to just get back to the basics of how things flow in GP.  And a client question this week gave me an opportunity this week to do just that.  To understand how purchase order processing works in Dynamics GP, a basic posting flow is helpful to illustrate where accounts come from.


It lays out the source of transactions and the corresponding default account locations. Hope you all find this useful!

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Updating Email File Formats in SQL

The best laid plans...  Word templates are not my favorite thing.  There I said it.  It is not because I don't think are pretty and functional and allow for more flexibility than report writer reports.  It's just that they can be time consuming in a way that clients don't expect.  Modifications that might be worth 1-2 hours of effort end up taking double or triple due to the quirks of word templates. 


I have found that this disconnect happens most frequently with clients who already have modified report writer reports.  The process to recreate a word template that looks like the report writer report can be challenging and time consuming.  So in those cases, I try to use HTML for email delivery.  That way clients can keep using the report writer report, and the cost to implement email delivery can be quite low.


Sometimes, though, you get stuck.  HTML doesn't work well with complicated layouts, and can be unpredictable in terms of how it displays for different recipients.  So then we have to use the word templates and send in PDF.  This is fine, except if you have already set up customers for emailing.  So now you have to change the format in Cards-Sales-Customer-Email Settings.

For that purpose, you can use the script below.  Keep in mind that EmailDocumentID (in this case it is set to Sales Invoice) and EmailSeriesID (in this case it is set to Sales)  would vary for different modules and documents.  The EmailDocumentFormat is the field you want to update.  It is the same as the dropdown (1-docx, 2-html, 3-pdf, 4-xps).

--Update sales invoice email settings for all customers to use PDF
update SY04905 set EmailDocumentFormat=3 where EMAILSeriesID=3 and EmailDocumentID=3 and MODULE1=11 and EmailDocumentEnabled=1

As always, test first and always back up your data!

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Wednesday, March 29, 2017

Dynamics 365 Financials vs. Dynamics GP: Payment Terms

By Steve Endow

In my post yesterday summarizing my Day 2 training, I said that the D365 Financials Payment Terms window looked disappointingly simple, as I expected it to be more configurable.

Well, it turns out that it is quite a bit more configurable than I thought based on my first glance, but you wouldn't realize it unless you knew about the Secret Codes.


What are the Secret Codes, and why exactly are they secret?  Good questions.

First, they are secret because they are apparently not documented yet.  The D365 Financials help web pages do not currently have any reference to Payment Terms.  And they are Secret Codes because unless you know what they are and how to use them, you would never know that they exist, or that you could wield their power in the Payment Terms window.

So...

Dynamics 365 Financials Training - Day 3 - Payables and Inventory

By Steve Endow

Today I attended day 3 of the Dynamics 365 Financials training class organized by Liberty Grove.

http://libertygrove.com/services/training/dynamics-365-for-financials-training-ndpm/


While I'm starting to get a little more familiar with the UI and navigation, there are a lot of nooks and crannies in Dynamics 365 Financials and very different names and concepts, so the training still seems very fast paced as I try and digest all of the new material.

Here are some observations from Day 3.

1. UI updates.

Menus are dead. Other than the few 'area page' links/buttons you can access from your Role Center (home page), there aren't really any menus in the application.  You interact with actions on your Role Center, the Ribbon above a window, or use the Global Search feature to locate things.


Tuesday, March 28, 2017

Dynamics 365 Financials Training - Day 2: Dimensions, more Setup, and Financial Reporting

By Steve Endow

Today I attended day 2 of the Dynamics 365 Financials training class organized by Liberty Grove.

http://libertygrove.com/services/training/dynamics-365-for-financials-training-ndpm/


Training day 2 summary: Brain overload

Here are my observations from Day 2:

1.  UI updates.  The user interface is nice, but the web interface does have a few minor quirks.  Sometimes the web site is very responsive, sometimes it pauses or lags when you perform an action, and other times there are multi-second delays.  And the responsiveness varies by user.  My UI was generally very responsive, but the instructor has had several 10-15 second delays on steps where I had no delay.  The inconsistency is strange, but I'm guessing Microsoft is doing tuning on the back end.

A few additional UI observations.

When you have multiple 'windows' open, you can only see the one on top.  You can click the X in the upper right to close the top window, or you can press the Esc key to close it.  Today I found that you can click anywhere on the gray bar on the left to close the top window.  So if you are a keyboard person, you can use Esc. If you are a mouse person, it's quicker and easier to click the gray bar than click on the X.  I am finding the gray bar to be the most convenient, as you can easily click multiple times to close several windows.


Monday, March 27, 2017

Dynamics GP vs. Dynamics 365 Financials: Do users really need access to transaction distributions?

By Steve Endow

UPDATE: As James Crowter notes in the tweet shown below, D365 Financials has been updated to allow entry of GL accounts on transactions, similar to the full Dynamics NAV product.  As a result, some of the discussion in my post below isn't relevant anymore, but the discussion of philosophy around access to GL distributions on transactions is timeless!

https://twitter.com/JamesCrowter/status/864529798642663424





In my last post on Dynamics 365 Financials training, I mentioned that in the first day, one difference I noticed was that D365 Financials does not allow access to transaction distributions during transaction entry, and that users cannot change distribution accounts directly on a transaction.  Dynamics 365 Financials (NAV) uses a different design involving "Posting Groups" that control the GL account assignments for transactions.


As someone who has worked with Dynamics GP for a long time, this seemed like a big difference to me.  In the GP world, Distributions are a pretty big deal.  When certain transactions are entered in GP, you often have to verify the distributions.  And some customers have processes that require users to change the distributions.

For example, some customers may manually code revenue accounts on a sales invoice, while other customers may code expense accounts on a vendor invoice.  In the GP world, Distributions are a thing, and they're an important thing.  When transactions fail to post, we sometimes ask customers to check the posting journal or Edit List to see if the distributions might have an issue.  When importing transactions using eConnect, I have to make sure that the correct distributions are imported.  We're so used to dealing with Distributions that it's second nature for us in the GP world.

Dynamics 365 Financials Training - Day 1: Navigation, Company Setup, and Posting Groups

By Steve Endow

Today I attended the first day of the Dynamics 365 Financials training class organized by Liberty Grove.

http://libertygrove.com/services/training/dynamics-365-for-financials-training-ndpm/


Each day I'll try and share a few observations from the training, primarily from the perspective of Dynamics 365 Financials vs. Dynamics GP.

Overall, the training was excellent.  There was a lot of content covered in four hours, but the trainer, Kerry Rosvold, was able to get through all of the content while answering dozens of questions along the way.



Here are my observations from Day 1:

1. Beautiful user interface.  The Dynamics 365 user interface just looks fantastic.  The look, feel and visual cues are really impressive.  There are definitely elements to the UI design that aren't obvious or intuitive, and have some complexity that you have to figure out (like any complex application), but I think the design is gorgeous, and it's really amazing how much functionality they were able to put into a web client.

Friday, March 24, 2017

Musical Tribute: I Want My MVP

By Steve Endow

My colleague Andrew Dean and I have come up with a few wild ideas, mostly related to Dynamics GP development and customization.  The other day, he came up with a really, really wild idea.

I mentioned that MVP submissions were due at the end of March, so I had to finish up a few things in the next few weeks.

He said, "Why don't you make a video to the song Money for Nothing, but make it 'I Want My MVP'?"

That was one wild idea.  After I got over the surprise, I didn't know where to begin.  How does one make a music video, and how would I create the song?

With a little ingenuity, a friend who has a rock band, and some low budget guerrilla film making with my daughter, we pulled it together in 3 days.

Enjoy!




Lyrics

I want my MVP

I want my MVP

Now look at them users
That’s the way you do it
Getting help from those MVPs
That ain’t workin’ that’s the way you do it
Knowledge for nothin’ and your help for free

Now that ain’t workin’ that’s the way you do it
Lemme tell ya them users ain’t dumb
Maybe get an answer from the GP forums
Maybe get an answer from a GP blog

We gotta install all the latest patches
ERP software deliveries
We gotta move these big databases
We gotta write these SQL queries

See the busy worker with Excel and tons of numbers
Yeah buddy, no need to despair
That busy worker calls on the MVPs
That busy worker, he don’t need to care

We gotta install all the latest patches
ERP software deliveries
We gotta move these big databases
We gotta write these SQL queries

I shoulda learned to post a couple o’ questions
I shoulda learned to read some blogs
Look at that accountant, she’s got all the answers
Man accounting is so much fun
And he’s up there, what’s that? Reconciliations?
Bangin’ out reports like SQL wizardry
That ain’t workin that’s the way you do it
Get your knowledge for nothin’ get your help for free

We gotta install all the latest patches
ERP software deliveries
We gotta move these big databases
We gotta write these SQL queries

Now that ain’t workin’ that’s the way you do it
You get your help from an MVP
Now that ain’t workin’ that’s the way you do it
Knowledge for nothin’ and your help for free
Knowledge for nothin’ and help for free


Saturday, March 18, 2017

SQL Trivia: Leading spaces on string field

By Steve Endow

I think I've been writing SQL statements since 1996.  Maybe I encountered this scenario many years ago, but completely forgot about it.  But I am pretty sure that since I have been working with GP over the last 13 years, I can't remember ever encountering it.

During the SmartConnect training class that I attended this week, the trainer, Mark Anderson, pointed out a situation where an extra space could accidentally make its way into a SQL statement in a SmartConnect map.  He explained that the leading space in the WHERE clause value would cause the query to return zero records.

Somehow, I don't think I've run into this problem--I just haven't made that particular typo.

Here's an example of what that might look like.  The first query has no leading space in the WHERE clause, but does have trailing spaces, and it retrieves one customer record.  But the second query adds a few spaces in front of the customer ID value in the WHERE clause, and that query returns zero records.


Most Dynamics GP application fields do not seem to allow leading spaces--the only exception I could find is the Note field/window.  Given this, it is unlikely that you would have a situation where Dynamics GP data would have a leading space and cause this scenario.

However, if you have integrations or reporting with custom queries, or queries that are concatenated through script or code, it's possible for a space to creep in.  In SmartConnect, this can occur when you are inserting a pre-defined data field name into a SQL statement, like this:

SELECT * FROM RM00101 WHERE CUSTNMBR = ' _CUSTOMERID'

When you drag the _CUSTOMERID source data field name into your SQL script, it can result in an extra space before the field name, causing the SQL statement to return zero records.

Since I can't remember having ever encountered this scenario, it isn't something I would think to look for.  It makes sense, but it wouldn't have been obvious to me.  I've become almost indifferent to trailing spaces, because they have no consequence with a typical GP SQL install, but leading spaces are definitely a gotcha.

I thought it was interesting, and a neat reminder about the mechanics of SQL Server.


You can also find him on Twitter, YouTube, and Google+




Tuesday, March 7, 2017

SQL Script to verify Dynamics GP GL batch control totals

By Steve Endow

I have a customer who is importing GL batches into Dynamics GP, and they explained that they a bug in their integration that is causing the batch control totals to be incorrect. So they may import 50 transactions into GP, but the batch control total will show 85 transactions.  I believe that this discrepancy is causing some issues.

So I quickly threw together this SQL query to check the number of GL transactions and JE totals and compare those numbers to the batch control totals in SY00500.

There are probably several different ways to accomplish this, but I happened to pull this version together in a few minutes using a CTE.


WITH glTotals (BACHNUMB, BatchCount, BatchTotal)
AS (
SELECT glh.BACHNUMB, COUNT(glh.JRNENTRY) AS BatchCount, SUM(dtGLTotals.JEAmount) AS BatchTotal FROM GL10000 glh
JOIN (SELECT JRNENTRY, SUM(DEBITAMT + CRDTAMNT) AS JEAmount FROM GL10001 GROUP BY JRNENTRY) AS dtGLTotals ON dtGLTotals.JRNENTRY = glh.JRNENTRY
GROUP BY glh.BACHNUMB
)

SELECT b.BACHNUMB, b.NUMOFTRX, b.BCHTOTAL, glTotals.BatchCount, glTotals.BatchTotal, 
CASE WHEN b.NUMOFTRX <> glTotals.BatchCount THEN 'Count Diff' ELSE '' END AS CountDiff, 
CASE WHEN b.BCHTOTAL <> glTotals.BatchTotal THEN 'Amount Diff' ELSE '' END AS AmountDiff 
FROM TWO..SY00500 b 
JOIN glTotals ON glTotals.BACHNUMB = b.BACHNUMB
WHERE b.BCHSOURC = 'GL_Normal' AND b.BACHNUMB LIKE 'TEST%'


Here's what the results look like:


If there is a discrepancy in the transaction count or the total amount, it will display a note in the CountDiff or the AmountDiff columns.

Next, I will likely need to write a SQL script that will correct the control totals in SY00500.



You can also find him on Twitter, YouTube, and Google+





Monday, February 20, 2017

Converting a DataTable to objects in C#

By Steve Endow

Let's say you have a C# Dynamics GP integration that retrieves Customer records from a staging table and you need to convert each row to a Customer object so that you can more easily work with the data.

You could loop through the rows in your DataTable, read every field in the row, and assign each field value to each corresponding object property.  But that would be tedious, particularly if your object has dozens of properties.  And it's very low value coding.

After having done this a few times, I got sick of all of the typing and figured there had to be a better way.  Of course there is.  Fortunately a very smart person posted some code that handles this task very well, using Reflection.

https://codereview.stackexchange.com/questions/30714/converting-datatable-to-list-of-class

It works incredibly well and requires all of 1 or 2 lines of code to use.

I made a few different variations, allowing me to use it with a DataTable, DataRow, and DataRow arrays.


using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Reflection;
using System.Data;

namespace TSM.Integration.Library
{
    static class ObjectMapper
    {
        ///

        /// Converts a DataTable to a list with generic objects
        ///
        /// Generic object
        /// DataTable
        /// List with generic objects
        public static List DataTableToList(this DataTable table) where T : class, new()
        {
            //From: https://codereview.stackexchange.com/questions/30714/converting-datatable-to-list-of-class

            try
            {
                List list = new List();

                foreach (var row in table.AsEnumerable())
                {
                    T obj = new T();

                    foreach (var prop in obj.GetType().GetProperties())
                    {
                        try
                        {
                            PropertyInfo propertyInfo = obj.GetType().GetProperty(prop.Name);
                            propertyInfo.SetValue(obj, Convert.ChangeType(row[prop.Name], propertyInfo.PropertyType), null);
                        }
                        catch
                        {
                            continue;
                        }
                    }

                    list.Add(obj);
                }

                return list;
            }
            catch
            {
                return null;
            }
        }


        public static T DataRowToList(this DataRow row) where T : class, new()
        {
            //Variant of DataTableToList code from: https://codereview.stackexchange.com/questions/30714/converting-datatable-to-list-of-class

            try
            {
                List list = new List();

                T obj = new T();

                foreach (var prop in obj.GetType().GetProperties())
                {
                    try
                    {
                        PropertyInfo propertyInfo = obj.GetType().GetProperty(prop.Name);
                        propertyInfo.SetValue(obj, Convert.ChangeType(row[prop.Name], propertyInfo.PropertyType), null);
                    }
                    catch
                    {
                        continue;
                    }
                }

                return obj;
            }
            catch
            {
                return null;
            }
        }


        public static List DataRowArrayToList(this DataRow[] dataRowArray) where T : class, new()
        {
            //Variant of DataTableToList code from: https://codereview.stackexchange.com/questions/30714/converting-datatable-to-list-of-class

            try
            {
                List list = new List();

                foreach (var row in dataRowArray.AsEnumerable())
                {
                    T obj = new T();

                    foreach (var prop in obj.GetType().GetProperties())
                    {
                        try
                        {
                            PropertyInfo propertyInfo = obj.GetType().GetProperty(prop.Name);
                            propertyInfo.SetValue(obj, Convert.ChangeType(row[prop.Name], propertyInfo.PropertyType), null);
                        }
                        catch
                        {
                            continue;
                        }
                    }

                    list.Add(obj);
                }

                return list;
            }
            catch
            {
                return null;
            }
        }

    } 
}



You can also find him on Twitter, YouTube, and Google+










Thursday, February 16, 2017

Don't Forget- Standard GP SQL Views

From time to time, we still get inquires from folks building views in SQL server that actually already exist.  So I thought I would post a quick reminder that every SmartList has a SQL view that corresponds and can be used for your own purposes as well (e.g., SQL reports, Excel queries, SmartList Builder, etc).  And, remember, you can link views together as well as to other tables when creating reports.  Just don't modify the standard views (if you need to add to them, just create a new once with the same design and then modify).  Here are some of the most common ones available (this is NOT all of them) on any GP database...


  • AATransactions
  • Accounts
  • AccountSummary
  • AccountTransactions
  • BankTransactions
  • AttendanceDetail
  • BillofMaterials
  • CertificateList
  • Customers
  • EmployeeBenefit
  • Employees
  • EmployeeSummary
  • FixedAssets
  • FixedAssetsBooks
  • FixedAssetsPurchase
  • InventoryPurchaseReceipts
  • ItemQuantities
  • PayrollTransactions
  • PayablesTransactions
  • PurchaseLineItems
  • SalesLineItems
  • Vendors
When I teach beginner reporting classes, I advise students to always "look twice" for a standard view before embarking on creating new views or combining open/history/work tables in a SQL statement (as often the views already do this for you).  Good luck and happy reporting!


Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Customer Service and Failure

I hate car problems.  This is a fact of my life.  My dad was a car guy.  My brother is a car guy.  But I cringe every time I have to deal with car issues.  Fortunately, we have a mechanic who we trust and have taken both of our cars to for years.  (See the parallels already with a software partner/consultant?).  So, anyway, driving home last Friday my check engine light came on.  I did my normal scary internet searching for some basic things to try, and we cycled through those over the weekend (again, anyone picking up on the parallel to working with your software solution?).


Finally, on Tuesday, we caved and took it to our mechanic.  Who we like, but always secretly cringe because we don't know enough to know how much it will cost to fix.  Our mechanic fixed the issue (for those that are wondering- engine oil pressure sensor malfunction), although naturally it was a bit more than I wanted it to be (I wanted the under $100 fix of course!).  So I am sure by now, you are wondering why (despite the clever parallels) I am blogging about car problems on a blog devoted to Dynamics GP and software implementation?  Well, it is what came next that I think is a testament to how you think about customer service and approach failures with software and with partners.


On Wednesday morning, I woke up and got myself and the kids ready for the day.  I loaded the car with 20 cases or so of girl scout cookies (our office did a cookie pool to support all of the girl scouts in the office) then we loaded up to head to drop-off and work.  As soon as I started the car, I knew I had a problem- horribly rough idle and then the warning lights started flashing and next thing I know the car won't go faster than 10 mph.  Ugh. Ugh. Ugh.  Transfer cookies and kid and laptop to our other car, and call the mechanic.  When I talked to one of his employees, I was told they would either come out and get it or have it towed.  A couple hours passed, and I had not heard from them so I texted my husband to see if he had.  His text back was a simple "Yes, they came out and its fixed and they are test driving it around the neighborhood."


So there you go.  A mechanic who came to our house (his wife watched the store while he and his employee came out) and fixed something that was not tightened enough after the original repair.  Now, I know that some of you might think "dang right he came to your house to fix his own mistake" but I actually think of it totally differently.


Mistakes are inevitable in our work. We are human.  Software (and automobiles) are complicated.  We multi-task constantly with different clients, project, and even software.  Now, do we expect failures to be common?  No (this would be the first time in many years that we have had to call our mechanic back after a repair).  But I would argue that true customer service lies in how we respond to failures, do we....
  • Take on a proactive mindset?
  • Bring "solutions" to the table?
  • Skip the defensiveness and blame game?
  • Go the extra mile to resolve the issue?
I would argue that how we respond to failure as partners builds customer loyalty because failure is unavoidable at some point in a business relationship.  We deal with imperfect people, teams/organizations (clients and partners), and software.


In talking with the project managers where I work, we often discuss that projects will have bumps.  Trying to manage to avoid any bumps at all will leave you exhausted, ineffective, and reactionary.  But by understanding that projects will have bumps (miscommunications, missed expectations, etc) you are not "lowering the bar" (we continue to strive for excellence).  But you are, by expecting the occasional issue, adopting a proactive, pragmatic, and risk-adverse mindset- looking to manage the bumps, how we respond, and how we engage with the client for ultimate project success.


Look for the customer service in the failures.  That is where you will find it.  And that is where you will build the lasting partnerships (both internal and external) that will allow you and your organization to succeed.
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.