NetSuite SuiteAnalytics Connect ADO.NET Issues

Update (2020/01/17): WeGotCode is now offering a turnkey solution for getting your data, all of your data, out of NetSuite and into almost any database or data warehouse location you desire.  It’s a SaaS offering and is very affordable. If you’d like to save a lot of time and frustration, please contact us today for more information.


WeGotCode.com recently entered into a project with a partner with the objective of replicating a client’s ERP data into a data warehouse for the purpose of reporting and analytics.  The ERP is one of the largest cloud-based systems, NetSuite (by Oracle).  The target data warehouse system chosen was Snowflake.  This implementation has proved to be very challenging and I’d like to take a moment to mention some of the issues we encountered in hopes of soliciting opinions and other experiences from the community.

The takeaway of this project has been that attempting a complete replication of data in NetSuite is incredibly difficult.  I believe this is deliberate and intentional.  Oracle provides some options for reporting on your NetSuite data, but you will find that if you want a high-performance option, your choices are extremely limited.

Issue #1: REST API data load performance

In the initial implementation, we used Stitch as a data connector to bridge from NetSuite over to Snowflake.  The dev and support teams at Stitch were responsive and very helpful, but in the end we found that the rate at which data was replicating was insufficient for our needs.  It would have taken weeks to pull all of the records over.

The apparent reason for this is that Stitch uses the Web (REST) API for NetSuite to get its data.  While it works, it’s slow.  This is the only mechanism available from Stitch.  While I believe it would have worked great for a new project where the system was just turned on, it was not appropriate for our situation where we had to catch up on two years of data already stored in the system.

Issue #2: Finding a faster connection to NetSuite

The performance issues led to some conversations with Stitch and other data pipeline providers.  What we found is that most providers use the REST API to obtain their data, so the performance issue is common and pervasive.  Some companies have created drivers that “fake out” the connection to NetSuite and support things like ADO.NET queries but under the hood they are doing the same thing and suffer the same problems.

After some research we discovered that the most performant option available is the “SuiteAnalytics Connect” package for NetSuite.  There is an additional subscription cost for this, and only a handful of pipelines support connections to it.  They are pricey – typically close to $15k/year or more for hosting and support.

Issue #3: ADO.NET driver for NetSuite

Summary of this issue: don’t try to use the ADO.NET connector for SuiteAnalytics

The client was already paying for SuiteAnalytics for other reporting tasks, so the next step of our research took us to this document.  NetSuite SuiteAnalytics Connect offers drivers for ODBC, JDBC, and ADO.NET.  Since these are all officially documented and “officially supported”, and since WeGotCode.com leverages Microsoft .NET for the majority of its development, the option was clear.  Or so we thought.  We’d create a custom replication adapter using the ADO.NET driver.

The initial version of this was a WebJob hosted on Azure that ran on a scheduled basis and “chunked” the data in NetSuite and replicated it in an Azure SQL Database.  From there Stitch could pick it up and *very quickly* pipe it over to Snowflake.  The problem is that it didn’t work as expected.  Connections with NetSuite were plagued with issues, like the following  (bubbling up from the Oracle-supported ADO.NET adapter, latest version, downloaded from their support page.)

  • OpenAccessException [SuiteAnalyticsConnect 9880] Session protocol state error.
  • [SuiteAnalyticsConnect 9546] Command timed out.
  • System.Net.Sockets.SocketException: “An existing connection was forcibly closed by the remote host”
  • System.IO.IOException: Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.

Replication proved to be impossible, because connections were dropping all the time.  Typically they would drop immediately, but sometimes after waiting 20 minutes for a slow data pull and then a random “timeout” error would occur.

We opened a support ticket with NetSuite support.  This proved to be an infuriating experience, with no helpful recommendations made.  We extended timeouts to 4 hours (!), limited replication to one table at a time, and put in a five second delay between queries (really!).  We ran the process during live screen share sessions until it broke, usually after about 15 minutes.  Always the issue was my code, not the adapter, and always the recommendation was to hack up the service to make it do something strange, after which Oracle promised that they would (eventually) open an investigation into the root cause.  As I write this, the support ticket is still open with Oracle NetSuite’s support team.  Three weeks later.

Issue #4: You should be using ODBC or JDBC (really)

After several frustrating calls with support, it because readily apparent that the support team didn’t understand the difference between the ODBC driver and the ADO.NET driver.  They kept asking me to run ODBC queries using third-party tools, then assured me that they had no issues on their end because queries like “SELECT TOP 10 * FROM TRANSACTIONS” would return results (after 6 minutes.)

It quickly became apparent that there isno real support for ADO.NET with NetSuite.  So the initial Webjob was rewritten as a Windows service using Topshelf and the ODBC driver, and hosted on an Azure VM (because the ODBC drivers had to be installed.)  So much for the thin service.

This proved to be far more reliable.  We still run into occasional issues with large tables, where sometimes you are lucky to get 5000 rows of data back in a realistic period, say less than 10 minutes.  However, after some service tweaks we were able to replicate 95% of the client data within about 48 hours of the service running.  That’s for almost two years of records.  Meh – slow but orders of magnitude faster than the other options.

As for the remaining data, well…we’re still talking to Oracle support.  The only suggestion thus far is to keep reducing the amount of data you SELECT until the process doesn’t time out any more, and if it times out, well run the query again.  Yes, really.  This really isn’t a satisfactory answer in my opinion for a “first-tier” service provider.

Comments 5

  1. Hi,

    we got exactly all of the issues as you listed
    OpenAccessException [SuiteAnalyticsConnect 9880] Session protocol state error.
    [SuiteAnalyticsConnect 9546] Command timed out.
    System.Net.Sockets.SocketException: “An existing connection was forcibly closed by the remote host”
    System.IO.IOException: Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.

    however, when job runs locally(instead of Azure cloud), we have not found any issue!!

    1. Post
      Author

      I have to be honest – I ended up moving my VM hosting my code from Azure to another platform because of this issue. I suspected there was something going on, and I’m not going to elaborate in a public forum. You might be able to guess what it is. Your reply confirmed it.

      When I moved the app to a different host, the issue went away.

  2. Hi, unsure when this post was but were you able to make the ADO.NET eventually?

    I’m running into the same issues you have described here.

    1. Post
      Author

      No, unfortunately. I spent many hours on the phone with NetSuite’s overseas technical support team. Several tickets were opened. In the end they said they were unable to reproduce the problem. In my professional opinion, I don’t believe the adapter is officially supported. I would use the ODBC driver, or if you need a full replication solution, we offer one as a service: N-SAS. It might save you some heartburn.

  3. I have built an ETL pulling from Netsuite to a SQL Data Warehouse using ADO (as general consensus seemed to be it was superior to an ODBC connection).

    I stumbled upon this article as recently I have seen the occasional [SuiteAnalyticsConnect 9546] Command time out errors.

    Generally though, I have been able to build an ETL that pulls from all the major tables (transaction, transactionline, accountingtransactionline) plus associated dimension tables. I do this using an incremental load wherever the Netsuite table has a lastmodifieddate field, and the refresh will generally complete in around 5 minutes.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.