My experience with Xplenty

… what I liked, and why I’ve opted not to use their services.

When I first started at Braviant, and just in a couple weeks have realized that I will need to build a new data mart, the question was: how I can do it having four different external data sources?  Not to mention, having no IT and no app developers.

At first my plan was to use foreign data wrappers – the are FDW for all of the data source types I needed: SQL Server, MySQL, .csv files and Postgres itself. So everything seemed easy, except of… well, except of RDS does not support any of the FDW, aside of Postgres one.

I’ve started to look for alternatives, and several people pointed me to the Xplenty – and I decided to give it a try. I almost feel bad choosing at the end not to go with their solution, because these folks had spent enormous time discussing my needs and trying their best to accommodate my wants.  And I believe that for many organizations it might indeed be a very sensible solution.

Who and when should consider using Xplenty?

  • Organizations which do not have or have very small IT department with not enough expertise in data integration
  • The number of tables to be integrated is small (or reasonably small)
  • The speed and/or frequency of data refresh/pull is not a big concern
  • There is none or very little special data processing

One of the definitely positive things about Xplenty is their customer service, they actually get back to you, they talk to you, they are really focused on resolving your problems. They would give you a sandbox instance to try everything for a week, and you can perform as many data pulls during this trial as you want. They will help you to debug your scheduled jobs.  Another great thing is that you do not really need to know anything about these external systems, except of the connection details. all the meta-information will be extracted, and processed, and the data will be presented to you.

So why we ended up not using their services? Well, because as it often happens, the things which are good for some customers are not good to other. We needed to map in total over 300 tables, and this was completely manual process. Besides, it turned out that some column names in our external data sources where there keywords in Postgres, so they required special coding – yes, Xplenty supports this option, but again, it is a manual process.

There were several other things which also had to be decoded manually, for example, integer 0/1 was force-converted to boolean, but the biggest problem was a speed of refresh. Again, if you have just a handful of tables which should be refreshed a couple times a day, there is no problem at all. But if you have 300 tables which has to be refreshed every hour or so, and each table takes at least a minute to refresh… you’ve got the idea.

To summarize: there are lots of cases when Xplenty will be the fastest in terms of delivery and the easiest solution; it didn’t work for us, but on the other hand I do not think any out-of-the-box solution will work for us – we ended up with the custom development.

 

 

Advertisements

Leave a comment

Filed under Companies, Data management, Uncategorized

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s