6 Replies Latest reply on Jan 15, 2018 11:12 AM by magrawal RSS
    magrawal Apprentice

    How to sync thingworx internal database with external database?

    Hi All,


    Is there a way so that I can sync my Thingworx data tables with Mysql database?


    My actual use case is:


    I have thousands of data table each with at least 100000 records. Till now I was using Thingworx Internal database neo4j and to access data I wrote all services by using built in snippets given by twx platform for data tables.

    Now because of huge amount of data I need to use Mysql database but If I use MySQL database the biggest problem is that I will need to change my all services I've written with built in snippets, If I use MySQL then I will need to write my own queries. It will increase lot of work for me.

    Is there a way so that I can use the same snippets(snippets for data table) for mySQL database?


    Looking forward for help ASAP.


    Thanks and Regards,


      • Re: How to sync thingworx internal database with external database?
        carlesc Heavyweight Champ

        What about moving to PostgresSQL and Cassandra ? both are provided as Persistence providers ( PostgresSQL for model and stream data, and Cassandra for Stream data ).

        • Re: How to sync thingworx internal database with external database?
          wposner-2 Creator

          Unfortunately, if you have that much data being stored, migrating to a Thingworx version using PostgreSQL as the default persistence provider won't solve your problems.  There are known and proven performance issues once datatables exceed a certain size.  The only way to overcome these performance issues it to migrate your data to a relational database.  And yes, you will need to write queries to access all your data.  All my designs include a relational database whenever I know I'll be dealing with large volumes of data.   The only data I store in the TWX database are my TWX modeled objects and basic lookup table type data to drive list boxes. 


          The best way to approach this is to stand up whatever DB instance you'd like (PostgreSQL, MySQL, Oracle, SQL Server) and use the JDBC DB extension to create a new DB Thing which you can configure to talk to your DB instance.  Build your relational DB schema, created stored procedures for complex queries, insert, update, delete operations, and then create SQL Command Services in your DB thing to call your stored procedures. 


          To migrate your data, you'll need to write services that loop through your datatables and call your insert services.


          It's a lot of work, but once it's done you'll wonder why you didn't just use a relational DB to begin with.  The flexibility of how you can query for and/or update data is vastly superior to basic datatable functionality. 


          BTW, streams are not immune to these performance issues either.  We've had issues where our streams got so large, the purge operations couldn't keep up with the volume of data and caused a huge performance hit.  We ended up also using our external relational DB to store stream data and then set up revolving purge jobs that run every week to clean out data older than X.