1 Reply Latest reply on Aug 1, 2017 5:08 PM by adamr RSS
    jacekgra Apprentice

    ThingWorx Agent - frequent property update with storage in Stream



    I am wondering what is the best approach to take when I have an TWX Edge which is pushing properties update (~60-70 properties) every 25ms and I want to store every single property data set (i.e. in Stream). I have an remote event which is triggered by TWX Edge when all properties are updated and I am subscribing to it, so that whenever all properties are updated I am creating JSON with all properties and persist it in a stream:


    var properties = me.GetPropertyValues();
    var params = {
         table: properties /* INFOTABLE */
    // result: JSON
    var json = Resources["InfoTableFunctions"].ToJSON(params);
    var values = Things["RawStream"].CreateValues();
    values.thing= me.name; //THINGNAME
    values.properties = json; //JSON
    var params = {
         sourceType: undefined /* STRING */,
         values: values /* INFOTABLE*/,
         location: undefined /* LOCATION */,
    source: me.name /* STRING */,
    timestamp: undefined /* DATETIME */,
    tags: undefined /* TAGS */
    // no return


    However, I am wondering what if TWX will not yet manage to store Stream entry (execute full subscription, especially execute me.GetPropertyValues()) and in the meantime Edge will update properties with new values (which will trigger yet another subscription thread) - so there is a risk of data loss (as 'first' subscription will retrieve updated values of properties)? I want to be 100% that I do not lose any data at all.


    Do you have any other ideas (not necessarily using Streams) which I can use to persist high volume of data coming from Edge with high frequency (to multiple (>1000) devices simultaneously)?


    Thanks in advance for any ideas.




      • Re: ThingWorx Agent - frequent property update with storage in Stream
        adamr Creator



        This type of setup is what value streams were designed for.  Having a subscription fire at this volume will cause issues when scaling to 1000+ devices.  The way to do this is have a remote subscription for each property and write directly to the property values or have the JSON property logged to a value stream directly.  Which option is best primarily depends on how you need to query the data after it is stored.


        Additionally for this ingestion rate (40 writes per second @ 1000 devices - 40,000 writes per second) you would need to use a DSE Cassandra back end and HA/Federated structure for the platform to ensure no data is lost.


        We have tested our value stream writes to a DSE server setup with a fully HA environment up to 1,000,000 writes per second.  This does require a lot of setup, administration and server nodes.  You can find information on this setup here - http://support.ptc.com/WCMS/files/173281/en/ThingWorx_8_High_Availability_Administrators_Guide.pdf