1 Reply Latest reply on Jan 18, 2018 3:20 PM by skoessler RSS
    skoessler Newbie

    Description of complete workflow combining TW Edge, TW Analytics, TW Platform

    Hi everyone,


    I'm currently working on demo case where I want to show how ThingWorx Edge, ThingWorx Analytics and ThingWorx Platform interact with each other. Currently, on the Edge there is a virtual sensor that generates synthetic data and sends it to the platform. I'm also able to use the ThingWorx Analytics server to generate prediction models for the synthetic data and then retrieve the results.

    Let us now come to my actual question:

    What is the recommended way to stream new data from Edge to the Analytics server and then send the results to the Platform to display e.g. the classification of the current data in a mashup?



    Sebastian K.

      • Re: Description of complete workflow combining TW Edge, TW Analytics, TW Platform
        wposner-2 Creator

        Hi Sebastian...


        The prediction model is one part of the process and is based on training (historical) data.  Your predictive model should be created from a large historical dataset and this is what TWX Analytics uses to create predictive scoring results, provided you have accurately identified your trigger(s) which are the attributes in your historical data for which you would like TWX to provide predictive results when you submit new data.    You can also use the predictive model to generate signals, which indicate how predictive your data is.  You can also use the predictive model to generate profiles, which reveal groups of records that over or under perform.   


        Once you have a predictive model created based on solid historical data, you can then use new/current data to submit to TWX Analytics for scoring.  


        To get to your actual question, the best way that I have found to achieve what you're looking to do is through the use of a remote thing with logged properties to a value stream.  The value stream will allow you to time series analysis, if necessary.  With the logged properties, you simply create a subscription to the data changed event and then use a service, which you create, to pass your current property values to the Analytics scoring service.  Word of advice, don't submit every property data change for scoring.  Either use the built-in property threshold configuration so that the data changed event is only fired once a certain threshold is exceed, or figure out a if there are specific attributes that, if they exceed a certain standard deviation or some other calculable measurement, THEN you submit your property values for scoring. 


        Scoring does NOT happen instantly so you'll want to make sure you configure things for "Async" execution when you start passing data to the analytics engine for evaluation.