3 Replies Latest reply on May 26, 2017 11:33 AM by cmorfin RSS
    ajackson Newbie

    Running ThingWatcher



    I'm struggling to get ThingWatcher running. I've deployed both services with docker as in the documentation and then found some sample code here on the forums using the SDK.


    It seems to work fine for training and then the model is saved and I can see the file appear in my volume. I then get exceptions when it appears the SDK is trying to load the model by connecting directly to the model service using the URI is has received from the training service. This doesn't work because the URI is using the container name (my host cannot resolve this) and on port 8080, not the 8090 that I have mapped it to.


    My docker commands to run the containers are:


    docker run -d \
      -p 8090:8080 \
      -v /home/andrew/tw/models:/data/models \
      -v /home/andrew/tw/db:/tmp/ \
      --name model-service \
      twxml/model-service:1.0 \
      -Dfile.storage.path=data/models \
      -jar maven/model-1.0.jar \
      server maven/standalone-evaluator.yml
    docker run -d \
      -p 8091:8080 \
      --link model-service \
      --name training-service \
      twxml/training-service:1.0.0 \
      -Dmodel.destination.uri=model://model-service:8080/models \
      -jar maven/training-standalone-1.0.0-bin.jar \
      server maven/training-standalone-single.yml


    Once the training is complete, each call to the monitor method throws an exception


    com.thingworx.analytics.thingwatcher.exceptions.ThingWatcherOperationException: Unable to load the PMML model located at http://model-service:8080/models/1/pmml.xml

      at com.thingworx.analytics.thingwatcher.ModelScorer.initializeTimeSeriesExtensions(ModelScorer.java:116)

      at com.thingworx.analytics.thingwatcher.ModelScorer.<init>(ModelScorer.java:55)

      at com.thingworx.analytics.thingwatcher.ThingWatcher.spinUpScorerAndAnomalyDetector(ThingWatcher.java:283)

      at com.thingworx.analytics.thingwatcher.ThingWatcher.monitor(ThingWatcher.java:166)


    Followed by a load of inner exceptions.


    I can see that http://localhost:8091/training/1 returns JSON that refers to the modelUri that it is trying to load however the host would get it at http://localhost:8090/models/1/pmml.xml so this doesn't work.


    I tried to set it up without the model service and setting the model.destination.uri to a file URI as described in the documentation, but then the training service returns a file URI to the path in the container that the host cannot find so really it's the same problem.


    I'm not very familiar with Docker so have I missed a step somewhere?



      • Re: Running ThingWatcher
        cmorfin Communicator

        Hi Andrew


        Thank you for the details here, that makes it clear to understand :-)

        There are a few things to address:

        1) Because you start your model microservice with docker run -d   -p 8090:8080 ... , your model microservice needs to be accessed through port 8090. This means the training micro service should be started with -Dmodel.destination.uri=model://model-service:8090/models and not -Dmodel.destination.uri=model://model-service:8080/models


        2) Your analysis is correct, ThingWatcher will use the URI passed by the training microservice (which comes from -Dmodel.destination.uri to retrieve the prediction model. However if you use a named microservice, this name is only known within the docker network so your local machine indeed does not know of it.

        One way to address this is to add model-service int he host file of your client machine (/etc/hosts on Linux, C:\Windows\System32\drivers\etc\hosts on Windows) .

        Another way would be to use the ip address of the docker container in the -Dmodel.destination.uri parameter. This is possibly less convenient as if the ip address changes (and it might after a restart), then a new micro service will need to be created and the current one deleted. Updating the host file would be easier.


        To find the ip address of the docker container :

        - on Windows with docker toolbox: docker-machine ip

        - on native platform: docker inspect -f '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' container_name_or_id
        Replace container_name_or_id  with the name of your container, here model-service


        Hope this helps

        Kind regards


          • Re: Running ThingWatcher
            ajackson Newbie

            Thanks for getting back to me.


            The first part of your response did not work. While the URI was correct for the host to connect to the modelling server, it was not correct for the training service to connect to the modelling service and so it still did not work.  I was able to workaround this by changing the port binding to 8080:8080 for the modelling service.

            As for the second part, I added a record to the hosts file to resolve model-service to


            With this I able was to get ThingWatcher running, however both seem more like hacks than fixes and I wouldn't be comfortable doing this in a production environment.


            I guess I could deploy my code using ThingWatcher into a third docker container and have all three using a shared docker network. It would not be necessary to expose any ports then and the name resolution would work without changing the hosts.


            Now that my code runs I have another question, is it possible to associate a thingwatcher to an existing model on startup? It seems to make a new model each time and have to do the learning again.