cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Learn all about the Community Ranking System, a fun gamification element of the PTC Community. X

Subscription - 20,000 property writes per 30 second

pselvaraj-1
9-Granite

Subscription - 20,000 property writes per 30 second

Hi,

I have a scenario, we have 1000 devices at multiple regions connected via edge sdk. Every 30 secs data received from devices and updated thing properties (20). On data change event I have enabled a subscription to write these entries on a stream. System becomes unresponsive, entries didn't as expected and hung. Pl someone assist me how to approach this scenario? Thanks in advance.

7 REPLIES 7

Why are you using a datachange event to write the properties to a stream?  Why not just enable logging on the properties and assign a value stream to  your thing template.  That will save you from the huge subscription performance hit.

PaiChung
22-Sapphire I
(To:pselvaraj-1)

There are some settings you can increase for the Event/Stream queue, but if you are overrunning the server, you probably need a more powerful server.

The valuestream suggestion is a good one as well.

Thanks for your reply! But valuestream entries are not consistent and some time we found gap in the entries.  Besides, we had to some manipulation at regular intervals and moved those calculations to another stream in order to increase the performance instead of querying the raw stream after entries written to raw stream. for eg. we are showing trend chart in hour wise. Any other suggestions would be helpful!

PaiChung
22-Sapphire I
(To:pselvaraj-1)

Sounds like the issue you are experiencing is pretty complex.

I'd like to recommend you engage with our Customer Success group to set up a workshop to fully understand your use case and give you expert recommendations.

But my initial guess is that you are not passing a specific timestamp to your property values (VTQ) and it is writing values to the stream as they arrive, since the writes are asynch it might mean they go out of order.

The missing values most likely is because you are writing within a period of 3ms apart which means that entries are overwritten.

You will need to have that verified but also have your architecture checked to make sure you aren't overrunning your JVM/Processing power as the writes come in and events are fired. Based on that you'll have to either increase your processing power, but most likely you will have to adjust your design.

Thank you!! Sounds like need to check the design again.

One other scenario.. Writing entries to stream every 15 mins interval and almost 96000 records per month. One of the use case is to diplay trend charts with hour/day/week/mont/year filters. So, we created a timer which fires every 15 or 30 mins to implement busniess logic and moved the aggregated data to new streams like hourstream and dailystream. Now data fetched from these new streams, hour from hourstream and for other filers from dailystream instead of raw stream which has for  100,000 plus records. What would be the other best approach for this scenario? Pl assist.

You should definitely offload that data and instead of using streams use an external DB schema.  Then you can directly query the DB using stored procedures.  This will be much more efficient and significantly more scalable than using streams to store this amount of data.

PaiChung
22-Sapphire I
(To:pselvaraj-1)

With that many records I would recommend you bring a relational database into the design.

Do the high ingest rate through Thingworx and handle the real time and near real time.

put the historical and archived historical into the relational database.

Top Tags