cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Stay updated on what is happening on the PTC Community by subscribing to PTC Community Announcements. X

How to copy content of an existing valuestream (cloning)

tcoufal
12-Amethyst

How to copy content of an existing valuestream (cloning)

Hi guys,

finally Jive is up and running so I can post my questions.

Does anyone know how to clone valuestreams, i am re-designing one of my older projects and I need to change the name of the valuestreams used, obviously this is not possible, but I dont want to loose any data which I've collected. Value stream will be assigned to the same thing templates, properties are the same.

I know how to clone Streams, but I am not sure how to do it for ValueStreams  (because of timestamps and so on).

Any ideas?

Thanks a lot

9 REPLIES 9

You can export everything on a csv file, and then reimport it again on the new Value Stream for instance.

tcoufal
12-Amethyst
(To:CarlesColl)

Hi Carles,

thanks for that,

could you elaborate on this. I will use data exporter to get the csv out, which will contain logged properties and timestamp of their logging.

Which service should I use to fill the new valuestream with those csv data?

Thanks a lot

Hi Tomas, don't use Data Exporter extension, use CSV Parser one: http://marketplace.thingworx.com/Items/csv-parser you can write/read CSV files with it.

tcoufal
12-Amethyst
(To:CarlesColl)

I will check this extension. And let you if Ill got stuck.

Tomas

tcoufal
12-Amethyst
(To:CarlesColl)

Hi Carles,

Iam goofing around with that CSV parser - quite ok.

But I am facing with some minor issue, I am reading content of a value stream (only one property + timestamp), I am putting everything in a csv file.

I am reading that file back and try to view the result, CSVReader needs a datashape for to handle file correctly, If I use datashape of QueryPropertyHistory (number + datetime), it returns an error saying cannot convert string to datetime. I thought that I can change result infotable.timestamp to number in a loop like this:

params.....

var result = me.QueryPropertyHistory(params);

var tableLength = result.rows.length;

for (var x = 0; x < tableLength; x++) {

  var row = result.rows;

  var timestamp = (new Date(row.timestamp).getTime())*1000;

    var timestampString = timestamp.toString();

    row.timestamp = timestampString;

}

But it has no effect to infotable as I expected to have. I guess that datashape is hidden inside an Infotable object and it gets passed to CSV service which than takes it and converts values to appropriate formats. I think thats ok. I wanted to create sort of copy of result with my defined datashape - some problems with indexes (although I am pretty sure I got that right), i can deal with that later.

Do you know how easily change the current infotable property's datashape, i.e:


var result = me.QueryPropertyHistory(params);

result.changeDataShape = myDataShape;


But my major problem is that my ValueStream is used by many things (100) those are instances of the same TT. Everything has like 8 logged properties (numbers) + timestamp, so If I want to insert 8 properties into a new valuestream I should use 8x AddNumberValueStreamEntry correct? (that service has EventTime - that would be my timestamp I suppose?? so the timeseries will not be effected).


So far so good i hope. But the problem is that the ValueStream operations are Thing-based. So it would mean that I need to place this service into TT and than write another service which would call this service on EveryImplementingThings for that template in a loop correct?


I am failing to see the benefits of having CSV parser.


Have try this, do you have some code which you could spare


Thanks a lot


Tomas

Hi Tomas,

What I would do:

  1. List all logged properties by baseType - on the Thing: me.GetLoggedProperties({ type: 'NUMBER'  })....
  2. Query Contents on logged properties "Query+baseType+PropertyHistory"({ maxItems: BIG_VALUE, propertyName: "the ones from previous", }) -> DataSHape it's like baseType+"ValueStream" ( id, timestamp, value )
    1. Add a time field to the previous result with Derive
      1. result = Resources["InfoTableFunctions"].DeriveFields({types:"LONG", t: result, columns: "time", expressions: "timestamp.getTime()" });
    2. Remove the timestamp field
      1. result.RemoveField("timestamp");
  3. Write as-is with WriteCSVFile the result from previous query, on a file called ( "ThingName"+"."+"PropertyName"+".csv" )
  4. Now you can change the ValueStream on the things
  5. List all logged properties by basetTyp...
  6. (Previous) Construct a DataShape for each baseType as MyBaseTypeValueStream, with fields "id", "time", "value"
  7. Now you can read the CSV file with ReadCSVFile, using previous DataShape
  8. you have in memoty the loaded CSV, let's reshape it
    1. Recover timestamp:
      1. result = Resources["InfoTableFunctions"].DeriveFields({types:"DATETIME", t: result, columns: "timestamp", expressions: "new Date(time)" });
    2. Remove time field
      1. result.RemoveField("time");
  9. Now you can freely add "Add"+baseType+"ValueStreamEntry"
kr1
3-Visitor
3-Visitor
(To:tcoufal)

Hi Tomas,

Sorry to deviate from your question. I have a query, if you could help me on the method to clone streams . I have a huge data in my stream and want to delete older entries in it after cloning. Please suggest a method without using for loops.

tcoufal
12-Amethyst
(To:kr1)

Hi,

Are you talking about ValueStream or Stream.

I dont know if I can recommend it, but I am now doing some heavy duty stuff directly via SQL.

I was logging incoming SQL statements in db (which is super easy in Postgre, I was once doing the same on MySQL and it was terrible) to find out what is happening "under the hood" and all seemed pretty straight forward. So you can use SQL to copy records in StreamTable and delete what you want. Obviously DB will not check if parameters (stream name, source, tags, etc.) refers to existing entities in TW or nut, but if you careful it will help you to speeds things up.

Carles what do you think if you are reading this? 

If it's really huge date, maybe Tomas Coufal solution ( custom SQL ) would be the most viable, and of course if it's for just one time task ( of course if you are on PotgreSQL and other relational databases, if you are on Neo4J or H2 you are out of luck ).

Top Tags