Custom data types for a blob of 200,000 char

Hi,

We are currently investigating how to transport internal messages of tick data for large string objects. One idea is to publish the through ATS to TREP, and via an internal feed distribute the data internally. Do you have a supporting custom data type for string sizes of 200,000 characters?


Kind regards,

Johan

Best Answer

  • Hi @johan.lundquist

    You can write a Provider application (either Interactive or non-interactive depending our your requirement/preference) to publish your data.

    Would you intend to publish the 200k characters in only a RefereshMsg or in an UpdateMsg as well? If only in RefreshMsg, would the UpdateMsg payload be less than 64k?

    If only in a RefreshMsg (and sub 64K UpdateMsg) and if you are able to split the 200k then you could use the OMM support for multipart RefreshMsg to split the payload across the RefreshMsgs and then set the Completion flag on the final RefreshMsg. The sub 64k UpdateMsg could then be published as normal.

    You can also use payloads larger than 64k but we do not recommend this as the TREP infrastructure is optimised for 64k message sizes.

    You don't necessarily have to use a Custom Domain model - you could use MarketPrice and include a Buffer type field which allows upto 65535 bytes.

    If you cannot split the 200k across fields or messages, then you could zip the 200k before populating the field and then the consumer has to unzip the field at their end. For this scenario it would be better to define a custom domain model - to differentiate it from the existing domains. Please see one of of our Machine Readable News tutorial for an example of using zip to transport larger payloads.

    You may also find this post useful in terms of payloads > 64K and this post on large generic messages.

Answers

  • Thank you @Umer for the quick and precise answer,

    The 200k message should be a RefreshMsg followed by UpdateMsg's. The expected size of those updates I don't know yet, it's a serialized object of internal curve models.

    I'll begin with the investigation of splitting the RefreshMsg using partNum and hopefully being able to publish updates normally (size <64k)

    Kind regards,

    Johan

  • Hi @johan.lundquist,

    You may find some information regarding custom domain in this article. The article demonstrate how to create new custom domain to publish large binary data.

  • Thank you, I have another case in the backlog where this might become useful cheers.

  • By the way, I just had internal discussions on it and one application is currently using ATS to publish on internal feed. Would that be doable here as well? This by using primitive type Buffer for the large string objects up to 64k.

    Thanks,

    Johan

  • Hi,

    When I last looked into ATS maximum messages sizes, I was told that whilst ATS does not have a record size limit as such, the ADS would limit any Post Messages you send to the ATS to the 64K limit.

    Are you planning to Post messages to ATS or use ATS models / records to actually generate the payload itself?

  • That plan would be to post messages to ATS with payload, do you see any immediate issues with that approach?

    Or else what do you suggest use ATS to generate payload itself or actually write OMM Publisher?

  • Hi @johan.lundquist

    I have not used ATS for a considerable time now so I will try and reach out and see if I can get an answer to your question.

    Whilst Posting to ATS is relatively simple - compared to writing a provider - I expect the Provider approach would be more efficient. You may want to discuss with your ATS admin team to see if they would have any concerns about your large payloads - unless you only plan to post a few of large messages infrequently.

    In terms of ATS generating the payload, it does have an expression builder to generate fairly complex records and can also import libraries containing your own C/C++ functions. I recommend you reach out to your Refinitiv Account team and setup a discussion with an ATS expert and/or refer to the ATS to see if it would meet your requirements.

    If you do prefer Posting to using a Provider and your ATS team are not keen on your ATS approach, then you could also discuss the possibility with your Market data team if they can set up an ADH Cache service for you - where you post the data into the Cache service and your consumers consume from the service.

  • Thanks again for very good and detailed guidance, we'll go ahead writing prototypes for a step-wise approach.

    Kind regards,Johan

  • Hi @johan.lundquist

    I came across a set ATS with Elektron SDK articles and one on publishing custom data

    They don't specifically deal with large data blob - but you may find them helpful...

  • Hi


    We tried using Marketfeed model to store curves. I added such field to RDMFieldDictionary:

    CALC_CURVE "CURVE STRING" -4001 NULL ALPHANUMERIC 40000 RMTES_STRING 40000

    Is there any limitation in SSL to size of such field? I used rmdstestclient with SSL to insert data to CALC_CURVE field but I got error for more than approx. 206 characters.


    Kind Regards

    Krzysztof

  • Hi @krzysztof.kujawski.1

    Not a MarketFeed expert but as far as am I aware MarketFeed has 255 character limit e.g. news stories text fields were always limited to 255 character and allowing for tags etc in the Marketfeed encoding could explain your 206 limit