New RD library for python is leaking memory when opening/closing streams

Hey

We are trying to use the new Refinitiv Data Library for Python, but we are experiencing memory leaks when opening/closing pricing streams.


1682936446719.png


Can you please investigate and fix?


Over 20 iterations of the above stream opening and closing, the memory increases as below:

Loop #1

Memory before open stream = 103.12109375

Memory after open stream = 103.1953125

Memory after get_snapshot = 103.1953125

Memory after close stream = 103.1953125

End loop #1

[...]

Loop #19

Memory before open stream = 103.5234375

Memory after open stream = 103.5234375

Memory after get_snapshot = 103.5234375

Memory after close stream = 103.5234375

End loop #19


I am attaching a small python file where this can be tested and debugged.testRDP.txt


Best regards

Sabina

Best Answer

  • aramyan.h
    Answer ✓

    Hi @Sabina Tanase ,


    I have just learned from the Dev team, that the issue is now fixed with version 1.4.0. Please see below the before and after memory graphs:

    screenshot-2023-10-13-at-093553.png

    screenshot-2023-10-13-at-093516.png

    The current fix includes the Optimisation in working with listeners of streams, optimisation library imports will be delivered in 1.5.0 and in the future releases the optimisation of internal callback mechanisms will be enhanced as well.


    Best regards,

    Haykaz

Answers

  • Hi @Sabina Tanase ,


    Thank you for your question. I will redirect your question to the Dev team and come back to you with an answer as soon get one.


    Hope this works,


    Best regards,

    Haykaz

  • Hi @Sabina Tanase ,


    My colleague @nick.zincone raised this to the Dev team. The issue is under investigation, will update accordingly.


    Best regards,

    Haykaz


  • Hey Haykaz, Nick


    Thank you - looking fwd to your answer.


    Best regards

    Sabina.

  • Hey

    Any updates?

    Best regards

    Sabina.

  • Hey Haykaz,

    Any updates on this?

    Best regards

    Sabina.

  • Hi Sabina, no updates unfortunately, followed up with the dev team.
  • Hi @Sabina Tanase ,


    The dev team come back to me with a following observation:

    -----

    If we will have current particular case, yes, we will see some memory consumption growing on 20 iterations.


    If we will spread use case to get 500 or 1000 iterations I see that memory consumption starts from 138.42Mb has some peak at 139.04Mb, then it falls to 123.03Mb, then peak at 123.85Mb, falls to 108Mb. To the end of 1000 iteration it fell down to 64.95Mb. We can assume that garbage collector works after some conditions came and process release memory. Tested with python3.10 on MacOS Ventura 13.4, refinitiv-data 1.3.0


    Our library is a really thin layer, from the current case in example, we just storing the last updates callback in stream, that’s it.


    If the customer have practical, living streaming use case with resulting in growing memory consumption, please, share it.

    -----

    please let me know what you think and if you can provide a practical use case where the team can investigate further?


    Best regards,

    Haykaz

  • Hey Haykaz ( @h.aramyan01 )

    Thank you for the reply.

    I have increased the loop to 5000 iterations, and tested with forcing the garbage collector to tend to memory (outside and inside the loop). The result is unfortuantely the same for us - memory is increasing.

    I have attached the program I have been using (testrdp.txt), which has run on Windows10 using python lib "refinitiv-data==1.3.0".

    This is how the memory consumtion is looking for 5000 iterations using testrdp.py:mem.png

    I would have assumed that stream.close() would release the memory for me, as well as release the handles used by the respective stream.

    Please let me know if the developers are testing using my testrdp.py or can you please share their code?

    Best regards

    Sabina.

  • Hello @Sabina Tanase ,


    Thank you very much for your patience, I have just got a follow up from the Dev team after I sent your notes above. The team told that there are in process of making some adjustments for streams. Currently, the change is under review and testing. They already have some intermediate results of improvements, but will release lately when task will be done completely.

    I will inform you separately to upgrade the lib for the change.


    Best regards,

    Haykaz

  • Hey Haykaz

    Super nice!

    We will start testing in the next 2 weeks, and we will let you know how it look on our side.

    Thank you for the efforts,

    Sabina.

  • Sounds good, looking forward to the update from your side
  • Hello @Sabina Tanase

    Do you still encounter the issue?

  • Hello @Sabina Tanase

    Does the issue still exist?