New RD library for python is leaking memory when opening/closing streams
Hey
We are trying to use the new Refinitiv Data Library for Python, but we are experiencing memory leaks when opening/closing pricing streams.
Can you please investigate and fix?
Over 20 iterations of the above stream opening and closing, the memory increases as below:
Loop #1
Memory before open stream = 103.12109375
Memory after open stream = 103.1953125
Memory after get_snapshot = 103.1953125
Memory after close stream = 103.1953125
End loop #1
[...]
Loop #19
Memory before open stream = 103.5234375
Memory after open stream = 103.5234375
Memory after get_snapshot = 103.5234375
Memory after close stream = 103.5234375
End loop #19
I am attaching a small python file where this can be tested and debugged.testRDP.txt
Best regards
Sabina
Best Answer
-
Hi @Sabina Tanase ,
I have just learned from the Dev team, that the issue is now fixed with version 1.4.0. Please see below the before and after memory graphs:
The current fix includes the Optimisation in working with listeners of streams, optimisation library imports will be delivered in 1.5.0 and in the future releases the optimisation of internal callback mechanisms will be enhanced as well.
Best regards,
Haykaz
0
Answers
-
Hi @Sabina Tanase ,
Thank you for your question. I will redirect your question to the Dev team and come back to you with an answer as soon get one.
Hope this works,
Best regards,
Haykaz
0 -
Hi @Sabina Tanase ,
My colleague @nick.zincone raised this to the Dev team. The issue is under investigation, will update accordingly.
Best regards,
Haykaz
0 -
Hey Haykaz, Nick
Thank you - looking fwd to your answer.
Best regards
Sabina.
0 -
Hey
Any updates?
Best regards
Sabina.
0 -
Hey Haykaz,
Any updates on this?
Best regards
Sabina.
0 -
Hi Sabina, no updates unfortunately, followed up with the dev team.0
-
Hi @Sabina Tanase ,
The dev team come back to me with a following observation:
-----
If we will have current particular case, yes, we will see some memory consumption growing on 20 iterations.
If we will spread use case to get 500 or 1000 iterations I see that memory consumption starts from 138.42Mb has some peak at 139.04Mb, then it falls to 123.03Mb, then peak at 123.85Mb, falls to 108Mb. To the end of 1000 iteration it fell down to 64.95Mb. We can assume that garbage collector works after some conditions came and process release memory. Tested with python3.10 on MacOS Ventura 13.4, refinitiv-data 1.3.0
Our library is a really thin layer, from the current case in example, we just storing the last updates callback in stream, that’s it.
If the customer have practical, living streaming use case with resulting in growing memory consumption, please, share it.
-----
please let me know what you think and if you can provide a practical use case where the team can investigate further?
Best regards,
Haykaz
0 -
Hey Haykaz ( @h.aramyan01 )
Thank you for the reply.
I have increased the loop to 5000 iterations, and tested with forcing the garbage collector to tend to memory (outside and inside the loop). The result is unfortuantely the same for us - memory is increasing.
I have attached the program I have been using (testrdp.txt), which has run on Windows10 using python lib "refinitiv-data==1.3.0".
This is how the memory consumtion is looking for 5000 iterations using testrdp.py:
I would have assumed that stream.close() would release the memory for me, as well as release the handles used by the respective stream.
Please let me know if the developers are testing using my testrdp.py or can you please share their code?
Best regards
Sabina.
0 -
Hello @Sabina Tanase ,
Thank you very much for your patience, I have just got a follow up from the Dev team after I sent your notes above. The team told that there are in process of making some adjustments for streams. Currently, the change is under review and testing. They already have some intermediate results of improvements, but will release lately when task will be done completely.
I will inform you separately to upgrade the lib for the change.
Best regards,
Haykaz
0 -
Hey Haykaz
Super nice!
We will start testing in the next 2 weeks, and we will let you know how it look on our side.
Thank you for the efforts,
Sabina.
0 -
Sounds good, looking forward to the update from your side0
-
Hello @Sabina Tanase
Do you still encounter the issue?
0 -
Hello @Sabina Tanase
Does the issue still exist?
0
Categories
- All Categories
- 6 AHS
- 37 Alpha
- 161 App Studio
- 4 Block Chain
- 4 Bot Platform
- 16 Connected Risk APIs
- 47 Data Fusion
- 30 Data Model Discovery
- 608 Datastream
- 1.3K DSS
- 577 Eikon COM
- 4.9K Eikon Data APIs
- 7 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- Trading API
- 2.7K Elektron
- 1.3K EMA
- 236 ETA
- 519 WebSocket API
- 33 FX Venues
- 10 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 20 Messenger Bot
- 2 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 59 Open Calais
- 264 Open PermID
- 39 Entity Search
- 2 Org ID
- PAM
- PAM - Logging
- 8.4K Private Comments
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 20 RDMS
- 1.4K Refinitiv Data Platform
- 367 Refinitiv Data Platform Libraries
- 3 Refinitiv Due Diligence
- LSEG Due Diligence Portal API
- 3 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.1K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 10 World-Check Customer Risk Screener
- 990 World-Check One
- 44 World-Check One Zero Footprint
- 45 Side by Side Integration API
- Test Space
- 3 Thomson One Smart
- 1.2K TR Internal
- Global Hackathon 2015
- 2 Specialists Who Code
- 10 TR Knowledge Graph
- 150 Transactions
- 142 REDI API
- 1.7K TREP APIs
- 4 CAT
- 21 DACS Station
- 117 Open DACS
- 1.1K RFA
- 103 UPA
- 172 TREP Infrastructure
- 224 TRKD
- 886 TRTH
- 5 Velocity Analytics
- 5 Wealth Management Web Services
- 59 Workspace SDK
- 9 Element Framework
- 5 Grid
- 13 World-Check Data File
- Yield Book Analytics
- 46 中文论坛