How to fetch large amounts of data efficiently

Hi,

I noticed that its taking me a very long amount of time to fetch large amounts of data. For example, I'm currently trying to fetch mid price & yield for 50000 securities for all of 2020. It takes around a day of loading to get 1 month of data, and I was wondering how to do this more efficiently?

I'm not sure if its our usage of the eikon data api, or perhaps some existing inefficiencies in our code interacting with our database that's causing this to take so long.

Best Answer

  • zoya faberov
    Answer ✓

    Hello @kevin.guo,

    In terms of requesting large data sets with Eikon Data API, you may find this previous discussion to be relevant, to gauge what to expect in terms of performance.

    I am assuming that you can, for the purposes of verification, temporarily disable database insert path, and test retrieval via Eikon Data API, separately.

    In terms of requesting really large historical data sets, quickly and asynchronously, HTTP REST over python, you may wish to take a look at different product, Tick History API use and product specs.