Downloading data in a bulk, some Calendar Years (CY) are not recognized in some years for some varia
I am downloading data in a bulk for all public firms in 47 countries, I used the add-in first to download some data but when I realized that this method was inefficient, I moved to Phyton. I used exactly the same code to download the data in the add-in and in Phyton, but for some reason, when I downloaded through Phyton some data of some variables for some specific years would be completely missing. I know this because I compared the data downloaded through phyton with the data downloaded previously by using de add-in.
More notably, all the data of firms from China ('CN') in most of variables I tried to download were missing from 1999 to 2016 and for some reason, all the data from the same variables was completely downloaded from 2017 to 2020. All using the same CODE in one run! The same happened for Latin American countries ("'PE', 'CL', 'BR', 'CO', 'MX', 'AR', 'EC'") in 2013, but this happened ONLY for 2013, the data was completely downloaded for all variables in the rest of the years.
I actually run it again to see whether this was a random occurrence and it was not, it happened again every time I run the code! I tried also to download using only the year (e.g. 2013) instead of CY2013, and then, the data is downloaded completely but I am not sure whether the default is CY and not fiscal year. Moreover, the data for market capitalization changes completely when I used only the year (e.g. 2013) instead of CY2013.
I would really appreciate any guidance in this respect, considering that I am downloading now the annual data, but I am planning to download the quarterly data soon and there will be more scope for problems in the downloading process...
I highly appreciate any guidance and thank you in advance for the attention given to this question
I am using this code in phyton:
#=========
#Regions to be downloaded
#You can take out the regions that you do not need anymore here
region_list = [g7, adv_other, asia, lac, asia_CN, asia_IN] #each region has the ISO2 codes of the countries from which I am getting the data
#=========
#Variables to be downloaded
fields = ''
fields = fields + 'TR.HeadquartersCountry' + ' ' +'TR.TotalAssetsReported' + ' ' + 'TR.TotalDebtOutstanding' + ' ' + 'TR.TotalRevenue' + ' ' + 'TR.TotalLiabilities' + ' '
fields = fields + 'TR.CashandEquivalents' + ' ' + 'TR.CashandSTInvestments' + ' ' + 'TR.PptyPlantEqpmtTtlGross' + ' '
fields = fields + 'TR.PropertyPlantEquipmentTotalNet' + ' ' + 'TR.CapitalExpenditures' + ' '
fields = fields + 'TR.InterestExpense' + ' ' + 'TR.CostofRevenueTotal' + ' ' + 'TR.SgaExpenseTotal' + ' ' + 'TR.TotalEquity' + ' '
fields = fields + 'TR.CapitalExpendituresCFStmt' + ' ' + 'TR.TotalCurrentAssets' + ' ' + 'TR.TotalCurrentLiabilities' + ' '
fields = fields + 'TR.TotalLongTermDebt' + ' ' + 'TR.HistPE' + ' '
fields = fields + 'TR.EBITDA' + ' '
fields = fields + 'TR.EBIT' + ' ' + 'TR.ROATotalAssetsPercent' + ' ' + 'TR.TotalInventory' + ' '
fields = fields + 'TR.NetIncomeCFStmt' + ' ' + 'TR.LTDebt' + ' ' + 'TR.TangibleBookValueRptd' + ' ' + 'TR.HQCountryCode'
fields_series = fields.split()
#=========
# Time period of the data
#yearly data
periods = ''
for yr in range(1999, 2021, 1):
periods = periods + ' ' + 'CY'+ str(yr)
periods_yr = periods.split()
#=========
#Some values to store
currency = 'USD'
scale = 6
#=========
#Downloading the data
for region in region_list:
#saving the name of the region
region_name = [ k for k,v in locals().items() if v is region ][0]
print("Refinitiv is downloading data to " + region_name)
for j in periods_yr: #for j in range(len(periods)):
df1, err = ek.get_data(firms_ric_list[region_name],
'TR.CompanyMarketCap',
{'Scale': scale,
#'FRQ' : frequency,
'SDate': j,
'Curn' : currency,
'NULL':'BLANK',
'FXRate':'PeriodEnd'})
dfj, err = ek.get_data(firms_ric_list[region_name],
fields_series,
{'Scale': scale,
#'FRQ' : frequency,
'Curn' : currency,
'ConsolBasis':'Consolidated',
'ReportType':'Final',
'reportingstate':'Rsdt',
'NULL':'BLANK',
'FXRate':'PeriodEnd',
'Period': j})
df1 = pd.merge(df1, dfj, on='Instrument')
#including all other vars in the excel sheet
with pd.ExcelWriter(os.path.join(d_bs_yr_dir, 'bs_' + region_name + '.xlsx'), engine='openpyxl',mode='a') as writer:
df1.to_excel(writer, sheet_name= j, index=True)
#printing the batch name
print( j + " was downloaded")
Best Answer
-
I have tested with the following code with 'CY2000'.
currency = 'USD'
scale = 6
j= 'CY2000'
rics = ['600781.SS',
'000001.SZ',
'000002.SZ',
'000004.SZ',
'000005.SZ',
'000008.SZ',
'000009.SZ',
'000010.SZ',
'000011.SZ',
'000012.SZ',
'000014.SZ',
'000016.SZ',
'000017.SZ',
'000018.SZ',
'000020.SZ',
'000021.SZ',
'000023.SZ',
'000025.SZ']
df1, err = ek.get_data(rics,
'TR.CompanyMarketCap',
{
'Scale': scale,
'SDate': j,
'Curn' : currency,
'NULL':'BLANK',
'FXRate':'PeriodEnd'
}
)
dfj, err = ek.get_data(rics,
fields_series,
{
'Scale': scale,
'Curn' : currency,
'ConsolBasis':'Consolidated',
'ReportType':'Final',
'reportingstate':'Rsdt',
'NULL':'BLANK',
'FXRate':'PeriodEnd',
'Period': j
})I can get the data properly.
To verify the content, you need to use Eikon Excel to retrieve the same data.
If both Eikon Excel and Eikon Data API are unable to retrieve the data, it could be a content issue. Therefore, you need to contact the content support team via MyRefinitiv to verify the data.
If the problem only occurs with Eikon Data API, please share RICs that you are using so we can verify the problem.
0
Categories
- All Categories
- 6 AHS
- 37 Alpha
- 161 App Studio
- 4 Block Chain
- 4 Bot Platform
- 16 Connected Risk APIs
- 47 Data Fusion
- 30 Data Model Discovery
- 608 Datastream
- 1.3K DSS
- 577 Eikon COM
- 4.9K Eikon Data APIs
- 7 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- Trading API
- 2.7K Elektron
- 1.3K EMA
- 236 ETA
- 519 WebSocket API
- 33 FX Venues
- 10 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 20 Messenger Bot
- 2 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 59 Open Calais
- 264 Open PermID
- 39 Entity Search
- 2 Org ID
- PAM
- PAM - Logging
- 8.4K Private Comments
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 20 RDMS
- 1.4K Refinitiv Data Platform
- 367 Refinitiv Data Platform Libraries
- 3 Refinitiv Due Diligence
- LSEG Due Diligence Portal API
- 3 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.1K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 10 World-Check Customer Risk Screener
- 990 World-Check One
- 44 World-Check One Zero Footprint
- 45 Side by Side Integration API
- Test Space
- 3 Thomson One Smart
- 1.2K TR Internal
- Global Hackathon 2015
- 2 Specialists Who Code
- 10 TR Knowledge Graph
- 150 Transactions
- 142 REDI API
- 1.7K TREP APIs
- 4 CAT
- 21 DACS Station
- 117 Open DACS
- 1.1K RFA
- 103 UPA
- 172 TREP Infrastructure
- 224 TRKD
- 886 TRTH
- 5 Velocity Analytics
- 5 Wealth Management Web Services
- 59 Workspace SDK
- 9 Element Framework
- 5 Grid
- 13 World-Check Data File
- Yield Book Analytics
- 46 中文论坛