Continuous 500 Internal Server Error and Gateway Time-out errors
Hi all, I have a few python models that have ran into updating issues in the past three weeks or so. On some days the models update just fine, but on other days (like today) I keep getting either an 500 Internal Server Error or a Gateway Time-out error.
----------------------------------------------------------------------------------------------
Here is a quick example of the code I am using to pull in the data:
lookback_days = 240 # You can change this number
today = dt.datetime.today()
start_date = (today - dt.timedelta(days=lookback_days)).strftime('%Y-%m-%d')
end_date = (today - dt.timedelta(days=1)).strftime('%Y-%m-%d') # Adjusted to get yesterday's date
pairs_rics = ['AUDCAD=R' , 'AUDCHF=R' , 'AUDNZD=R' , 'AUDJPY=R' , 'AUD=' ,
'CADCHF=R' , 'CADJPY=R' ,
'NZDCAD=R' , 'NZDCHF=R' , 'NZDJPY=R' , 'NZD=' ,
'EURAUD=R' , 'EURCAD=R' , 'EURCHF=R' , 'EURJPY=R' , 'EURNZD=R' , 'EURGBP=R' , 'EUR=' ,
'GBPAUD=R' , 'GBPCAD=R' , 'GBPCHF=R' , 'GBPJPY=R' , 'GBPNZD=R' , 'GBP=' ,
'CHFJPY=R' ,
'CHF=' , 'JPY=' , 'CAD=',
'US2YT=RR', 'DE2YT=RR', 'GB2YT=RR', 'JP2YT=RR', 'AU2YT=RR', 'NZ2YT=RR', 'CA2YT=RR', 'CH2YT=RR',
'US10YT=RR', 'DE10YT=RR', 'GB10YT=RR', 'JP10YT=RR', 'AU10YT=RR', 'NZ10YT=RR', 'CA10YT=RR', 'CH10YT=RR']
final_data = pd.DataFrame()
for pair in pairs_rics:
data = ek.get_timeseries(pair,
fields=["CLOSE"],
start_date=start_date,
end_date=today, # use end_date instead of today
interval='daily')
----------------------------------------------------------------------------------------------
I was told that I needed to reduce the amount of RICs and the lookback period, but even after limiting it to one RIC and just 5 lookback days I still got the same error.
Is there any advice on how I can solve this?
Someone told me I should consider moving away from the ek.get_timeseries and rather using rd.get_history but am struggling to find the same output compared to the other.
Any advice would be appreciated.
Regards.
Best Answer
-
Hi @Arries,
Having a look at this article, I thought it best to try the below. Please let me know if it helps. Indeed, it is best with rd, but only via the Content Layer:
import datetime as dt
import pandas as pd
import refinitiv.data as rd
rd.open_session(config_name="C:/Example.DataLibrary.Python-main/Configuration/refinitiv-data.config.json") # https://github.com/LSEG-API-Samples/Example.DataLibrary.Python/blob/main/Configuration/refinitiv-data.config.json
rd.open_session("desktop.workspace")
lookback_days = 10
today = dt.datetime.today()
start_date = (today - dt.timedelta(days=lookback_days)).strftime('%Y-%m-%d')
end_date = (today - dt.timedelta(days=1)).strftime('%Y-%m-%d') # Adjusted to get yesterday's date
pairs_rics = [
'AUDCAD=R' , 'AUDCHF=R' , 'AUDNZD=R' , 'AUDJPY=R' , 'AUD=' ,
'CADCHF=R' , 'CADJPY=R' , 'NZDCAD=R' , 'NZDCHF=R' , 'NZDJPY=R' , 'NZD=' ,
'EURAUD=R' , 'EURCAD=R' , 'EURCHF=R' , 'EURJPY=R' ,
'EURNZD=R' , 'EURGBP=R' , 'EUR=' ,
'GBPAUD=R' , 'GBPCAD=R' , 'GBPCHF=R' , 'GBPJPY=R' , 'GBPNZD=R' , 'GBP=' ,
'CHFJPY=R' , 'CHF=' , 'JPY=' , 'CAD=',
'US2YT=RR', 'DE2YT=RR', 'GB2YT=RR', 'JP2YT=RR', 'AU2YT=RR',
'NZ2YT=RR', 'CA2YT=RR', 'CH2YT=RR',
'US10YT=RR', 'DE10YT=RR', 'GB10YT=RR', 'JP10YT=RR',
'AU10YT=RR', 'NZ10YT=RR', 'CA10YT=RR', 'CH10YT=RR']
def chunks(lst, n):
"""Yield successive n-sized chunks from lst."""
_lst = []
for i in range(0, len(lst), n):
_lst.append(lst[i:i + n])
return _lst
batched_rics = chunks(lst = pairs_rics, n = 10)
for i,j in enumerate(batched_rics):
__optn_mrkt_price_gmt = rd.content.historical_pricing.summaries.Definition(
universe=j,
start=start_date,
end=end_date,
interval='P1D', # 'PT1M', 'PT10M', rd.content.historical_pricing.Intervals.DAILY
fields=['TRDPRC_1', 'SETTLE', 'BID', 'ASK']).get_data().data.df # 'LST_TRD_PR', 'CF_LAST', 'CF_CLOSE', 'SETTLE', 'TRDPRC_1', 'BID', 'ASK'
if i == 0:
final_data = __optn_mrkt_price_gmt.T
else:
final_data = final_data.append(__optn_mrkt_price_gmt.T)
final_data = final_data.T
display(final_data)
rd.close_session()0
Categories
- All Categories
- 6 AHS
- 37 Alpha
- 161 App Studio
- 4 Block Chain
- 4 Bot Platform
- 16 Connected Risk APIs
- 47 Data Fusion
- 30 Data Model Discovery
- 608 Datastream
- 1.3K DSS
- 577 Eikon COM
- 4.9K Eikon Data APIs
- 7 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- Trading API
- 2.7K Elektron
- 1.3K EMA
- 236 ETA
- 519 WebSocket API
- 33 FX Venues
- 10 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 20 Messenger Bot
- 2 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 59 Open Calais
- 264 Open PermID
- 39 Entity Search
- 2 Org ID
- PAM
- PAM - Logging
- 8.4K Private Comments
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 20 RDMS
- 1.4K Refinitiv Data Platform
- 367 Refinitiv Data Platform Libraries
- 3 Refinitiv Due Diligence
- LSEG Due Diligence Portal API
- 3 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.1K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 10 World-Check Customer Risk Screener
- 990 World-Check One
- 44 World-Check One Zero Footprint
- 45 Side by Side Integration API
- Test Space
- 3 Thomson One Smart
- 1.2K TR Internal
- Global Hackathon 2015
- 2 Specialists Who Code
- 10 TR Knowledge Graph
- 150 Transactions
- 142 REDI API
- 1.7K TREP APIs
- 4 CAT
- 21 DACS Station
- 117 Open DACS
- 1.1K RFA
- 103 UPA
- 172 TREP Infrastructure
- 224 TRKD
- 886 TRTH
- 5 Velocity Analytics
- 5 Wealth Management Web Services
- 59 Workspace SDK
- 9 Element Framework
- 5 Grid
- 13 World-Check Data File
- Yield Book Analytics
- 46 中文论坛