Get historical expiry dates for futures
using the eikon api with python, i'd like to retrieve the expiry date of expired future contracts. Let me explain: ek.get_data(instruments='Wc1',fields='EXPIR_DATE') successfully gives me the current expiry date of Wc1. This is already great, but I also want to know the expiry date of Wc1 on any given date in the past. Example: the expiry date of Wc1 at 1-aug-2020 was 14-sep-2020. How/where can I input a date or date range in order to get expiry dates?
NB: the related posts on this topic did not help me.
Best Answer
-
They told me it's not possible. In the end I looked up all the rules for each seperate future (which months and which day in the month it rolls over), then wrote a python script to calculate all the expiry dates from 1980 until 2030 for each given future.
0
Answers
-
You may need to contact the content support team via MyRefinitiv to get a chain RIC (prefixed with 0#) which can provide future RICs for Wc1. Then, you can use that chain RIC to get expiration dates. You can refer to this thread for the code.
0 -
@Andreas Sure. but I dont have time to explain it so you'll have to dig through it yourself
Please note that the mappings.future_map table contains the future_codes (eikon RICS) and contracts_available (an int which tells python how much deliveries a future has ahead. If you dont know it, just set it to 12; this gives you 12 periods ahead).
Python script:# purpose: we want daily currency rates
import logging as logger
from datetime import datetime, timedelta
import time
import eikon as ek
from sqlalchemy import create_engine
import os
import sys
sys.path.insert(1, os.path.join(sys.path[0], '..'))
from sales_report_library import *
import warnings
warnings.simplefilter(action='ignore', category=FutureWarning) #https://stackoverflow.com/questions/40659212/futurewarning-elementwise-comparison-failed-returning-scalar-but-in-the-futur
#####INPUT
query = "select distinct future_code,contracts_available from mappings.future_map where retrieve = 'dev';"
given_start_date_str = '1988-01-01' #ran from 1988-01-01
given_end_date_str = '2020-09-20' #max yesterday.
steps = 40 #steps*futures*contracts (40*5*13). must be smaller than 3000, because max records = 3000.
#####INPUT
#####INIT
scrape_date = datetime.today()
initialize_logger(logging_directory='log', console_level='INFO')
logger.info(scrape_date)
data_source = 'futures'
params = config(filename='../specific.ini', section='postgresql') # some settings
engine = create_engine(f"postgresql+psycopg2://{params['user']}:{params['password']}@{params['host']}:{params['port']}/{params['database']}") # engine to write to DB
given_start_date = datetime.strptime(given_start_date_str, '%Y-%m-%d')
given_end_date = datetime.strptime(given_end_date_str, '%Y-%m-%d')
all_contracts = list()
#connect to eikon with the app key
params = config(filename='..\generic.ini', section='eikon_api')
try:
ek.set_app_key(params['apikey'])
except:
logger.error('Could not connect to eikon. probably logged out.')
exit(1)
#get meta data
logger.info('Get meta data from DB')
df_meta = sql_to_df(query)
future_codes = list(set(df_meta['future_code']))
#step 1) create a list with ALL the contracts we want. so all contracts of all futures.
logger.info('Loop over future codes')
for future_code in future_codes:
logger.info(f'Iteration future_code={future_code}')
df_meta_temp = df_meta.loc[df_meta['future_code'] == future_code]
df_meta_temp.reset_index(drop=True, inplace=True) # reset row index
contracts_available = df_meta_temp.loc[0, 'contracts_available']
contracts = [f'{future_code}c{i}' for i in range(1, contracts_available + 1)] #Wc1, Wc2 etc.
all_contracts = all_contracts + contracts
#end for loop
logger.info(f'Now going to retrieve all {len(all_contracts)} contracts for the {len(future_codes)} futures.')
start_date = given_start_date
end_date = given_end_date
while start_date <= given_end_date:
end_date = start_date + timedelta(days=steps)
if end_date > given_end_date: #if we are too far already
end_date = given_end_date
start_date_str = start_date.strftime('%Y-%m-%d')
end_date_str = end_date.strftime('%Y-%m-%d')
# make the api call for these dates. get timeseries from eikon. this returns the dates in the rows (from start to end date) and all the contracts in the columns. The values are the CLOSE values.
logger.info(f'############################ Retrieving rates from {start_date_str} to {end_date_str}...')
try:
df = ek.get_timeseries(all_contracts, start_date=start_date_str, end_date=end_date_str, interval="daily", fields="CLOSE")
except:
print('crasht. sleep 60 sec and try this iteration again')
time.sleep(60)
ek.set_app_key(params['apikey'])
continue #goes back to the while loop. start_date hasnt changed so we are doing exactly the same iteration
#clean the data
logger.info('Made the api call. Now clean it.')
df.insert(loc=0, column='future_date', value=df.index) #take the dates from the rownames and put them in a new column
df_cleaned = pd.melt(df, id_vars='future_date', var_name='contract') #unpivot the data
df_cleaned = df_cleaned.loc[(df_cleaned['value'] != '') & (pd.notnull(df_cleaned['value'])) & (~pd.isna(df_cleaned['value']))] #remove rows where field 'value' is empty
#insert to database
logger.info(f'Start inserting into postgres. {len(df_cleaned.index)} rows...')
df_cleaned.to_sql(schema='stage', name='futures', con=engine, if_exists='append', index=False, chunksize=10000) # appends the dataframe to the specified sql table.
logger.info(f'Finished inserting into postgres')
start_date = end_date + timedelta(days=1) #in the next iteration the start the day after we just finished.
logger.info('Sleeping...')
time.sleep(30) # as not to overload eikon.
#end while loop
0
Categories
- All Categories
- 6 AHS
- 39 Alpha
- 161 App Studio
- 4 Block Chain
- 4 Bot Platform
- 16 Connected Risk APIs
- 47 Data Fusion
- 30 Data Model Discovery
- 608 Datastream
- 1.3K DSS
- 577 Eikon COM
- 4.9K Eikon Data APIs
- 7 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- Trading API
- 2.7K Elektron
- 1.3K EMA
- 236 ETA
- 519 WebSocket API
- 33 FX Venues
- 10 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 20 Messenger Bot
- 2 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 59 Open Calais
- 264 Open PermID
- 39 Entity Search
- 2 Org ID
- PAM
- PAM - Logging
- 8.4K Private Comments
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 20 RDMS
- 1.4K Refinitiv Data Platform
- 367 Refinitiv Data Platform Libraries
- 3 Refinitiv Due Diligence
- LSEG Due Diligence Portal API
- 3 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.1K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 10 World-Check Customer Risk Screener
- 990 World-Check One
- 44 World-Check One Zero Footprint
- 45 Side by Side Integration API
- Test Space
- 3 Thomson One Smart
- 1.2K TR Internal
- Global Hackathon 2015
- 2 Specialists Who Code
- 10 TR Knowledge Graph
- 150 Transactions
- 142 REDI API
- 1.7K TREP APIs
- 4 CAT
- 21 DACS Station
- 117 Open DACS
- 1.1K RFA
- 103 UPA
- 172 TREP Infrastructure
- 224 TRKD
- 886 TRTH
- 5 Velocity Analytics
- 5 Wealth Management Web Services
- 60 Workspace SDK
- 9 Element Framework
- 5 Grid
- 13 World-Check Data File
- Yield Book Analytics
- 46 中文论坛