AWS redirect for MarketDepth data retrieval does not work (No 'Location' provided on response)
Hi
After making some minor changes (log traces, error handling, URL replacement, a return value is now provided) on the awsRetrieveDataAndSaveToFile method found on DSSOnDemandIntradayBarsRTH.java Java sample file, my method looks like the way is shown below:
public boolean extractAndSaveFileFromAWSServer( String filename, String extractedFileId) throws DataAccessException {
boolean success = false;
CloseableHttpClient httpclient = HttpClientBuilder.create().disableContentCompression().build();
logger.info("Retrieve compressed data directly from the AWS server on local file '"+ filename+"'");
//final String url = getConfig().getUrl() + "/Extractions/RawExtractionResults('"+extractedFileId+"')/$value";
final String url = getConfig().getUrl() + "/Extractions/ExtractedFiles('"+extractedFileId+"')/$value";
logger.info("Url for extraction file retrieval: "+url);
HttpGet requestGet = new HttpGet(url);
requestGet.addHeader("Authorization", "Token "+ getSessionToken());
requestGet.addHeader("Prefer", "respond-async");
//The next header asks for the download to occur through AWS:
requestGet.addHeader("X-Direct-Download", "true");
try {
HttpClientContext context = HttpClientContext.create();
HttpResponse responseGet = httpclient.execute(requestGet, context);
//Handle the redirection:
//Initialise awsURI to the initial URI (i.e. the same value as that of url):
URI awsURI = requestGet.getURI();
//Retrieve the last Location, which is the AWS URL:
List<URI> locations = context.getRedirectLocations();
if (locations != null) {
awsURI = locations.get(locations.size() - 1);
logger.info("AWS URI for file '"+filename+"' ('"+extractedFileId+"') is '"+awsURI+"'");
} else {
throw new IOException("ERROR: could not retrieve the AWS URI");
}
URL myURL = awsURI.toURL();
//To request the file from AWS we do not use the session Token (which is for the RTH server).
//Authentication is done through the AWS URL, which is a self signed URL.
HttpURLConnection myURLConnection = (HttpURLConnection)myURL.openConnection();
//This works both with compressed and non compressed files
try( DataInputStream readerIS = new DataInputStream( myURLConnection.getInputStream())) {
Files.copy (readerIS, Paths.get (filename));
}
logger.info("File '"+filename+"'was saved to disk (AWS).");
//Closing the http client
httpclient.close();
} catch (ClientProtocolException e) {
throw new DataAccessException("Error when reading/saving the extraction file "+extractedFileId+"!",e);
} catch (IOException e) {
throw new DataAccessException("Error when reading/saving the extraction file "+extractedFileId+"!",e);
}
success=true;
return success;
}
Unfortunately, we are getting and IOException exception (ERROR: could not retrieve the AWS URI )
2022-03-01 08:50:27,274 ERROR [main] - com.apama.iaf.transport.reuters.trth2.test.TestClientDepth: AdapterException occured!
com.apama.iaf.transport.reuters.trth2.exception.DataAccessException: Error when reading/saving the extraction file VjF8MHgwN2ViNWRhNjM1MWRhMjIyfA!
at com.apama.iaf.transport.reuters.trth2.api.ApiHelper.extractAndSaveFileFromAWS(ApiHelper.java:909)
at com.apama.iaf.transport.reuters.trth2.api.ApiHelper.saveAndReadReportExtractionsFiles(ApiHelper.java:778)
at com.apama.iaf.transport.reuters.trth2.handler.AbstractRequestHandlerImpl.process(AbstractRequestHandlerImpl.java:167)
at com.apama.iaf.transport.reuters.trth2.handler.AbstractRequestHandlerImpl.processRequest(AbstractRequestHandlerImpl.java:115)
at com.apama.iaf.transport.reuters.trth2.test.TestClientDepth.main(TestClientDepth.java:129)
Caused by: java.io.IOException: ERROR: could not retrieve the AWS URI
at com.apama.iaf.transport.reuters.trth2.api.ApiHelper.extractAndSaveFileFromAWS(ApiHelper.java:888)
... 4 more
which is raised where redirection's location availability is checked, as it's shown in the following code excerpt:
if (locations != null) {
awsURI = locations.get(locations.size() - 1);
logger.info("AWS URI for file '"+filename+"' ('"+extractedFileId+"') is '"+awsURI+"'");
} else {
throw new IOException("ERROR: could not retrieve the AWS URI");
}
Any idea @Christiaan Meihsl Meihsl about what's causing the issue?
Thanks in advance for your help.
Best Answer
-
You are using the /Extractions/ExtractedFiles('"+extractedFileId+"')/$value" endpoint.
As I know, some files, such as note files, don't support AWS download.
You may use Postman to verify the problem by using the HTTP GET method with this URL: https://selectapi.datascope.refinitiv.com/RestApi/v1/Extractions/ExtractedFiles('VjF8MHgwN2ViNWRhNjM1MWRhMjIyfA')/$value.
The HTTP headers are:
The request will be redirected to AWS.
Otherwise, you can add the highlighted code to verify the status code and headers.
0
Answers
-
Thank you, @Jirapongse, for your reply
We updated our code in order to request Notes file's from the, let's say, "Refinitiv side" and the extraction file from the "AWS" one. When trying to download the latest one, we get an unexpected SSLPeerUnverifiedException exception with the message shown below:
javax.net.ssl.SSLPeerUnverifiedException: Host name 'a205143-use1-prod-results-custom.s3.amazonaws.com' does not match the certificate subject provided by the peer (CN=*.s3.amazonaws.com, O="Amazon.com, Inc.", L=Seattle, ST=Washington, C=US)
The full stack trace is also provided
2022-03-01 16:14:17,933 ERROR [main] - com.apama.iaf.transport.reuters.trth2.test.TestClientDepth: AdapterException occured!
com.apama.iaf.transport.reuters.trth2.exception.DataAccessException: Error when reading/saving the extraction file VjF8MHgwN2ViOTQ5MzQ3Y2RhMzI0fA!
at com.apama.iaf.transport.reuters.trth2.api.ApiHelper.extractAndSaveFileFromAWSServer(ApiHelper.java:912)
at com.apama.iaf.transport.reuters.trth2.api.ApiHelper.saveAndReadReportExtractionsFiles(ApiHelper.java:778)
at com.apama.iaf.transport.reuters.trth2.handler.AbstractRequestHandlerImpl.process(AbstractRequestHandlerImpl.java:167)
at com.apama.iaf.transport.reuters.trth2.handler.AbstractRequestHandlerImpl.processRequest(AbstractRequestHandlerImpl.java:115)
at com.apama.iaf.transport.reuters.trth2.test.TestClientDepth.main(TestClientDepth.java:129)
Caused by: javax.net.ssl.SSLPeerUnverifiedException: Host name 'a205143-use1-prod-results-custom.s3.amazonaws.com' does not match the certificate subject provided by the peer (CN=*.s3.amazonaws.com, O="Amazon.com, Inc.", L=Seattle, ST=Washington, C=US)
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.verifyHostname(SSLConnectionSocketFactory.java:466)
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:396)
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:354)
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134)
at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353)
at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at com.apama.iaf.transport.reuters.trth2.api.ApiHelper.extractAndSaveFileFromAWSServer(ApiHelper.java:877)
... 4 moreAny idea about why we are getting this error?
Thanks
Ricardo
0 -
I found the Problem with SSL subject matching in Apache’s HttpClient article on Medium.
It mentions that this issue was solved in Apache’s HttpClient version 4.5.
I tested it. If I use HttpClient 4.4.1, I get the following error.
javax.net.ssl.SSLPeerUnverifiedException: Host name 'a205143-use1-prod-results-custom.s3.amazonaws.com' does not match the certificate subject provided by the peer (CN=*.s3.amazonaws.com, O="Amazon.com, Inc.", L=Seattle, ST=Washington, C=US)
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.verifyHostname(SSLConnectionSocketFactory.java:465)
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:395)
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:353)
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134)However, if I use HttpClient 4.5, the example runs fine.
0 -
One more time, thank you, @Jirapongse, for your replyñ.
Once the HttpClient jar was upgraded to the suggested version, everything run fine.
Thanks
Regards,
Ricardo
0
Categories
- All Categories
- 6 AHS
- 37 Alpha
- 161 App Studio
- 4 Block Chain
- 4 Bot Platform
- 16 Connected Risk APIs
- 47 Data Fusion
- 30 Data Model Discovery
- 608 Datastream
- 1.3K DSS
- 577 Eikon COM
- 4.9K Eikon Data APIs
- 7 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- Trading API
- 2.7K Elektron
- 1.3K EMA
- 236 ETA
- 519 WebSocket API
- 33 FX Venues
- 10 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 20 Messenger Bot
- 2 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 59 Open Calais
- 264 Open PermID
- 39 Entity Search
- 2 Org ID
- PAM
- PAM - Logging
- 8.4K Private Comments
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 20 RDMS
- 1.4K Refinitiv Data Platform
- 367 Refinitiv Data Platform Libraries
- 3 Refinitiv Due Diligence
- LSEG Due Diligence Portal API
- 3 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.1K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 10 World-Check Customer Risk Screener
- 990 World-Check One
- 44 World-Check One Zero Footprint
- 45 Side by Side Integration API
- Test Space
- 3 Thomson One Smart
- 1.2K TR Internal
- Global Hackathon 2015
- 2 Specialists Who Code
- 10 TR Knowledge Graph
- 150 Transactions
- 142 REDI API
- 1.7K TREP APIs
- 4 CAT
- 21 DACS Station
- 117 Open DACS
- 1.1K RFA
- 103 UPA
- 172 TREP Infrastructure
- 224 TRKD
- 886 TRTH
- 5 Velocity Analytics
- 5 Wealth Management Web Services
- 59 Workspace SDK
- 9 Element Framework
- 5 Grid
- 13 World-Check Data File
- Yield Book Analytics
- 46 中文论坛