Automatic Decompression in HTTP request for Tick History Time and Sales report
Hi there!
I'm trying to download the Tick History Time and Sales report via R library httr.
Apparently, the GET method decompress it automatcally, what could be a real problem for large reports.
Is there a way to work around this?
I read in this page that if the request doesn't have the "Accept-Encoding" header, the response wouldn't have the "Content-Encoding" header and the client shouldn't decompress it, but that is not true, the "Content-Encoding" is still present.
So I don't know if the problem is with the R or with the API.
Best Answer
-
The TRTH servers deliver compressed (or not) data depending on several things, as described under heading "Compression" in this page. For an On Demand Time and Sales extraction I believe it should always deliver compressed data, whatever you set in the GET header.
What does the response header contain ? I would expect it to contain this:
Content-Encoding: gzip Content-Type: text/plain
If that is the case, the content is compressed.
But some HTTP clients automatically decompress data when they receive compressed data (Postman does that) I guess you have run into that as well. You must disable httr’s content decoding, using "config(http_content_decoding=0)" in the GET call.
Here is a code snippet that does it:
TRTHRawExtractionResults <- function(token,jobid,Path,Overwrite = TRUE) {
url <- paste0("https://hosted.datascopeapi.reuters.com/RestApi/v1/Extractions/RawExtractionResults('",jobid,"')/$value")
r <- httr::GET(url,add_headers(prefer = "respond-async",Authorization = token),config(http_content_decoding=0),write_disk(Path,Overwrite),progress())
stop_for_status(r)
return(r)
}This is an extract from this article that describes an example of R code that does a time and sales extraction.
If that does not help, please post your code so we can have a look at it.
0
Answers
-
Thank you very much! It its working perfectly now.
0
Categories
- All Categories
- 6 AHS
- 39 Alpha
- 161 App Studio
- 4 Block Chain
- 4 Bot Platform
- 16 Connected Risk APIs
- 47 Data Fusion
- 30 Data Model Discovery
- 608 Datastream
- 1.3K DSS
- 577 Eikon COM
- 4.9K Eikon Data APIs
- 7 Electronic Trading
- Generic FIX
- 7 Local Bank Node API
- Trading API
- 2.7K Elektron
- 1.3K EMA
- 236 ETA
- 519 WebSocket API
- 33 FX Venues
- 10 FX Market Data
- 1 FX Post Trade
- 1 FX Trading - Matching
- 12 FX Trading – RFQ Maker
- 5 Intelligent Tagging
- 2 Legal One
- 20 Messenger Bot
- 2 Messenger Side by Side
- 9 ONESOURCE
- 7 Indirect Tax
- 59 Open Calais
- 264 Open PermID
- 39 Entity Search
- 2 Org ID
- PAM
- PAM - Logging
- 8.4K Private Comments
- 6 Product Insight
- Project Tracking
- ProView
- ProView Internal
- 20 RDMS
- 1.4K Refinitiv Data Platform
- 367 Refinitiv Data Platform Libraries
- 3 Refinitiv Due Diligence
- LSEG Due Diligence Portal API
- 3 Refinitiv Due Dilligence Centre
- Rose's Space
- 1.1K Screening
- 18 Qual-ID API
- 13 Screening Deployed
- 23 Screening Online
- 10 World-Check Customer Risk Screener
- 990 World-Check One
- 44 World-Check One Zero Footprint
- 45 Side by Side Integration API
- Test Space
- 3 Thomson One Smart
- 1.2K TR Internal
- Global Hackathon 2015
- 2 Specialists Who Code
- 10 TR Knowledge Graph
- 150 Transactions
- 142 REDI API
- 1.7K TREP APIs
- 4 CAT
- 21 DACS Station
- 117 Open DACS
- 1.1K RFA
- 103 UPA
- 172 TREP Infrastructure
- 224 TRKD
- 886 TRTH
- 5 Velocity Analytics
- 5 Wealth Management Web Services
- 59 Workspace SDK
- 9 Element Framework
- 5 Grid
- 13 World-Check Data File
- Yield Book Analytics
- 46 中文论坛