For a deeper look into our Eikon Data API, look into:

Overview |  Quickstart |  Documentation |  Downloads |  Tutorials |  Articles

question

Upvotes
Accepted
20 0 2 1

1. get_timeseries cuts off historic data 2. get_data not working for all tickers

The two problems below only occur in when using the API with R.

Everything works well when using Eikon Workstation of the Eikon Excel add-inn.


#PROBLEM 1

#get_timeseries() cuts off timeseries - irregularly

#Example tickers: IBM (17 Mar 1980), MSFT (13 Mar 1986), .IBLEM0002 (31 Oct 2010)

#Full historic data is available on Eikon and on the Excel-Eikon for all tickers

#Prepare

library(devtools)

library(eikonapir)

library(tidyr)

eikonapir::set_proxy_port(9000L)

eikonapir::set_app_id('Put your API Key here')

#Create tickers

tickers = c("MSFT.O","IBM",".IBLEM0002")

tickers = c("MSFT.O",".IBLEM0002")

tickers = c("IBM",".IBLEM0002")

tickers = c(".IBLEM0002")

#Create tickers input list

ticker.list = vector("list", length(tickers))

ticker.list[1:length(tickers)] = tickers

#Timeframe

startDate = "2003-01-01T00:00:00"

endDate = paste("2020-03-27","T00:00:00",sep="")

#Get prices

p.out = get_timeseries(

rics = ticker.list,

fields = list("TIMESTAMP","CLOSE"),

start_date = startDate,

end_date = endDate,

interval = "daily")

names(p.out) = c("Date","Close","Tickers")

#Convert from stacked to wide format

p.out = spread(data = p.out, key = Tickers ,value = Close)

names(p.out) = c("Date",tickers)

head(p.out,3)

tail(p.out,3)


#RESULTS - different timeseries start dates depending on ticker combination

#If only .IBLEM0002 is included then timeseries gives the full series

#If only MSFT.O or IBM are included then timeseries is cut off 2009-04-29

#If tickers = c("MSFT.O","IBM",".IBLEM0002"), ".IBLEM0002" starts 2016-03-11 and MSFT, IBM later

#If tickers = c("IBM",".IBLEM0002"), ".IBLEM0002"starts 2016-03-11 and MSFT later

#If tickers = c("MSFT.O",".IBLEM0002"), ".IBLEM0002"starts 2014-03-05 and IBM later


#PROBLEM 2

#get_data get full timeseries for et IBM, MSFT.O BUT nothing for .IBLEM0002

#Full historic data is available on Eikon and on the Excel-Eikon also for .IBLEM0002 (start in 2010)

startDate = "2003-01-01"

endDate = "2020-03-27"

p.out = get_data(

instruments = ticker.list,

fields = list("TR.PriceClose.Date","TR.PriceClose"),

parameters = list("Frq"="M","SDate"=startDate,"EDate"=endDate))

names(p.out) = c("Tickers","Date","Close")

p.out = spread(data = p.out, key = Tickers ,value = Close)

head(p.out,3)

tail(p.out,3)


I need to download the full historic timeseries for a range of different ticker types. Please let me know if there is an error in my formulas or if there is a different more consistent approach.


Thank you very much in advance

eikoneikon-data-apiworkspaceworkspace-data-apirefinitiv-dataplatform-eikonpythonhistoricalr
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Upvotes
Accepted
39.2k 75 11 27

You can cut down the number of requests by retrieving multiple fields in a single request. Using the following code I retrieve 13 years of daily price history for the constituents of S&P 500 in about 8 minutes. In this code spx is the list of RICs for the constituents of S&P 500.

ts = pd.DataFrame()
chunk_size = 50
for i in range(0, len(spx), chunk_size):
    rics = spx[i : i + chunk_size]
    df, err = ek.get_data(rics,
                      ['TR.CLOSEPRICE.date','TR.CLOSEPRICE',
                       'TR.HIGHPRICE','TR.LOWPRICE',
                       'TR.OPENPRICE','TR.ACCUMULATEDVOLUME'],
                     {'SDate':'-13Y', 'EDate':'0D'})
    ts = ts.append(df, sort=False)
ts

The resulting dataframe has 1.65 million rows. I must admit I have no first hand experience with Bloomberg. I can easily believe that using Bloomberg API one can retrieve all these timeseries in a single request. Then again adding the above code to break the list of instruments into chunks and to retrieve chunks in a loop is not a huge task, and you only need to do this once. As for the data retrieval time I find it very hard to believe that all this data can be retrieved in 5 seconds from any system. If it is not an exaggeration, that is mightily impressive. 8 minutes is not the best time you can possibly get out of Eikon, although further improvement would not be as straightforward as in the above code sample. And even with the max optimization I don't think we can get the retrieval time to significantly under a minute.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Upvotes
38.1k 71 35 53

For the first problem, it could be limitations in Eikon Data APIs mentioned in the EIKON DATA API USAGE AND LIMITS GUIDELINE.

get_timeseries: The current limit value (10-Oct-2019) is 3,000 data points (rows) for interday intervals and 50,000 data points for intraday intervals. This limit applies to the whole request, whatever the number of requested instrument. 

For the second problem, the TR.PriceClose field is not available for .IBLEM0002. You can use the Data Item Browser (DIB) tool to verify it.


1585534759088.png (41.1 KiB)
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Upvotes
20 0 2 1

Thank you @irapongse.phuriphanvichai - a bit disappointing. This is much much less than what Bloomberg allows - making the API less useful for even a simple analysis such as the S&P500 10 years data: 500 x 10 x 250 = 1,250,000 data points or 417 requests - essentially looping over each constituent. Even just analyzing a single stock over the past 20 years needs 2 requests and a merge.

Question: How is the community dealing with this problem? Thank you very much in advance.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Upvotes
39.2k 75 11 27

@HeikoRR
To retrieve timeseries of daily price history for exchange traded instruments including stocks and indices you can use get_data method with TR.CLOSEPRICE field.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Upvotes
20 0 2 1

Hi Alex, thank you for your reply.


get_timeseries():

I have answered you with respect to get_timeseries()

https://community.developers.refinitiv.com/questions/14078/data-limis-for-eikon-api-proxy.html?childToView=57588#comment-57588

There, the result for a OHLC, Volume, daily, 13 years for S&P 500 constituents were:

EIKON: 5000 requests and 2500 merging of time series, 27.8 minutes

BLOOMBERG: 1 request, no merging, 5 seconds


get_data():

I tried to do the same using get_data(). The situation is a bit better but still way worse than using the Bloomberg API: Assume again I want to do a quick analysis on the S&P500 with

Open Low High Close and Volume (5 timeseries), daily, starting in 2007 (13 years ago)

For the time period (2007-01-03 to 2020-03-25, 3332 days) I can only request 63 to 64 tickers in one request (I tried many combinations). That amounts to 500 * 5 / 63 = 40 requests. If I request more, get_data() returns NULL without an error message.

EIKON: 40 requests, no merging of time series

BLOOMBERG: 1 request, no merging, 5 seconds

Note: Some tickers such as .IBLEM0002 do not have the TR.PriceClose field and have to be retrieved using get_timeseries()


Please let me know if I miss anything?

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Upvotes
20 0 2 1

Hi Alex,

Yes, thank you again. Sure it is easy to loop over chucks of rics.

But why having these restrictions in the first place? It would be helpful if you could internally see if these restrictions could be adapted to allow to retrieve reasonable amounts of data per day in a single request - especially the limits on get_timeseries are way to low.

Thank you in advance, Heiko

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Thanks again for the feedback. The reason for the limits on how much data can be retrieved in a single request is to protect the platform from abuse and to protect the user from accidentally exhausting daily data retrieval limits, which is what we hear some users complain about happening with competitor products. This said I agree that 3K rows limit for single get_timeseries call for interday intervals is too restrictive. It is an unfortunate limitation of a legacy backend system that is currently behind get_timeseries method. As we develop and roll out the new Refinitiv Data Platform, it will replace the legacy system for timeseries with its limitations.

Click below to post an Idea Post Idea