For a deeper look into our Eikon Data API, look into:

Overview |  Quickstart |  Documentation |  Downloads |  Tutorials |  Articles

question

Upvotes
Accepted
24 5 6 11

Error when using get_timeseries()

Hi there

I am currently trying to implement a python program to download time series data through Eikon Scripting API. For example, I want to download the minute-by-minute high price for certain US stocks (e.g. IBM, AAPL.O). The code is pretty straightforward, such as

tmp = ek.get_timeseries(chunk, fields=field, 
                                   start_date=today, end_date=dt.datetime.utcnow(),
                                   interval='minute', corax='adjusted',
                                   raw_output=True)

where `chunk` is `['IBM', 'AAPL.O']`. In practice, I want to download as many stocks as possible, hence the length of `chunk` can be >1000.

However, I am running into the following error:

... File "C:\ProgramData\Anaconda2\lib\site-packages\eikon\time_series.py", line 155, in get_timeseries
    result = eikon.json_requests.send_json_request(TimeSeries_UDF_endpoint, payload, debug=debug)
  File "C:\ProgramData\Anaconda2\lib\site-packages\eikon\json_requests.py", line 82, in send_json_request
    check_server_error(result)
  File "C:\ProgramData\Anaconda2\lib\site-packages\eikon\json_requests.py", line 130, in check_server_error
    raise requests.HTTPError(error_message, response=server_response)
HTTPError: Failed to deserialize backend response: invalid character 'E' looking for beginning of value

The most weird part is that, with the same code, I only run into this error sporadically, hence I am guessing I am hitting some kind of limit?

Thanks!

eikoneikon-data-apiworkspaceworkspace-data-apirefinitiv-dataplatform-eikonpythonpricing
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Upvote
Accepted
39.2k 75 11 27

The error comes from the Web service delivering timeseries data, and it's not related to any limits. I can easily reproduce the error requesting a single row for a single RIC:
ek.get_timeseries('AAPL.O', count=1, interval='minute')
The issue seems to be limited to certain RICs and certain time periods, e.g. I have no problem retrieving
ek.get_timeseries('IBM', count=1, interval='minute')
or
ek.get_timeseries('AAPL.O',count=1, end_date='2017-09-29T10:44:25.500903-04:00' , interval='minute').
However
ek.get_timeseries('AAPL.O',count=1, end_date='2017-09-29T14:44:25.500903-04:00' , interval='minute')
and
ek.get_timeseries('CSCO.O',count=1, end_date='2017-09-28T14:44:25.500903-04:00' , interval='minute')
return the same error.
The issue also seems to be intermittent. A few minutes after I wrote the above I no longer reproduce the error with the exact calls I used to reproduce it before.
I'm escalating the issue.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Thanks Alex! So in the meantime, shall I just bet on my luck, and use 'try' clause to attempt multiple times?

Upvotes
24 5 6 11

Just a followup:

I still rely on 'luck' to get the minute-by-minute data, for >2000 stocks that I am tracking. In order to increase my odd, I simply use 'try... except...' clause to make multiple attempts. I will set the timeout to be 10 seconds, in order to avoid the 'request timeout' error (which I've seen couple times).

So far, I am able to download the minute-by-minute data, for 1000 stocks, in one get_timeseries() function call (sometimes after multiple attempts).

However, I do see a lot of 'nan's in the downloaded data, I will open a new thread to describe the details.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Click below to post an Idea Post Idea