For a deeper look into our DataScope Select SOAP API, look into:

Overview |  Quickstart |  Documentation |  Downloads |  Tutorials

question

Upvotes
Accepted
1 0 1 0

TimeSeries response time – is DSS really the right choice?

Hi all,

We have been in contact with the Refinitiv sales team, which suggested to use DSS for our purpose. I'm starting to question whether this API is the right choice for our needs. We provide API's which provide risk calculation functionality using daily pricing time series data of the assets from the customers portfolio. Our setup is as follows:

1) We receive instrument identifiers from the customer (here we appreciate flexibility of DSS IdentifierType)
2) a. Search our local database for instrument time series existence

b. If not, collect via DSS (REST)

c. Store time series in local database and retrieve close price on a daily basis (backfill using time series endpoint in case of data gaps due to collection failure)

d. Additionally, we request a relatively small number of meta information via DSS (company name, sector, industry, ...) which is stored in

3) Calculate risk measures using multivariate time series as described in 2


The main issue is that DSS time series requests (step 2) take ~1-2 minutes for 10-20 (likely to be more in the near future) instruments (which is see has previously been discussed in this forum). This in turn prevents us to provide immediate response (step 3) to our customers.

I see 2 options to solve this:

1) Same setup but faster response time from Refinitiv

2) Preferred: replace local database by data provider completely, provided that response times are constantly <1 second (asset time series requests can be threaded). Meta information could take longer.

I'm wondering if somebody here has similar requirements or if the technical support staff has recommendations as to which API would fit our needs best?


Kind regards,
Marc

eikon-data-apidss-rest-apidatascope-selectdsstime-series
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Upvote
Accepted
6.7k 8 6 7

Hi @marcjuchli please see my responses to your questions below:

> Refinitiv Data Platform also has a Symbology Conversion endpoint.

I looked into this and see that this would probably work, however it would introduce N conversion calls whereas N=number of assets in a portfolio.
My question here is: which SYMBOL (type) is meant for the time series endpoints (RIC?), and could we use outputIdentifierTypeId = RIC when calling symbology conversion?

Answer: I dont think it need introduce N conversion calls regularly - as you would just need to store RICs (probably, ISINs, CUSIPs and SEDOLS etc as well) once in your database. Only as new customers gave new instruments would you need to hit the symbology endpoint.

> Are you doing a bulk scheduled extraction on a daily basis to get the closing prices into your database. Then you just make a new extraction every night to get the latest Daily closing values and append to your database -> therefore everything is always up-to-date.

I probably wasn't precise enough here, sorry. Yes we do bulk extraction, but would like to do so via REST API (FTP downloads wouldn't be convenient for us) and on our own schedule (e.g. on demand). However, we can only extract the assets that we already know are in use, which means we have to be adaptive in the sense that if a customer requests an ISIN that hasn't been downloaded yet, we'd get the time series on the fly. Our database is just a proxy to reduce latency when time series or meta data is requested. It's basically the way you assumed it in your response before.

> From the your above description - I take it you have a RIC list (which is growing as new customers provide new instruments). I assume you are scheduling bulk extraction for the RICs you already have in your list from DSS. The I suppose you are doing an on-demand extraction from DSS for new RICS and then inserting that into your main database and then doing your daily risk calculations (I am assuming that you only calculate these once a day).

That's exactly right, except that we do risk calculations upon request (depending on the requested constellation of assets).

> I am unsure why you would need to backfill the data due to a collection failure? Could you explain this a little more. In my mind either you get the data or you do not.

This is due to the fact that we collect, for the assets that we already "track" in our database, only the deltas (closing prices) from the last collection date to the current date. This collection process might fail, which means we'd have to go back and attempt to collect the missing time series data points. I understand this is due to the fact that we use REST instead of the FTP bulk but it's more convenient for us this way at the moment.

> We would need to know exactly what meta data fields - you are requesting but Refinitiv Data Platform will likely be able to deliver the kinds of meta data you would like,

That depends on the asset type.

In general:

  • Exchange Id
  • Exchange symbol
  • RIC
  • Description
  • Industry
  • Group
  • Sector
  • (Start/End date for expiring assets)

In addition, we require asset specific information in order to calculate the absolute (reinvested) prices. That is, mainly:

  • Corporate actions which are not incorporated in the adjusted price (e.g. dividends)

As well as for bonds:

  • Interest rate
  • Payment frequency
  • Coupon date
  • Bond type
  • Nominal Value
  • Rate calculation type
  • Day count method
  • Reference rate

for funds:

  • Fees

for structured products:

  • it's not implemented yet on our ends but will likely require additional information

Question: are these fields available via RTP or would we have to call DSS?

Answer: Unfortunately I believe many of these fields are not currently available in RDP - until they are available there - you would need to call DSS - they will all be there.

> As Umer suggested you can talk to your account manager about accessing the Refinitiv Data Platform (RDP). This has very robust and performant timeseries service. For daily closing prices this is no issue at all to retrieve on demand. The only issue iis the length of the timeseries request - I believe each request is limited to 10k points which for a daily close is quite long - for other intervals you might require an interative apporach. The APIs are web APIs using JSON payload. We also have a Refinitiv Data Platform library in .NET, Python and shortly Typescript wrappers to provide an additional ease of use layer. What language are you mainly coding in?

We are using Java, therefore an SDK would be ideal but REST works as well. The limitation of 10k points per request is fine since we can execute them concurrently and aggregate on our ends.

Question: as far as I can see the RTP timeseries API does not support the retrieval of time series for multiple symbols within one request, is that correct?

Answer: RDP Timeseries endpoint does support multiple RICs/symbols. RDP is built to handle concurrent requests.

-----

As far as I can see RTP could be suitable, provided that responses are fast enough. That means we would avoid the daily DSS time series bulk request and instead submit time series requests for or universe to the RTP time series endpoint on a daily basis. Likewise, for missing assets, we would call the same endpoint when this is requested by our customers.

Then again, it would be important to ensure that the time series represents of adjusted prices and we find a way to retrieve meta information that is not incorporated (see list above) with an immediate response.

Comment: I can't really advise on which would be better for you - but until RDP has all the data you require - it seems you would be looking at a hybrid DSS bulk and possibly RDP timeseries solution (for new instrument data returned in a quick timeframe). As a when the data becomes available on RDP you could then perhaps drop the DSS bulk part and shift entirely to RDP. The great thing about RDP is speed, robustness and ease-of-use standardised web APIs with multi-language library support. Further it would allow you to target all our content (real-time & non-realtime) from one set of APIs. However, as it is a new platform - we are doing our best to onboard content as quickly as we can.

I hope this can help.


icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Upvote
20.3k 73 10 20

Hi @marcjuchli

Not a content expert and not really familiar with the DSS offering, but have you discussed the Refinitiv Data Platform Historial Pricing data offering with your account team? It may offer a more timely response than DSS.

I did have a brief discussion with another customer recently who was considering RDP Historical Pricing as an alternative to DSS - but it did not quite meet the customer's particular requirements.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Hi @marcjuchli

Please ignore - Refinitiv Tick History on Google Cloud BigQuery is a T+1 service.

Upvote
6.7k 8 6 7

@marcjuchli see my comments in italic and underline below

1) We receive instrument identifiers from the customer (here we appreciate flexibility of DSS IdentifierType). Refinitiv Data Platform also has a Symbology Conversion endpoint.


2) a. Search our local database for instrument time series existence

b. If not, collect via DSS (REST)

c. Store time series in local database and retrieve close price on a daily basis (backfill using time series endpoint in case of data gaps due to collection failure)

d. Additionally, we request a relatively small number of meta information via DSS (company name, sector, industry, ...) which is stored in

Are you doing a bulk scheduled extraction on a daily basis to get the closing prices into your database. Then you just make a new extraction every night to get the latest Daily closing values and append to your database -> therefore everything is always up-to-date.

We would need to know exactly what meta data fields - you are requesting but Refinitiv Data Platform will likely be able to deliver the kinds of meta data you would like,

I am unsure why you would need to backfill the data due to a collection failure? Could you explain this a little more. In my mind either you get the data or you do not.

3) Calculate risk measures using multivariate time series as described in 2

Do you require different time intervals for these risk measures or is it just daily closing prices for a number of instruments?

From the your above description - I take it you have a RIC list (which is growing as new customers provide new instruments). I assume you are scheduling bulk extraction for the RICs you already have in your list from DSS. The I suppose you are doing an on-demand extraction from DSS for new RICS and then inserting that into your main database and then doing your daily risk calculations (I am assuming that you only calculate these once a day).

As Umer suggested you can talk to your account manager about accessing the Refinitiv Data Platform (RDP). This has very robust and performant timeseries service. For daily closing prices this is no issue at all to retrieve on demand. The only issue iis the length of the timeseries request - I believe each request is limited to 10k points which for a daily close is quite long - for other intervals you might require an interative apporach. The APIs are web APIs using JSON payload. We also have a Refinitiv Data Platform library in .NET, Python and shortly Typescript wrappers to provide an additional ease of use layer. What language are you mainly coding in?

I hope this can help - look forward to hear your response.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Upvotes
1 0 1 0

Thank you for the quick reply! See my response below.


> Refinitiv Data Platform also has a Symbology Conversion endpoint.

I looked into this and see that this would probably work, however it would introduce N conversion calls whereas N=number of assets in a portfolio.
My question here is: which SYMBOL (type) is meant for the time series endpoints (RIC?), and could we use outputIdentifierTypeId = RIC when calling symbology conversion?

> Are you doing a bulk scheduled extraction on a daily basis to get the closing prices into your database. Then you just make a new extraction every night to get the latest Daily closing values and append to your database -> therefore everything is always up-to-date.

I probably wasn't precise enough here, sorry. Yes we do bulk extraction, but would like to do so via REST API (FTP downloads wouldn't be convenient for us) and on our own schedule (e.g. on demand). However, we can only extract the assets that we already know are in use, which means we have to be adaptive in the sense that if a customer requests an ISIN that hasn't been downloaded yet, we'd get the time series on the fly. Our database is just a proxy to reduce latency when time series or meta data is requested. It's basically the way you assumed it in your response before.

> From the your above description - I take it you have a RIC list (which is growing as new customers provide new instruments). I assume you are scheduling bulk extraction for the RICs you already have in your list from DSS. The I suppose you are doing an on-demand extraction from DSS for new RICS and then inserting that into your main database and then doing your daily risk calculations (I am assuming that you only calculate these once a day).

That's exactly right, except that we do risk calculations upon request (depending on the requested constellation of assets).

> I am unsure why you would need to backfill the data due to a collection failure? Could you explain this a little more. In my mind either you get the data or you do not.

This is due to the fact that we collect, for the assets that we already "track" in our database, only the deltas (closing prices) from the last collection date to the current date. This collection process might fail, which means we'd have to go back and attempt to collect the missing time series data points. I understand this is due to the fact that we use REST instead of the FTP bulk but it's more convenient for us this way at the moment.

> We would need to know exactly what meta data fields - you are requesting but Refinitiv Data Platform will likely be able to deliver the kinds of meta data you would like,

That depends on the asset type.

In general:

  • Exchange Id
  • Exchange symbol
  • RIC
  • Description
  • Industry
  • Group
  • Sector
  • (Start/End date for expiring assets)

In addition, we require asset specific information in order to calculate the absolute (reinvested) prices. That is, mainly:

  • Corporate actions which are not incorporated in the adjusted price (e.g. dividends)

As well as for bonds:

  • Interest rate
  • Payment frequency
  • Coupon date
  • Bond type
  • Nominal Value
  • Rate calculation type
  • Day count method
  • Reference rate

for funds:

  • Fees

for structured products:

  • it's not implemented yet on our ends but will likely require additional information

Question: are these fields available via RTP or would we have to call DSS?

> As Umer suggested you can talk to your account manager about accessing the Refinitiv Data Platform (RDP). This has very robust and performant timeseries service. For daily closing prices this is no issue at all to retrieve on demand. The only issue iis the length of the timeseries request - I believe each request is limited to 10k points which for a daily close is quite long - for other intervals you might require an interative apporach. The APIs are web APIs using JSON payload. We also have a Refinitiv Data Platform library in .NET, Python and shortly Typescript wrappers to provide an additional ease of use layer. What language are you mainly coding in?

We are using Java, therefore an SDK would be ideal but REST works as well. The limitation of 10k points per request is fine since we can execute them concurrently and aggregate on our ends.

Question: as far as I can see the RTP timeseries API does not support the retrieval of time series for multiple symbols within one request, is that correct?


-----

As far as I can see RTP could be suitable, provided that responses are fast enough. That means we would avoid the daily DSS time series bulk request and instead submit time series requests for or universe to the RTP time series endpoint on a daily basis. Likewise, for missing assets, we would call the same endpoint when this is requested by our customers.

Then again, it would be important to ensure that the time series represents of adjusted prices and we find a way to retrieve meta information that is not incorporated (see list above) with an immediate response.


Thank you in advance and kind regards,
Marc

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Click below to post an Idea Post Idea