question

Upvotes
Accepted
1 0 0 2

Trial Licence IntraDay Summary Limit

Hi there, we are currently using a trial licence (until we receive our full licence). I have used the C# template code to extract Intraday Summary data (all fields) at second level, creating one file per year of data. We have found an issue where each file only has data up to (approx.) 12th Jan (approx. 145MB in size) . Even when I extract data for two/five/ten years the file still remains virtually the same size, proving that no additional data is included. For comparison, when one year of data is downloaded from the portal it is 1.2GB in size. Please help as this is an urgent issue that we require a quick resolution for.

Thanks in advance.

Vaughan

tick-history-rest-apiapihistorical
icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Upvotes
Accepted
13.7k 26 8 12

@vaughan.rees, as Veerapath states, this looks like an issue with the retrieval of concatenated gzip files, which the TRTH servers generate for large data sets.

Depending on how you retrieve the data, how you code that retrieval, and which library you use, you can run into a situation where not all data is downloaded, without generating an error. The larger the data set, the higher the probability you will run into this issue. Decompressing data on the fly makes things worse, that is why we strongly recommend you first download and save the compressed data, and only open it up (from local hard disk) afterwards, for whatever treatment you want to apply to it.

Pages 4 and 5 of the advisory Veerapath mentioned give some information on how to avoid this issue when coding with C#. For further information, please refer to the .Net SDK Tutorial 5. Its downloadable C# code sample contains code that illustrates various ways of downloading and saving your extraction data from TRTH. It also shows how to download the data from AWS (the Amazon Web Service cloud), which is faster than the standard download. On the topic of AWS downloads, see this advisory.

In a nutshell, the recommended method is to download and save the compressed data, from AWS, without decompressing on the fly.

If this answer does not suffice, please post the code you use to retrieve and save the data, that could be useful for us to help you more.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Thanks for the detail Christiaan, very much appreciated.

Upvotes
11.3k 25 8 13

Hi @vaughan.rees

It looks like the incomplete output issue, where some HTTP clients do not fully support concatenated gzip files. Please check this documentation ADVISORY: AVOID INCOMPLETE OUTPUT - DOWNLOAD THEN DECOMPRESS for more information.

icon clock
10 |1500

Up to 2 attachments (including images) can be used with a maximum of 5.0 MiB each and 10.0 MiB total.

Many thanks Veerpath, this has helped me!

Click below to post an Idea Post Idea