I am using (SAS 9.4 M5 on Linux) PROC HTTP to call an in-house Windows API.
I use POST and pass JSON file in, and it returns a JSON file.
To speed the process up I want to GZIP the JSON in and out.
The input of the GZIP'd JSON works fine for all file sizes.
The output in JSON format works for all file sizes.
The output in GZIP'd format works if the resulting .gz file is < 65 KB, BUT fails for anything larger.
I can run the cURL from DOS which passes .gz file from Linux box location via the windows API and it writes a resulting .gz file to Linux (whether input large or small).
What can i add to the filename statement for the .gz file - to resolve size issue?
What can i add to the proc http ?
filename inZIPd "&theFilePath./&theDSname._toACS.json.gz";
filename outZIPd "&theFilePath./&theDSname._resp.json.gz";
proc http
url = &URLString.
method = "POST"
in = inZIPd
out = outZIPd;
headers
"Content-Type" = "application/json"
"Content-Encoding" = "gzip"
"Accept" = "application/json"
"Accept-Encoding" = "gzip"
;
run;
I would recommend you open a track with SAS Technical Support here: https://support.sas.com/ctx/supportform/createForm
What is the server response ? You can capture the response with option
HEADEROUT=<fileref>
Thanks Richard, this is the result of adding headerout=<fileref>
Case : GZIP in JSON out, the header.txt file results are
HTTP/1.1 200 OK
Cache-Control: no-cache
Pragma: no-cache
Content-Type: application/json; charset=utf-8
Expires: -1
Server: Microsoft-IIS/10.0
Service-Version: 1.2.0.9825
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Thu, 07 May 2020 05:23:09 GMT
Content-Length: 15503684
Case : GZIP in and GZIP out (expected GZIP file > 64 KB)
HTTP/1.1 200 OK
Cache-Control: no-cache
Pragma: no-cache
Transfer-Encoding: chunked
Content-Type: application/json; charset=utf-8
Content-Encoding: gzip
Expires: -1
Vary: Accept-Encoding
Server: Microsoft-IIS/10.0
Service-Version: 1.2.0.9825
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Thu, 07 May 2020 05:28:48 GMT
Case : GZIP in and out of a small file - which works - the difference is 'transfer-encoding: chunked' is used on large .gz file.
HTTP/1.1 200 OK
Cache-Control: no-cache
Pragma: no-cache
Content-Type: application/json; charset=utf-8
Content-Encoding: gzip
Expires: -1
Vary: Accept-Encoding
Server: Microsoft-IIS/10.0
Service-Version: 1.2.0.9825
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Thu, 07 May 2020 05:32:43 GMT
Content-Length: 5506
I can't really help with out seeing the transferred file, but you could try using a hex viewer to look at the received file near the 64K point to see if any hints are there.
You can also try using DS2 HTTP PACKAGE. It might have finer control and diagnostics than PROC HTTP.
DS2 HTTP Package Methods, Operators, and Statements
@RichardDeVen your suggestion to review the file with a hex editor is a good one and precisely what I did when I reviewed the track opened by @ShirleyD.
It appears the HTTP procedure is not correctly handling the embed content length in the stream of data when the server sends a response with the a chunked transfer encoding. Once the investigation into this is completed I'll update this thread again.
SAS Innovate 2025 is scheduled for May 6-9 in Orlando, FL. Sign up to be first to learn about the agenda and registration!
Learn how use the CAT functions in SAS to join values from multiple variables into a single value.
Find more tutorials on the SAS Users YouTube channel.
Ready to level-up your skills? Choose your own adventure.