BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
Ullsokk
Pyrite | Level 9

I am trying to pass a json to the SAS Viya API by using proc http. My json data needs to contain some macro variables for table uri and template href and uris

 

In python, the data looks like this:

data = {
        "name": name,
        "dataTableUri": dataTableUri,
        "type": "predictive",
        "pipelineBuildMethod": "template",
        "analyticsProjectAttributes": {
            "targetVariable" : target,
            "partitionEnabled" : True,
            "targetEventLevel" : "1"
        },
        "settings": {
            "applyGlobalMetadata" : "false"
        },
        "links":
        [
                {
                    "method": "GET",
                    "rel": "initialPipelineTemplate",
                    "href": pipelineHref,
                    "uri": pipelineUri,
                    "type": "application/vnd.sas.analytics.pipeline.template"
                }
        ]
    }

In sas, I am trying the following to create the json data:

 

filename json_in temp;
data _null_;
file json_in;
input text $500.;
textResolved=dequote(resolve(quote(text)));
put _infile_;
datalines;
 {"name": name,
        "dataTableUri": &dataTableUri.,
        "type": "predictive",
        "pipelineBuildMethod": "template",
        "analyticsProjectAttributes": {
            "targetVariable" : &target,
            "partitionEnabled" : True,
            "targetEventLevel" : "1"
        },
        "settings": {
            "applyGlobalMetadata" : "false"
        },
        "links":
        [
                {
                    "method": "GET",
                    "rel": "initialPipelineTemplate",
                    "href": &pipelineHref.,
                    "uri": &pipelineUri.,
                    "type": "application/vnd.sas.analytics.pipeline.template"
                }
        ]
    }

But I just get problems with unbalanced qoutes. Creating the json without using deqoute(resolve(qoute(... ))) to resolve the macro varaibles in this way works, but doesnt solve my problem, since I need to fill in those values

filename json_in temp;
data _null_;
file json_in;
input ;
put _infile_;
datalines;
 {"name": name,
        "dataTableUri": &dataTableUri.,
        "type": "predictive",
        "pipelineBuildMethod": "template",
        "analyticsProjectAttributes": {
            "targetVariable" : &target,
            "partitionEnabled" : True,
            "targetEventLevel" : "1"
        },
        "settings": {
            "applyGlobalMetadata" : "false"
        },
        "links":
        [
                {
                    "method": "GET",
                    "rel": "initialPipelineTemplate",
                    "href": &pipelineHref.,
                    "uri": &pipelineUri.,
                    "type": "application/vnd.sas.analytics.pipeline.template"
                }
        ]
    }
run;

1 ACCEPTED SOLUTION

Accepted Solutions
Tom
Super User Tom
Super User

Check out PROC STREAM.

What Does the STREAM Procedure Do?

The STREAM procedure enables you to process an input stream that consists of arbitrary text that can contain SAS macro specifications. The macros are executed and expanded while the other text in the input stream is preserved. The text stream is not validated as SAS syntax. The output stream is sent to an external file that is referenced by a fileref and that can be defined to use any traditional SAS output destination.

View solution in original post

6 REPLIES 6
RichardDeVen
Barite | Level 11

You will need to RESOLVE each _INFILE_ line in order to perform macro symbol resolution.

_infile_ = RESOLVE(_infile_);

If the macro symbol value needs to be double quoted in the output, make sure your datalines already contains those double quotes

datalines;
...
"dataTableUri": &dataTableUri. , incorrect for JSON, string value has no double quotes "dataTableUri": "&dataTableUri.", correct for JSON, string value has double quotes
...

Example:

%let foobar = CHAOS;

data _null_;
input;
_infile_ = resolve(_infile_);
putlog 'NOTE: ' _infile_;
datalines;
There was real &foobar out there!
;

Log

NOTE: There was real CHAOS out there!                         <----------------------
NOTE: DATA statement used (Total process time):
      real time           0.00 seconds
      cpu time            0.01 seconds

 

Ullsokk
Pyrite | Level 9
Couldnt get the syntax right with more than one line. I ended up using Proc Stream instead. But would be interested in an example that works on my multi line json, and gives an output file like my example.
Tom
Super User Tom
Super User

The RESOLVE() function solution should work. 

Let's set up and example by creating some example files.  First the original file with no macro variable references. Then a version that replaces the original text with macro variable references.  Here is code to create a file name WANT with your original text with some minor editing to make it look more like valid JSON (Notice the quotes around the text values) and make the indentation easier for humans to scan.

filename want temp;
options parmcards=want;
parmcards4;
data =
{"name": "name"
,"dataTableUri": "dataTableUri"
,"type": "predictive"
,"pipelineBuildMethod": "template"
,"analyticsProjectAttributes":
  {"targetVariable" : "target"
  ,"partitionEnabled" : "True"
  ,"targetEventLevel" : "1"
  }
,"settings":
  {"applyGlobalMetadata" : "false"
  }
,"links":
  [
    {"method": "GET"
    ,"rel": "initialPipelineTemplate"
    ,"href": "pipelineHref"
    ,"uri": "pipelineUri"
    ,"type": "application/vnd.sas.analytics.pipeline.template"
    }
  ]
}
;;;;

Now let's replace some of the values with macro variables and make a new file named HAVE which will be the input to our program.

filename have temp;
options parmcards=have;
parmcards4;
data =
{"name": "name"
,"dataTableUri": "&dataTableUri."
,"type": "predictive"
,"pipelineBuildMethod": "template"
,"analyticsProjectAttributes":
  {"targetVariable" : "&target."
  ,"partitionEnabled" : "True"
  ,"targetEventLevel" : "1"
  }
,"settings":
  {"applyGlobalMetadata" : "false"
  }
,"links":
  [
    {"method": "GET"
    ,"rel": "initialPipelineTemplate"
    ,"href": "&pipelineHref."
    ,"uri": "&pipelineUri."
    ,"type": "application/vnd.sas.analytics.pipeline.template"
    }
  ]
}
;;;;

So now let's make a program to convert the HAVE dataset.  Let's call the output TRY.  First we need to assign values to the macro variables.

%let dataTableUri=dataTableUri;
%let target=target;
%let pipelineHref=pipelineHref;
%let pipelineUri=pipelineUri;
filename try temp;
data _null_;
  infile have ;
  file try;
  input ;
  _infile_=resolve(_infile_);
  put _infile_;
run;

Now let's compare the two files and see if they match.

156  data _null_;
157    length cmd $600 ;
158    cmd=catx(' ','diff -b',quote(trim(pathname('want'))),quote(trim(pathname('try'))));
159    infile cmd pipe filevar=cmd;
160    input;
161    put _infile_;
162  run;

NOTE: The infile CMD is:

      Pipe command="diff -b ".../#LN00028" ".../#LN00030""

NOTE: 0 records were read from the infile CMD.

Let's try it with different values for the macro variables:

%let dataTableUri=intable;
%let target=outtable;
%let pipelineHref=url;
%let pipelineUri=uri;

results:

3c3
< ,"dataTableUri": "dataTableUri"
---
> ,"dataTableUri": "intable"
7c7
<   {"targetVariable" : "target"
---
>   {"targetVariable" : "outtable"
18,19c18,19
<     ,"href": "pipelineHref"
<     ,"uri": "pipelineUri"
---
>     ,"href": "url"
>     ,"uri": "uri"
NOTE: 14 records were read from the infile CMD.
      The minimum record length was 3.
      The maximum record length was 34.
Tom
Super User Tom
Super User

Check out PROC STREAM.

What Does the STREAM Procedure Do?

The STREAM procedure enables you to process an input stream that consists of arbitrary text that can contain SAS macro specifications. The macros are executed and expanded while the other text in the input stream is preserved. The text stream is not validated as SAS syntax. The output stream is sent to an external file that is referenced by a fileref and that can be defined to use any traditional SAS output destination.

Ullsokk
Pyrite | Level 9
Thanks! looks promising, the files seems to look right when using the &streamdelim (since it starts with brackets)
proc stream outfile=json; begin &streamdelim;
BillM_SAS
SAS Employee

Another possibility that should easily resolve your macro variables and leave out the worry of JSON syntax issues is to use the JSON procedure (available in SAS 9.4 maintenance 4 or later). Here is code that will produce the same form of JSON output. I had to guess on the values you were inserting, but resolving the macros should be easy with PROC JSON.

 

%let name=myName;
%let dataTableUri=https://www.google.com;
%let target=myTarget;
%let pipelineHref=myHref;
%let pipelineUri=https://pipe.line.uri;
proc json out='./sasuser/jsonTest.json' nosastags pretty;
  write open object; /* object 1 */
    write values name "&name";
	write values dataTableUri "&dataTableUri";
	write values type "predictive";
    write values pipelineBuildMethod "template";
	write values analyticsProjectAttributes;
	write open object; /* object 2 */
	  write values targetVariable "&target";
      write values partitionEnabled True;
      write values targetEventLevel "1";
	write close; /* object 2 */
	write values settings;
	write open object; /* object 3 */
	  write values applyGlobalMetadata "false";
	write close; /* object 3 */
    write values links;
	  write open array; /* array 1 */
	    write open object; /* object 4 */
		  write values method "GET";
          write values rel "initialPipelineTemplate";
          write values href "&pipelineHref";
          write values uri "&pipelineUri";
          write values type "application/vnd.sas.analytics.pipeline.template";
	    write close; /* object 4 */
	  write close; /* array 1 */
  write close; /* object 1 */
run;

OUTPUT FILE:

{
  "name": "myName",
  "dataTableUri": "https://www.google.com",
  "type": "predictive",
  "pipelineBuildMethod": "template",
  "analyticsProjectAttributes": {
    "targetVariable": "myTarget",
    "partitionEnabled": true,
    "targetEventLevel": "1"
  },
  "settings": {
    "applyGlobalMetadata": "false"
  },
  "links": [
    {
      "method": "GET",
      "rel": "initialPipelineTemplate",
      "href": "myHref",
      "uri": "https://pipe.line.uri",
      "type": "application/vnd.sas.analytics.pipeline.template"
    }
  ]
}

SAS Innovate 2025: Call for Content

Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!

Submit your idea!

How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 6 replies
  • 2145 views
  • 2 likes
  • 4 in conversation