BookmarkSubscribeRSS Feed
lfabbri
Obsidian | Level 7

 

Hello everyone, I need to create a job in Data Integration Studio 4.9 that should get data from a REST web service and insert this data into a target table.

 

Following the documentation I've added a REST transformation a filled the various parameters (this should be a POST request with a json body):

 

sas_post_1.pngsas_post_2.png

 

 

When I execute the job it seems it complets succesfully (although I get a lot of warnings "WARNING: Apparent symbolic reference QUOT not resolved", not sure what it means):

 

 

sas_post_3.png

 

My questions:

 

  1. Did the job run succesfully?
  2. If yes, how can I map the output of the POST request to store the content result into a table for further processing? What type of node/transformations can I use, there are some examples around? Documentation does not say very much about how to use data from REST transformations.
  3. Why do I have all that warnings? How can I get rid of that?

Thank you for any information.

 

 

13 REPLIES 13
VincentRejany
SAS Employee

Hi

 

Have you already tested the method with a REST Client like Postman or Curl?

 

Regards

 

Vincent

lfabbri
Obsidian | Level 7

Hi, yes the endpoint works fine.

 

I suppose it is working also in SAS, but I don't know how extract data from the sas Rest node and use it in other transformations node or store it on a target table.

Any example would be very appreciated

 

 

clwatson
SAS Employee

Hi, I'm working to put an example together for you which demos how to get JSON data back out from this transform.  While I'm preparing this do you happen to be working with SAS 9.4m4 or later as my suggestion will use the new JSON libname engine.  See this for information on that; https://blogs.sas.com/content/sasdummy/2016/12/02/json-libname-engine-sas/.

lfabbri
Obsidian | Level 7

Hi @clwatson, thank you for the reply.

 

This is the version of SAS I'am using:

 

sas_version.png

 

It is ok to follow your example?

 

 

clwatson
SAS Employee

If you are at least SAS 9.4m4, which it looks like you are at 9.4m5, you could try to get the data out with the JSON libname engine, https://blogs.sas.com/content/sasdummy/2016/12/02/json-libname-engine-sas/. I was going to type up all this to demonstrate how you you could set this up, but decided to just build you a library, a couple of tables and a job (which hits a rest data provider) to give you a better picture of how this might work for you (see the attached spk file that you can import into a folder, it should work when you import the items in the spk without needing. This also assumes you have 9.4m4 (at least) and the response data can come back as JSON, which it appears you have 9.4m5.

  

The key to this example is the values set in the Input and Output tab of the REST transform and the path for the Response body library SAS library.

 

Download the attached ZIP file and rename with an SPK extension (ex: read_rest_data.spk) -- then you'll be able to import it.

lfabbri
Obsidian | Level 7

 

Hi, thank you very much this is really helpful!

I've imported the SPK file in Data Integration Studio. I've double click on "Read data from REST call", I can see a REST node and 2 tables. 

 

sas_import_1.png

 

I am not sure how this works, all the nodes are disconnected. In the REST node input/output tab I can see the following:

 

sas_import_2.png

 

 

So the output is set to a file (not to a table)?

If I run the job it completes succesfully and creates a TEMP file with the seponse content of the http call as json text.

I don't uderstand how I am suppose to populate the tables from this file.

Also if I try to open one of the tables I got an error:

 

sas_import_3.png

 

 

Can you please provide some other details on how I am suppose to make this work correctly? 

 

Thank you for the support

 

 

 

 

 

 

 

 

 

 

clwatson
SAS Employee

The REST node is set to write to a file because that's what the JSON libname engine needs to access the JSON data.  The 2 tables on the Job diagram are just there to give you an easy way to open them for demo purposes.  Unfortunately the REST transform does not let you link them up to demonstrate them as outputs but the transform order can be set appropriately to have things execute in the right order, but you can link them up as sources to other transforms.  So when the REST node runs successfully, those tables should be populated with content, you don't need to do anything.

 

In order to determine why View data is not working with them, could you place a List Data transform after each table and connect them up.  If this causes and error running the job, then please post the log.

 

json_job.JPG

lfabbri
Obsidian | Level 7

Hi, I've tried as you suggested. These are the errors:

 

sas_import_4.png    

 

I think I am missing several key points here (apologies I am totally new to SAS):

 

  1. I see a "Response json library" object in the folder hierarchy with a libref of "respjson". What is this and what does it mean "Libref RESPJSON is not assigned"?
  2. I can't find where is specified that the tables ALLDATA and DATA_AMENITIES should read data from the file generated by the REST node (which I configured to be at path "C:\SAS\TEMP\response.json")

 

Thank you for the support

TomH
SAS Employee

Hello @lfabbri

 

My name is Tom and I'm a member of the SAS Data Integration Studio Technical Support team.  I love to help you resolve this and I think the quickest way would be for you to open a Technical Support track so I can contact you.  If you aren't familiar with SAS Technical Support, you can find everything you need right here:

 

Contact SAS Technical Support

 

You can use any one of the options to initiate a track.  You can just include a reference to this Forum entry, say it's related to SAS Data Integration Studio and reading data via REST.  

 

Regards,

 

Tom

lfabbri
Obsidian | Level 7

Hi Tom, I've open a support ticket as you suggested. Let me know if I need to send you the tracking number

lfabbri
Obsidian | Level 7

Hi @clwatson, I've figured out I have to change the libname statement in the "Response json library" which previously was pointing to a directory and not to the json file written from the REST node.

 

Now I can correctly visualize the data from the tables. What I am missing is where the mapping is defined, or in other words where I can specify which fields from the json response I need to map to the output table.

 

 

clwatson
SAS Employee

@lfabbri, I spoke to @TomH (SAS Technical Support) on Friday and discussed the sample in detail so he has background about what I was trying to convey with the sample.  I think at this point it would be good to take his suggestion and open a track so he the two of you can communicate by phone or even via desktop sharing so he can guide you through how to integrate the response data through the rest of the job.  I think you are at the phase where you need to investigate how table and column metadata in used to model physical data. 

 

This is done using 2 features which read the physical structure of the table and create a metadata representation of it; those features are called "Register tables" and "Update metadata", but I really think you might have more questions after this and that's where having a chat with Tom in Technical Support will get you up and running quicker.

AngusLooney
SAS Employee

I've been dabbling with this, and hit the fundemantal issue that JSON is inherently schema-less, unlike XML. Essentially, you are never sure what you're going to get, in terms of field structure, from any one REST call (or another expression of data in JSON, like an "export to file").

 

What I tried was creating "template" datasets/tables, that I could use to conform any instance of JSON derived tables (from the same source or endpoint) to the virtual schema that the data (more or less) complies with. 

 

So in the code I was working on, once I've got the various tables - but often only one really - back from the JSON library, I run them through an append to an empty "target instance" of the template dataset, freshly created for the purpose of being the output from my user transform, which means that I get an output table from my call to the REST end point that's crudely conformed to a static structure/schema.

 

I ended up with a User Transfrom that took the "template" dataset as an input, and the REST endpoint details and options/parameters as options, and coughed out a dataset with a fixed format. So in DIS, I could use registered tables as the output of the UT that called hte REST endpoint, with a stable data structure.

 

PROC APPEND is a but low-tech/brute force for doing this, it would probably be better done in a more in-depth macro. There are options to suppress the squawking about the JSON sourced table not being just like the template derived BASE, but I couldn't get it to stop issuing warnings, and "probably beneign warnings" are absolutely not what want in a DIS UT.

 

XML is much prefereable, as the XML Map mechanism, and XML itself, tackles the issue of how to handle parts of a schema not being present in a particular set of records - in JSON the absence of a schema, and the absence of any sort of placeholder for "not present" data items/fields, means that it's actually an undesireable format for "ETL-esque" data transport, but it's not going away.

SAS Innovate 2025: Call for Content

Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!

Submit your idea!

How to connect to databases in SAS Viya

Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 13 replies
  • 3883 views
  • 4 likes
  • 5 in conversation