BookmarkSubscribeRSS Feed
Eitan123
Obsidian | Level 7

Hello,


I'm using SAS 9.4_M7.

I intend to add an additional part to my automatic daily flow (I'm using IBM Flow Manager to run flows) which will read data from a file (probably CSV) and insert it as rows into a table in my MS SQL DB.

The flow is composed of DIS jobs.
How do I create such a job?

Notes:
1) In case of the file being CSV, and the text within having commas, will CSV still be a viable option?
2) The file will hold two columns:

- A large 2500 to 8000 long (!) characters text field. I've already checked and MS SQL varchar can hold up to 8000 characters. Might have a SAS limitation though.
- A 100 characters long text field.

I can also create an Execute transmutation to order an SQL DB procedure to perform the actions, but that would only be if SAS does not have a built-in way to read files with such long fields.

3 REPLIES 3
ballardw
Super User

If you read the data with a data step into a SAS data set you shouldn't have any issues with the content assuming 1) the columns that might contain commas have the entire value quoted, which is typical for most applications that make proper CSV files and 2) specify the length of the long variable propberly. SAS currently has a limit of 32K characters for a single variable, and 3) specify that the file LRECL is greater than 8101 character on the Infile statement.

 

I don't use DIS so am not sure exactly how to incorporate a data step into the process.

 

An example of the data set code:

 

data <your library and dataset name goes here>;
   infile "<path and file name " dlm=',' lrecl=9000 dsd;
   length short $ 100 longvar $ 8000;
   input short
           long
  ;
run;

The order of the variables on the INPUT statement has to match the order of the short or long variables in the data.

You can of course use names that make more sense than "short" and "long".

Kurt_Bremser
Super User

8000 characters is no problem, the limit for SAS character variables is 32767. The intermediate SAS datasets should be stored with the COMPRESS=YES option to avoid wasting disk space and I/O.

LinusH
Tourmaline | Level 20

"How do I create such a job?"

This may sound harsh, but if you don't know how to start, I don't think asking the community will help (in the long run),

Composing in DI Studio is a profession of its own, it has an extra level of abstraction from programming, and its own best practices etc.

So I sugest training and mentoring.

 

"1) In case of the file being CSV, and the text within having commas, will CSV still be a viable option?"

Usually the source files are given to you. If the sender produce such a file, they don't know what they are doing. Require a file where all text fileds (and potential numerical as well depending an the decimal separator that is in use) are within quotes.


"2) The file will hold two columns:

- A large 2500 to 8000 long (!) characters text field. I've already checked and MS SQL varchar can hold up to 8000 characters. Might have a SAS limitation though.
- A 100 characters long text field."

SAS can handle up to 32K long fields.

"I can also create an Execute transmutation to order an SQL DB procedure to perform the actions, but that would only be if SAS does not have a built-in way to read files with such long fields."

As per above answer, this should not be needed,

Data never sleeps

SAS Innovate 2025: Call for Content

Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!

Submit your idea!

Mastering the WHERE Clause in PROC SQL

SAS' Charu Shankar shares her PROC SQL expertise by showing you how to master the WHERE clause using real winter weather data.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 3 replies
  • 679 views
  • 0 likes
  • 4 in conversation