<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How to import many complex .csv files (1) in SAS Programming</title>
    <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650533#M195072</link>
    <description>&lt;P&gt;Hi Richard,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for the reply and clarifications, I understand the code much better now. &amp;nbsp;&lt;/P&gt;&lt;P&gt;With help from you and Tom, the data have been imported and connected and I was able to generate our first summary report today. &amp;nbsp;&lt;/P&gt;&lt;P&gt;Which is great to have gotten done.&lt;/P&gt;&lt;P&gt;Many thanks for the detailed recommendations and collective patience,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Robert&lt;/P&gt;</description>
    <pubDate>Mon, 25 May 2020 21:17:43 GMT</pubDate>
    <dc:creator>rmacarthur</dc:creator>
    <dc:date>2020-05-25T21:17:43Z</dc:date>
    <item>
      <title>How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650121#M194960</link>
      <description>&lt;P&gt;Hi SAS Friends,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Need to import and analyze hundreds of .csv files that contain environmental data.&lt;/P&gt;&lt;P&gt;a sample has been uploaded here&amp;nbsp;&lt;/P&gt;&lt;P&gt;Because there are so many, individually saving them as a .xlsx file for import seems impractical, which is how many SAS programmers handle this.&lt;/P&gt;&lt;P&gt;The first 13 lines of data are not needed.&amp;nbsp;&lt;/P&gt;&lt;P&gt;line 14 has the column header information, and then&amp;nbsp;each line thereafter has data in rows, and the data structure is straight forward.&lt;/P&gt;&lt;P&gt;I've attached a sample file.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Question is , how can this be imported as a CSV file, after which the first 13 lines are deleted, and keeping headers and data structure that follow.&amp;nbsp; &amp;nbsp;&lt;/P&gt;&lt;P&gt;Here's a sample of what it looks like:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Key Name:Suffix Trend Definitions Used&lt;BR /&gt;Point_1: HOSP_1FL_PHARM_FCU01:ROOM TEMP 15 minutes&lt;BR /&gt;Point_2: HOS.1FL.ANTE.RM.DIFF.PRESS 15 minutes&lt;BR /&gt;Point_3: HOS.1FL.ANTE.RM.HUMID 15 minutes&lt;BR /&gt;Point_4: HOS.1FL.ANTE.RM.TEMP 15 minutes&lt;BR /&gt;Point_5: HOS.1FL.CLEAN.RM.DIFF.PRESS 15 minutes&lt;BR /&gt;Point_6: HOS.1FL.STORAGE.DIFF.PRESS 15 minutes&lt;BR /&gt;Point_7: pCOWeb7700_A003 Error: Point not found in database.&lt;BR /&gt;Point_8: pCOWeb7700_A004 Error: Point not found in database.&lt;BR /&gt;Time Interval: 15 Minutes&lt;BR /&gt;Date Range: 12/31/2019 06:55:11 - 1/1/2020 06:55:11&lt;BR /&gt;Report Timings: All Hours&lt;BR /&gt;&lt;BR /&gt;&amp;lt;&amp;gt;Date Time Point_1 Point_2 Point_3 Point_4 Point_5 Point_6 Point_7 Point_8&lt;BR /&gt;12/31/2019 6:55:11 72 0.04 26.81 73.53 -0.09 -0.06 No Data No Data&lt;BR /&gt;12/31/2019 7:10:11 72.25 0.04 26.81 73.57 -0.09 -0.06 No Data No Data&lt;BR /&gt;12/31/2019 7:25:11 72.25 0.05 26.81 73.62 -0.09 -0.06 No Data No Data&lt;/P&gt;&lt;P&gt;////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////&lt;/P&gt;&lt;P&gt;Any advice is greatly appreciated,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks !&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 23 May 2020 23:30:34 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650121#M194960</guid>
      <dc:creator>rmacarthur</dc:creator>
      <dc:date>2020-05-23T23:30:34Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650123#M194962</link>
      <description>&lt;P&gt;Are the columns always DATE, TIME, POINT_1 - POINT_8 ?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;If so, the data from all the hundreds of data files can be placed in one data set.&lt;/P&gt;
&lt;P&gt;If not, what is your strategy for naming the data set for a given data file ?&lt;/P&gt;
&lt;P&gt;Are you sure you want to discard the auxiliary information about each POINT_&amp;lt;n&amp;gt; column ?&lt;/P&gt;</description>
      <pubDate>Sun, 24 May 2020 01:34:33 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650123#M194962</guid>
      <dc:creator>RichardDeVen</dc:creator>
      <dc:date>2020-05-24T01:34:33Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650124#M194963</link>
      <description>&lt;P&gt;Are the files complex because of the header rows?&lt;/P&gt;
&lt;P&gt;Do all files have the same variables and layout beyond that?&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Is it always 13 lines in the header?&lt;/P&gt;</description>
      <pubDate>Sun, 24 May 2020 02:00:35 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650124#M194963</guid>
      <dc:creator>Reeza</dc:creator>
      <dc:date>2020-05-24T02:00:35Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650130#M194967</link>
      <description>&lt;P&gt;That files does not look very complicated.&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;%let path=C:\downloads;
%let fname=Temp_ex.csv ;

proc format ;
  invalue  nodata
    'TIME (-3)'=.t
    'No Data'=.d 
     other=_same_
  ; 
run;

data want;
  infile "&amp;amp;path/&amp;amp;fname" dsd truncover firstobs=15;
  input @;
  if index(_infile_,'End of Report') then delete;
  input date :mmddyy. time :time. (point1-point8) (:nodata.);
  format date yymmdd10. time time8. ;
run;

proc print;
run;
&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;Results:&lt;/P&gt;
&lt;PRE&gt;Obs         date       time   point1   point2   point3   point4   point5   point6   point7   point8

  1   2019-12-31    6:55:11    72.00    0.04     26.81    73.53    -0.09    -0.06      D        D
  2   2019-12-31    7:10:11    72.25    0.04     26.81    73.57    -0.09    -0.06      D        D
  3   2019-12-31    7:25:11    72.25    0.05     26.81    73.62    -0.09    -0.06      D        D
  4   2019-12-31    7:40:11    72.25    0.04     26.35    73.67    -0.09    -0.06      D        D
  5   2019-12-31    7:55:11    72.25    0.05     26.20    73.75    -0.09    -0.06      D        D
  6   2019-12-31    8:10:11    72.25    0.04     26.10    73.80    -0.09    -0.06      D        D
  7   2019-12-31    8:25:11    72.25    0.04     26.15    73.84    -0.09    -0.06      D        D
  8   2019-12-31    8:40:11    72.25    0.04     26.15    73.93    -0.09    -0.06      D        D
  9   2019-12-31    8:55:11    72.25    0.05     26.05    73.98    -0.09    -0.06      D        D
 10   2019-12-31    9:10:11    72.25    0.04     25.90    74.02    -0.09    -0.06      D        D
 11   2019-12-31    9:25:11    72.25    0.04     26.30    74.02    -0.09    -0.06      D        D
 12   2019-12-31    9:40:11    72.25    0.05     26.15    74.07    -0.09    -0.06      D        D
 13   2019-12-31    9:55:11    72.25    0.06     26.35    74.03    -0.09    -0.06      D        D
 14   2019-12-31   10:10:11    72.25    0.06     26.30    74.07    -0.09    -0.06      D        D
 15   2019-12-31   10:25:11    72.50    0.06     26.25    74.07    -0.09    -0.06      D        D
 16   2019-12-31   10:40:11    72.50    0.05     26.35    74.07    -0.09    -0.06      D        D
 17   2019-12-31   10:55:11    72.50    0.05     26.15    74.12    -0.09    -0.06      D        D
 18   2019-12-31   11:10:11    72.50    0.05     26.10    74.21    -0.10    -0.06      D        D
 19   2019-12-31   11:25:11    72.50    0.06     26.00    74.25    -0.10    -0.06      D        D
 20   2019-12-31   11:40:11    72.50    0.06     26.00    74.30    -0.10    -0.06      D        D
 21   2019-12-31   11:55:11    72.50    0.04     26.05    74.43    -0.09    -0.06      D        D
 22   2019-12-31   12:10:11    72.50    0.06     25.84    74.47    -0.10    -0.06      D        D
 23   2019-12-31   12:25:11    72.50    0.05     25.95    74.57    -0.10    -0.06      D        D
 24   2019-12-31   12:40:11    72.50    0.06     26.00    74.59    -0.10    -0.06      D        D
 25   2019-12-31   12:55:11    72.75    0.06     25.90    74.61    -0.10    -0.06      D        D
 26   2019-12-31   13:10:11    72.50    0.05     26.20    74.66    -0.09    -0.06      D        D
 27   2019-12-31   13:25:11    72.50    0.06     26.20    74.66    -0.10    -0.06      D        D
 28   2019-12-31   13:40:11    72.50    0.06     26.30    74.66    -0.10    -0.06      D        D
 29   2019-12-31   13:55:11    72.50    0.04     26.40    74.66    -0.09    -0.06      D        D
 30   2019-12-31   14:10:11    72.50    0.05     26.41    74.70    -0.09    -0.06      D        D
 31   2019-12-31   14:25:11    72.50    0.06     26.05    74.66    -0.10    -0.06      D        D
 32   2019-12-31   14:40:11    72.50    0.05     26.00    74.66    -0.09    -0.06      D        D
 33   2019-12-31   14:55:11    72.50    0.06     25.85    74.66    -0.10    -0.06      D        D
 34   2019-12-31   15:10:11    72.50    0.06     25.19    74.70    -0.10    -0.06      D        D
 35   2019-12-31   15:25:11    72.50    0.05     26.45    74.66    -0.09    -0.06      D        D
 36   2019-12-31   15:40:11    72.50    0.06     27.37    74.66    -0.10    -0.06      D        D
 37   2019-12-31   15:55:11    72.50    0.06     27.21    74.70    -0.09    -0.06      D        D
 38   2019-12-31   16:10:11    72.50    0.04     26.20    74.70    -0.09    -0.06      D        D
 39   2019-12-31   16:25:11    72.50    0.04     25.76    74.71    -0.09    -0.06      D        D
 40   2019-12-31   16:40:11    72.50    0.05     25.44    74.70    -0.09    -0.06      D        D
 41   2019-12-31   16:55:11    72.50    0.06     25.40    74.71    -0.10    -0.06      D        D
 42   2019-12-31   17:10:11    72.50    0.06     25.20    74.66    -0.10    -0.06      D        D
 43   2019-12-31   17:25:11    72.50    0.04     25.04    74.66    -0.09    -0.06      D        D
 44   2019-12-31   17:40:11    72.50    0.05     25.30    74.66    -0.09    -0.06      D        D
 45   2019-12-31   17:55:11    72.50    0.06     25.29    74.70    -0.10    -0.06      D        D
 46   2019-12-31   18:10:11    72.50    0.05     25.39    74.75    -0.10    -0.06      D        D
 47   2019-12-31   18:25:11    72.50    0.06     25.59    74.70    -0.10    -0.06      D        D
 48   2019-12-31   18:40:11    72.50    0.06     25.59    74.75    -0.10    -0.06      D        D
 49   2019-12-31   18:55:11    72.50    0.05     26.00    74.75    -0.09    -0.06      D        D
 50   2019-12-31   19:10:11    72.25    0.06     25.65    74.61    -0.10    -0.06      D        D
 51   2019-12-31   19:25:11    72.00    0.05     27.06    74.39    -0.10    -0.06      D        D
 52   2019-12-31   19:40:11    71.75    0.06     27.97    74.21    -0.10    -0.06      D        D
 53   2019-12-31   19:55:11    71.50    0.05     26.05    74.07    -0.10    -0.06      D        D
 54   2019-12-31   20:10:11    71.50    0.05     26.21    73.93    -0.09    -0.06      D        D
 55   2019-12-31   20:25:11    71.25    0.06     26.41    73.76    -0.10    -0.06      D        D
 56   2019-12-31   20:40:11    71.25    0.05     26.41    73.71    -0.09    -0.06      D        D
 57   2019-12-31   20:55:11    71.75    0.06     26.00    73.71    -0.10    -0.06      D        D
 58   2019-12-31   21:10:11    72.00    0.06     27.06    73.80    -0.09    -0.06      D        D
 59   2019-12-31   21:25:11    72.25    0.05     27.62    72.99    -0.09    -0.06      D        D
 60   2019-12-31   21:40:11    72.50    0.06     27.72    72.12    -0.10    -0.06      D        D
 61   2019-12-31   21:55:11    72.50    0.06     25.90    72.35    -0.10    -0.06      D        D
 62   2019-12-31   22:10:11    72.50    0.05     25.24    72.84    -0.09    -0.06      D        D
 63   2019-12-31   22:25:11    72.75    0.05     24.79    73.25    -0.09    -0.06      D        D
 64   2019-12-31   22:40:11    72.50    0.04     24.43    73.06    -0.08    -0.06      D        D
 65   2019-12-31   22:55:11    72.00    0.05     23.43    72.52    -0.09    -0.06      D        D
 66   2019-12-31   23:10:11    71.50    0.05     22.82    72.75    -0.09    -0.06      D        D
 67   2019-12-31   23:25:11    71.50    0.04     22.32    73.02    -0.09    -0.06      D        D
 68   2019-12-31   23:40:11    71.25    0.04     21.66    73.20    -0.09    -0.06      D        D
 69   2019-12-31   23:55:11    71.50    0.05     21.30    73.38    -0.09    -0.06      D        D
 70   2020-01-01    0:10:11    71.75    0.04     20.75    73.57    -0.09    -0.06      D        D
 71   2020-01-01    0:25:11    72.25    0.04     20.15    73.75    -0.09    -0.06      D        D
 72   2020-01-01    0:40:11    72.25    0.04     19.60    73.89    -0.09    -0.06      D        D
 73   2020-01-01    0:55:11    72.50    0.04     19.44    73.98    -0.09    -0.06      D        D
 74   2020-01-01    1:10:11    72.50    0.04     19.19    74.07    -0.09    -0.06      D        D
 75   2020-01-01    1:25:11    72.50    0.05     19.35    74.12    -0.09    -0.06      D        D
 76   2020-01-01    1:40:11    72.50    0.04     19.79    74.16    -0.09    -0.06      D        D
 77   2020-01-01    1:55:11    72.50    0.04     19.90    74.16    -0.08    -0.06      D        D
 78   2020-01-01    2:10:11    72.50    0.05     20.45    73.29    -0.09    -0.06      D        D
 79   2020-01-01    2:25:11    72.50    0.04     20.36    73.35    -0.09    -0.06      D        D
 80   2020-01-01    2:40:11    72.50    0.04     19.90    73.57    -0.09    -0.06      D        D
 81   2020-01-01    2:55:11    72.75    0.04     19.49    73.80    -0.09    -0.06      D        D
 82   2020-01-01    3:10:11      T       T         T        T        T        T        D        D
 83   2020-01-01    3:25:11    72.50    0.04     19.34    74.16    -0.09    -0.06      D        D
 84   2020-01-01    3:40:11    72.50    0.05     18.74    74.29    -0.09    -0.06      D        D
 85   2020-01-01    3:55:11    72.50    0.04     18.44    74.31    -0.09    -0.06      D        D
 86   2020-01-01    4:10:11    72.50    0.04     18.39    74.35    -0.09    -0.06      D        D
 87   2020-01-01    4:25:11    72.50    0.05     18.43    74.34    -0.09    -0.06      D        D
 88   2020-01-01    4:40:11    72.50    0.05     18.34    74.39    -0.09    -0.06      D        D
 89   2020-01-01    4:55:11    72.50    0.04     18.33    74.34    -0.09    -0.06      D        D
 90   2020-01-01    5:10:11    72.50    0.04     18.13    74.35    -0.09    -0.06      D        D
 91   2020-01-01    5:25:11    72.50    0.05     17.83    74.39    -0.09    -0.06      D        D
 92   2020-01-01    5:40:11    72.50    0.04     17.53    74.30    -0.09    -0.06      D        D
 93   2020-01-01    5:55:11    72.50    0.05     17.43    74.30    -0.09    -0.06      D        D
 94   2020-01-01    6:10:11    72.50    0.04     17.63    74.25    -0.09    -0.06      D        D
 95   2020-01-01    6:25:11    72.25    0.04     18.38    74.25    -0.09    -0.06      D        D&lt;/PRE&gt;</description>
      <pubDate>Sun, 24 May 2020 06:56:20 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650130#M194967</guid>
      <dc:creator>Tom</dc:creator>
      <dc:date>2020-05-24T06:56:20Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650171#M194977</link>
      <description>Hi Tom,&lt;BR /&gt;Thank you very much, I"ve learned alot from the code you've provided, The file was complex because the data did not start until line 15, and this is a part of SAS coding I'm not very familiar with. Much appreciated, !&lt;BR /&gt;R.</description>
      <pubDate>Sun, 24 May 2020 12:52:48 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650171#M194977</guid>
      <dc:creator>rmacarthur</dc:creator>
      <dc:date>2020-05-24T12:52:48Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650177#M194978</link>
      <description>&lt;P&gt;Hi Richard,&lt;/P&gt;&lt;P&gt;Thanks for the comments !&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;Are the columns always DATE, TIME, POINT_1 - POINT_8 ?&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;yes&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;If not, what is your strategy for naming the data set for a given data file ?&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;was planning to keep each the original filename, and just remove the troublesome characters (see below).&amp;nbsp; However that may not be the best approach&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;Are you sure you want to discard the auxiliary information about each POINT_&amp;lt;n&amp;gt; column ?&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;Yes, those lines are all the same file to file and just define each variable.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;If so, the data from all the hundreds of data files can be placed in one data set.&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;Yes, that's the goal.&amp;nbsp; A macro or do loop is needed for that, right? That can read the consecutive CSV file names from the directory, and import them.&amp;nbsp; I don't know how to do that, too.&lt;/P&gt;&lt;P&gt;Because the file names have "._-" in them, I've modified the code from Tom to get rid of those, which I've pasted below.&amp;nbsp; So how can I take that further to import each uniquely named data set, one at a time, to create one large file?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;/*************************************/

%let fname=Temp_01-01-20_06-55.Hum.Press-DA.csv ;&amp;nbsp; /*THis is what the actual file names look like*/

data e ;
fn2 = symget("fname") ; 
fn1 = compress(fn2, "-._ ") ;
call symput('fn', fn1) ;
run ;


proc format ;
invalue nodata
'TIME (-3)'=.t
'No Data'=.d 
other=_same_
; 
run;

data &amp;amp;fn;
infile "&amp;amp;RenoD/&amp;amp;fname" dsd truncover firstobs=15;
input @;
if index(_infile_,'End of Report') then delete;
input date :mmddyy. time :time. (point1-point8) (:nodata.);
format date yymmdd10. time time8. ;
run;

proc print;
run;
/*************************************/

&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 24 May 2020 14:00:56 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650177#M194978</guid>
      <dc:creator>rmacarthur</dc:creator>
      <dc:date>2020-05-24T14:00:56Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650180#M194980</link>
      <description>&lt;BLOCKQUOTE&gt;&lt;HR /&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/28361"&gt;@rmacarthur&lt;/a&gt;&amp;nbsp;wrote:&lt;BR /&gt;
&lt;P&gt;Hi SAS Friends,&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Because there are so many, individually saving them as a .xlsx file for import seems impractical, which is how many SAS programmers handle this.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;HR /&gt;&lt;/BLOCKQUOTE&gt;
&lt;P&gt;Because Import does not deal well with data with more than a single header row to get variable names from this would be a bad idea quickly. Second, each time you invoke Proc Import the procedure makes a potentially quite different set of guesses as to variable type and length it is often difficult to combine data imported from xlsx because some variables end up with numeric and others with character for the same column. Which SAS won't allow and you can spend more time "fixing" things than you might have saved with import.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Moral of the story: If the files have the same structure read them with a data step.&lt;/P&gt;</description>
      <pubDate>Sun, 24 May 2020 14:07:35 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650180#M194980</guid>
      <dc:creator>ballardw</dc:creator>
      <dc:date>2020-05-24T14:07:35Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650183#M194982</link>
      <description>Thanks, very helpful, we've got that sorted out and sample code is above. Now the trick is how to import each individually named file, using data steps. I image that would involve a macro or do loop.&lt;BR /&gt;</description>
      <pubDate>Sun, 24 May 2020 14:26:34 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650183#M194982</guid>
      <dc:creator>rmacarthur</dc:creator>
      <dc:date>2020-05-24T14:26:34Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650188#M194984</link>
      <description>&lt;P&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/159"&gt;@Tom&lt;/a&gt; showed you how to read one file.&amp;nbsp; The concept can be extended because a &lt;STRONG&gt;single&lt;/STRONG&gt; data step can read multiple files that match a wildcarded operating system filename.&amp;nbsp; Because all the data is read as if from one sequential stream you can not rely on the FIRSTOBS= infile option.&amp;nbsp; Instead you will want to examine a held input of the raw line and examine it for&amp;nbsp;&lt;EM&gt;landmarks&amp;nbsp;&lt;/EM&gt;pertinent to subsequent statements in the data step.&amp;nbsp; Also, use FILEVAR= to capture the source filename into a automatic variable that can be assigned to an output data set variable.&amp;nbsp;&amp;nbsp;&lt;EM&gt;NOTE: Automatic variables are part of the program data vector (PDV) but are always automatically dropped and thus can not be directly part of an output data set.&lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Example (untested):&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;data want(label="Input from many csv");&lt;BR /&gt;  length source_filename $200;
&amp;nbsp; infile "&amp;amp;path.\*.csv" filevar=in_filename;
&amp;nbsp; retain input_now_flag 0 source_filename;&lt;BR /&gt;
&amp;nbsp; if lag (in_filename) ne in_filename then do;    /* start of reading from next file */&lt;BR /&gt;    source_filename = in_filename;
&amp;nbsp; &amp;nbsp; input_now_flag = 0;&lt;BR /&gt;&amp;nbsp; end;

&amp;nbsp; input&amp;nbsp;@;                        /* held input for checking on landmarks */

  if _infile_ =: '&amp;lt;&amp;gt;' then do;    /* landmark for subsequent rows are data */
    input_now_flag = 1;
    delete;
  end;
&lt;BR /&gt;  if _infile_ =: '//' then do;  /* landmark for end of data, ignore anything that might be after it */&lt;BR /&gt;    input_now_flag = 0;&lt;BR /&gt;    delete;&lt;BR /&gt;  end;&lt;BR /&gt;
&amp;nbsp; if not input_now_flag then delete;   /* have not reached the data landmark yet */
&lt;BR /&gt;  /* presume all *.csv files have the same column structure and order */
  input date :mmddyy. time :time. (point1-point8) (:nodata.);&lt;BR /&gt;
  format date yymmdd10. time time8. ;&lt;BR /&gt;  keep source_filename date time point1-point8;
run;&lt;/PRE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 24 May 2020 15:18:39 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650188#M194984</guid>
      <dc:creator>RichardDeVen</dc:creator>
      <dc:date>2020-05-24T15:18:39Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650200#M194988</link>
      <description>&lt;P&gt;If they are all in one directory should be able to just use a wildcard in INFILE statement.&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;%let renod=your path here;

data want;
  length filename $256 fname $50 ;
  infile "&amp;amp;RenoD/*.csv" dsd truncover filevar=filename;
  input @;
  fname = scan(filename,-1,'/\');
  if lag(filename) ne filename then row=0;
  row+1;
  if row &amp;lt;15 or index(_infile_,'End of Report') then delete;
  input date :mmddyy. time :time. (point1-point8) (:nodata.);
  format date yymmdd10. time time8. ;
run;
&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;If you they are scattered or mixed then make a dataset with the list of files instead. Then use that list to drive reading the files. So if you have dataset name FILELIST with a variable named FILENAME you could use this data step to read each file starting at line 15 and skipping the last row.&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data want;
  set filelist ;
  filename2 = filename;
  infile csv dsd truncover filename=filename2 end=eof firstobs=15;
  do while (not eof);
    input @;
    if not eof then do;
      input date :mmddyy. time :time. (point1-point8) (:nodata.);
      output;
    end;
  end;
  format date yymmdd10. time time8. ;
run;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 24 May 2020 18:02:49 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650200#M194988</guid>
      <dc:creator>Tom</dc:creator>
      <dc:date>2020-05-24T18:02:49Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650210#M194993</link>
      <description>&lt;BLOCKQUOTE&gt;&lt;HR /&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/28361"&gt;@rmacarthur&lt;/a&gt;&amp;nbsp;wrote:&lt;BR /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Because there are so many, individually saving them as a .xlsx file for import seems impractical, which is how many SAS programmers handle this.&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;I'm not sure where you got this idea, but I'd say that goes against best and recommended practices for importing data. Any decent SAS programmer would say that importing data from a CSV is infinitely easier than XLSX, especially if you have multiple files and need to ensure your variables are all the same type. I almost guarantee that if you converted these all to Excel and then tried to read it, you'd get a mismatch of types between some files.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The only time &lt;STRONG&gt;I&lt;/STRONG&gt; recommend using Excel as an intermediary is when you have comments, or long text fields that could have all sorts of punctuation in it to mess up the data - for example free text answers on surveys. And I definitely know quite a few programmers who disagree with me on that one &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 24 May 2020 18:57:38 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650210#M194993</guid>
      <dc:creator>Reeza</dc:creator>
      <dc:date>2020-05-24T18:57:38Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650450#M195031</link>
      <description>&lt;P&gt;Well, the approach comes from working with very uniform spreadsheets and complete consistent datasets with no missing variables, where an excel import procedure works well.&amp;nbsp; We live in a rarefied work setting where that is the case.&amp;nbsp; I totally agree, however, that when importing messier data a data step is far superior.&lt;/P&gt;</description>
      <pubDate>Mon, 25 May 2020 13:41:49 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650450#M195031</guid>
      <dc:creator>rmacarthur</dc:creator>
      <dc:date>2020-05-25T13:41:49Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650453#M195033</link>
      <description>Yes, the files are all consistent , they're machine generated, thanks !</description>
      <pubDate>Mon, 25 May 2020 14:13:44 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650453#M195033</guid>
      <dc:creator>rmacarthur</dc:creator>
      <dc:date>2020-05-25T14:13:44Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650457#M195035</link>
      <description>&lt;P&gt;Hi Tom,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you very much.&amp;nbsp; The files are all in one place, so I ran the first code and got the following ,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;" A Physical file reference (i.e. "PHYSICAL FILE REFERENCE" ) or an aggregate file storage&amp;nbsp;reference (i.e. AGGREGATE(MEMBER) ) reference cannot be used with the FILEVAR= option."&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;However changing "FILEVAR" to "FILENAME" allowed the code to work like a charm.&amp;nbsp; That modification was recommended in some other SAS community postings.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;The output data look great, much appreciated.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Robert&lt;/P&gt;</description>
      <pubDate>Mon, 25 May 2020 14:36:07 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650457#M195035</guid>
      <dc:creator>rmacarthur</dc:creator>
      <dc:date>2020-05-25T14:36:07Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650463#M195036</link>
      <description>&lt;P&gt;Hi Richard,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you very much for the reply.&amp;nbsp; Have learned allot from your sample code.&amp;nbsp;&lt;/P&gt;&lt;P&gt;In running this code, I received an error message,&lt;/P&gt;&lt;P&gt;&lt;FONT color="#FF0000"&gt;ERROR: A Physical file reference (i.e. "PHYSICAL FILE REFERENCE" ) or an aggregate file storage&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#FF0000"&gt;reference (i.e. AGGREGATE(MEMBER) ) reference cannot be used with the FILEVAR= option.&lt;/FONT&gt;"&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;But changing "FILEVAR" to "FILENAME", based on some SAS community comments, allowed the code to run.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Once running, the code recognizes each file in the directory, there is a message like this, once for each file:&lt;/P&gt;&lt;P&gt;&lt;FONT color="#0000FF"&gt;NOTE: The infile "C:\Users\SAS\Box\DATA_RENO\*.csv" is:&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#0000FF"&gt;Filename=C:\Users\SAS\Box\DATA_RENO\Temp_12-31-19_06-55.Hum.Press-DA.csv,&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#0000FF"&gt;File List=C:\Users\SAS\Box\DATA_RENO\*.csv,&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#0000FF"&gt;RECFM=V,LRECL=32767&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT color="#000000"&gt;Then, for each file, there is another line like this:&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT color="#000000"&gt;&lt;FONT color="#0000FF"&gt;NOTE: 110 records were read from the infile "C:\Users\SAS\Box\DATA_RENO\*.csv".&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#0000FF"&gt;The minimum record length was 2.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#0000FF"&gt;The maximum record length was 115.&lt;/FONT&gt;&lt;BR /&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT color="#000000"&gt;&lt;FONT color="#0000FF"&gt;NOTE: The data set WORK.WANT has 0 observations and 11 variables.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#0000FF"&gt;NOTE: DATA statement used (Total process time):&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#0000FF"&gt;real time 1.12 seconds&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#0000FF"&gt;cpu time 0.68 seconds&lt;/FONT&gt;&lt;BR /&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;But it doesn't seem to aggregate the data into one file.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Should I be adding a Landmark, at this point in the code ?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;CODE class=" language-sas"&gt;if _infile_ =: '&amp;lt;&amp;gt;' then do; /* landmark for subsequent rows are data */&lt;/CODE&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Have pasted the code below for ease of reference,&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data want_R (label="Input from many csv");
  length source_filename $200;
  infile "C:\Users\SAS\Box\DATA_RENO\*.csv" filevar=in_filename;
  retain input_now_flag 0 source_filename;

  if lag (in_filename) ne in_filename then do;    /* start of reading from next file */
    source_filename = in_filename;
    input_now_flag = 0;
  end;

  input @;                        /* held input for checking on landmarks */

  if _infile_ =: '&amp;lt;&amp;gt;' then do;    /* landmark for subsequent rows are data */
    input_now_flag = 1;
    delete;
  end;

  if _infile_ =: '//' then do;  /* landmark for end of data, ignore anything that might be after it */
    input_now_flag = 0;
    delete;
  end;

  if not input_now_flag then delete;   /* have not reached the data landmark yet */

  /* presume all *.csv files have the same column structure and order */
  input date :mmddyy. time :time. (point1-point8) (:nodata.);

  format date yymmdd10. time time8. ;
  keep source_filename date time point1-point8;
run;&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;Thanks again for all of this, am learning a great deal about importing via a datastep !&lt;/P&gt;&lt;P&gt;Robert&lt;/P&gt;</description>
      <pubDate>Mon, 25 May 2020 15:04:05 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650463#M195036</guid>
      <dc:creator>rmacarthur</dc:creator>
      <dc:date>2020-05-25T15:04:05Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650499#M195050</link>
      <description>&lt;P&gt;Thanks for catching the FILENAME problem in untested code. End of post has tested example with generated data.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;You probably want to check the value returned by filename=, it's probably $8, and should be lengthened&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;  length source_filename in_filename $200 ;&lt;/LI-CODE&gt;
&lt;P&gt;Why is filename= automatic variable default length of $8? Probably because standard file references (i.e. filename &amp;lt;fileref&amp;gt; "path") are identifiers (a reference name) with an 8 character limit.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Yes, lots of files makes for lots of NOTES. You can get one NOTE if you use the operating system to your advantage, however you would lose the information about which file a record came from (if that loss is OK, hey no problem)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Use OS to execute a command that concatenates all the files together, and read from that commands stdout&lt;/P&gt;
&lt;PRE&gt;filename ALLCSV pipe "powershell -c ""cat &amp;amp;PATH.\*.csv""";&lt;BR /&gt;data want;&lt;BR /&gt;  infile ALLCSV dsd missover;&lt;BR /&gt;  ...&lt;/PRE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Back to business....&lt;/P&gt;
&lt;P&gt;The first 'landmark' identifies the header line of the data. I chose the test to be "a line starting with&amp;nbsp;&lt;CODE&gt;&amp;lt;&amp;gt;&lt;/CODE&gt;&amp;nbsp;is the header line" based on the sample data in the question.&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;&amp;lt;&amp;gt;Date Time Point_1 Point_2 Point_3 Point_4 Point_5 Point_6 Point_7 Point_8&lt;/LI-CODE&gt;
&lt;P&gt;The next line after the header line should be data.&lt;/P&gt;
&lt;P&gt;If your test for the header line landmark is incorrect the program will never reach the second INPUT statement, and you will get 0 observations.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I chose&lt;CODE&gt;//&lt;/CODE&gt;&amp;nbsp;to be the landmark for end of data, again based on the sample data. Regardless, the input_now_flag is reset whenever a new filename is encountered.&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;12/31/2019 7:25:11 72.25 0.05 26.81 73.62 -0.09 -0.06 No Data No Data
////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Tested example (900 data rows written and 900 read in) :&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;data _null_;
  do i = 1 to 9;
    out_filename = cats("C:\temp\",i,".csv");
    file multiple filevar=out_filename;
    do j = 1 to 15;
      put 'header detail' j;
    end;
    put '&amp;lt;&amp;gt;a,b,c';
    do z = 1 to 100;
      a = i * 100 + z-1;
      b = a**2;
      c = sqrt(a);
      put a ',' b ',' c;
    end;
    put '/////////';
    put '999,999,999, i dare you to ignore me';
  end;
run;

proc format ;
  invalue  nodata
    'TIME (-3)'=.t
    'No Data'=.d 
     other=_same_
  ; 
run;

filename ALLCSV pipe "powershell -c ""cat c:\temp\?.csv""";

data want;
  length source_filename $200 in_filename $200 ;

  infile 'c:\temp\?.csv' filename=in_filename dsd missover ;

/*  infile ALLCSV dsd missover;*/

  retain input_now_flag 0 source_filename;

  if lag (in_filename) ne in_filename then do;    /* start of reading from next file */
    source_filename = in_filename;
    input_now_flag = 0;
  end;

  input @;

  if _infile_ =: '&amp;lt;&amp;gt;' then do;    /* landmark for subsequent rows are data */
    input_now_flag = 1;
    delete;
  end;

  if _infile_ =: '//' then do;  /* landmark for end of data, ignore anything that might be after it */
    input_now_flag = 0;
    delete;
  end;

  if not input_now_flag then delete;   /* have not reached the data landmark yet */

  /* presume all *.csv files have the same column structure and order */
  input @1 a b c; *date :mmddyy. time :time. (point1-point8) (:nodata.);

  drop input_now_flag;
run;&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 25 May 2020 17:09:00 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650499#M195050</guid>
      <dc:creator>RichardDeVen</dc:creator>
      <dc:date>2020-05-25T17:09:00Z</dc:date>
    </item>
    <item>
      <title>Re: How to import many complex .csv files (1)</title>
      <link>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650533#M195072</link>
      <description>&lt;P&gt;Hi Richard,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for the reply and clarifications, I understand the code much better now. &amp;nbsp;&lt;/P&gt;&lt;P&gt;With help from you and Tom, the data have been imported and connected and I was able to generate our first summary report today. &amp;nbsp;&lt;/P&gt;&lt;P&gt;Which is great to have gotten done.&lt;/P&gt;&lt;P&gt;Many thanks for the detailed recommendations and collective patience,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Robert&lt;/P&gt;</description>
      <pubDate>Mon, 25 May 2020 21:17:43 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/How-to-import-many-complex-csv-files-1/m-p/650533#M195072</guid>
      <dc:creator>rmacarthur</dc:creator>
      <dc:date>2020-05-25T21:17:43Z</dc:date>
    </item>
  </channel>
</rss>

