<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Reading-in New Data Everyday in Developers</title>
    <link>https://communities.sas.com/t5/Developers/Reading-in-New-Data-Everyday/m-p/218444#M4752</link>
    <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;you could process that txt file to a unique sas-dataset for each text-file and the rename / move that txt-file to another location. (just have a copy to fall-back)&lt;BR /&gt;There could be no need for having an append approach. Yes that appending is needed because it is the only option in a RDBMS environment.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;SAS is different when you need to process is with a datastep or view &lt;A href="http://support.sas.com/documentation/cdl/en/lestmtsref/67407/HTML/default/viewer.htm#p00hxg3x8lwivcn1f0e9axziw57y.htm" title="http://support.sas.com/documentation/cdl/en/lestmtsref/67407/HTML/default/viewer.htm#p00hxg3x8lwivcn1f0e9axziw57y.htm"&gt;SAS(R) 9.4 Statements: Reference, Third Edition&lt;/A&gt; .&lt;/P&gt;&lt;P&gt;As you can specify a datasetlist (that one of all needed daily txt files) you can specify the period of needed files by defining that list.&lt;BR /&gt;Needed to drop outdated data is just dropping those isolated datasets. &lt;BR /&gt;No processing for concatenation (appending) and dropping is needed and every time only the needed data (period) is accessed nothing else. &lt;/P&gt;&lt;P&gt;It can't become more easy and simple as that without all RDBMS OLTP restrictions.&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
    <pubDate>Sat, 23 May 2015 11:59:44 GMT</pubDate>
    <dc:creator>jakarman</dc:creator>
    <dc:date>2015-05-23T11:59:44Z</dc:date>
    <item>
      <title>Reading-in New Data Everyday</title>
      <link>https://communities.sas.com/t5/Developers/Reading-in-New-Data-Everyday/m-p/218441#M4749</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;I am a brand new user of SAS and I have a couple of questions about reading in my data. Everyday there is a new .txt or .csv file created and placed in a folder for me to read-in. The name is usually the current date (dd/mm/yy, ex. 5-22-15.txt), in which the file was created. There is a lot of data in each file. I would really like write a SAS program that would pull in each day's data and add it to the prior day's data. I could then delete that individual text file each day because it has been added to my main SAS program file. So on to my questions:&lt;/P&gt;&lt;P&gt;1) Is there any way that place the .txt in the temporary libref (work.xxxx) using the libname statement so that SAS can automatically deletes it once I am done with the session, or would deleting the actual .txt always be my responsibility?&lt;/P&gt;&lt;P&gt;2) Is there a way to set up an indexing counter that would always read-in the text file with today's date, or a function that pulls in the newest text file?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks for the help!!&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 22 May 2015 16:18:12 GMT</pubDate>
      <guid>https://communities.sas.com/t5/Developers/Reading-in-New-Data-Everyday/m-p/218441#M4749</guid>
      <dc:creator>pswork</dc:creator>
      <dc:date>2015-05-22T16:18:12Z</dc:date>
    </item>
    <item>
      <title>Re: Reading-in New Data Everyday</title>
      <link>https://communities.sas.com/t5/Developers/Reading-in-New-Data-Everyday/m-p/218442#M4750</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;1) I'm very cautious about any program that deletes files unconditionally. You never know when someone changes the layout or contents and you can spend a lot of time trying to get a conditional delete to work properly. I would be very tempted to have a separate job run manually to clean up stuff. There is a function FDELETE to delete external (non-SAS) files.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;2) One way is to get the current date formatted appropriately into a macro variable and add that to the file name.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;%let dstr = %sysfunc(putn(%sysfunc(today()),mmddyyd8.));&lt;/P&gt;&lt;P&gt;and use "c:\folder\otherfolder\Filename&amp;amp;dstr..txt" as needed. NOTE: the quotes must be " for the value to resolve and there are two . as the first signifies the end of the date string.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Experiment with %put "c:\folder\otherfolder\Filename&amp;amp;dstr..txt", looking the LOG for the resolved value to make sure it looks right. And remember that some operating systems the filename and path are case sensitive.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 22 May 2015 18:29:41 GMT</pubDate>
      <guid>https://communities.sas.com/t5/Developers/Reading-in-New-Data-Everyday/m-p/218442#M4750</guid>
      <dc:creator>ballardw</dc:creator>
      <dc:date>2015-05-22T18:29:41Z</dc:date>
    </item>
    <item>
      <title>Re: Reading-in New Data Everyday</title>
      <link>https://communities.sas.com/t5/Developers/Reading-in-New-Data-Everyday/m-p/218443#M4751</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;I agree that deleting things is not such a good idea, set it up in a back up file before deleting, or make sure that there's an IT person that has a backup... I don't usually trust that.&amp;nbsp; I like having my own backup.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Another tip, seems obvious but it comes up frequently.&amp;nbsp; If you run a program with an append statement in it more than once you double the data for that month/day/week/interval.&lt;/P&gt;&lt;P&gt;Seems so obvious but I deal with end users that do it so often.&amp;nbsp; One user does it so often that I added a minute time stamp to make it easier to go in and delete iterations.&amp;nbsp; &lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 22 May 2015 19:12:05 GMT</pubDate>
      <guid>https://communities.sas.com/t5/Developers/Reading-in-New-Data-Everyday/m-p/218443#M4751</guid>
      <dc:creator>Steelers_In_DC</dc:creator>
      <dc:date>2015-05-22T19:12:05Z</dc:date>
    </item>
    <item>
      <title>Re: Reading-in New Data Everyday</title>
      <link>https://communities.sas.com/t5/Developers/Reading-in-New-Data-Everyday/m-p/218444#M4752</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;you could process that txt file to a unique sas-dataset for each text-file and the rename / move that txt-file to another location. (just have a copy to fall-back)&lt;BR /&gt;There could be no need for having an append approach. Yes that appending is needed because it is the only option in a RDBMS environment.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;SAS is different when you need to process is with a datastep or view &lt;A href="http://support.sas.com/documentation/cdl/en/lestmtsref/67407/HTML/default/viewer.htm#p00hxg3x8lwivcn1f0e9axziw57y.htm" title="http://support.sas.com/documentation/cdl/en/lestmtsref/67407/HTML/default/viewer.htm#p00hxg3x8lwivcn1f0e9axziw57y.htm"&gt;SAS(R) 9.4 Statements: Reference, Third Edition&lt;/A&gt; .&lt;/P&gt;&lt;P&gt;As you can specify a datasetlist (that one of all needed daily txt files) you can specify the period of needed files by defining that list.&lt;BR /&gt;Needed to drop outdated data is just dropping those isolated datasets. &lt;BR /&gt;No processing for concatenation (appending) and dropping is needed and every time only the needed data (period) is accessed nothing else. &lt;/P&gt;&lt;P&gt;It can't become more easy and simple as that without all RDBMS OLTP restrictions.&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Sat, 23 May 2015 11:59:44 GMT</pubDate>
      <guid>https://communities.sas.com/t5/Developers/Reading-in-New-Data-Everyday/m-p/218444#M4752</guid>
      <dc:creator>jakarman</dc:creator>
      <dc:date>2015-05-23T11:59:44Z</dc:date>
    </item>
  </channel>
</rss>

