I am struggling in my attempt to pull data into a table from a website. Here is the code that I have thus far, but with this method, I have to parse the data (which I am not exactly sure how). I also see other approaches where I could use the XML Mapper, but my mapper references keep failing (libname sec xmlv2 xmlfileref= test xmlmap=tempmap automap=replace) Error: The creation of the XML Mapper file failed.
Any thoughts are appreciated. Thanks.
filename src temp;
proc http
method="GET"
url="http://www.barrons.com/public/page/majormarket-nasdaqnational-A.html"
out=src;
run;
data rep;
infile src length=len lrecl=32767;
input line $varying32767. len;
line = strip(line);
if len>0;
run;
The page is in HTML so I fear you'll have to parse the result.
On the positive side: The links to all the different pages (A, B, C,...) are easily accessible in the code so once you've got the parsing of the first A page working, you should be able to just iterate over all the links using the same HTTP request and parsing code.
And just as a thought: If that's a one off then it's eventually much easier to just copy/paste the tables into a text editor and then use a simple data step infile/input statement reading in orthogonal tab delimited data.
April 27 – 30 | Gaylord Texan | Grapevine, Texas
Walk in ready to learn. Walk out ready to deliver. This is the data and AI conference you can't afford to miss.
Register now and save with the early bird rate—just $795!
Learn how use the CAT functions in SAS to join values from multiple variables into a single value.
Find more tutorials on the SAS Users YouTube channel.
Ready to level-up your skills? Choose your own adventure.