Other than his comment about DDE (i.e., not using something simply because of others' opinions .. especially when we don't all agree), I agree with Peter's comments. However, if these are data that you scrapped from the web, why are you adding the extra layer of complexity by first putting the data into Excel files?
Using the same techniques that Peter suggested, you could download the files directly and not risk introducing the whole range of import and conversion problems that can result from trying to interface with Excel.
Art
---------
> smilingmelbourne
>
> Sounds like you should read and "transpose" your input
> in the first step to touch your data. That isn't
> supplied "out-of-the-box".
> For your data source, a libname statement pointing at
> the workbook could simplify access to the data, but
> you will need to ensure the first 8 lines of each
> sheet clarify the data types for those columns.
> Although you could read excel directly through DDE,
> most experts deprecate that technology.
> If your data are CSV-type, then a simple data step
> allows you to achieve directly what you need.
> Because you would "transpose" as you read data, you
> won't need to define much more than "code", "date"
> and "value" as output columns.
> If you want a special flag for those "$$ERRORS"
> cells, that is not much more work either.
> Let the INFILE and INPUT do the hard work
> Even if your datastream is created with 100,000
> columns (highly unlikely) the program becomes no
> larger.
>
> But of course, as you say you don't want(/need?) to
> learn INFILE/INPUT (how to make your task simpler
> with a little learning):
> Still, I would suggest your solution will become more
> manageable once you understand these things, and:
> * trailing @
> * output
> * do loop
> More examples and discussion can be found in
>
http://support.sas.com/documentation/cdl/en/basess/581
> 33/HTML/default/a001302699.htm
>
> peterC