We are forced to use Excel in some cases to control business rules, for lookup rules for data from different systems.
When importing and Excel spreadsheet we have not found any way of setting the length of ending variables in the imported data set. In DI studio the length from Metadata is not in all transformations automatically applied to the ending tables of the transaction.
A rudimentary fix is to set a length you think is long enough, import the table to a temp table and create the &_output1 table duplicating the lengths from the mapping. You can also run a contents on the temp table to see the length and compare against the lengths in the mappings.
This is cumbersome. Does anybody have a more elegant solution?
Indeed there are always constraints. But these should not stop you from pushing for an ideal scenario and then compromise. For instance, if a person a uses Excel and will not change, this does not stop me from taking the file, saving that file to a format I want, then using that file rather than the original. It does add another not-ideal step into the equation, but is perfectly fine.
Another point, push back should always be tried. Not just saying I don't want X format, but present other cost factors. For instance the ones I tend to use are financial and timelines - e.g: We can use the given format, however as this provide an unstable base for our process, we would need to increase the budget assigned to handle this by £xyz, and we put a reduced Up time on the process due to inability to rely on given format.
Both of the above are perfectly true as a file which changes, or is not as expected will generate more work for you and delay the process happening. This should be made known to the customer, and then if they accept this then fine. It is then known and can be documented in the various documentation files.
Eradicate Excel. This is the simple answer. Otherwise not, not really. Excel is a really poor data medium, any operation on it is going to be less stable and unpredictable than using a good data medium. I also expect you are using proc import to get the data, this again adds another layer of instability and unpredictability as that guesses the best format for the data.
There are some ways you could force the Excel file to be more compliant (Data validation, formatting cells, locking them from changes etc.) however the reason why so many people love using Excel is because it has an extremely flexible front end. The reason why programmers and data users dislike Excel so much is because of the flexibility where programming needs hard clean lines.
If you really have no options, then your best bet is to go with what you mention, create a template dataset with long lengths, then set that with your imported data, assuming the format of variables does not change the longer lengths will be used and programming can go on from there. Although I suspect at some point in the future someone will decide to add or delete columns, or change the type etc. This is the flexibility I mentioned, and it will kill your programming no matter how much your try to cover everything.
Thanks, yes I would love not to use Excel for this, but sometimes you are forced.
Thanks for Your comments
Indeed there are always constraints. But these should not stop you from pushing for an ideal scenario and then compromise. For instance, if a person a uses Excel and will not change, this does not stop me from taking the file, saving that file to a format I want, then using that file rather than the original. It does add another not-ideal step into the equation, but is perfectly fine.
Another point, push back should always be tried. Not just saying I don't want X format, but present other cost factors. For instance the ones I tend to use are financial and timelines - e.g: We can use the given format, however as this provide an unstable base for our process, we would need to increase the budget assigned to handle this by £xyz, and we put a reduced Up time on the process due to inability to rely on given format.
Both of the above are perfectly true as a file which changes, or is not as expected will generate more work for you and delay the process happening. This should be made known to the customer, and then if they accept this then fine. It is then known and can be documented in the various documentation files.
Thanks RW). We are in complkete agreement. We already have a set-up using better formats. In the present case there are to many loose ends to get the full modelling properly done.
Howverer you comments and @LinusH comment in this thread has been really helpful. Helped me rememebr some of Our tricks from old in fixing this. Thanks a lot.
Thanks, yes I completly agree. The Challenge is that we are in a phase in the work where we need flexibility as we are not really certain where we need the rules and what kind of lookup tables are needed. The various look-up tables are actually ending up in "real database" tables in the end. The rest of the DI jobs are run based on these tables.
Business Rules Manager can not solve these kind of Challenges. Also we have some stability problems With Office Add-in. Have previously used this and FM as a means of editing but have some Challenges after Windows10.
Thanks for you comments
Thanks again LinusH. The add-in route Works very well actually.
I've found Excel is a useful tool for storing SAS application metadata and reference data. However our secret is to keep the Excel workbooks stored in a version control tool that is only accessible by the application administrators. We find we very rarely have any problems when Excel is managed this way. We don't use DI Studio though so maybe "normal" SAS is a bit more forgiving.
For Excel files outside of tight version control our experience mirrors what others have said in this post. I personally would prefer not to use them unless there is no alternative and unfortunately sometimes there isn't.
Follow the below steps to importing excel file in studio:
1. In the menu, select Import from the File menu.
2. In the Import wizard, click to expand the Mule folder, select Anypoint Studio generated Deployable Archive (.zip), then click Next.
3. In the Zip File field, click the ellipses (…) button to explore your local drive to find, then select, the Mule Deployable Archive file (i.e. the .zip file) you wish to import.
4. Change the Project Name, if you wish, then use the drop-down to select a Server Runtime from those you have downloaded locally.
5. Click Finish. Studio loads the archive file, then displays the project in the Package Explorer under the project name you specified.
I hope this will be helpful.
Join us for SAS Innovate 2025, our biggest and most exciting global event of the year, in Orlando, FL, from May 6-9. Sign up by March 14 for just $795.
Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.
Find more tutorials on the SAS Users YouTube channel.