05-31-2013 07:36 PM
I have be working with tech support on an issue where the new set operator transform keeps losing reference to the input datasets. They have been really great in trying to find the cause of this issue, but it is not an easy problem to reproduce. I suspect it might be peculiar to my set up. If anyone else has had any issues, can you respond?
Other issues we have encountered which again, seems hard to track down a cause include:
- every now and then, the save option becomes disabled, meaning you can't save your work.
- transforms appear (pop up in the top left hand corner) that you never added.
06-01-2013 08:52 AM
I've had the same issues and it's very annoying.
Below a way to replicate it.
1. Create a simple job using a set operator.
2. Disconnect table "Class_2"
- In real life: we might have to modify something, eg. remove or replace a table. But what happened now to "Class"? Why got it removed as well?
3. Reconnect table "Class_2".
- Class is still "gone" and the SQL Set as initially set up remains "corrupted".
And even worse: It doesn't help to delete the connection of "Class" and reconnect this table. It still doesn't appear. I've had to re-build a SQL set more than once from scratch.
The sad thing: I've built this demo with the newest DIS4.6 version and the issue is still there.
06-02-2013 02:41 AM
I have giving up using this set operator, and rely on the append. Its a shame, because, this is a really good operator (in theory).
Hopefully, this information will help SAS troubleshoot - tech support, over to you.
06-02-2013 06:52 AM
Make sure you send the use case to replicate the issue to tech support.
I believe if it's simply about appending data then Proc Append is eventually more efficient. Where - for me - a SQL Set adds it's real value is for combining and de-duplicating rows in one go (so a "UNION" without "ALL").
06-02-2013 08:13 AM
It is however better to combine other operators with the append rather that continuing to use the set operator. I have lost days of work having to re-create the logic lost using this set operator. Archiving doesn't help - after re-creating a job and archiving, I found that a month later, the job broke again, and importing the archive made no difference. I have now recreated most of these lost jobs without the set operator and I ain't going back. Hopefully with the above info tech support can fix it, but they have had no luck thus far.
It seems that, for our environment at least, this is just one of many bugs with DI studio and the metadata environment. I recommend others please contribute to this post with issues and contact tech support rather that just put up with it, so that SAS can resolve them. The information you have provided for instance means that this is a more widespread issue, and you have provided a way to replicate the issue.
06-02-2013 09:00 AM
You need always to add the exact SAS / DIS version where you encounter an issue.
What other issues are you aware of? I'd be really interested to learn about them.
I suggest you start a new thread for issues you know about so we can discuss if this is usage or bug - and can raise them with SAS Tech Support if it's a bug. Knowing about bugs would also help all of us to implement in a way that we can get around them.
I for example are less than impressed with the new SCD Type 1 loader. First I thought it's great because it finally gives me an option to insert/update a table and at the same time generate a key - but the code generated is very inefficient and is only acceptable for low volume data. And then I would ask why we need a specific SCD Type 1 loader in first place. It appears it's only needed because the Type 2 loader (which is actually a hybrid loader for Type 1 and Type 2) isn't able to load all colums Type 1 only. Why not amend the Type 2 loader instead of throwing a separate Type 1 loader at us? Or give us an option to generate a key using the normal table loader.
06-07-2013 07:21 PM
I am using DI Studio 4.6.
The issues are outlined in the original thread - my main issue where no solution can be found is that every now and then, DI will lose its ability to save work. I would be interested to know if anyone else has had that issue. There are also random metadata corruptions that occur, and it is hard to reproduce these, and harder to find a solution.
I find that the SCD Type transforms are great in theory, but when working with external DBMS's (we work with SQL Server), they are only useful for small tables because the processing is not pushed into the database. In general, given the emphasis on Data Integration, it would be helpful if transforms such as these were not so half baked and were better optomised for DBMS processing,