01-27-2016
nikhil_khanolkar
Calcite | Level 5
Member since
11-24-2014
- 31 Posts
- 4 Likes Given
- 0 Solutions
- 0 Likes Received
-
Latest posts by nikhil_khanolkar
Subject Views Posted 1198 01-27-2016 04:00 AM 2482 01-20-2016 11:17 PM 2496 01-20-2016 12:30 AM 2509 01-19-2016 03:53 AM 2528 01-19-2016 01:38 AM 2539 12-22-2015 04:46 AM 2549 12-22-2015 12:28 AM 4588 10-13-2015 12:01 AM 4594 10-12-2015 09:38 AM 4617 10-08-2015 12:54 AM -
Activity Feed for nikhil_khanolkar
- Posted Changing Table metadata of LASR server table on SAS Data Management. 01-27-2016 04:00 AM
- Posted Re: Copying data from LASR server to SAS server on SAS Data Management. 01-20-2016 11:17 PM
- Posted Re: Copying data from LASR server to SAS server on SAS Data Management. 01-20-2016 12:30 AM
- Posted Re: Copying data from LASR server to SAS server on SAS Data Management. 01-19-2016 03:53 AM
- Posted Copying data from LASR server to SAS server on SAS Data Management. 01-19-2016 01:38 AM
- Posted Re: Reload on Start in LASR on SAS Visual Analytics. 12-22-2015 04:46 AM
- Posted Reload on Start in LASR on SAS Visual Analytics. 12-22-2015 12:28 AM
- Posted Re: How to insert file in the DB table as a BLOB on SAS Programming. 10-13-2015 12:01 AM
- Posted Re: How to insert file in the DB table as a BLOB on SAS Programming. 10-12-2015 09:38 AM
- Liked Re: How to insert file in the DB table as a BLOB for Patrick. 10-12-2015 09:29 AM
- Posted How to insert file in the DB table as a BLOB on SAS Programming. 10-08-2015 12:54 AM
- Posted Re: Best practices to control data Size in LASR on SAS Visual Analytics. 06-30-2015 03:23 AM
- Liked Re: Best practices to control data Size in LASR for Kurt_Bremser. 06-30-2015 01:49 AM
- Posted Re: Best practices to control data Size in LASR on SAS Visual Analytics. 06-29-2015 11:39 PM
- Posted Best practices to control data Size in LASR on SAS Visual Analytics. 06-29-2015 03:15 AM
- Posted Backing uo compressed data from LASR to HDFS on SAS Visual Analytics. 06-26-2015 02:18 AM
- Posted Re: renaming LASR server dataset on SAS Visual Analytics. 06-10-2015 06:01 AM
- Posted renaming LASR server dataset on SAS Visual Analytics. 06-10-2015 03:24 AM
- Posted Re: Copying data into HDFS from LASR server on SAS Visual Analytics. 06-10-2015 02:22 AM
- Liked Re: Copying data into HDFS from LASR server for gergely_batho. 06-10-2015 02:18 AM
-
Posts I Liked
Subject Likes Author Latest Post 2 1 2 1
01-27-2016
04:00 AM
Hi,
Can we alter a table Metadata of LASR In Memory Table?
e.g. Say changing a length/type of the column
Nikhil
... View more
01-20-2016
11:17 PM
Hi Linus,
That was a joke right..:) If your brain is slow (even sometimes) in the SAS DWH/BI area then god knows if someone like me even has a brain..;)
Your understanding is correct in the 3 bullet points you mentioned. Just one correction, we are not using Data builder. We are using SAS DI/ETL for doing this processing.
In a small POC we did, it was noticed that Star schema approach is impacting the end user performance, so did not use that.
We are using build-in record level authorization within VA. thats the reason data size in LASR is increased as we ended up storing one record per user based on the access rights.
One question. I am assumimg by co-located storage location you meant a Base SAS location? If yes then we are planning store/back up data in LASR to the this location. and looking out for suggestion for th most efficient way to do this.
Or I am missing your point here. 🙂
Nikhil
... View more
01-20-2016
12:30 AM
Hi,
Our master data store is SAS. on this data Account level user authorization information is processed and data is hosted in LASR. Row level security is applied in LASR to be used by SAS VA for dashboard. . LASR Updates we mentioned are about change in access/autorization information.
After applying user authorization information size of the data gets increased exponentially. As mentioned earlier beacuse of the infra challenges we opted for this routine. and unfortunately we could not use Out of box synchronization unless this data is copied back to the Physical location in SAS.
Hence looking out for suggestion on the what could be the most eficient way to do this. We have not tried Data step so not sure about the performance yet.
Nikhil
... View more
01-19-2016
03:53 AM
Hi Linus,
We are trying to leverage reload at start functionality and hence want to copy/back up data from LASR to SAS server i.e. co-located storage, so it can be backed up to LASR memory at the time of server re-start.
Few facts for your reference.
a) LASR table is updated multiple times in a day with the security/Auth rules.
b) because of the infra challenges we decided to take Data Server --> SAS ETL ---> Upload to LASR flow at the start of the project.
c) Now we are close to completion and re-designing a solution would involve lot of efforts. so planning to back up data from LASR to SAS server i.e. co-located storage to leverage re-load at start.
Using a Data step might be slower.
And FTP would not work since we are copying data from memory to Physical server I suppose?
Thanks,
Nikhil
... View more
01-19-2016
01:38 AM
Hi,
As part of the requirement, we need to copy tables/data from LASR server (In memory data) to the SAS server and store as a physical tables. We need to copy this multiple times in a day and data size is around 900 GB.
What is the most efficient way to perform this operation?
Any suggestions..?
Thanks,
Nikhil
... View more
12-22-2015
04:46 AM
Hi Snehasis,
Thanks for the Reply.
I am trying to use the Reload at start functionality that comes Out of the Box. Baiscally we want reload the Data in memory from a back up copy of data in Hadoop dynamically every time LASR server re starts.
Nikhil
... View more
12-22-2015
12:28 AM
Hi,
In our project we are creating and updating a Table in LASR by Processing/Joining Datasets in SAS server. After creating a data in LASR we copy that data in the HDFS. This Scheduled process executes multiple times during a day.
Would "reload on start" funtionality work in this case? We need to reload all datasets in LASR memory every time LASR server starts.
Thanks,
Nikhil
... View more
10-13-2015
12:01 AM
Hi Patrick,
That approach is being considered where in SQL reading a file from the location and then inserting in a table. But that would involve creating a area that is accesible to SQL and SAS. We would take that path if any other approach does not work. 🙂
Can you please advice reasons why idea discussed wouldn't work? Just for my understanding.
Is it because concept of storing a file as a binary object in memory is not native to SAS among other reasons?
Regards,
Nikhil
... View more
10-12-2015
09:38 AM
Hi Patrick,
thanks a lot for your thoughts. You are right, database server does not have access to the File location and there are challeges in providing access beacuse of security concerns. So it would take a while before we have required access in place.
I was thinking about the workaround where
store .CSV file into the SAS server memory as a binary object variable.
and then stream this object from SAS memory into the SQL table column as a BLOB
Is this feasible to do in SAS?
Please share your thoughts/suggestions.
Thanks in advance,
Nikhil
... View more
10-08-2015
12:54 AM
Hello,
We are tryng to Insert an .CSV file as a Blob object in the Table in MS SQL server.
A) SAS is installed on the UNIX sever and We are using SQL Passthrough facility to insert the file in the database table. B) We are facing an error while doing this task.
ERROR: CLI prepare error: [SAS/ACCESS][ODBC 20101 driver][Microsoft SQL Server]Cannot bulk load because the file "/opt/sasinside/SASDATA/EXPORT_LOGS/TESTEXPORT.csv" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 15105).
It seems like system is not able to find the path specified (Error code 3?)
To insert file as a blob in the Database, is it necessary to store the file in the same server/location? Is there any workaround to do this task?
Code Used:
proc sql; connect to sqlsvr as myconn (datasrc="MSSQLDSN" user=WinProfileSSRReader password='xxxxxxxxxxxxxxxxxx');
select * from connection to myconn (
DECLARE @file AS VARBINARY(MAX); SELECT @file = CAST(bulkcolumn AS VARBINARY(MAX)) FROM OPENROWSET( BULK '/opt/sasinside/SASDATA/EXPORT_LOGS/TESTEXPORT.csv', SINGLE_BLOB ) AS x;
/*insert into WinProfileRefUser.SSR_REPORTS (id,tnuid,FILE)*/ /* values (NEWID(),'CCCCCC',@file)*/ );
disconnect from myconn; quit;
Thanks in Advance, Nikhil
... View more
06-30-2015
03:23 AM
Hi Linus, In the Reference guide on LASR server it is mentioned that "Compression exchanges less memory use for more CPU use. It slows down any request that processes the data". Having said that you are right in saying "uncompression ratio" during execution slows down performance, since it uncompressed every record while execution and with the compression ratio at hand its worth looking closely at the data modelling to reduce the size and hence improve the performance. Thanks a lot for your input. SAS(R) LASR(TM) Analytic Server 2.4: Reference Guide
... View more
06-29-2015
11:39 PM
Hi Linus, Please find below compression info log of the LASR table for your reference. The IMSTAT Procedure Compression Information Data Source HPS.INVENTORY_MON_DS Table HPS.INVENTORY_MON_DS Size 6.8e+02GB Compressed Size 11GB Compression Ratio 65 Thanks, Nikhil
... View more
06-29-2015
03:15 AM
Hi, We have a compressed table in LASR server with the size of 11GB. Table has 160 million records and 33 variables. Around 25 of these are character variables. Since dashboard consuming this table was giving slow performance, we uncompressed this table. Size of the uncompressed table rose to 680 GB from 11GB. Compression ratio of the said table in LASR is 65%. With this ratio, size of the compressed data has been increased 61 times. To improve the dashboard performance data needs to be stored in uncompressed form. Are there any techniques/best practices that can be used to control the data size? Thanks, Nikhil
... View more
06-26-2015
02:18 AM
Hi, I am copying compressed data from LASR server to the Hadoop. Data is successfully getting created in HDFS but data is getting uncompressed in HDFS. I am using below code to perform this task. Data in LASR is already compressed. Is there any option that can be used in below to keep the data compresssed while copying from LAST to HDFS? Proc Imstat data=valibla.&secdat.; save path="&hdfspath" replace; run; quit; Thanks, Nikhil
... View more
06-10-2015
06:01 AM
Hi Abhishek, Thanks for your responce. We need to rename the dataset dynamically using SAS code. Nikhil
... View more