Architecting, installing and maintaining your SAS environment

Has anyone automated a Metadata sync process between a PROD linux environment and a "cold" bcp linux environment on linux?

Reply
Occasional Contributor
Posts: 5

Has anyone automated a Metadata sync process between a PROD linux environment and a "cold" bcp linux environment on linux?

  We have created a number of ExportPackage that run on our PROD server.  The packages are replicated to our BCP server, and we have created a number of ImportPackage.  These 2 servers are Linux servers.  We would like to use cron to schedule these processes on a weekly basis, but we are not having any luck with running the Import/Export Package in a true background mode.

Script Example -

#!/bin/ksh

sasdir=/sas/share/SASPlatformObjectFramework/9.3
packdir=/users/apps/stg/superbid/metadata_migration
saslogdir=/users/apps/stg/superbid/metadata_migration


$sasdir/ExportPackage -host xxxxxxx -port xxxx -user xxxxxxx -password xxxxxx -package $packdir/SASGrid1_ACT_export_package.spk -objects "/System/Security/Access Control Templates(Folder)" -subprop -log $saslogdir/SAS_BATCH_EXPORT_ACT.log -since "Month to date"

Frequent Contributor
Posts: 134

Re: Has anyone automated a Metadata sync process between a PROD linux environment and a "cold" bcp linux environment on linux?

Posted in reply to mmajorza52

Unix versions of SAS Metadata command-line utilities, at least up to the 9.3, are coded using Java Spring framework (I have been told). Therefore on Linux/Unix, they require a graphic dummy adapter like Xvfb to run in an headless terminal (eg batch, cron, nohup).

On a Red Hat Linux machine (RHEL 5), we use xvfb and this works fine. Install first the Xvfb package.

On a RHEL 6 server with a 9.3 metadata server, I did a few tests using xvfb-run command, which seemed to work also (as far as I can remember : it's been quite a while).

#!/bin/sh

# test rMetadata batch import/export

# Syntax : <script> <user> <pwd>

xvfb-run -a /tools/list/sas93/SASPlatformObjectFramework/9.3/ExportPackage -host xxxxxxx -port 8561 -user "$1" -password "$2" -domain ldap -package "/home/xxxxx/package_test.SPK" -objects "/p5/Libraries"

xvfb-run -a /tools/list/sas93/SASPlatformObjectFramework/9.3/ImportPackage -host xxxxxxxxx -port 8561 -user "$1" -password "$2" -domain ldap -package "/home/xxxxxx/package_test.SPK" -target "/" -noexecute

Super User
Posts: 3,260

Re: Has anyone automated a Metadata sync process between a PROD linux environment and a "cold" bcp linux environment on linux?

Posted in reply to mmajorza52

If you are doing a complete metadata synch between identical SAS server environments then you can do a file copy of the metadata folders and avoid packaging entirely. We do this for our Windows servers so I see no reason why it would not work on Linux. 

Frequent Contributor
Posts: 134

Re: Has anyone automated a Metadata sync process between a PROD linux environment and a "cold" bcp linux environment on linux?

I agree. This can work but this requires also that the same server hostnames referenced in metadata are resolved distinctly - using aliases (?) - inside each environment.

Trusted Advisor
Posts: 3,215

Re: Has anyone automated a Metadata sync process between a PROD linux environment and a "cold" bcp linux environment on linux?

Posted in reply to mmajorza52

This is a duplicate of https://communities.sas.com/thread/60719

The question is hitting more high level IT processes often coming in by mandatory guidelines and not being solely just technical.

BCP and cold, that is a Disaster Recovery (DR) question. A DR implementation is a common requirement it can be solved in several ways.

DR Using VM:
I would suppose the virtualized machine could do that. http://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalId=100093...  
When you are using data on a SAN this is less trivial as you need the data on the SAN.  A SAN should offering mirroring to an other one.

DR Using mirrors/clones:
In this case you are having an exact copy of the machine (cold) that can be activated when needed.
As the machines are cold you cannot use anything that is needing running services. The contradiction you are wanting is a running service for importing metadata.
A full mirror of all data (business installation configuration using eg SAN features) is the way to go.      
There are several attention points:
- connections to other machines must be included. External connections to be faked an being sure not to connect them.
- Open datasets can possible get corrupted. There must be a plan to tackle that.
  The SAS metadataserver is an in-memory process not necessary having all updates done to dasd. 
  You need to solve the often confusing naming of sas,metadata backup. It is no backup as commonly used it is an offloaded consistent version.
  The web-content server is possible needing also attention on this. http://support.sas.com/documentation/cdl/en/bisag/67481/HTML/default/viewer.htm#n1n8fnuni6kbjgn1805i...
For testing purposes you will need an isolated network segment. Bringing up the machine and verify it is working as should be. Than closing/archiving that as cold again. 

DR Clustering (different locations):
Building a cold unused datacenter was usual but with all things getting into cloud possible getting outdated.  In a clustered approach the DR is included.
http://support.sas.com/documentation/cdl/en/bisag/67481/HTML/default/viewer.htm#n1w2q4quib18udn1h8oj...  (Metdataservers clustered) Clustering is aside availability also done for performance reasons. 


Backup & Recovery
This is not the same as DR although technical solutions can have some overlap. There can be confusing about it.
The issue:
- A well taken complete backup can be part of a DR plan.
  Restore the backup to a new location and get it running. 
- An DR implementation usually does not fulfill the Backup/Restore requirements  
  Getting a single or several objects back to a previous version in the operational environment.

For the Backup&Recovery you are needing the export of the metadata being capable to restore a dedicated parts to previous versions.
When there are just components going through a development life cycle there must be already something. Archiving and documenting those should be sufficient.
The metadata can be used for a lot more things.
Allowing/promoting on an adhoc way of EGuide projects PDF/Word documents is placing the SAS metadata at the same way as a Windows share.
Having backup&restore requirements defined as business governance there must be something beign stated on that.      

---->-- ja karman --<-----
Trusted Advisor
Posts: 3,215

Re: Has anyone automated a Metadata sync process between a PROD linux environment and a "cold" bcp linux environment on linux?

Posted in reply to mmajorza52

This worked as there was an isolated DR-environment just for testing the functionality of a DR machine.

The whole concept of cold DR locations I evaluate as very old and possible outdated.  It was common practice when you only did have a mainframe and terminals in a building. The indicated DR location should be capable of doing all that. The case could a plane coming down or some big fire. or...  Today mostly there are several datacenters at about least 5km distance serving several office-building locations.   

The network for that is an isolated one. Access with workstations or servers outside that area is impossible. You need to be physically connected to hat network.

Byt that you can use the hostnames ip-addresses as full copy.   There is not connection. something like all home routers serving a 168.x.x.x ip address.

Testing is/was done yearly sometimes to be repeated as a finding of missing copied volume. Once upon the time a production mistake was found in D/R not having hit the error in prod already. 

---->-- ja karman --<-----
Ask a Question
Discussion stats
  • 5 replies
  • 1460 views
  • 2 likes
  • 4 in conversation