SAS Data Integration Studio, DataFlux Data Management Studio, SAS/ACCESS, SAS Data Loader for Hadoop and others

standardization through Data flux studio

Reply
Contributor
Posts: 48

standardization through Data flux studio

I have a fields named ADDRESS, which contains varchar data of 120 bytes(i.e. Sample is shown below). I need to standardize a portion of string, for example:

Address

#39/1, Electronic City, St. Green Glen, Bangalore

I need to convert St. to Street and rest all will be same (

#39/1, Electronic City, Street Green Glen, Bangalore

)

Kindly help me on this.

SAS Employee
Posts: 85

Re: standardization through Data flux studio

You can parse the data using the parsing node then use the standardization node with the English (India) QKB and the ENIND Address Extension Standards definition.

-shawn

Contributor
Posts: 48

Re: standardization through Data flux studio

skillman,

I tried it by the following way.I created a definition and selected in substring.It is working fine in sas data management studio.But while running the same job in sas data management server it is showing following error as |error|DF001|0;|2;Blue fusion error -801:invalid definition name :address>this definition may not exist in your qkb.

SAS Employee
Posts: 75

Re: standardization through Data flux studio

Does your server have access to the appropriate QKB? As explained in the Prereq section of this topic:

Deploying Jobs to a Data Management Server

SAS Employee
Posts: 85

Re: standardization through Data flux studio

You may wish to have a centralized location for the QKB that both the client and server can access, rather than having two distinct installations of the QKB (which can lead to versioning issues). Check that the QKB on the server is the same QKB version that the client uses.

Contributor
Posts: 48

Re: standardization through Data flux studio

skillman,

I am getting an error while inserting match codes into database output table.The error was 11: Data Access Plugin - Max. ODBC error count (500) exceeded. Last error:.My source data is having 92 lakhg records....when i try to inser match codes it is inserting only 111 records .Kindly help me on this.

Many thanks in advance.

SAS Employee
Posts: 85

Re: standardization through Data flux studio

What type of database are you writing to? Are you inserting records into the same table that you are reading data in the data input step? That could cause your issue. Try inserting data into a separate table to see if that resolves table locking errors. Also, can you submit the whole log file?

Contributor
Posts: 48

Re: standardization through Data flux studio


skillman,

I am trying to insert into oracle database and inserting into separate table.can i set anywhere options like set define off and set scan off.

Contributor
Posts: 48

Re: standardization through Data flux studio


hi skillman,

can you kindly suggest me how to increase cluster memory morethan 4GB.my data management server is running on 64 bit mode.I am unable to give morethan 2047 mb.When i give it to 4000mb it is prompting to enter integer 2 to 2047 .I checked in the dmserver.cfg file also.there i find cluster/bytes.

your inputs are highly appreciated.

many thanks in advance.

Ask a Question
Discussion stats
  • 8 replies
  • 965 views
  • 4 likes
  • 3 in conversation