BookmarkSubscribeRSS Feed
ajain59
Calcite | Level 5

Hi,

 

I am getting the following error while submitting the Pig statements using a text file

pigcommand.txt contains

 

A = load '/user/cloudera/newdirectory1/wordcount.txt' AS (FNAME:chararray,LName:chararray);
B= store A into '/user/cloudera/newdirectory2' USING PigStorage(',');

 

/* Run pig statements from a text file */

filename test1 'C:\Users\ajain59\Desktop\abc\pigcommand.txt';

proc hadoop username='cloudera' password='cloudera' verbose;

pig code=test1;

run;

 

SAS Error log:

NOTE: 03-07 13:51:46,016 [RunPigThread][ERROR RunPigThread] -

org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1002: Unable to store alias B

ERROR: The pig script has encountered errors.

ERROR: Unable to store alias B

 

I have correctly set SAS_HADOOP_JAR_PATH and SAS_HADOOP_CONFIG_PATH.

 

When i used the pig statments directly on pig command promt in CDH, it is working fine.

 

Can you please help?

 

Regards,

Ashish Jain

15 REPLIES 15
JBailey
Barite | Level 11

Hi @ajain59

 

Try this:

 

A = load '/user/cloudera/newdirectory1/wordcount.txt' AS (FNAME:chararray,LName:chararray);
store A into '/user/cloudera/newdirectory2' USING PigStorage(',');

 

I don't think the STORE command needs a variable.

 

Here is an example I have used in training. 

d_dividends = LOAD '/user/myuser/duped/NYSE_dividends' as (d_exchange, d_symbol, d_date, d_dividend);
d_grouped = GROUP d_dividends BY d_symbol;
d_avg = FOREACH d_grouped GENERATE group, AVG(d_dividends.d_dividend);
STORE d_avg INTO '/user/myuser/d_average_dividend';

 

Best wishes,

Jeff

 

 

ajain59
Calcite | Level 5

getting the same error again !!

JBailey
Barite | Level 11

Hi @ajain59

 

If you remove the reference to B then the error should be different because B is no longer there.

 

Running this:

A = load '/user/cloudera/newdirectory1/wordcount.txt' AS (FNAME:chararray,LName:chararray);
store A into '/user/cloudera/newdirectory2' USING PigStorage(',');

 

Results in the following error?

ERROR: The pig script has encountered errors.

ERROR: Unable to store alias B

 

I will see if I can find a working example. I think I have a wordcount example.

ajain59
Calcite | Level 5
ERROR: The pig script has encountered errors.
ERROR: Unable to store alias A
Victor
Calcite | Level 5

Hello

 

 

I have the same issue. is there anyone found the solution?

 

I'm using Hortonworks sandbox 2.6.4

 

 

Thanks

Victor
Calcite | Level 5
I forgot to mention that the following instructions
filename hadoop
libname hadoop
proc hadoop hdfs

work very well.Only PIG and MAPREDUCE dont work
JBailey
Barite | Level 11

Hi @Victor,

 

This is likely due to the JARs and Config files being used. Can you update us on the versions of SAS and Hadoop that you are using?

 

You may find this helpful (then again you may not;): 

https://github.com/Jeff-Bailey/SGF2016_SAS3880_Insiders_Guide_Hadoop_HOW

 

Best wishes,

Jeff

Victor
Calcite | Level 5

Hello and thank's for your answer

 

 

SAS version:         SAS 9.4 TS Level 1M5

Sanbox version :   2.5.0.0-1245

 

 

Sandbox information:                                                                                                                                                                   
Created on: 25_10_2016_08_11_26 for                                                                                                                                                    
Hadoop stack version:  Hadoop 2.7.3.2.5.0.0-1245                                                                                                                                       
Ambari Version: 2.4.0.0-1225                                                                                                                                                           
Ambari Hash: 59175b7aa1ddb74b85551c632e3ce42fed8f0c85                                                                                                                                  
Ambari build:  Release : 1225                                                                                                                                                          
Java version:  1.8.0_111                                                                                                                                                               
OS Version:  CentOS release 6.8 (Final)

 

 

Config files are dowloaded directly from the HortonWorks Sandbox: Service Actions / Download Client Configs

JAR files are downloaded directly drom the HortonWorks Sandbox drom directory /usr/current/hdp

 

The pig code is very simple :

 

A = LOAD '/user/maria_dev/data/class.csv' using PigStorage(',');
dump A;

 

 

I activated SAS logs with jreoptions and i got the following java error

 

java.net.ConnectException: Connection timed out: no further information
 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
 at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:744)
 at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
 at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1601)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1342)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463)

 

 

Regards

 

JBailey
Barite | Level 11

Hi @Victor

 

Are you merging the Core and HDFS XML files into a single file? I believe that was required in SAS 9.4M1.

 

Best wishes,

Jeff

JBailey
Barite | Level 11

I have @ajain59 and @Victor issues confused. For SAS 9.4M5 concatenation of the XML files in not required.

JBailey
Barite | Level 11

Hi,

 

This may be due to HDP 2.6.4 only supporting Java 8.

 

Check out this SAS Note: http://support.sas.com/kb/61/703.html

 

Best wishes,
Jeff

Victor
Calcite | Level 5

Hello

 

I merged xml files, but i always get the same error

 

I'm using JAVA 8. I have no error in LIBNAME, FILENAME or PROC HADOOP HDFS instructions.

 

I think the problem is that I'm using a sandbox VM on VirtualBox. Hadoop is contained in a Docker image.

 

the best way is to find someone who has done or wants to do exactly the same tests on the same type of machine

 

 

Regards

 

JBailey
Barite | Level 11

Hi @Victor

 

I think this may help you with this issue. There are limitation to SAS support of Java8.

 

Usage Note 61703: Configure SAS/ACCESS® Interface to Hadoop and SAS® Data Loader for Hadoop to suppo...

 

Best wishes,
Jeff

 

ajain59
Calcite | Level 5

Hi,

 

Add this additional property in mapred-site.xml to run cross-platform map-reduce and pig scripts.

 

 

 

<property>

<name>mapreduce.app-submission.cross-platform</name>

<value>true</value>

</property>

 

Let me know if this resolves your issue.

 

sas-innovate-2024.png

Join us for SAS Innovate April 16-19 at the Aria in Las Vegas. Bring the team and save big with our group pricing for a limited time only.

Pre-conference courses and tutorials are filling up fast and are always a sellout. Register today to reserve your seat.

 

Register now!

How to connect to databases in SAS Viya

Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 15 replies
  • 1911 views
  • 0 likes
  • 3 in conversation