BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
akpattnaik
Obsidian | Level 7

Hi,

 

I'm in Chapter 2: accessing HDFS and Invoking Hadoop application and trying the demo hp02d02.sas

When i execute the code i got the error: Input path does not exist. (See attachment for more info.)

I went into Hue and found out /user/shared/DIACHD/data/Custord is not available. Hence, i tried to create a folder under shared. However, i got a permission issue. Again i tried to create the folder using proc hadoop mkdir statement and got the same permission issue. Attached is the screen shot trail of all my approach. Can anyone advice, if i'm missing anything please?

 

 

 

1 ACCEPTED SOLUTION

Accepted Solutions
TheresaStemler
SAS Moderator

@akpattnaik,

Please try the following steps:  

Step 1: Did you successfully run cre8data.bat when you started the image? 

            If YES, go to Step 2. If NO, please run cre8data.bat as directed in section 1.2 of the Big Data Programming and Loading

            module.

Step 2: Did you run cre8data.sas?

            If YES, go to Step 3. If NO, please run cre8data.sas as directed in section 1.2 of the Big Data Programming and Loading

            module.This program creates the data used in this module and a required macro variable.

Step 3: If you are returning to a saved image, submit the %LET statement in cre8data.sas to define the macro variable. You do not run

            the entire program.

 

If this does not resolve your issue, your image maybe corrupted, I would discard this image and request a new image. Please update me on your progress.

-theresa

 

View solution in original post

4 REPLIES 4
TheresaStemler
SAS Moderator

@akpattnaik

Can you please send the entire SAS log, starting with the log for the PROC HADOOP step?

-theresa

 

akpattnaik
Obsidian | Level 7
ERROR: 11-29 10:47:10,452 [RunPigThread][ERROR PigStatsUtil] - 1 map reduce job(s) failed!
ERROR: The pig script has encountered errors.
ERROR: The pig job, job_1511967281334_0001, has encountered errors.
ERROR: org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Input path does not exist: 
       hdfs://server2.demo.sas.com:8020/user/shared/DIACHD/data/custord
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
    at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
    at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
    at java.lang.Thread.run(Thread.java:745)
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
Caused by: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: 
       hdfs://server2.demo.sas.com:8020/user/shared/DIACHD/data/custord
    at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:321)
    at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:264)
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigTextInputFormat.listStatus(PigTextInputFormat.java:36)
    at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:385)
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
    ... 18 more
3               

Hi, Above is the error message on the SAS EG.

ballardw
Super User

Paste the log into a code box using the forum {I} menu icon.

 

Many users here do not or cannot open Docx files from unknown sources due to security issues and organization policies.

TheresaStemler
SAS Moderator

@akpattnaik,

Please try the following steps:  

Step 1: Did you successfully run cre8data.bat when you started the image? 

            If YES, go to Step 2. If NO, please run cre8data.bat as directed in section 1.2 of the Big Data Programming and Loading

            module.

Step 2: Did you run cre8data.sas?

            If YES, go to Step 3. If NO, please run cre8data.sas as directed in section 1.2 of the Big Data Programming and Loading

            module.This program creates the data used in this module and a required macro variable.

Step 3: If you are returning to a saved image, submit the %LET statement in cre8data.sas to define the macro variable. You do not run

            the entire program.

 

If this does not resolve your issue, your image maybe corrupted, I would discard this image and request a new image. Please update me on your progress.

-theresa

 

 

This is a knowledge-sharing community for learners in the Academy. Find answers to your questions or post here for a reply.
To ensure your success, use these getting-started resources:

Estimating Your Study Time
Reserving Software Lab Time
Most Commonly Asked Questions
Troubleshooting Your SAS-Hadoop Training Environment

SAS Training: Just a Click Away

 Ready to level-up your skills? Choose your own adventure.

Browse our catalog!

Discussion stats
  • 4 replies
  • 2144 views
  • 1 like
  • 3 in conversation