BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
Venus-labs
Calcite | Level 5

When attempting to execute the following script to load data from census_2010.csv and display dump limit for five records:

 

Venus-labs_0-1619298460946.png

 

The following "failed to read data" error is generated as input path cannot be found:

 

Venus-labs_1-1619298607364.png

Venus-labs_2-1619298664012.png

 

The census_2010.csv file does exist in the following path when viewed in file explorer:

Venus-labs_3-1619298780054.png

I tried specifying '/user/student/workshop/dihdm/data/census_2010.csv' but this did not work. What path should I specify in order to get the data read successfully in Grunt?

 

Thank you.

 

 

 

 

1 ACCEPTED SOLUTION

Accepted Solutions
DavidGhan
SAS Employee

The Pig program is looking for a location in HDFS to find that data. It seems you need to complete an earlier step in the an earlier practice to copy the data into HDFS. That step copies the contents of several folders from the linux client machine to HDFS. To complete that step do the following:

  1. Click (mRemoteNG) on the taskbar to open the application, if required.
  2. Double-click student@HadoopClient to open the connection to the Hadoop client.
  3. Submit the command:

         hdfs dfs  -put    /workshop/dihdm/data    /user/student/dihdm

 

That copies data into HDFS, including the '/user/student/dihdm/data/census_2010.csv' file.

View solution in original post

2 REPLIES 2
DavidGhan
SAS Employee

The Pig program is looking for a location in HDFS to find that data. It seems you need to complete an earlier step in the an earlier practice to copy the data into HDFS. That step copies the contents of several folders from the linux client machine to HDFS. To complete that step do the following:

  1. Click (mRemoteNG) on the taskbar to open the application, if required.
  2. Double-click student@HadoopClient to open the connection to the Hadoop client.
  3. Submit the command:

         hdfs dfs  -put    /workshop/dihdm/data    /user/student/dihdm

 

That copies data into HDFS, including the '/user/student/dihdm/data/census_2010.csv' file.

Venus-labs
Calcite | Level 5
Thank you so much, David! I need to remember to do this every time I start the a new "fresh" client session.

 

This is a knowledge-sharing community for learners in the Academy. Find answers to your questions or post here for a reply.
To ensure your success, use these getting-started resources:

Estimating Your Study Time
Reserving Software Lab Time
Most Commonly Asked Questions
Troubleshooting Your SAS-Hadoop Training Environment

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 2 replies
  • 631 views
  • 1 like
  • 2 in conversation