BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
brulard
Pyrite | Level 9

hi,

 

I'm fairly new to querying in hadoop.

when running a query using sql pass thru, getting an out of memory error. Does someone know of an option i could use to perhaps by-pass this, or has a suggestion other than my having to break my query into different parts.

 

There was no error when I limited the query time frame to 4 years (from date, to date).

 

However, when broadening to 7 years, I am getting out of memory error.

 

Below is the bit of code that is scanning through hundreds of millions of records, that is resource intensive that I believe causes the error. 

 

LEFT JOIN
	
		(SELECT 
		CRNT_BAL_AMT,eff_from_dt ,eff_TO_dt,ACCT_ID ,PRD_CD
		FROM TRANS_HIST  
		Where  eff_to_dt>='2010-12-31' and eff_to_dt < '9999-12-31'
		
		  ) g ON
		
		A.ACCT_ID=g.ACCT_ID AND date_sub(A.txn_dt,1)=g.EFF_TO_DT
	
	LEFT JOIN
	
		(SELECT 
		CRNT_BAL_AMT,eff_from_dt ,eff_TO_dt,ACCT_ID ,PRD_CD
		FROM TRANS_HIST   
		Where  eff_to_dt>='2011-01-01' and eff_to_dt < '9999-12-31'
		
		  ) h ON
		
		A. ACCT_ID=h.ACCT_ID AND date_sub(g.eff_from_dt,1)=h.EFF_TO_DT
	
			 ORDER BY  a.ACCT_ID)
			;

DISCONNECT FROM hadcon;
quit;

Error message:

 

ERROR: Prepare error: Error while processing statement: FAILED: Execution Error, return code 2 from 
       org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Reducer 2, vertexId=vertex_1517179256891_507352_1_09, 
       diagnostics=[Task failed, taskId=task_1517179256891_507352_1_09_000003, diagnostics=[TaskAttempt 0 failed, info=[Error: 
       Failure while running task:java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space
	at 
       org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:173)

Thanks in advance

1 ACCEPTED SOLUTION

Accepted Solutions
SimonDawson
SAS Employee

This is something your Hadoop administrators should take a look at with you. The container provisioned for your task on the Hadoop compute cluster didn't have enough memory allocated to it to complete the task. You'll need to tweak the configuration inside the Hadoop cluster to fit the workload that query will generate.

View solution in original post

4 REPLIES 4
kiranv_
Rhodochrosite | Level 12

probably a haddop hive forum could answer this answer better.  One thing which i find not all important is

 

ORDER BY  a.ACCT_ID

 if possible try to remove order by clause, it is very resource intense process.

 

you can also break queries into 2 or 3 tables  (if you access to create table) and build indexes on joining columns.  use date_sub in that  create new column in this table

date_sub(A.txn_dt,1) as col

build a index on columns like acct_id, col

 

brulard
Pyrite | Level 9

Thanks, appreciate your tips... Will remove the order by.. currently trying a suggestion from a colleague to use the GROUP BY function instead of using DISTINCT. As for creating a table that would help... I think i need to create in my environment what is referred to as a KERBEROS ticket 

SimonDawson
SAS Employee

This is something your Hadoop administrators should take a look at with you. The container provisioned for your task on the Hadoop compute cluster didn't have enough memory allocated to it to complete the task. You'll need to tweak the configuration inside the Hadoop cluster to fit the workload that query will generate.

brulard
Pyrite | Level 9
Thanks. I appreciate your feedback and will open a ticket to our Hadoop administrators. (I did manage to produce my desired output but had to split my query into two timeframes.)

SAS Innovate 2025: Save the Date

 SAS Innovate 2025 is scheduled for May 6-9 in Orlando, FL. Sign up to be first to learn about the agenda and registration!

Save the date!

How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

SAS Training: Just a Click Away

 Ready to level-up your skills? Choose your own adventure.

Browse our catalog!

Discussion stats
  • 4 replies
  • 3012 views
  • 1 like
  • 3 in conversation