<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Using SAS PROC S3 to access a bucket using CEPH in SAS Procedures</title>
    <link>https://communities.sas.com/t5/SAS-Procedures/Using-SAS-PROC-S3-to-access-a-bucket-using-CEPH/m-p/926454#M83432</link>
    <description>&lt;P&gt;I managed to resolve the issue by incorporating a custom region using a different approach. Since there was no response, I'll share my solution for anyone who might benefit from it.&lt;/P&gt;
&lt;P&gt;Here's what I did to make it work:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Set a unique name for the custom region. Previously, I had assigned a custom region name identical to an existing AWS region, so I opted for a distinct name instead.&lt;/LI&gt;
&lt;LI&gt;Utilized PROC S3 to introduce the custom region instead of relying on environment variables.&lt;/LI&gt;
&lt;/OL&gt;
&lt;DIV&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;/*Add custom region*/

proc s3;
REGION ADD HOST="storage.middle.earth" NAME="shire";
run;
/* Listing all regions (including the custom region) */
proc s3;
region list;
run;&lt;/CODE&gt;&lt;/PRE&gt;
This resulted in the following log output (note the added custom region):&lt;/DIV&gt;
&lt;BLOCKQUOTE&gt;
&lt;DIV&gt;1 %studio_hide_wrapper;&lt;BR /&gt;83 proc s3;&lt;BR /&gt;84 REGION ADD HOST="storage.middle.earth" NAME="shire";&lt;BR /&gt;85 run;&lt;BR /&gt;NOTE: PROCEDURE S3 used (Total process time):&lt;BR /&gt;real time 0.01 seconds&lt;BR /&gt;cpu time 0.00 seconds&lt;BR /&gt;&lt;BR /&gt;86 &lt;BR /&gt;87 proc s3;&lt;BR /&gt;88 region list;&lt;BR /&gt;89 run;&lt;BR /&gt;Amazon Regions&lt;BR /&gt;us-east-1 s3.amazonaws.com Default 0&lt;BR /&gt;us-east-2 s3-us-east-2.amazonaws.com Default 0&lt;BR /&gt;us-west-2 s3-us-west-2.amazonaws.com Default 0&lt;BR /&gt;us-west-1 s3-us-west-1.amazonaws.com Default 0&lt;BR /&gt;eu-west-1 s3-eu-west-1.amazonaws.com Default 0&lt;BR /&gt;eu-central-1 s3-eu-central-1.amazonaws.com Default 0&lt;BR /&gt;ap-southeast-1 s3-ap-southeast-1.amazonaws.com Default 0&lt;BR /&gt;ap-southeast-2 s3-ap-southeast-2.amazonaws.com Default 0&lt;BR /&gt;ap-northeast-1 s3-ap-northeast-1.amazonaws.com Default 0&lt;BR /&gt;sa-east-1 s3-sa-east-1.amazonaws.com Default 0&lt;BR /&gt;us-gov-west-1 s3-us-gov-west-1.amazonaws.com Default 0&lt;BR /&gt;us-gov-west-1 s3-fips-us-gov-west-1.amazonaws.com Default 0&lt;BR /&gt;ca-central-1 s3-ca-central-1.amazonaws.com Default 0&lt;BR /&gt;ap-south-1 s3-ap-south-1.amazonaws.com Default 0&lt;BR /&gt;ap-northeast-2 s3-ap-northeast-2.amazonaws.com Default 0&lt;BR /&gt;cn-north-1 s3-cn-north-1.amazonaws.com Default 0&lt;BR /&gt;cn-northwest-1 s3-cn-northwest-1.amazonaws.com Default 0&lt;BR /&gt;eu-west-2 s3-eu-west-2.amazonaws.com Default 0&lt;BR /&gt;eu-west-3 s3-eu-west-3.amazonaws.com Default 0&lt;BR /&gt;ap-east-1 s3.ap-east-1.amazonaws.com Default 0&lt;BR /&gt;eu-south-1 s3.eu-south-1.amazonaws.com Default 0&lt;BR /&gt;eu-north-1 s3.eu-north-1.amazonaws.com Default 0&lt;BR /&gt;me-south-1 s3.me-south-1.amazonaws.com Default 0&lt;BR /&gt;af-south-1 s3.af-south-1.amazonaws.com Default 0&lt;BR /&gt;ap-northeast-3 s3.ap-northeast-3.amazonaws.com Default 0&lt;BR /&gt;Custom Regions&lt;BR /&gt;shire storage.middle.earth Default 0&lt;BR /&gt;NOTE: PROCEDURE S3 used (Total process time):&lt;BR /&gt;real time 0.00 seconds&lt;BR /&gt;cpu time 0.00 seconds&lt;BR /&gt;&lt;BR /&gt;90 &lt;BR /&gt;91 %studio_hide_wrapper;&lt;BR /&gt;102 &lt;BR /&gt;103&lt;/DIV&gt;
&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;/BLOCKQUOTE&gt;
&lt;DIV&gt;
&lt;P&gt;Now, with the appropriate credentials, it's possible to test bi listing a bucket content:&lt;/P&gt;
&lt;/DIV&gt;
&lt;DIV&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;proc s3 KEYID="isItSecret" SECRET="isItSafe" region="shire";
list "/bucket" ;
run;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;/DIV&gt;
&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV&gt;That's it.&amp;nbsp;&lt;/DIV&gt;
&lt;DIV&gt;See ya!&lt;/DIV&gt;
&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;</description>
    <pubDate>Tue, 30 Apr 2024 12:44:35 GMT</pubDate>
    <dc:creator>alisio_meneses</dc:creator>
    <dc:date>2024-04-30T12:44:35Z</dc:date>
    <item>
      <title>Using SAS PROC S3 to access a bucket using CEPH</title>
      <link>https://communities.sas.com/t5/SAS-Procedures/Using-SAS-PROC-S3-to-access-a-bucket-using-CEPH/m-p/925914#M83424</link>
      <description>&lt;BLOCKQUOTE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Hello There!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;I'm having trouble connecting to a CEPH object storage service using PROC S3 in SAS Viya 3.5 on Linux. CEPH is&amp;nbsp;an open-source distributed storage system capable of providing S3 compatible buckets.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;I've tried setting a custom region using the &lt;A href="https://go.documentation.sas.com/doc/en/pgmsascdc/9.4_3.4/proc/n0qozoux9a0633n1du4xy40vksf5.htm#n0i68s2pkemi41n120l9c9qmxse1" target="_self"&gt;TKS3_CUSTOM_REGION&lt;/A&gt; environment variable and a config file with access credentials, but I keep getting an error saying "The AWS access key Id you provided does not exist in our records." Here's my code:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;options set=TKS3_CUSTOM_REGION="us-east-1,storage.yada.yada,0,0,TRUE,TRUE";
%let myvar_value = %sysget(TKS3_CUSTOM_REGION);
%put &amp;amp;myvar_value;

PROC S3 
	config="/home/user/tks3.cfg";
	LIST "/";
RUN;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;BR /&gt;&lt;SPAN&gt;The S3 "&lt;CODE class=" language-sas"&gt;/home/user/tks3.cfg"&lt;/CODE&gt; file contains the following:&lt;/SPAN&gt;&lt;BR /&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;region=us-east-1
keyID = thisIsAnAwesomeKeyID
secret = whoaThisSecretIsSafe&lt;/CODE&gt;&lt;/PRE&gt;
&lt;BR /&gt;Output Log:&lt;BR /&gt;
&lt;PRE&gt;1    %studio_hide_wrapper;
83   options set=TKS3_CUSTOM_REGION="us-east-1,storage.yada.yada,0,0,TRUE,TRUE";
84   %let myvar_value = %sysget(TKS3_CUSTOM_REGION);
85   %put &amp;amp;myvar_value;
us-east-1,storage.yada.yada,0,0,TRUE,TRUE
86   
87   PROC S3
88   config="/home/user/tks3.cfg";
91   LIST "/";
92   RUN;
ERROR: Could not get user buckets.
ERROR: The AWS access key Id you provided does not exist in our records.
ERROR: The AWS access key Id you provided does not exist in our records.
NOTE: The SAS System stopped processing this step because of errors.
NOTE: PROCEDURE S3 used (Total process time):
      real time           0.58 seconds
      cpu time            0.02 seconds
      
93   
94   %studio_hide_wrapper;
105  
106  &lt;/PRE&gt;
&lt;BR /&gt;
&lt;P&gt;Currently I am able to interact with data from the CEPH bucket using the&amp;nbsp;&lt;SPAN&gt;command-line tool for managing Amazon S3 storage,&amp;nbsp;&lt;/SPAN&gt;s3cmd, the same way I use it to interact with AWS S3 buckets. I guess this rules out problems caused by the CEPH service.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Can someone point what might be&amp;nbsp;wrong?&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Environment info:&lt;/P&gt;
&lt;P&gt;SAS Viya 3.5 running on Linux&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 25 Apr 2024 23:44:13 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Procedures/Using-SAS-PROC-S3-to-access-a-bucket-using-CEPH/m-p/925914#M83424</guid>
      <dc:creator>alisio_meneses</dc:creator>
      <dc:date>2024-04-25T23:44:13Z</dc:date>
    </item>
    <item>
      <title>Re: Using SAS PROC S3 to access a bucket using CEPH</title>
      <link>https://communities.sas.com/t5/SAS-Procedures/Using-SAS-PROC-S3-to-access-a-bucket-using-CEPH/m-p/926454#M83432</link>
      <description>&lt;P&gt;I managed to resolve the issue by incorporating a custom region using a different approach. Since there was no response, I'll share my solution for anyone who might benefit from it.&lt;/P&gt;
&lt;P&gt;Here's what I did to make it work:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Set a unique name for the custom region. Previously, I had assigned a custom region name identical to an existing AWS region, so I opted for a distinct name instead.&lt;/LI&gt;
&lt;LI&gt;Utilized PROC S3 to introduce the custom region instead of relying on environment variables.&lt;/LI&gt;
&lt;/OL&gt;
&lt;DIV&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;/*Add custom region*/

proc s3;
REGION ADD HOST="storage.middle.earth" NAME="shire";
run;
/* Listing all regions (including the custom region) */
proc s3;
region list;
run;&lt;/CODE&gt;&lt;/PRE&gt;
This resulted in the following log output (note the added custom region):&lt;/DIV&gt;
&lt;BLOCKQUOTE&gt;
&lt;DIV&gt;1 %studio_hide_wrapper;&lt;BR /&gt;83 proc s3;&lt;BR /&gt;84 REGION ADD HOST="storage.middle.earth" NAME="shire";&lt;BR /&gt;85 run;&lt;BR /&gt;NOTE: PROCEDURE S3 used (Total process time):&lt;BR /&gt;real time 0.01 seconds&lt;BR /&gt;cpu time 0.00 seconds&lt;BR /&gt;&lt;BR /&gt;86 &lt;BR /&gt;87 proc s3;&lt;BR /&gt;88 region list;&lt;BR /&gt;89 run;&lt;BR /&gt;Amazon Regions&lt;BR /&gt;us-east-1 s3.amazonaws.com Default 0&lt;BR /&gt;us-east-2 s3-us-east-2.amazonaws.com Default 0&lt;BR /&gt;us-west-2 s3-us-west-2.amazonaws.com Default 0&lt;BR /&gt;us-west-1 s3-us-west-1.amazonaws.com Default 0&lt;BR /&gt;eu-west-1 s3-eu-west-1.amazonaws.com Default 0&lt;BR /&gt;eu-central-1 s3-eu-central-1.amazonaws.com Default 0&lt;BR /&gt;ap-southeast-1 s3-ap-southeast-1.amazonaws.com Default 0&lt;BR /&gt;ap-southeast-2 s3-ap-southeast-2.amazonaws.com Default 0&lt;BR /&gt;ap-northeast-1 s3-ap-northeast-1.amazonaws.com Default 0&lt;BR /&gt;sa-east-1 s3-sa-east-1.amazonaws.com Default 0&lt;BR /&gt;us-gov-west-1 s3-us-gov-west-1.amazonaws.com Default 0&lt;BR /&gt;us-gov-west-1 s3-fips-us-gov-west-1.amazonaws.com Default 0&lt;BR /&gt;ca-central-1 s3-ca-central-1.amazonaws.com Default 0&lt;BR /&gt;ap-south-1 s3-ap-south-1.amazonaws.com Default 0&lt;BR /&gt;ap-northeast-2 s3-ap-northeast-2.amazonaws.com Default 0&lt;BR /&gt;cn-north-1 s3-cn-north-1.amazonaws.com Default 0&lt;BR /&gt;cn-northwest-1 s3-cn-northwest-1.amazonaws.com Default 0&lt;BR /&gt;eu-west-2 s3-eu-west-2.amazonaws.com Default 0&lt;BR /&gt;eu-west-3 s3-eu-west-3.amazonaws.com Default 0&lt;BR /&gt;ap-east-1 s3.ap-east-1.amazonaws.com Default 0&lt;BR /&gt;eu-south-1 s3.eu-south-1.amazonaws.com Default 0&lt;BR /&gt;eu-north-1 s3.eu-north-1.amazonaws.com Default 0&lt;BR /&gt;me-south-1 s3.me-south-1.amazonaws.com Default 0&lt;BR /&gt;af-south-1 s3.af-south-1.amazonaws.com Default 0&lt;BR /&gt;ap-northeast-3 s3.ap-northeast-3.amazonaws.com Default 0&lt;BR /&gt;Custom Regions&lt;BR /&gt;shire storage.middle.earth Default 0&lt;BR /&gt;NOTE: PROCEDURE S3 used (Total process time):&lt;BR /&gt;real time 0.00 seconds&lt;BR /&gt;cpu time 0.00 seconds&lt;BR /&gt;&lt;BR /&gt;90 &lt;BR /&gt;91 %studio_hide_wrapper;&lt;BR /&gt;102 &lt;BR /&gt;103&lt;/DIV&gt;
&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;/BLOCKQUOTE&gt;
&lt;DIV&gt;
&lt;P&gt;Now, with the appropriate credentials, it's possible to test bi listing a bucket content:&lt;/P&gt;
&lt;/DIV&gt;
&lt;DIV&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;proc s3 KEYID="isItSecret" SECRET="isItSafe" region="shire";
list "/bucket" ;
run;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;/DIV&gt;
&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV&gt;That's it.&amp;nbsp;&lt;/DIV&gt;
&lt;DIV&gt;See ya!&lt;/DIV&gt;
&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;</description>
      <pubDate>Tue, 30 Apr 2024 12:44:35 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Procedures/Using-SAS-PROC-S3-to-access-a-bucket-using-CEPH/m-p/926454#M83432</guid>
      <dc:creator>alisio_meneses</dc:creator>
      <dc:date>2024-04-30T12:44:35Z</dc:date>
    </item>
  </channel>
</rss>

