Hello there,
I'm attempting to utilize BERT and DLPy for sentiment analysis based on the tutorial outlined in the paper titled "SAS4429-2020 NLP with BERT: Sentiment Analysis Using SAS® Deep Learning and DLPy". I'm currently facing a challenge at step 4 (ATTACH MODEL PARAMETERS), which involves linking the trained model parameters from the HDF5 file to the BERT model. The Deep Learning actions must have access to the HDF5 file. However, since the client computer is typically separate from the server, the following Python code snippet establishes a "server_dir" to which the generated parameters must be copied for accessibility by the Viya server:
import os
from shutil import copyfile
server_dir = 'path/to/your/server-directory'
copyfile(os.path.join(cache_dir, 'bert-base-uncased.kerasmodel.h5'),
os.path.join(server_dir, 'bert-base-uncased.kerasmodel.h5'))
.......
However, there's a catch: I’m using a SAS Viya 3.5 environment with a distributed server architecture, and the Python runtime lacks a direct connection to the file server, so the ‘copyfile’ function is a no go .
While I can manually copy the file from the client to the server and load it from there, it raises the question o of "where to place the HDF5 file within the distributed server architecture (cas controller, cas workers, SPRE)?".
thank you!
For more details, you can refer to the paper "SAS4429-2020 NLP with BERT: Sentiment Analysis Using SAS® Deep Learning and DLP" here.
Thank you.
... View more