BookmarkSubscribeRSS Feed
☑ This topic is solved. Need further help from the community? Please sign in and ask a new question.
paterd2
Quartz | Level 8

Dear community experts.

I have an issue with inodes in our backups , especially postgres .
wc -l gives me 642258 
642258 .........../20250507-010211F/__default__/postgres/SharedServices_pg_dump.dmp/blobs.toc

The issue is that it looks that there are orphaned largeobjects.

I suspect the export and import of reports. 

Daily we export and import our reports. 

We delete the old ones, but it looks like this does not correctly delete the large objects.

We use 'sas-viya transfer export  ' and 'sas-viya transfer delete'.

sh-4.4$ vacuumlo -n SharedServices

Connected to database "SharedServices"

Test run: no large objects will be removed!

Checking content_data_oid_no in files.file_content

Checking content_bin in transfer.large_object

Checking content_bin in transfer.map_object

Checking object_content_txt in transfer.package_object

Checking object_summary_txt in transfer.package_object

Checking error_txt in transfer.transfer_export_job

Checking request_txt in transfer.transfer_export_job

Checking error_txt in transfer.transfer_export_task

Checking error_txt in transfer.transfer_import_job

Checking request_txt in transfer.transfer_import_job

Checking error_txt in transfer.transfer_import_task

Checking model_xml_doc_oid in workflow.wf_definition_model

Checking message_json_doc_oid in workflow.wf_history_event

Would remove 447798 large objects from database "SharedServices".

 

  1. Do we have to run : vacuumlo SharedServices on a regular schedule ?
  2. Is this normal behavior of 'sas-viya transfer' ?

Regards

 

Dik.

 

1 ACCEPTED SOLUTION

Accepted Solutions
gwootton
SAS Super FREQ
The transfer service will call lo_unlink to the postgres database when you delete a package to delete the associated large object. If this call returns an error from postgres I would expect you to see an error in the transfer service log. The vacuumlo utility is the appropriate solution for removing those large objects that have been orphaned, so I would agree this is something that should be done periodically along with your other Postgres routine maintenance tasks like vacuuming and reindexing.
--
Greg Wootton | Principal Systems Technical Support Engineer

View solution in original post

6 REPLIES 6
JackMcGuire
Obsidian | Level 7

Hi Dik, 

 

The transfer packages are stored in the SharedServices PostgreSQL database and can only be deleted using the SAS CLI, rather than through any of the GUIs. There is a great post from AllenCunningham which provides much more detail here: Managing SAS Viya Transfer Packages.

 

TDLR;

'Transfer List' will show you all packages:

./sas-viya transfer list

 

'Transfer Delete' will then delete a specific package:

./sas-viya transfer delete --id <package_id>

 

To delete multiple packages you can use Python Tools for SAS Viyalisttransferpackages.py  allows you to create a CSV file with filters which can then be used as input into deletetransferpackages.py to delete multiple packages.

 

Thanks,

 

Jack 

paterd2
Quartz | Level 8
Jack, thank you for this advice. We do use these transfer cli commands.
I will take a closer look at the Python Tools for SAS Viya.
Regards,
Dik
gwootton
SAS Super FREQ
The transfer service will call lo_unlink to the postgres database when you delete a package to delete the associated large object. If this call returns an error from postgres I would expect you to see an error in the transfer service log. The vacuumlo utility is the appropriate solution for removing those large objects that have been orphaned, so I would agree this is something that should be done periodically along with your other Postgres routine maintenance tasks like vacuuming and reindexing.
--
Greg Wootton | Principal Systems Technical Support Engineer
paterd2
Quartz | Level 8
Greg , we will schedule this. I will also take a look at the log.
paterd2
Quartz | Level 8
Greg , there are many ERRORS in the transfer log.....
Here is an example.

{"version":1,"timeStamp":"2025-04-23T19:28:56.65+02:00","level":"error","source":"sas-transfer","message":"[TRANSFER_PROMOTION_EXCEPTION] Exception occurred while importing a package.\norg
.springframework.orm.jpa.JpaSystemException: Unable to access lob stream; nested exception is org.hibernate.HibernateException: Unable to access lob stream\nCaused by: org.hibernate.Hibern
ateException: Unable to access lob stream\nCaused by: org.postgresql.util.PSQLException: ERROR: large object 30416 does not exist","properties":{"logger":"com.sas.transfer.domain.Promotion
Manager","thread":"transfer-service-request-pool-10-thread-1"},"messageKey":"com.sas.transfer.logging.LogMessages.TRANSFER_PROMOTION_EXCEPTION"}
gwootton
SAS Super FREQ
This error is an import failure. When importing you first upload a package to the transfer service which would store the package as a large object, then import on that package ID. This error message is saying the package ID we are trying to import is associated with a large object that isn't in the database, so this is sort of the opposite of your initial concern, where we have a reference to the large object in the transfer schema table, but the large object itself isn't present.
--
Greg Wootton | Principal Systems Technical Support Engineer

suga badge.PNGThe SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment. 

Join SUGA 

Get Started with SAS Information Catalog in SAS Viya

SAS technical trainer Erin Winters shows you how to explore assets, create new data discovery agents, schedule data discovery agents, and much more.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 6 replies
  • 439 views
  • 1 like
  • 3 in conversation