Like most IT answers, "It depends".
The slowest things in a system:
* moving a disk drive's read/write head (seeking)
* moving to a specific spot in a file on tape
* moving data.
Processing data means moving it from storage, to memory, into a processor's cache, into a processor's register, through transformation or detection circuitry (decision making), and ultimately back (especially if it's SAS) to storage. The less of this is done, the faster things happen.
System design is a practice in balancing system resource usage, time, money and politics (interactions of people, groups, and departments).
How this is done depends on your corporate/company's culture.
Some stuff is best kept in an RDMS -- Oracle, DB2, Sybase, SQL Server, MySQL, Access, FoxBase, InterBase, Paradox, Informix, etc.
Some stuff is best kept in a SAS permanent dataset.
Some stuff has to come from a VSAM file on a mainframe.
Some stuff has to come from SMF or RMF flatfile records on an mainframe.
Most stuff can be transferred from one thing to another.
Some things are best kept in a central repository -- Data Warehouse
Some things are best kept locally.
A good answer to this is within the realm of "Business Intelligence" and the idea of "Information Maturity". A topic also within SAS's world.
Best practice is generally to have a central repository -- Data Warehouse -- with a well managed MetaData layer/server to help normalize/control the data access. But, this is a seriously non-trivial exercise/experience. The results can be huge for the business, if done well. If not done well, etc. then it is a huge waste of money and time.