At one place I've worked in the past dirty reading was used for querying a data warehouse in DB2. I don't remember the details but the performance improvements were quite significant and also reduced workload for the DB. I would have thought that multiple parallel reads shouldn't cause table locking issues. So if there are such issues does this mean there are also write operations happening at the same time? If so then you could actually get "dirty data" and you would need to find a way to control the job flow and only read if no write operations to the tables are happening. For the DB2 warehouse example: There had been several cases where a SAS EG user locked a table overnight. As this was a corporate warehouse and a failed overnight batch had quite an impact, one other measure which had been taken was to lock-out all SAS users during the overnight batch window for loading the database (by revoking the grants and then re-establish them after the load). - This was years ago and I'm sure with a current DB2 version there are better means available to avoid that "normal" users lock tables.
... View more