Oracle tables with > 10B rows

171 Views Asked by At

What's your experience with querying Oracle tables with >10B rows? The tables are well partitioned Can this be done using "noraml" machines (clustered) or - does it require a gigantic machine?

Thanks!

2

There are 2 best solutions below

1
On

Assuming that the table is partitioned and the queries specify the partition key, the size of the table is irrelevant. The only thing you would care about is the size of the partition(s) that need to be scanned in order to execute the query. That would normally be just one or two partitions which you should be able to do with minimal amounts of hardware resources.

Of course, if you have 10 billion rows of data, that generally implies that you have a data warehouse for a relatively large company which is the bigger issue. Large companies have large numbers of employees that want to be able to report against the data warehouse. They have large numbers of canned reports that need to hit the data warehouse. They have a large number of data feeds. They have a large number of data marts that need to be populated. They have relatively short load windows. These requirements drive the need for more hardware and, on occasion, the need for specialized hardware like an Exadata box.

0
On

The number of rows is not in itself an issue. Whether the system meets specification or not is also determined by the data access patterns, storage performance (usually woefully undersized), number of sessions accessing it, presence of precalculated summaries etc.

So, I would not say that special hardware is necessarily required.