We are trying to evaluate PostgreSQL DB as an alternative to the Oracle database in our application. We are using PostgreSQL 9.5, which is installed on a Linux machine with 128 GB of memory, 32 CPU cores and SSD storage. Connection pools and distributed transactions are managed by the JBoss 7 application server, SQL queries are generated / executed by Hibernate 4. Most tables have tens of millions of rows, one of which contains hundreds of millions of rows. In total, about 3,000 connections to the database (they are combined by the application server) are active and used simultaneously. We modified some queries, created indexes for slow ones, set up database and OS parameters based on documentation, etc. However, the throughput is several times slower, and ultimately the response time of the database increases by 10-20 times.
I did some searches, and I could not find information about anyone else (ab) using PostgreSQL DB in the same way:
- using thousands of active database connections
- using this large number of distributed transactions (PREPARED OPERATIONS)
- saving billions of rows in one table
Oracle has no problem handling even higher loads. I would be happy to share my experiences, suggestions, links, etc.
thanks
source share