I'm planning to host a PostgreSQL table that's about 100GB on a VPS using Dockploy, and I have sufficient disk space available. This table won't be connected to my main database; I'll mainly read from it and copy data as needed, managing everything in memory. Just to clarify, it contains around 100 million records. I'm wondering if this setup might slow down the performance of my other database. Should I keep this table in the same database or move it to a separate one? Also, what would be the best way to handle this situation?
4 Answers
Assuming you meant 100GB, the impact on performance really depends on your VPS specs. PostgreSQL can handle large tables, but you might want to check its documentation for limits regarding page usage and performance.
In general, yes, a 100GB table should be fine. I've dealt with much larger tables. However, you didn't mention how you'll be querying it. If quick response times are critical, consider partitioning your data by month or year to improve performance.
Yes, it could slow things down a bit, but it varies based on factors like indexing, RAM, and disk speed. If possible, you might want to host it on a separate server to isolate the load.
From what I know, having a large table won't directly slow down other queries, but since all PostgreSQL connections share resources, it may impact overall performance.
Why not try setting it up locally first? Most modern databases like PostgreSQL manage storage and indexing well. Putting it in a separate database mainly aids organization, but it's not necessarily a performance fix.

What does that mean in terms of the maximum number of rows the database can handle?