How to update a large amount of rows in PostgreSQL?


I need to update thousands of rows in a table. For example, I have 1000 rows with ids - 1, 2.. 1000:

| id   | value1 | value2 |
|  1   |  Null  |  Null  |
|  2   |  Null  |  Null  |
| 1000 |  Null  |  Null  |

Now I need to change first 10 rows. I can do it like this:

UPDATE mytable SET value1=42, value2=111 WHERE id=1
UPDATE mytable SET value1=42, value2=111 WHERE id=10

This requires to many requests and not very fast, so I decide to do this optimization:

UPDATE mytable SET value1=42  WHERE id in (1, 2, 3.. 10)
UPDATE mytable SET value2=111 WHERE id in (1, 2, 3.. 10)

Note: In this case I can actually write SET value1=42, value2=111 but in real world applications this sets of ids is not the same, for one rows I need to set value1, for other - value2, for some subset of rows I need to set both. Because of that I need two queries.

The problem is that I have very large amount of id's. This queries is something about 1Mb!

Q1: Is this a right way to optimize this updates?

Q2: Is it right to send queries that is so large? Can I get faster update by dividing this query into several smaller parts?

I can't use where statement, I've just have lots of row id's in my program.

Create a TEMPORARY TABLE and populate it with your target ids and new values. Then use UPDATE with FROM clause to join to that target and do it in a single command.

In general, whenever you have large numbers of id/values like this life gets easier if you move them into the database first.