Update large table in one transaction

Hi, we have experienced some problems updating an entire table of ~1 million rows in one transaciton. We are running cockroach db at the moment on three nodes in a Kubernetes cluster where other transactions may occur at the same time.
I wonder if this is even possible with the Serializable isolation in mind?
If not, what is the recommended way on how to approach this?

Hi @CFagrell! You are right, one big update like that is not the right approach, since it will contend with every other transaction that touches the table.

Have you seen these docs about bulk-updating data? You should do the updates in batches, using a WHERE clause to limit the affected rows.