Even more productive way to update large table

Using PG's PREPARE

Posted on 2017-01-16 10:39:04

In continuation of previous post, I’ve found even faster way to update a large table.

The original idea was found at this gist

Basically the difference with previous post solution is that:

  • we create each transaction manually
  • use Postgres Prepared Statement
  • sending a batch of queries inside single transaction

So the final solution looks like:

    ActiveRecord::Base.connection.execute 'PREPARE fooplan(text, text) AS
          UPDATE   users
          SET      email=$2
          WHERE    email=$1;'

  emails.in_groups_of(1000).each do |batch|
   sql = batch.map { |email|  ActiveRecord::Base.__send__(:sanitize_sql_array, ['EXECUTE fooplan(?, ?);', email, new_email]) }.join
   sql = "BEGIN;#{sql}COMMIT;"
   ActiveRecord::Base.connection.execute sql
  end
end

ActiveRecord::Base.connection.execute 'DEALLOCATE fooplan;'

References: