How to pass multiple values in an sql query in postgres?

I need to implement the solution given below (from Stackoverflow) to update multiple rows in a single query. The solution is:

update test as t set
    column_a = c.column_a
from (values
    ('123', 1),
    ('345', 2)  
) as c(column_b, column_a) 
where c.column_b = t.column_b;

How do I pass the multiple values to the query? I created a values string:

values_str = "('123', 1), ('345', 2)"

and used the following query:

            res = DBCONN.query_one?("
                UPDATE test as t 
                SET column_a = c.column_a
                FROM (
                    VALUES $1
                ) as c(column_b, column_a)
                WHERE c.column_b = t.column_b
                RETURNING id", 
                values_str, 
                as: String)

This query gives an error syntax error at or near "$1"

How do I pass the multiple values in this query?

Thank you for your help.

You can use a variable for each value.

VALUES ($1, $2, $3)

Or, you could switch to string interpolation if you’re dynamically building the SQL. Make sure you’re not affected by possible SQL-injection though.

sql = " ... VALUES (#{normalized_values(values)})"

normalized_values is where you loop through your values and apply quotes etc. for SQL.

@ejstembler Thanks. I just thought of the string interpolation about 10 minutes back, and could not believe that it did not occur to be earlier. Since I read the data sent by the client into a struct of type int for one of the variables, that one is safe from SQL-injection (I think). The other one is a string. Any pointers on how to protect it from SQL injection attack.

An UPDATE FROM query requires a table expression. It’s not possible to use query parameters for that.

You can write the table expression explicitly and use query parameters for individual values. That only works if the number of records to update is fixed.
Writing the table expression in raw SQL is possible, but probably not a great solution.

But there’s a more general aspect to consider: Do you really need to execute multiple updates in a single statment? Is there a strict requirement for that?
Otherwise you could just use a simple update query as prepared statement and invoke it for each row you want to update.

Won’t this require multiple round trips from the server to the database. I am trying to avoid that, and hence wanted to issue the query once. Thanks.

Every changeset would be applied as a separate query, if that’s what you mean. Why would you want to avoid that though? If you want atomicity, you can just wrap it in a transaction. Otherwise I don’t see any reason why you wouldn’t want to do that.

I realise it has been a long time since you posted, but maybe this will still be of some use. I had a similar issue (mine was for insert rather than update but you can use a similar mechanism) and ended up with a query like this:

      #
      # Pass all the inserts to the server in a single json structure to save round-trips
      #
      ins_1 = <<-SQL
      insert
      into    tmp$statement(id, sql)
      select  *
      from    jsonb_to_recordset($1) as (id uuid, sql text)
      SQL
      :
      if ins.statement.size > 0
        @log.info { "Inserting statement (#{ins.statement.size})" }
        db.exec(ins_1, ins.statement.to_json.to_s)
      end
      :

Of course this is very postgres-centric, so if you ever wanted to use a different DB you’d be stuck.

Steve