Rails does not work correctly on Postgres SERIAL NOT NULL column

I am developing (currently) a Rails 2.3.x application with a PostgreSQL 8.4 database backend. In my Rails application, I have a model corresponding to a database table that has two columns of the SERIAL data type and is set to NOT NULL. I have one of these columns defined as the primary key in Rails and as a restriction of PostgreSQL.

Table definition:

CREATE TABLE problem_table ( col1 serial NOT NULL, col2 serial NOT NULL, other_col1 character varying, other_col2 character varying, ..., CONSTRAINT problem_table_pkey PRIMARY KEY (col1) ); 

Model class definition:

 class ModelClass1 < ActiveRecord::Base self.table_name = 'problem_table' self.primary_key = 'col1' end 

My problem is with SERIAL NOT NULL column with non-primary key. When I try to create Rails ActiveRecord :: Base # create, Rails rightfully does not set the value of the SERIAL NOT NULL column of the primary key, but sets the value of the NULL column to another, which causes PostgreSQL to complain that the NOT NULL column is set to NULL.

What will I say to Rails:

 ModelClass1.create( other_col1: 'normal' other_col2: 'data', ... ); 

What Rails tells PostgreSQL:

 INSERT INTO problem_table ( col2, other_col1, other_col2, ... ) VALUES ( NULL, 'normal', 'data', ... ); 

My question is, how can I get Rails to stop passing NULL for this column and just not skip anything by letting DEFAULT nextval (my_seq) take over? Or, if this is not possible, how can I tell PostgreSQL to ignore this NULL value when passing and / or recognize it the same way as "set as DEFAULT"?

I would try just the monkey patch Rails 2.3.x ActiveRecord, but I know that if I did this, I would screw it when the transition to Rails 3 comes.

I was looking for an attempt to fix things with the PL / pgSQL trigger before INSERT, but I can't figure out how to tell PostgreSQL that PL / pgSQL is β€œundefine” is NEW.col2 or say NEW.col2: = DEFAULT (which doesn't work).

Answers and / or suggestions are welcome!

+4
source share
3 answers

An easier way to do this is probably to define your own sequence and use your own nextval() Postgres inside the ActiveRecord callback. nextval() handles both the progress of the sequence one step and the return of the next value in one atomic operation.

In the process of migration:

 def self.up execute "CREATE SEQUENCE myseq" end def self.down execute "DROP SEQUENCE myseq" end 

And in the model:

 before_save :set_column_from_sequence, :on => :create def set_column_from_sequence self.mycolumn = self.class.connection.select_value("SELECT nextval('myseq')") end 
+4
source

Not sure about the exact syntax of PL / pgSQL (my PostgreSQL installation is at home, so I can't play with it), but in Oracle PL / SQL I would do something like

 CREATE OR REPLACE TRIGGER MYSCHEMA.PROBLEM_TABLE_BI BEFORE INSERT ON MYSCHEMA.PROBLEM_TABLE REFERENCING NEW AS NEW FOR EACH ROW BEGIN IF :NEW.COL2 IS NULL THEN :NEW.COL2 := MY_SEQ.NEXT_VAL; END IF; END PROBLEM_TABLE_BI; 

PL / pgSQL should be similar.

Hope this helps.

0
source

Found something that works. I might need a second SERIAL column in multiple tables using different sequences to enlarge. The following is a solution in which only one trigger function is required for any number of tables containing col1, incremented by a unique sequence.

 CREATE OR REPLACE FUNCTION fn_set_col1_as_nextval_sequence_if_null() RETURNS trigger AS $BODY$ BEGIN IF NEW.col1 IS NULL THEN SELECT nextval(TG_ARGV[0]) INTO NEW.col1; END IF; RETURN NEW; END; $BODY$ LANGUAGE 'plpgsql'; CREATE TRIGGER trg_set_col1_as_nextval_sequence_on_problem_table_create BEFORE INSERT ON problem_table FOR EACH ROW EXECUTE PROCEDURE fn_set_col1_as_nextval_sequence_if_null('problem_table_col1_seq'); 

I would still like to figure out how to fix the Rails behavior for 2.3.x and 3.0, if possible, but this will work in the meantime.

0
source

All Articles