Rails: rake db: migrate * very * slow on Oracle

I use rails with an oracleenhanced adapter to create a new interface for an outdated application.

Database migrations work successfully, but it takes an incredibly long amount of time before the end of the rake. Changes to the database occur quite quickly (1 or 2 seconds), but resetting db/schema.db takes more than an hour. (See the migration example below.)

This is a relatively large schema (about 150 tables), but I am sure that it should not take so long to display a description of each table.

Is there a way to speed this up by simply taking the last schema.db and applying the change indicated in its porting? Or can I skip this scheme altogether?

I understand that this schema.db used to create a test database from scratch every time, but in this case there is a large fragment of the database logic in table triggers that are not included in schema.rb , so rake tests in any case do not fit us. (This is a completely different problem, which I need to sort out at some other point.)

  dgs@dgs-laptop : ~ / rails / voyager $ time rake db: migrate
 (in / home / dgs / rails / voyager)
 == 20090227012452 AddModuleActionAndControllerNames: migrating ================
 - add_column (: modules,: action_name,: text)
    -> 0.9619s
    -> 0 rows
 - add_column (: modules,: controller_name,: text)
    -> 0.1680s
    -> 0 rows
 == 20090227012452 AddModuleActionAndControllerNames: migrated (1.1304s) =======


 real 87m12.961s
 user 0m12.949s
 sys 0m2.128s
+4
source share
2 answers

After all the migrations are applied to the database, then rake db: migrate calls db: schema: dump task to generate the schema.rb file from the current database schema.

db: schema: dump table table method to get a list of all tables, then the index method and the column method are called for each table. You can find the SQL SELECT statements that are used in these methods in the activerecord-oracle_enhanced-adapter gem oracle_enhanced_adapter.rb file. Basically, he selects ALL% or USER% from the data dictionary tables to find all the information.

Initially, I had problems with the original Oracle adapter when I used it with databases with many different schemas (since performance may depend on the total number of tables in the database, not just your schema), and therefore I made some optimizations in the advanced Oracle adapter. It would be nice to find out which methods are slow in your case (I suspect that it can be either the "indexes" method or the "columns" that runs for each table).

A one-way hack for debugging this problem would be if you put some debugging messages in the oracle_enhanced_adapter.rb file so that you can identify which method calls take so long.

+4
source

The problem is mainly solved after some digging in oracle_enhanced_adapter.rb .

The problem boiled down to too many tables in the local schema (many tables EBA_%, EVT_%, EMP_%, SMP_% were created there coincidentally at some point), archive tables are included in the dump and the selection of data dictionaries taking 14 seconds to fulfill.

To fix the speed, I did three things:

  • Threw out all unnecessary tables (about 250 out of 500)
  • Excluded archive tables from schema dump
  • The cached result of a long query

This improved the migration / schema dump time for the remaining 350 tables from about 90 minutes to 15 seconds. Fast enough.

My code is as follows (for inspiration not copy and paste - this code is quite specific to my database, but you should be able to get this idea). You need to create a temporary table manually. It takes me about 2 or 3 minutes - still too long to generate with each migration, and it is pretty static anyway =)

 module ActiveRecord module ConnectionAdapters class OracleEnhancedAdapter def tables(name = nil) select_all("select lower(table_name) from all_tables where owner = sys_context('userenv','session_user') and table_name not like 'A!_%' escape '!' ").inject([]) do | tabs, t | tabs << t.to_a.first.last end end # TODO think of some way to automatically create the rails_temp_index table # # Table created by: # create table rails_temp_index_table as # SELECT lower(i.index_name) as index_name, i.uniqueness, # lower(c.column_name) as column_name, i.table_name # FROM all_indexes i, user_ind_columns c # WHERE c.index_name = i.index_name # AND i.owner = sys_context('userenv','session_user') # AND NOT exists (SELECT uc.index_name FROM user_constraints uc # WHERE uc.constraint_type = 'P' and uc.index_name = i.index_name); def indexes(table_name, name = nil) #:nodoc: result = select_all(<<-SQL, name) SELECT index_name, uniqueness, column_name FROM rails_temp_index_table WHERE table_name = '#{table_name.to_s.upcase}' ORDER BY index_name SQL current_index = nil indexes = [] result.each do |row| if current_index != row['index_name'] indexes << IndexDefinition.new(table_name, row['index_name'], row['uniqueness'] == "UNIQUE", []) current_index = row['index_name'] end indexes.last.columns << row['column_name'] end indexes end end 
+2
source

All Articles