Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Very long requests "Column definitions" before main sql request. #2004

Open
vitalyliber opened this issue Apr 16, 2020 · 12 comments
Open

Very long requests "Column definitions" before main sql request. #2004

vitalyliber opened this issue Apr 16, 2020 · 12 comments

Comments

@vitalyliber
Copy link

Steps to reproduce

When I start rails server I and try to do make an Active Record request to DB I have very slow executions named as "Column definitions". It can take between 10 to 20 seconds. Sometimes "Column definitions" it happens twice.

Expected behavior

This should work faster.

Actual behavior

I see very long requests before execute the main request:

Column definitions (10458.9ms) SELECT cols.column_name AS name, cols.data_type AS sql_type, cols.data_default, cols.nullable, cols.virtual_column, cols.hidden_column, cols.data_type_owner AS sql_type_owner, DECODE(cols.data_type, 'NUMBER', data_precision, 'FLOAT', data_precision, 'VARCHAR2', DECODE(char_used, 'C', char_length, data_length), 'RAW', DECODE(char_used, 'C', char_length, data_length), 'CHAR', DECODE(char_used, 'C', char_length, data_length), NULL) AS limit, DECODE(data_type, 'NUMBER', data_scale, NULL) AS scale, comments.comments as column_comment FROM all_tab_cols cols, all_col_comments comments WHERE cols.owner = 'PORTAL_DB' AND cols.table_name = 'ORDER_PURCHASE_V' AND cols.hidden_column = 'NO' AND cols.owner = comments.owner AND cols.table_name = comments.table_name AND cols.column_name = comments.column_name ORDER BY cols.column_id

System configuration

Rails version: 6.0.2.1

Oracle enhanced adapter version:

Ruby version: 2.7.0

Oracle Database version: 18.4.0.0.0

I think it can be related with Oracle Views - I connected them like a models:

class Supplier < ApplicationRecord
  self.table_name = 'supplier_v'
  self.primary_key = :supplier_id

  attribute :supplier_id, :integer
  attribute :axapta_id, :string
  attribute :supplier_name, :string
  attribute :supplier_inn, :string
  attribute :file_id, :integer
end
@alexishida
Copy link

The same problem here!

@top4ek
Copy link

top4ek commented Jun 10, 2020

Same problem

@taylorwilliams
Copy link

It looks like we're having the same issue ourselves

@stale
Copy link

stale bot commented Aug 22, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix label Aug 22, 2020
@grizzly94
Copy link

Same here

@stale stale bot removed the wontfix label Aug 25, 2020
@IlyaLisin
Copy link

Same problem

@stale
Copy link

stale bot commented Dec 25, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix label Dec 25, 2020
@yahonda yahonda removed the wontfix label Dec 29, 2020
@ioev
Copy link

ioev commented Jan 11, 2021

We have a large Rails 4.2 project with many views and schema loading was giving us a lot of grief in how long it took. In some cases connection pools appeared to be resetting when certain oracle operations were running which would cause the schemas to reload on every request, greatly exacerbating the problem.

Making use of the schema cache had solved this problem for us, maybe it could help here as well?
https://kirshatrov.com/2016/12/13/schema-cache/

Edit: I'm revisiting this because dumping the schema of one of our databases that has a lot of tables takes a lot longer on 6.0.6 than it used to in version 1.7.0. I assume this is due to the OPTIMIZER_FEATURES_ENABLE('11.2.0.2') hint, that might work better in some versions of Oracle than others?

@viniciusgati
Copy link

viniciusgati commented Feb 11, 2021

same here
my oracle client are on 21.1

@viniciusgati
Copy link

We have a large Rails 4.2 project with many views and schema loading was giving us a lot of grief in how long it took. In some cases connection pools appeared to be resetting when certain oracle operations were running which would cause the schemas to reload on every request, greatly exacerbating the problem.

Making use of the schema cache had solved this problem for us, maybe it could help here as well?
https://kirshatrov.com/2016/12/13/schema-cache/

Edit: I'm revisiting this because dumping the schema of one of our databases that has a lot of tables takes a lot longer on 6.0.6 than it used to in version 1.7.0. I assume this is due to the OPTIMIZER_FEATURES_ENABLE('11.2.0.2') hint, that might work better in some versions of Oracle than others?

i do not think so, in my case i do not have my entire schema on the rails project, some models are from a legacy database.

@ioev
Copy link

ioev commented Feb 25, 2021

I've been digging around in this today as we have a fairly large schema that takes a really long time to run db:schema:cache:dump on v6.0.6. The queries themselves didn't seem to be the problem, but there was something else happening causing each table to take around 5-10 seconds to dump. I was able to turn on some logging through AD and OCI8, and it turns out there a pretty significant query running many times for each table due to .describe being called (and not being cached?) for each part of a schema dump.

I see 6 of this exact query for each table being dumped:

OCI8: SELECT owner, table_name, 'TABLE' name_type FROM all_tables WHERE owner = 'OMNIDATA_UAT' AND table_name = 'ADDRESS' UNION ALL SELECT owner, view_name table_name, 'VIEW' name_type FROM all_views WHERE owner = 'OMNIDATA_UAT' AND view_name = 'ADDRESS' UNION ALL SELECT table_owner, table_name, 'SYNONYM' name_type FROM all_synonyms WHERE owner = 'OMNIDATA_UAT' AND synonym_name = 'ADDRESS' UNION ALL SELECT table_owner, table_name, 'SYNONYM' name_type FROM all_synonyms WHERE owner = 'PUBLIC' AND synonym_name = 'TABLE_NAME'
OCI8 timing: 0.92 seconds

Seems it's also called twice when the schema is loaded for a particular query when initializing a model.

Also, this query is much faster without the unions on all_synonyms.

The behavior of this in 1.69 is different and much faster. Testing in rails 4.2, .describe ends up here and returns instantly:
https://github.com/rsim/oracle-enhanced/blob/v1.6.9/lib/active_record/connection_adapters/oracle_enhanced/oci_connection.rb#L362

In any case, I was able to drastically improve the speed of this by monkey-patching in some caching for the .describe method, but that query is still taking upwards of 3 seconds at times.

@rshell
Copy link

rshell commented Mar 11, 2022

Have recently had the same problem on move to Rails 6.1/Ruby 2.7. Current approach looking at is to replace many slow queries with one slow query. Have changed driver to build columns cache in one sql call. This has really improved startup time and made oracle useable on development box again.

Hearts of the fix is a change in SchemaStatements where move to optimized statement "for my needs". So removed comments as dont use them and swapped to user_tab_cols as a little faster and all tables in one schema.

        def build_columns_cache()
          sql = <<~SQL.squish
          SELECT LOWER(cols.table_name) AS table_name,
                 cols.column_name AS name, 
                 cols.data_type AS sql_type,
                 cols.data_default, 
                 cols.nullable, 
                 cols.virtual_column, 
                 cols.hidden_column,
                 cols.data_type_owner AS sql_type_owner,
                 DECODE(cols.data_type, 'NUMBER', data_precision,
                                   'FLOAT', data_precision,
                                   'VARCHAR2', DECODE(char_used, 'C', char_length, data_length),
                                   'RAW', DECODE(char_used, 'C', char_length, data_length),
                                   'CHAR', DECODE(char_used, 'C', char_length, data_length),
                                    NULL) AS limit,
                 DECODE(data_type, 'NUMBER', data_scale, NULL) AS scale,
                 null as column_comment
            FROM user_tab_cols cols
           WHERE cols.hidden_column = 'NO'
           ORDER BY cols.table_name, cols.column_id
          SQL
          cache = {}
          old_key = nil
          select_all(sql).each do |row|
            key = row['table_name']
            if old_key != key
              old_key = key
              cache[key] = []
            end
            cache[key] <<  new_column_from_field(key, row)
          end
          cache
        end


        def columns(table_name)
          table_name = table_name.to_s
          @columns_cache ||= build_columns_cache
          if @columns_cache[table_name]
            @columns_cache[table_name]
          else
            @columns_cache[table_name] = super(table_name)
          end
        end

This is on https://github.com/rshell/oracle-enhanced/tree/biorails_rails61 still working on this branch. Once its tested will submit to core so have have :-

  • moved optimizer_features_enable = '11.2.0.2' to database.yml and added all posible session parameters
  • add build columns cache
  • exposed bind_param_array to can to bulk insert/update with arrays binds.
  • used BINARY_DOUBLE for float as do science match C Double type

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants