Home General
New Blog Posts: Merging Reports - Part 1 and Part 2

Setting the searchexpression takes very long

edited March 2007 in General
Hi,
if I set the searchexpression (MyAutoSearchField.SearchExpression := Value;)
of a autosearchfield it takes very long the first time. After then, setting
the same expression it's very fast. Serveral time ago I loooked deep inside
the RB source. It looks like RB scans the whole database structure (tables
and fields) which takes very long. We are using a Oracle database with much
tables. Even with a fresh new database with just some exsample records I got
this behavior. We are using Delphi 5 with RB 7.04 and Delphi 2006 with RB
10.04.
Same result with both versions. Any ideas?

Regards
Sven

Comments

  • edited March 2007
    Hi Sven,

    I will take a look at this however it sounds like this may be a design
    limitation of changing the autosearch condition on the fly. This is
    something we can possibly look at enhancing for a later version of
    ReportBuilder. How many tables are being accessed in your query?

    --
    Regards,

    Nico Cizik
    Digital Metaphors
    http://www.digital-metaphors.com

    Best Regards,

    Nico Cizik
    Digital Metaphors
    http://www.digital-metaphors.com
  • edited March 2007
    > ReportBuilder. How many tables are being accessed in your query?

    It makes no difference the query uses one or 5 tables. The result is the
    same.

  • edited March 2007
    Hi Sven,

    How are you connecting to your Oracle database? Are you using the end-user
    solution?

    --
    Regards,

    Nico Cizik
    Digital Metaphors
    http://www.digital-metaphors.com

    Best Regards,

    Nico Cizik
    Digital Metaphors
    http://www.digital-metaphors.com
  • edited March 2007
    I'm using the end user soulution, too. Using ADO or BDE.


  • edited March 2007
    Hi Sven,

    Are you able to recreate the issue using one of our example demos? We have
    made some enhancements to the way the end-user solution works for RB 10 so
    perhaps this will speed things up for you. Also, have you tried using DOA
    to connect to Oracle?

    --
    Regards,

    Nico Cizik
    Digital Metaphors
    http://www.digital-metaphors.com

    Best Regards,

    Nico Cizik
    Digital Metaphors
    http://www.digital-metaphors.com
  • edited March 2007

    Sven said:
    *This* is the problem, combined with that the DM code is not scalable.

    The example database is small, so you will never see the problem there. You
    must test with a large database.

    We also had the problem where it was just too slow (5-10 minutes wait) each
    time you...
    - open the query builder
    - close the query builder
    - change from query design tab to report design tab

    The reason it takes so long is that each time you wait, the DM code...
    1) Reloads *all* database information whether it needs it or not.
    At this stage CPU is at 5-15% and networking is slight.
    2) Does a brutesearch through all joins possible (with O(n^2) looping
    through arrays of records).
    At this stage CPU is at 100% and networking is zero.

    However, all is not lost. I managed to solve the problem. Now the GUI is
    almost instant. (Takes 1-2 secs to select a table in my query designer.)

    In short, the way I solved it was to add a filter so that the DM code only
    sees tables that have been referred to in the query designer. So whenever
    the user selects a new table, that table is added to the filter and I tell
    the DM code to do a refresh. (This is why it takes 1-2 secs each time a
    user selects a table). This means that all the other times the DM code does
    the same kind of refresh, the filter makes sure it only sees a very small
    subset of the database.

    The steps are...

    1) Update the table filter and add it to the queries in the
    TDataSet.BeforeQuery event for the three datasets that load...
    - table information
    - field information
    - join information

    2) Tell the DM code to do a refresh (with an updated filter) with something
    like this...
    procedure ResetTableCache;
    begin
    gMetaDataManager.Clear;
    frmReport.qryTable.Close;
    frmReport.qryField.Close;
    frmReport.qryJoin.Close;
    end;


    Good luck!
  • edited March 2007
    Hi Peter,

    Thanks for the info. We are always looking to improve the way ReportBuilder
    accesses and handles data. We would be interested in seeing what changes
    you would recommend in order to allow the use of larger databases. If
    possible, please send these to support@digital-metaphors.com so we can
    research them and possible add the enhancements to a later release of
    ReportBuilder.

    One other note. Have you tried using the DataDictionary to accomplish the
    same thing you describe below? I'm unsure on a large database such as yours
    however I believe the DataDictionary should perform the same filtering.

    --
    Regards,

    Nico Cizik
    Digital Metaphors
    http://www.digital-metaphors.com

    Best Regards,

    Nico Cizik
    Digital Metaphors
    http://www.digital-metaphors.com
  • edited March 2007

    The changes I made replaced the whole arrays-of-records way with a proper OO
    approach. With my libraries I gain access to all information I need without
    ever needing to reload any information, and only loading information when
    it's needed. (Although, as stated earlier, DM code must reload information,
    but that's fairly quick provided the set of tables is small.)

    I can't see how. The data dictionary seems to connect the pieces together,
    but not have any filtering.

    Looking at it I have:
    ppDataDictionary
    - FieldPipeLine.DataSource.DataSet: TxxxDataSet
    - JoinPipeline.DataSource.DataSet: TxxxDataSet
    - TablePipeline.DataSource.DataSet: TxxxDataSet
    It is those TxxxDataSet's (in our case TOracleDataSet) that must apply the
    table filter in their BeforeQuery events.

    eg.
    (qryTable: TOracleDataSet)
    (TRmt... are classes that are part of my library.)

    procedure TfrmReportBuilder.qryTableBeforeQuery(Sender: TOracleDataSet);
    var
    Filter: string;
    begin
    Filter := UsedTablesFilter('TABLE_NAME');
    qryTable.SQL.Text :=
    'SELECT table_name, table_alias '+
    'FROM s_rpm_table '+
    'where '+ Filter +
    ' order by 1';
    end;

    function TfrmReportBuilder.UsedTablesFilter(const FieldName: string):
    string;
    var
    i: integer;
    TableList: TRmtTableList;
    begin
    Result := '';
    TableList := TRmtTableList.Create;
    try
    TableFactory.ListPopulatedTables(TableList);
    for i := 0 to TableList.Count-1 do
    begin
    if i > 0 then
    Result := Result + ' OR ';
    Result := Result + FieldName + '=''' + TableList[i].TableName +
    '''';
    end;
    // If the filter is empty, then don't load any records
    if Result = '' then
    Result := FieldName+' = ''''';
    finally
    TableList.Free;
    end;
    end;

    I suggest you make a simple program that generates tables for an Oracle
    database (we use Oracle too). Generate several hundred tables with joins
    and some random data. Then test with that.
  • edited March 2007

    Sounds great. Spoke to my manager and team leader and they both liked that
    too. We need to check with our executive first, and we'll get back to you.

    Also, we've added several enhancements that won't be interesting/meaningful
    to you, so we'll have to make a demo that's a cut down version of what we
    have.

    Do you have a guesstimate on when the 2007 upgrade is coming out?

    My e-mail address is peter.t@rmt.com.au
This discussion has been closed.