Home Server
New Blog Posts: Merging Reports - Part 1 and Part 2

Report Server Performance

edited March 2003 in Server
I'm using RB 7.01 / Delphi 6 SP2 / Oracle 7.3 with reports stored in the
database.

It's taking about 20 seconds to display the first page of a very simple
report.


- Is this a typical load time?

- What should I look for to speed this up?


Thanks! Jon Gray

Comments

  • edited March 2003

    The performance of our Live Demo Server speaks for itself - it is
    instantaneous. Our Live Server Demo
    (http://www.digital-metaphors.com/rbServerLive) contains both archived
    and live reports.

    The ReportBuilder Demo Server is running on a dual processor Dell
    PowerEdge computer with MS Windows 2000 Server and IIS 5.0. The RB Demo
    Server is using Elelvate's DBISAM Server for the database

    Try some tests to isolate the issue:

    1. Test a report that contains a single TppLabel, no dataset, and is
    stored to an .rtm file.

    2. Test the same report as above, but store the report in a database.

    3. Test a simple query based report that is stored to an .rtm file.

    4. Test the same as above with a report stored to a database.

    Hopefully the above test will help you to isolate the issue. You could
    test these reports in a standard application and then test them in a
    server application.




    Best regards,

    Nard Moseley
    Digital Metaphors
    www.digital-metaphors.com
  • edited March 2003
    Your insight and testing steps were quite helpful.

    I'm not sure if this is correct but I have been using the stock Oracle
    Explorer database server demo modified for my database connection.
    (RBServer\Demos\Servers\03. Explorer Databases\Oracle 8) I'm using DOA
    connectivity.

    My slowdown it seems is the DOA qryItems firing and returning all rows from
    the database. With several hundred reports this is taking 10+ seconds to
    return all rows. I modified the qryItems and qryFolder to NOT return all
    rows. While this improved performance I can't help but think this query
    should be parameterized for the specific report being requested.

    I feel like I'm missing something big, like I should not be using the demo
    or it needs to modified some way to serve reports one at a time.

    Am I on the right track?




  • edited March 2003

    Building the directory tree that is shown by the report explorer will cause
    the dataset to be opened and all records to be traversed. In its current
    implementation this can be slow for very large numbers of reports - as you
    have observed. You may want to redesign the application to use the
    ReportTemplateVolume (i.e. .rtm file based repository), rather than the
    report explorer tables.


    --
    Nard Moseley
    Digital Metaphors
    http://www.digital-metaphors.com

    Best regards,

    Nard Moseley
    Digital Metaphors
    www.digital-metaphors.com
  • edited March 2003
    I can live with the slow display for the report explorer - makes sense.

    I'm having the problem when executing a single trsClientReport through the
    same server, not via an explorer.

    Should I be using the same report explorer server to run single reports from
    an application?


  • edited March 2003
    Your suggestion to use the ReportTemplateVolume and .RTM files was a good
    one. The reports now display very quickly, just a few seconds for simple
    reports.

    Of course I want it all tho :-) I don't want to lose the Explorer / Web
    Tier.

    Can a collection of RTM files be used within a Report Explorer somehow?

    Thanks - Jon Gray


  • edited March 2003

    The .rtm files on the server can be arranged into subfolders. You just need
    to point the ReportTemplateVolume.FileDirectory at the root directory. The
    ClientReportExplorer and Webtier can both display a report explorer style
    interface that shows the familiar tree structure of folders and reports .

    (The TppReportExplorer is limited to working with the rbFolder and rbItem
    tables. It does not support the .rtm file based configuration.)


    --
    Nard Moseley
    Digital Metaphors
    http://www.digital-metaphors.com

    Best regards,

    Nard Moseley
    Digital Metaphors
    www.digital-metaphors.com
  • edited April 2003
    Ouch. Just ran into the performance slowness issue myself. Do you have
    any plans to address this? I understand that the DBPipeline that is used
    for the items and folders tables is generically used through a datasource
    and that you cannot count on parameterizing a query, but in the long run,
    this makes storing reports in a database completely unusable in a
    client/server environment (for anything more than a handful of reports
    anyway).

    Centralizing reports in the database is so convenient!


  • edited April 2003

    For the rbItems data, try using a TQuery rather than a TTable. The default
    SQL should select all records, because that is needed to build the volume
    directory. Set the ReportVolume.DirectoryOptions.RefreshInterval to a very
    high number so that the directory can be cached for long periods of time.

    In the ReportVolume.LoadReportStart event, try adding code to select only
    the report that needs to be loaded. Below is an example. Note that this
    example assumes that report name is unique.

    procedure TDatamodule1.rsReportExplorerVolume1LoadReportStart(Sender:
    TObject; Stream: TStream);
    var
    lsReportName: String;
    begin
    lsReportName := TppReportTemplate(Sender).DataBaseSettings.Name;

    qryItem.SQL.Text := 'select * from rbItem where name = ' + lsReportName;


    end;


    --
    Nard Moseley
    Digital Metaphors
    http://www.digital-metaphors.com

    Best regards,

    Nard Moseley
    Digital Metaphors
    www.digital-metaphors.com
  • edited April 2003
    The Items pipeline was already being fed by a TQuery, so I can't improve
    on this very much. Isn't the report template already loaded when the code
    in the LoadReportStart event fires?

    I bit the bullet yesterday and created a little separate utility to 'grab'
    the reports from the database and save them to a directory structure on
    the file system. Quite honestly, even one report loads so much faster
    from the file system than from the database that this may be a better
    solution for me in the long run.

    Thanks again for your help, I am getting closer and closer to deploying
    this on a few internal machines, so hopefully in the next week or so, I
    will be buying a few cpu's of deployment licenses.



This discussion has been closed.