Speed issue, report gets slower and slower
Hi All,
We're using Delphi 7 and RB 9.03 connected to DBISAM Client Server v4.21
build 11 as a data source. This is a big complex application and from one
set of tables we print a Client Statement. When this prints, to screen or
printer, it starts very fast and gets slower and slower and never finishes
printing.
For this report we have three tables linked as Master Detail as follows:
Table1 <=Master linked by Index1 Detail=> Table2 <=Master linked by Index2
Detail=> Table3
Table 1 is the Client Details, Table 2 is the Invoice Header and Table 3 is
the Invoice Lines. Table 2 uses Table 1 as it's Mastersource and Table 3
uses Table 2 as it's master source.
Table 1 has about 1000 records, Table 2 has about 100,000 records and Table
3 has about 400,000 records.
To print a Statement for a client we apply a filter to Tables 1 & 2.
Printing a Statement for 1 Client is instant. Print a range of say, 50
Clients takes about 1 minute to print.
When we ask for a full print run it begins processing about 2 records per
second and does this for about 150 records. Then it abruptly slows to about
1 records every 3 seconds and gets slower and slower never finishing to
print.
Basically it takes a couple of minutes to process 20% of the records, then
an hour to process the next 20% and then 10 hours for the next 20% etc. No
one has lived long enough to see it finish.
We are not caching pages, I've looked in the posts to no avail. This smells
like a buffer or setting thing. We get the same result on our development
system and on the client's system, different hardware - server and
workstations.
Any ideas?
Thanks and regards,
Bruce.
We're using Delphi 7 and RB 9.03 connected to DBISAM Client Server v4.21
build 11 as a data source. This is a big complex application and from one
set of tables we print a Client Statement. When this prints, to screen or
printer, it starts very fast and gets slower and slower and never finishes
printing.
For this report we have three tables linked as Master Detail as follows:
Table1 <=Master linked by Index1 Detail=> Table2 <=Master linked by Index2
Detail=> Table3
Table 1 is the Client Details, Table 2 is the Invoice Header and Table 3 is
the Invoice Lines. Table 2 uses Table 1 as it's Mastersource and Table 3
uses Table 2 as it's master source.
Table 1 has about 1000 records, Table 2 has about 100,000 records and Table
3 has about 400,000 records.
To print a Statement for a client we apply a filter to Tables 1 & 2.
Printing a Statement for 1 Client is instant. Print a range of say, 50
Clients takes about 1 minute to print.
When we ask for a full print run it begins processing about 2 records per
second and does this for about 150 records. Then it abruptly slows to about
1 records every 3 seconds and gets slower and slower never finishing to
print.
Basically it takes a couple of minutes to process 20% of the records, then
an hour to process the next 20% and then 10 hours for the next 20% etc. No
one has lived long enough to see it finish.
We are not caching pages, I've looked in the posts to no avail. This smells
like a buffer or setting thing. We get the same result on our development
system and on the client's system, different hardware - server and
workstations.
Any ideas?
Thanks and regards,
Bruce.
This discussion has been closed.
Comments
1) Retrieve only needed records :
In Table1 and Table2, retrieve only records that match with the master
key like :
in Table 1 Query :
select from table1 where IN (SELECT from table2 where IN (SELECT <link_field
FROM table2)
2) Join pipeline, not dataset
Le 04/02/2006, Bruce Rogers a supposé :
Dennis's recommendations are correct. Take a look at Demo 64 (dm0064.pas)
located in the \RBuilder\Demos\1. Reports\... directory for an example of
what can happen when standard Delphi linking is used. You will notice that
this demo runs very slow compared to the other master-detail reports in the
section due to the fact that detail records are selected each time the
master dataset record position changes. Since your details each have more
and more records, this would definitely slow the report generation down.
--
Regards,
Nico Cizik
Digital Metaphors
http://www.digital-metaphors.com
Nico Cizik
Digital Metaphors
http://www.digital-metaphors.com
I have tried joining the pipelines only not the datasets and this made no
difference.
I have not tried selecting specific records with a query as Denis suggests,
I've always applied filters to the whole table. What's the difference
between filtering and a query?
Something else if happening here though. I ran a data backfill process on
table 3, the one with 400,000 records, to fill in an empty date field. A
field that is NOT an index, NOT used as a link to another table and NOT even
printed in the report. This was just a sequential scan though the records
and conditionally filling the field in question.
After I did this the statement generated 930 for Clients in about 3 minutes.
I'll look dmeo 64 I'd like to try and understand what's going on here
because this app will have a great deal more data in it in the future.
Thanks and regards,
Bruce.
When you do filter, all records are in memory. Not good !
With SQL Database, use Query. No table !
How did you do that ? With First, next and eof or with Query ?
Use SQL UPDATE.
This is what I would have suggested. One other item you may try is
disabling the Report Outline. Occationally for very large reports, this
feature can slow the report generation down.
--
Regards,
Nico Cizik
Digital Metaphors
http://www.digital-metaphors.com
Nico Cizik
Digital Metaphors
http://www.digital-metaphors.com
Yes I always use a filter and First, Next EOF, a hang over from non SQL DBs.
This is the first SQL based app I've done, so I'm short on best practice.
Thanks for the help,
Regards,
Bruce.