Disconnected pipelines
We are converting our web tier report server to support plug-ins, Delphi
packages which each contain a report. The object is to be able to add
reports without stopping the report server so we don't have to add
reports in the middle of the night.
We designed the reports and saved them off to RTM files with pipelines
available on the same data module as contains the
rsReportTemplateVolume. We would like to move the pipelines to data
modules contained in the plug-ins and then assign the pipelines to the
report at run-time in code.
We've noticed when we do this we lose grouping and master-detail
relationships, although everything else seems to connect. Can you offer
any suggestions on how we could do this without either building entire
reports in code (which we know does work) or assigning a pipeline to
each component on the report?
packages which each contain a report. The object is to be able to add
reports without stopping the report server so we don't have to add
reports in the middle of the night.
We designed the reports and saved them off to RTM files with pipelines
available on the same data module as contains the
rsReportTemplateVolume. We would like to move the pipelines to data
modules contained in the plug-ins and then assign the pipelines to the
report at run-time in code.
We've noticed when we do this we lose grouping and master-detail
relationships, although everything else seems to connect. Can you offer
any suggestions on how we could do this without either building entire
reports in code (which we know does work) or assigning a pipeline to
each component on the report?
This discussion has been closed.
Comments
1. The ReportVolume components are capable of dynamically discovering new
report templates and archives. Thus if you save your reports to .rtm files
or to the report explorer database tables, then new reports can be added by
saving them to the appropriate repository.
To take advantage of this you need to construct the reports so that they are
entirely portable. Use the ReportBuidler Data Workspace (Dade) to define the
data and use the Calc Workspace (RAP) to code event-handlers and
calculations. Report definitions defined in this manner are completely
portable - containing the data access logic, the code, and the layout.
2. Reports executed on the server must have a thread-safe environment in
which to execute. To use datamodules, you would need to create a separate
instance for each report - perhaps you already doing this. Prior to the
report being loaded you could try to create the datamodule instance and then
change the Owner of the data access components from the datamodule to the
reportmodule. To do this you would need to interate thru the components in
the datamodule and call DataModule.RemoveComponent() and then call
ReportModule.InsertComponent(). For the DBPipeline components you simply
call DBPipeline.ChangeOwner(ReportModule). One the data access components
are all owned by the report module, then you the report will likely connect
and run properly.
--
Nard Moseley
Digital Metaphors Corporation
www.digital-metaphors.com
Best regards,
Nard Moseley
Digital Metaphors
www.digital-metaphors.com
Thanks for your response. I work with Mark Lincoln, who has discussed
this with you in the past. He says we can't take this approach because
we are getting our data from a data server over the web rather than
having components on a data module which can establish connections to
the database server.
Regardless, I solved my problem and let me share in case it helps anyone
else. I got our reports to work and there were two things I had to do:
(1) Save the reports to the RTM file without data pipelines being
connected; (2) Assign the data pipeline as the last thing in the plug-in
(I had been doing it first).