You are not logged in.
Hi
I observed ReportServer occupies high Resident memory [RES] after concurrent report schedules. I had to kill the tomcat process after 24hr as i did see the same RES. i used "top" to monitor
Hardware : RedHat installed box: physical mem :256g [~200 usable]
Software : Java8/ tomcat8 /RS (3.0.2) / heap [new and initial] : 100g / unlimited metaspace
Scenario
- 4 concurrent schedules : Retrieve same data set and export them to exls
# column : 70 [String/numeric]
Report size : ~300 mb
Resultset size: 1.0 million records
Observed Results
All of the schedule executed successfully and end after 10 min.
RES
before : 1.4g
after : 45g
Is it because internal file-system hold the dataset/files ?
Is there a way to access the file and SFTP to a preferred server in a schedule ? AS we can not use Email option since report sizes may varies
Since server does not create a physical files As i observed , Files are generated on demand, when use "Save as option" in Teamsapce on a particular report resource ?
Is there a way to tune memory after concurrent executions in future releases ?
thanks
Last edited by mahesh (2016-07-19 14:59:23)
Offline
Hi,
you can create custom schedule targets via ReportServer's SendTo functionality. For more information, have a look at https://reportserver.net/en/guides/scri … s/Send-To/. As for SFTP, have a look at this thread here https://forum.reportserver.net/viewtopi … 2386#p2386.
Regarding the memory issue. XLSX is simply a zip archive of a bunch of XML files. If the result is 300mb then the uncompressed size which is kept in memory during the evaluation is orders of magnitude bigger. As for the memory not changing 24 hours, this is expected behavior. Memory that is once claimed by Java will not be released to the system. This does not mean, that java does not reuse the memory for its own operations, but the memory will not be reclaimed. Thus, the observed memory usage of a java application from the systems perspective is always only growing.
Best Regards,
Thomas
Offline
Hi
Thanks for the reply.
Lets skip narrow down option using column filters for high volume data set for following scenario .
i observed that RS try to restart the schedule in following scenario throwing exceptions.
Resultset size exceed the maximum number can hold in XLSX file. [LIMIT set 1.6 million records]
what is the standard convention that any report tool should follow when it encounters such scenario ?
1. stop schedule with a warning , Ask user to narrow down.
2. multiple report generation
...... etc
is there a way to configure resultset size in a pariticular report without set using LIMIT/ROWNUM...etc ?
Can we use rcondition in such scenario ?
Also, if there is no records satisfy the report criteria, then is there a way to set a message in the schedule output without executing schedule ?
Last edited by mahesh (2016-07-21 03:15:18)
Offline
Actually i did manage set rcondition
Last edited by mahesh (2016-07-24 13:12:08)
Offline