Troubleshooting performance issues

Home / Support / Administrators Guide /

Troubleshooting performance issues

1. Task Manager – CPU control

Windows Server 2016

Obrázek.png

Windows Server 2008

Obrázek.png

  • On the “Details” tab in the “Task Manager”, sort the running processes in descending order according to the “CPU” column and find out which process is causing an unusual CPU usage.

Obrázek.png

1.1. The CPU is permanently busy at 100 %

  • Permanent CPU usage can indicate a web application, console application, or service loop that is caused by an error in the application source code.
  • Permanent CPU usage of the “w3wp.exe” process may indicate a loop in the application pool or error in some of the web applications.
  • Permanent CPU usage by the “firebird.exe” process can indicate either poorly designed SQL queries that disproportionately use the Firebird database server, or a long-term increasing database server load that must be addressed by a hardware increase in the number of processor cores.
  • Permanent CPU usage by the “sqlserver.exe” process can indicate either poorly designed SQL queries that disproportionately load the MSSQL database server, or a long-term increasing database server load that must be addressed by a hardware increase in the number of CPU cores.

1.2. CPU usage oscillates around limit 70 % and above

  • Higher CPU usage by the “w3wp.exe” process may indicate suboptimal algorithms in some of the web applications.
  • Higher CPU usage by the “firebird.exe” process can indicate either poorly designed SQL queries that disproportionately load the Firebird database server, or a long-term increasing database server load that must be addressed by a hardware increase in the number of CPU cores.
  • Higher CPU usage by the “sqlserver.exe” process can indicate either poorly designed SQL queries that disproportionately use the MSSQL database server, or a long-term increasing database server load that must be addressed by a hardware increase in the number of CPU cores.

1.2.1. Examples of suboptimal algorithms

  • For cycles that call an SQL query in each iteration – the solution is to load all data in bulk before the for cycle itself, and then index this data into a “Dictionary” object. This procedure is described in detail in the separate External function manual.
  • Generating large files in PDF or XLSX format using “Aspose” libraries – the solution is a hardware increase in the number of cores.
  • High-resolution image processing – the solution is a hardware increase in the number of cores.
  • Non-optimal algorithms are often located in one of the external functions, or they can be caused by printing to printing templates. In both cases, it is important to check the script report (total_worker_time) and the external functions (total_worker_time), in which the long durations of the scripts or external functions are displayed at the top of the report.

1.3. CPU usage oscillates below 50 %

This condition is normal.

2. Task Manager – memory check

Windows Server 2016

Obrázek.png

Windows Server 2008

Obrázek.png

  • On the “Details” tab in the “Task Manager”, sort the running processes in descending order according to the “Memory (private working set)” column and find out which process is causing unusual memory usage.

Obrázek.png

2.1. The memory is permanently used at 100 %

  • It is necessary to determine the memory usage of the “w3wp.exe” process, which may be limited by the operating system due to the one-time execution of more memory-intensive tasks, so that the currently allocated memory may no longer be sufficient for smooth running of web applications within this process. In crisis situations, NET Genium can be temporarily disabled by creating a cache by creating a “DenyCache.txt” file in the “Config” directory, which ensures less memory consumption and faster processing of web requests when memory is low.
  • Permanent memory usage by the “w3wp.exe” process may indicate suboptimal algorithms in some of the web applications.
  • Permanent memory usage by the “firebird.exe” process can indicate either poorly designed SQL queries that disproportionately load the Firebird database server, or a long-term increasing database server load that must be addressed by a hardware increase in memory.
  • Permanent memory usage by the “sqlserver.exe” process can indicate either poorly designed SQL queries that disproportionately load the MSSQL database server, or a long-term increasing database server load that must be addressed by hardware-increasing memory.
  • The permanent memory usage of the “sqlserver.exe” process means, among other things, that one of the instances (usually the default one) does not have a set memory usage limit, which in turn prevents the operating system and other applications from running properly. Each instance of MSSQL should always have a memory usage limit set to a safe limit, such as only 80 % from the total memory so that the operating system and other applications have enough space for their operation.

2.2. Memory usage is between 60 % a 99 %

  • It is necessary to determine the memory usage of the “w3wp.exe” process, which may be limited by the operating system due to the one-time execution of more memory-intensive tasks, so that the currently allocated memory may no longer be sufficient for smooth running of web applications within this process. In crisis situations, NET Genium can be temporarily disabled by creating a cache by creating a “DenyCache.txt” file in the “Config” directory, which ensures less memory consumption and faster processing of web requests when memory is low.
  • It is a good idea to consider a hardware increase in memory primarily to run more memory-intensive tasks once.
  • You must check the MSSQL server instance settings to see if it has a memory usage limit.

2.3. Memory usage is below 50 %

This condition is normal.

3. Pending Windows operating system updates

  • On the application and database server, check the queue for pending updates that need to complete the installation. These updates can wait in the background and take up memory for the entire server, even though in Task Manager it appears that the server has enough free memory. Consequently, it does not have the resources to start and run processes normally, and both common applications on the application server and the database server collapse. Typically, this condition on the database server is that the MSSQL instance allocates only 200MB, for example, even though it has a much higher limit or no limit on the allocated memory at all.

4. Debug Diagnostic Tool

  • In critical situations, it is necessary to end individual processes in the “Task Manager”. Before this step, it is useful to create a “memory dump” of the process, which can then be analyzed using the “Debug Diagnostic Tool” and can detect both looping algorithms and the reason for memory congestion.
  • A simplified procedure for working with “memory dumps” is given in the file “Config\Tools\MemoryDumps.txt” every NET Genium:
1) Download and install Debug Diagnostic Tool v2 Update 2 (https://www.netgenium.com/download/DebugDiagx64.msi)
2) Locate memory dumps (C:\Users\abc\AppData\CrashDumps)
3) Run DebugDiag
4) Default Analysis / CrashHangAnalysis
5) Add Data Files
6) Start Analysis

5. My computer – Disk Check

  • Incoming disk space often means irreversible data loss due to database consistency. The fastest way to free up disk space is to delete log files or database and file backups.

Obrázek.png

  • The “TDP x-Ray” program is an ideal tool for detailed analysis of disk occupancy by individual directories and files.

Obrázek.png

  • The log files include both the IIS log files located in the default directory “C:\inetpub.\logs\LogFiles , so the log files of specific NET Genium – the “Logs” directory of each NET Genium.
  • Database backups are often located in the “Backup” directory of each NET Genium, or in the “E:\BackupServer”.
  • By default, IIS has web application traffic logging turned on, which must be turned off as soon as possible. At the same time, it is important to delete existing logs, which are stored in the “C:\inetpub\logs\LogFiles”. Use the following procedure to determine the exact location of these logs from the specific IIS settings.
  • Go to the IIS settings and select the highest node of your server/computer (for example “OFFICE” / “TOMAS-PC”) so that the settings are overwritten to other child nodes, and select “Logging” / “Logging” in the “IIS” section.

Obrázek.png

  • Then click “Disable” in the “Actions” panel on the right side of the window. This disables vehicle logging for your server/computer node.

Obrázek.png

6. Performance Manager – control of disk operations

  • On the “Performance” tab in the “Task Manager”, click on the “Open Resource Monitor” link.

Obrázek.png

  • On the “Disk” tab in the “Resource Monitor”, sort the disk activities in descending order according to the “Write” column and find out which files – Firebird or MSSQL databases – show unusual disk writes.

Obrázek.png

  • Unusual disk writes can also include file attachments in the “Files” directory, which can cause NET Genium to slow down when rendering portlets, view pages, or edit forms. Increased disk write activity can slow down the reading of control templates stored in “ascx” files if NET Genium does not have file attachments located in the “Files” directory on another disk using the “Files/Files” configuration file. Redirecting file attachments to another disk location is described in detail in the separate Administrators Guide.
  • To quickly check disk operations, open the NET Genium settings, and turn on “Logging to console” or “Logging to disk” if you want to go through the rendering process of portlets, view pages or edit forms in detail:

Obrázek.png

  • Close and reopen the web browser with NET Genium, and check “Disk Time” in the console at the bottom of the screen, which is the sum of the read times of all control templates from disk:

Obrázek.png

7. RunningQueries.exe

  • The “RunningQueries.exe” application is located in the “bin” directory of each NET Genium, and is used to evaluate currently processed database queries. Running this application creates a “RunningQueries.htm” log file in the “Logs” directory, and opens this log file at the same time. The queries mentioned in this log file may reduce the performance of the database server. At the same time, they can be used to identify the database in which performance issues occur.

Obrázek.png

8. SQL Server – Activity Monitor

  • Start “SQL Server Management Studio” and click on the “Activity Monitor” icon.

Obrázek.png

  • On the “Recent Expensive Queries” tab, identify the list of recently run demanding database queries. These queries can also be used to identify the database in which performance issues occur.

Obrázek.png

9. SQL Server – Top Queries by Total CPU Time

  • Start “SQL Server Management Studio”, right-click on the top node in “Object Explorer”, and select “Reports / Standard Reports / Performance – Top Queries by Total CPU Time”.

Obrázek.png

  • In this report, identify the list of database queries that are most burdensome on the database server. These queries can also be used to identify the database in which performance issues occur.

Obrázek.png

  • Once the database in which the performance issues occur is identified, it is more convenient to run a specific NET Genium and display the “dm_exec_query_stats (total_worker_time)” report. This report displays an identical list of database queries that are most burdensome on the database server, and also offers the ability to identify the control that runs the database query (“…” link).
  • SQL Server also offers a “Top Queries by Average CPU Time” report, which has the equivalent in NET Genium under the “dm_exec_query_stats (average_worker_time)” report.
  • In general, it is recommended to browse the reports in the following order:
    • dm_exec_query_stats (total_worker_time) – In this report, both database queries that may have a reasonable query processing time but run too often and database queries that run at reasonable intervals but have a long processing time come first.
    • dm_exec_query_stats (execution_count) – In this report, database queries that run frequently come first, so it is important to consider “average_worker_time” for these queries.
    • dm_exec_query_stats (average_worker_time) – In this report, database queries come first, which have a long processing time of individual queries.
  • A typical incorrectly designed query error that runs very often and does not have an index set on the “ng_url” column.

Obrázek.png

10. NET Genium logging and log evaluation

  • The steps in the previous chapters should be used to identify the database in which the performance issues occur.
  • Enable logging in the NET Genium settings by selecting “To database and disk”. A detailed description of NET Genium settings is given in the separate NET Genium settings manual.

Obrázek.png

  • Evaluate logs with reports, and identify view pages, edit forms, database queries, scripts, or external functions that last the longest. A detailed description of the reports is given in a separate manual Reports.
    • View pages (average_worker_time)
    • View pages (total_worker_time)
    • Edit forms (average_worker_time)
    • Edit forms (total_worker_time)
    • Database queries (average_worker_time)
    • Database queries (total_worker_time)
    • Scripts (average_worker_time)
    • Scripts (total_worker_time)
    • External functions (average_worker_time)
    • External functions (total_worker_time)

11. Editing SQL queries and program code

  • The most common cause of performance issues are improperly designed database queries or inefficient retrieval of data from the database. In most cases, you just need to set the indexes on the columns in the database correctly, or change the way the database tables are joined. It is far more laborious, but just as important, to change the way data is retrieved from the database so that as few queries as possible are sent to the database, and always run all queries of a given task with only one connection to the database.

11.1. Index settings

  • A detailed description of the issue of indexes is given in a separate manual Reports, chapter “Indexes”.

11.2. Joining database tables

  • Joins are often either completely useless because the result does not use columns retrieved from the accepted database table, uses a slower “LEFT” instead of a faster “INNER” in cases where “LEFT” is unnecessary, or uses “LEFT” in conjunction with the database condition. query, see the Reports manual, chapter “ Database Queries (LEFT JOIN + Condition) ”.
  • The moment large database tables join together, or join them too much, you need to completely change the data retrieval logic, and split a costly database query with many joins into more simple database queries.

11.3. SQL Server Database Engine Tuning Advisor – SQL query debugging

  • SQL Server includes a useful tool for debugging database queries – SQL Server Database Engine Tuning Advisor. A simplified procedure for debugging database queries is provided in the “Config\Tools\TunningQueries.txt” every NET Genium:
1) Run SQL Server Management Studio
2) Right click on the instance name and select 'Reports / Standard Reports / Performance - Top Queries by Total CPU Time'
3) Identify top queries with constants in a condition that can be improved with indexes
4) Right click on the background of the report and select 'Print / Excel'
5) Open printed Excel file
6) Create 'netgenium.sql' file and insert top queries using the following syntax:

use netgenium
go
SELECT ...
SELECT ...

7) Run SQL Server Database Engine Tuning Advisor
8) Click 'Start New Session'
9) Select 'File' as a 'Workload' and browse for 'netgenium.sql'
10) Mark 'netgenium' database
11) Click 'Start Analysis'
12) Analyze 'Recommendations' tab
13) Create new indexes as recommended

11.4. Minimize SQL queries to retrieve data from the database

  • The most common error when retrieving data from a database is “SELECT *”. The query should never contain an asterisk as a wildcard for all columns of the source database table, as well as all columns of all accepted tables, but should always contain a comma-separated list of read columns.
  • Loading strings of unlimited length (TextArea, RichTextBox, MultiListBox) causes a significant slowdown in database query processing, especially on the Firebird database server. These columns should only be loaded when absolutely necessary.
  • Detailed description of how to retrieve data from the database using C# including examples is given in the separate manual External functions, chapter “Reading data from the database”.