Systems have a habit of slowing down over time and with increased user loads. When customers report slow performance on their system there are a few typical questions that can help provide clues as to why.
- Has the user load on a system increased significantly?
- Has the data throughput on a system increased significantly?
- How old is your hardware and software?
- What does your database look like?
User Load
As your company grows, your system users may grow as well. More users on a system will often lead to an increased load on the hardware resources such as CPU, RAM and Hard Disk space. These things are impacted differently in every situation. If your company serves out SAP related applications with a terminal applications server, these hardware resources will undoubtedly be impacted. This is one of the first areas to check if systems begin to slow down.
Data Throughput
As your business grows, sometimes the throughput of data will increase as well. For example, when your company first implements your system you’re adding one transaction (of 50 lines) every one minute. However, two years down the road you’re adding one transaction (of 50 lines) every second. Similar to the number of users, if the amount of data activity increases greatly this can also lead to a taxing of the system. This can be the case in systems where transactions are highly automated such as an e-Commerce site feeding back into your ERP system.
Hardware/Software Age
As time goes on, operating systems and applications increase in size and complexity. Hardware and software can eventually fall out of step with the increasing requirements of new versions of things. For example, the amount of RAM required to run any given program never seems to decrease with newer versions. Again, benchmarking your current IT infrastructure against the changing demands on it can be an important part of assessing why a system is performing slowly.
Database
Over time, the data in your database becomes fragmented. This is a lot like how the data on a hard drive becomes fragmented over time. That is to say, similar bits of data might end up being stored in entirely different parts of the hard drive. The hard drive then needs to work harder and longer because accessing these bits means that it’s having to reach farther and longer. SQL databases (for those of you running SQL) are no different – they become fragmented over time. Running a defragmentation script can help reverse the slow down associated with this fragmentation.
As databases become very large, accessing data can slow down simply due to the sheer size of the database. Another way to improve performance could be to archive transactional data that isn't accessed regularly. Typically, this procedure will summarize important historical data for quick access, and archiving/compressing the main data off to the side. This can help increase performance on systems which very large databases.
Want help improving your system’s performance? Contact our Support Team, we’d be happy to help!
Comments
0 comments
Please sign in to leave a comment.