If you are a Salesforce developer or admin, there may have been a scenario where you encountered this error message: “System.LimitException: Apex heap size too large: 6005934″. We recently had a client run into this issue, and I felt it would be a great blog post. This post will cover what may have caused the issue and ways to both fix and prevent this from occurring again.
Why does this error happen?
In traditional programming languages such as Java, .NET, etc. the use of memory for querying a table with more than 50,000 records and storing it in a collection like a map is common practice. We do this in order to prevent unwanted database calls every time you obtain the results and instead get the values from the collection. Now, this approach does not work in an Apex environment. The reason is because Salesforce is a shared database and Salesforce ensures that there is a limit for putting data on memory thus causing the issue.
If the above paragraph does not make sense to you, think about your iPhone that has limited storage. If you load too many apps or have too many images, your mobile phone complains that you are running out of space . Of course you can add more memory to your iPhone but in Salesforce, memory is expensive and you will not win the game with Salesforce on this.
So… Where do you start to debug this issue?
Check SOQL queries
If you look at the error message, Salesforce will point to a trigger or Apex class causing this problem. Once you narrow down the Apex trigger or class, you should check for all of the SOQL statements used by the Apex class or trigger. If you find any SOQL statement without a where clause, or which returns more than 30,000 records, that is the first cause of the problem. The reason being, once the query happens, you are now storing those huge records in memory causing the error message. The below screen shot is an example of a SOQL query which is the source of the problem. You need to enable your debug log to see this problem.
Check collection objects
The second source of the problem could be the collection objects you use such as maps, array lists or sets. If you find the SOQL statement which returns more number of records, the result of the records are stored in this collection. Identify the code which loops through the collection or gets a value based on the key. Now this collection will hold the entire 30,000 records or more in memory and it continues to loop through the records to find the value. This is a big red flag that needs to be removed.
Other Areas to troubleshoot
Here is a possible set of items you can check for other than big SOQL Queries and collections.
- If you are using multiple currency and you are querying to get the latest conversion rate for a currency by a SOQL statement, it is a certain cause.
- If you have custom objects having more than 30,000 records such as countries, list of tax rates, etc. where the code is querying every time to get a particular value, it could be a potential cause.
How to fix the problem.
- Once you have identified the problem to be a big SOQL statement displaying 30,000 records, you want to make sure that you put a “where condition” on the SOQL. Once complete, you should query the records based on a date or condition which will return 1 to 5 records maximum instead of the entire record set.
- If the cause of the problem is a collection object where you have loaded everything in memory, try to remove the collection or restrict it to a few records by tuning the SOQL and get the record you want. You should think about rewriting the code.
- Look at the debug log with profiling filter set to finest. Look for any method, SOQL statement executed more than multiple times due to duplicate calls. This can easily add more records to the collection causing it to blow out. Remove duplicate code or calls and minimize them.
- One possible side effect of the problem is that you may try to cheat the system by creating a static instance and ensure your trigger is called one time only without fixing the real SOQL. This would cause too many rows errors. Make sure you remove this completely.
Key Takeaways
- If you find the heap size error, it is always caused by SOQL statement returning more than 30,000 records or more OR collection objects holding too many records in memory.
- To fix it, remove the SOQL statement and put a “where condition” to return a limited records (1 to 5) and fine tune any collection object.
- If you want to prevent this from happening, look out for custom objects or currency records where you are continuously loading records weekly to maintain them. Any apex code or trigger looking at this object needs to revisited.
I hope this post provided a solution to the dreaded heap size limit issue. Please feel free to email questions or comments to buyan@eigenx.com.
Buyan Thyagarajan
Sr. Salesforce Consultant
Chief Subject Matter Expert
Salesforce Architect