Introduction
Even well-designed integrations can slow down if every transaction hits the API repeatedly.
By optimizing how data is fetched, cached, and processed, you can cut run-time by half β or more β and dramatically reduce governance usage.
This post shows how to:
- Batch API calls efficiently.
- Cache configuration and metadata.
- Use Map/Reduce properly for large jobs.
- Measure and compare results before vs. after optimization.
1οΈβ£ Before vs. After Example
Metric | Before Optimization | After Optimization |
---|---|---|
Average Run Time | 18 min | 6 min |
Record Loads | 2 200 | 260 |
API Calls | 1 500 | 430 |
Governance Units | 8 900 | 2 400 |
Failures | 4 / day | 0 / day |
2οΈβ£ Batch Record Operations
Instead of loading or submitting one record at a time:
const ids = [201,202,203,204];
record.submitFields({
type: record.Type.SALES_ORDER,
id: ids,
values: { custbody_synced:true }
});
β
Use submitFields()
or record.submitBulk()
(where available) to update multiple records simultaneously.
3οΈβ£ Cache Reference Data
Configuration data (item mappings, tax codes, currencies) rarely changes β so cache it.
define(['N/cache'], (cache) => {
const configCache = cache.getCache({name:'integration_cfg'});
const getConfig = (key, loader) => configCache.get({
key, loader
});
return { getConfig };
});
β
Reduces repeated search.lookupFields()
calls for static values.
4οΈβ£ Consolidate Searches
Donβt run a new search inside each loop. Build one paged search and reuse results.
const paged = search.load({id:'customsearch_pending_orders'}).runPaged({pageSize:1000});
paged.pageRanges.forEach(range => {
paged.fetch({index:range.index}).data.forEach(process);
});
β 1 search call β 1000 records processed.
5οΈβ£ Bundle External API Requests
If integrating with Shopify / Salesforce, send bulk payloads:
Before: 100 POST requests Γ 1 record
After: 1 POST request Γ 100 records
Example payload structure:
{
"orders":[
{"id":301,"total":250.50},
{"id":302,"total":180.75}
]
}
β Cuts network overhead and authentication costs dramatically.
6οΈβ£ Leverage Map/Reduce Parallelization
Split large jobs into smaller chunks automatically.
define(['N/search','N/record'], (search,record)=>({
getInputData:() => search.load({id:'customsearch_orders'}),
map:(ctx)=>{
const order = JSON.parse(ctx.value);
record.submitFields({
type:record.Type.SALES_ORDER,
id:order.id,
values:{custbody_synced:true}
});
}
}));
β Each map stage runs in parallel β high throughput with automatic yielding.
7οΈβ£ Optimize Field Access
Use search.lookupFields()
to read just the needed data:
const cust = search.lookupFields({
type:'customer',
id:123,
columns:['email','salesrep']
});
β
1 call instead of record.load()
(20β40 units saved).
8οΈβ£ Use Conditional Processing
Skip unnecessary work:
if (!order.syncRequired) return;
β Simple check = major time savings for high-volume runs.
9οΈβ£ Track and Measure Governance
Add usage tracking at key points:
const usage = runtime.getCurrentScript().getRemainingUsage();
if (usage < 200) log.audit('Yielding to avoid timeout');
β Helps identify heavy sections for future optimization.
π§ 10οΈβ£ Best Practices Summary
Practice | Impact |
---|---|
Batch updates | Fewer record loads |
Cache static data | Instant lookups |
Paginate searches | Stable governance |
Async API calls | Faster throughput |
Map/Reduce for bulk | Automatic yield |
Measure governance | Predictable performance |
Conclusion
Integration optimization is about doing more with less.
By batching, caching, and parallelizing your SuiteScript operations, youβll cut processing time, stay under governance limits, and deliver integrations that scale effortlessly as data grows.
Efficient code = reliable automation.
Discover more from The NetSuite Pro
Subscribe to get the latest posts sent to your email.
Leave a Reply