NetSuite has taken a major leap into the AI era by introducing SuiteScript Generative AI APIs, powered by the new N/llm module. For the first time, NetSuite developers can directly send prompts to large language models (LLMs), receive AI-generated responses, and integrate generative intelligence into SuiteScript automation.
This blog breaks down how the N/llm API works, what models are supported, usage modes, limitations, and real examples of how developers can use generative AI in their NetSuite scripts.
✔️ What Is the N/llm Module?
The N/llm module is NetSuite’s new SuiteScript interface that lets your script:
- Send prompts to an LLM
- Receive generated text responses
- Evaluate structured prompts (via Prompt Studio)
- Stream responses in real time
- Pass parameters & variables to AI models
- Work with Oracle-hosted generative AI safely
NetSuite integrates with OCI Generative AI Service, meaning your data stays within Oracle infrastructure—never sent to third-party providers for training.
This module is available only in SuiteScript 2.1 and only in supported regions.
🧠 How SuiteScript Generative AI APIs Work
The workflow is simple:
- Your SuiteScript calls N/llm.generateText()
- NetSuite forwards the request to OCI Generative AI
- The LLM processes the prompt
- OCI returns the AI-generated answer back to NetSuite
- Your script continues with that result
All communication stays inside Oracle’s secure environment.
💡 If you don’t specify a model, NetSuite automatically uses Cohere Command R.
🚨 Important Considerations Before Using N/llm
- AI responses can be creative, not deterministic
- Always validate output before updating records or posting transactions
- Not available in all regions
- Governance limits apply (just like any SuiteScript call)
Generative AI should help automate tasks — not blindly control your data.
🔧 Supported LLM Models
You can set the model using the modelFamily parameter in:
generateText()generateTextStreamed()
For Prompt Studio prompts, the model is chosen inside the prompt configuration.
Common model families include:
- cohere.command-r (default)
- cohere.command-r-plus
- meta.llama2
- custom configured OCI models
Each model supports different parameters (temperature, maxTokens, etc.).
🧪 Example: Using N/llm to Generate Text
/**
* @NApiVersion 2.1
*/
define(['N/llm'], (llm) => {
const example = () => {
const response = llm.generateText({
prompt: "Write a 2-line description for a new product: Wireless Smart Lamp.",
modelFamily: llm.ModelFamily.COHERE_COMMAND_R,
modelParameters: {
temperature: 0.3,
maxTokens: 100
}
});
log.debug("AI Response", response.text);
};
return { example };
});
🧪 Example: Using Prompt Studio + N/llm.evaluatePrompt()
If you already created a custom prompt in Prompt Studio:
const result = llm.evaluatePrompt({
promptId: 'custprompt_generate_sales_summary',
variables: {
customerName: "JADS Toys",
startDate: "2024-01-01",
endDate: "2024-12-31"
}
});
log.debug("AI Result", result.text);
This is ideal for reusable business logic across multiple scripts.
🎛 Usage Modes (Free, On-Demand, Dedicated Cluster)
NetSuite offers three ways to use generative AI:
1. Free Mode (Limited Monthly Usage)
✔ No setup needed
✔ Ideal for testing or light usage
❌ Limited quota resets monthly
❌ Will throw an error when capacity is used up
Use this when experimenting or running small usage scripts.
2. On-Demand Mode (Paid, Unlimited via OCI)
Requires an OCI account + configuration.
Best for:
- Medium to high usage
- SuiteApps
- Use cases where you cannot afford quota limits
Charges usage per LLM inference request.
3. Dedicated AI Cluster (High Volume Production)
For enterprises running:
- Very high AI workloads
- Large SuiteApps
- High concurrency systems
This provides dedicated compute resources for maximum performance.
⚠️ SuiteScript Governance for N/llm
Each method has its own governance cost.
Examples:
generateText()→ costs 10 unitsgenerateTextStreamed()→ costs 15 unitsevaluatePrompt()→ varies
Always check the governance limits table when designing automations to prevent errors like SSS_USAGE_LIMIT_EXCEEDED.
💡 Real Use Cases for N/llm in SuiteScript
Here are practical ways NetSuite developers can use generative AI today:
1. Automate item description generation
Create clear, SEO-friendly item descriptions from basic attributes.
2. Summarize long text fields
Convert long notes or case messages into 1-2 sentence summaries.
3. Sales order insight extraction
Generate insights from transactions:
- Key delays
- Customer patterns
- Reorder recommendations
4. AI-powered Suitelets
Build chat-like assistance tools inside Suitelets.
5. Auto-generate emails, comments, or support responses
Instead of manually typing long updates.
6. AI-driven workflow decisions
Example: Interpret customer requests and route cases automatically.
🏁 Final Thoughts
The N/llm module is one of the most powerful additions to SuiteScript since the introduction of Map/Reduce. It brings true generative AI capability directly into NetSuite development — without relying on external servers or unsafe third-party APIs.
With this API, developers can now build:
- Smarter Suitelets
- Intelligent workflows
- Auto-generating descriptions
- AI-powered data processors
- Dynamic customer communications
- Custom enterprise-grade AI tools
AI in NetSuite is no longer a future concept — it’s ready to build with today.
Discover more from The NetSuite Pro
Subscribe to get the latest posts sent to your email.
Leave a Reply