Category : NetSuite ERP
N/LLM is a newly introduced NetSuite SuiteScript 2.1 module that brings native access to a Large Language Model (LLM) directly inside NetSuite. With it, you can:
This represents a major leap in how developers can integrate intelligent, natural language capabilities into custom NetSuite solutions.
RAG is an AI architecture that combines “retrieval” and “generation.” Instead of relying only on a model’s memory, you first retrieve relevant documents, then generate an answer based on those documents. This keeps outputs factually grounded, domain-specific, and reduces hallucinations.
Typically, RAG consists of:
In NetSuite, N/LLM enables a form of RAG by allowing you to:
Thus, you ensure that generated answers are based on your own NetSuite data, not random internet knowledge!
Let’s create a Suitelet that allows us to see how all this can be accomplished. In this suitelet we will have
define(["N/ui/serverWidget", "N/llm", "N/runtime", "N/query"], function (
serverWidget,
llm,
runtime,
query
) {
function onRequest(context) {
const request = context.request;
const response = context.response;
const form = serverWidget.createForm({
title: "Sales Insights with LLM (2016–2017)",
});
const promptField = form.addField({
id: "custfield_prompt",
type: serverWidget.FieldType.TEXTAREA,
label: "Enter your question",
isMandatory: true,
});
promptField.setHelpText({
help:
"Examples:n- What is the most sold item in 2016-2017?n" +
"- Compare total sales volume by item across locationsn" +
"- What item generated the most revenue?n" +
"- Which items were mostly sold in Boston?",
});
promptField.updateDisplaySize({ height: 40, width: 50 });
const responseField = form.addField({
id: "custfield_response",
type: serverWidget.FieldType.LONGTEXT,
label: "LLM Response",
});
responseField.updateDisplaySize({ height: 40, width: 50 });
form.addFieldGroup({
id: "advanced_details",
label: "Advanced Details (Documents, Citations, Usage)",
});
const documentPreviewField = form.addField({
id: "custfield_documents",
type: serverWidget.FieldType.LONGTEXT,
label: "Generated Document Content",
container: "advanced_details",
});
documentPreviewField.updateDisplaySize({ height: 40, width: 50 });
const citationField = form.addField({
id: "custfield_citations",
type: serverWidget.FieldType.LONGTEXT,
label: "Citations",
container: "advanced_details",
});
citationField.updateDisplaySize({ height: 40, width: 50 });
const remainingUsage = form.addField({
id: "custfield_remaining_usage",
type: serverWidget.FieldType.TEXT,
label: "Remaining LLM Usage",
container: "advanced_details",
defaultValue: llm.getRemainingFreeUsage(),
});
form.addSubmitButton({ label: "Ask" });
form.addButton({
id: "custpage_clear_btn",
label: "Clear",
functionName: "clearFormFields",
});
form.addButton({
id: "custpage_toggle_details_btn",
label: "Show/Hide Advanced Details",
functionName: "toggleAdvancedDetails",
});
form.clientScriptModulePath = "./LLM_Form_Clear_Client.js";
response.writePage(form);
We do not include the code of the Client.js script because it is only used to show/hide the different text areas for better interaction.
Upon submission, the Suitelet uses N/query to perform a SuiteQL query, summarizing Sales Order lines from 2016–2017. Data includes:
The use of N/query here is key for performance and flexibility compared to N/search and we encourage its use over the old N/search module.
The Suitelet formats the data into documents like:
Item: Widget A, Total Qty Sold: 123, Total Revenue: $12,300.00, Locations: Boston: 80 units, New York: 43 units
Each document is created using llm.createDocument() and added to the documents array. A key factor in improving the LLM’s response to natural language queries lies in how we structure and store the data within this array. In this solution, we are simply fetching results and immediately adding them to the documents array. However, a more robust approach could involve storing the data in a Custom Record and using a scheduled Map/Reduce script to regularly update the results. While this example represents a basic use case, it can be easily adapted to suit more complex scenarios.
With documents in hand, llm.generateText() is called with:
The response includes both the main generated text and citations linking back to which documents supported the answer. Here is how we perform the search and format the results.
if (request.method === "POST") {
const promptContent = request.parameters.custfield_prompt;
// --- NEW: QUERY INSTEAD OF SEARCH ---
const itemSummaryMap = {};
const salesQuery = query.runSuiteQL({
query: `
SELECT
i.itemid AS itemid,
i.displayname AS description,
SUM(tl.quantity * -1) AS quantitysold,
tl.location AS locationid,
loc.name AS locationname,
SUM(tl.quantity * -1 * tl.rate) AS totalrevenue
FROM
transaction AS t
INNER JOIN
transactionline AS tl
ON
t.id = tl.transaction
AND tl.mainline = 'F'
INNER JOIN
item AS i
ON
tl.item = i.id
LEFT JOIN
location AS loc
ON
tl.location = loc.id
WHERE
t.type = 'SalesOrd'
AND t.void = 'F'
AND t.voided = 'F'
AND i.itemtype <> 'Discount'
AND t.trandate BETWEEN TO_DATE('2016-01-01', 'YYYY-MM-DD') AND TO_DATE('2017-12-31', 'YYYY-MM-DD')
GROUP BY
i.itemid, i.displayname, tl.location, loc.name
ORDER BY
quantitysold DESC, i.itemid
`,
});
const results = salesQuery.asMappedResults();
results.forEach((row) => {
const itemName = row.itemid || "Unknown Item";
const quantitySold = parseFloat(row.quantitysold) || 0;
const revenue = parseFloat(row.totalrevenue) || 0;
const locationName = row.locationname || "Unknown";
if (quantitySold <= 0) return;
if (!itemSummaryMap[itemName]) {
itemSummaryMap[itemName] = {
totalQty: 0,
totalRevenue: 0,
locations: {},
};
}
itemSummaryMap[itemName].totalQty += quantitySold;
itemSummaryMap[itemName].totalRevenue += revenue;
itemSummaryMap[itemName].locations[locationName] =
(itemSummaryMap[itemName].locations[locationName] || 0) +
quantitySold;
});
const documents = [];
let docTextLog = "";
let index = 0;
Object.keys(itemSummaryMap).forEach((itemName) => {
const summary = itemSummaryMap[itemName];
const locText = Object.entries(summary.locations)
.map(([loc, qty]) => `${loc}: ${qty} units`)
.join(", ");
const docText = `Item: ${itemName}, Total Qty Sold: ${
summary.totalQty
}, Total Revenue: $${summary.totalRevenue.toFixed(
2
)}, Locations: ${locText}`;
const docId = "item_" + index;
const doc = llm.createDocument({ id: docId, data: docText });
documents.push(doc);
docTextLog += `${docId}: ${docText}nn`;
index++;
});
const llmResponse = llm.generateText({
prompt: promptContent,
documents: documents,
modelParameters: {
maxTokens: 1000,
temperature: 0.2,
topK: 3,
topP: 0.7,
frequencyPenalty: 0.4,
presencePenalty: 0,
},
});
responseField.defaultValue = llmResponse.text;
let citationsSummary = "";
if (llmResponse.citations?.length > 0) {
llmResponse.citations.forEach((citation, i) => {
citationsSummary += `#${
i + 1
} [Docs: ${citation.documentIds.join(", ")}]n${
citation.text
}nn`;
});
} else {
citationsSummary = "No citations found.";
}
citationField.defaultValue = citationsSummary;
documentPreviewField.defaultValue = docTextLog;
remainingUsage.defaultValue = llm.getRemainingFreeUsage();
}
Finally, the Suitelet updates the form to show:
All without leaving NetSuite!
The quality of documents you pass to generateText() heavily impacts the final LLM output. Some tips to improve results:
The more “friendly” your documents are, the better and more accurate your answers become.
Everything demonstrated in this post works seamlessly for development and testing scenarios using the free allocation of the SuiteScript Generative AI APIs. However, if you are building a SuiteApp intended for deployment across customer accounts, it’s important to understand that the free tier is not sufficient for production usage, and you’ll need to use one of the paid usage modes available via Oracle Cloud Infrastructure (OCI).
NetSuite supports three usage modes for the N/llm module:
ociConfig object) or via the AI Preferences > Settings page in NetSuite.endpointId in the ociConfig.For both On Demand and Dedicated Cluster modes, credentials can be managed securely:
ociConfig object in your code to reference these secrets, or configure them globally under AI Preferences.For detailed instructions on how to obtain and setup an OCI account please consult the HELP section Using Your Own OCI Configuration for SuiteScript Generative AI APIs
The combination of N/LLM and N/query unlocks a revolutionary new capability: conversational access to your NetSuite data. With a smart document retrieval setup, you can implement a practical version of RAG inside your Suitelet and offer users powerful, AI-assisted insights — all grounded firmly in your organization’s actual data.
Tags: