FortiAI tokens
When FortiManager is licensed for FortiAI, the license will include a monthly entitlement for tokens that is shared by all FortiAI users.
How token usage is calculated
Tokens are used in large language models (LLMs) to process text and quantify usage. Tokens usage is calculated using the following guidelines:
-
When you use the FortiAI assistant, the text in both the prompt (input) and the response (output) is processed as tokens.
-
While there is not a one-to-one relationship between words or characters and tokens, in general, more text in the query and response means using more tokens.
-
Because the FortiAI assistant uses session history to inform it's responses, queries that are a part of a long session will use more tokens than new conversations.
Best practices
To ensure you are using your monthly allocation of tokens effectively, consider implementing best practices for FortiAI users. For example:
-
Make your prompts concise and specific. In terms of token usage, the prompt "Can you please help me create a firewall address for 10.0.0.1 and another one for the domain awesome-domain.com?" is less effective than "Create firewall addresses for 10.0.0.1 and awesome-domain.com".
-
Use filters in your prompts to receive concise and specific responses. For example, say that you want to create a site-to-site VPN based on an uploaded topology image.
-
Use words that relate to functions existing in FortiManager. For example, using "quarantine device" concisely tells the FortiAI assistant what action is required.
-
Reference details in the existing thread when possible. This reduces redundancy and allows you to be concise and specific as you build upon previous prompts. However, note that the FortiAI assistant will not remember previous threads.
Viewing token usage
The monthly token usage is displayed at the bottom of the FortiAI pane in FortiManager. Mouse over the Monthly token usage % to view the following in a tooltip:
-
Current Chat Session Token Usage
-
Current Monthly Token Usage
-
Total Monthly Entitled Tokens