Generate Response from Model

By Oscar Frith-Macdonald, 6 August 2025

FileMaker 22 (also referred to as FileMaker 2025) introduces a range of new tools for integrating AI into your solutions. In this post, we’ll be focusing on the Generate Response from Model script step, which allows scripts to send prompts to an AI model and get a response. At its simplest, it drops a ChatGPT-style chat straight into your solution; type a prompt, and get an answer. This is similar to using ChatGPT in a browser, but switch on Agentic Mode, and that same step can plan tasks, invoke tools such as SQL queries or own custom functions, and iterate until it gets the result you need.

Just like previous AI features, you’ll need to run the Configure AI Account script step first. This sets up the AI account by name, specifying the model provider (or endpoint) along with an API key. In this example, we’re using OpenAI, but the script steps also support Anthropic, Cohere, and even custom endpoints.

Once your AI account is configured, you can use Generate Response from Model to interact with the model of your choice.

At its simplest, this script step works just like ChatGPT in a browser: you send a prompt, and the model replies. It doesn’t know anything about your FileMaker file or its data; it’s just a general-purpose LLM chatbot. The real power of the Generate Response from Model script step comes when you enable Agentic mode. So what is agentic mode?

configure aiagnetic response 1

Agentic Mode

Agentic mode is where the AI behaves more like an agent, meaning it can:

  • Plan its next steps
  • Use tools or external functions to carry out tasks
  • Make decisions based on the outcomes of those actions
  • Work iteratively, evaluating results before continuing

This is quite different from the usual prompt-and-response setup, where the AI just replies to what it’s given. In agentic mode, the AI can reason through a problem step-by-step and call tools as needed.

LLMs are inherently probabilistic, meaning their responses can vary each time. But in agentic mode the LLM can use tools, which adds a layer of determinism, allowing the AI to produce more consistent and reliable results.

Generate Response from Model - Tools

When Agentic mode is enabled on the Generate Response from Model script step, the AI is given access to two built-in tools provided by Claris: `execute_sql` and `retrieve_image`. You’ll find more detail on both in the Claris Help documentation.

In addition to those, you can expose any Custom Functions in your file as tools for the AI to use. All tools must be defined in a JSON structure that’s passed to the AI agent. The exact structure can vary depending on your model provider, so it’s important to check their documentation. OpenAI’s guide for tool usage can be found here.

In general, each tool definition will need to include:

  • the name of the tool
  •  a description of what it does
  • a list of parameters it accepts
  • a description for each parameter, including whether it’s required or optional

tools 1

Generate Response from Model - Agentic Setup

To get started with Agentic mode, there are only a few key fields you need to set:

Agentic Mode

  • Enables agentic behaviour, which allows the model to plan and take actions using tools.

Tool Calls From Model 

  • This is a response field. When the model uses tools, the results will be returned here in the order they were called.
  • This is mostly used for debugging, as it shows which tools the model called and in what order.

Instructions

  • This is where you describe the agent’s role using plain language. For example, what it's meant to do, what tools it should use, and what kind of output you expect.

Tool Definitions

  •  This is where you enter the JSON that defines which tools the AI can access. This might include the built-in Claris tools or any custom functions in your file.

 agnetic response 2

Generate Response from Model - Demo

We’ve included a simple demo file that shows how to use this script step to build an AI-powered calculator.

Download The Demo!

The file defines four custom functions:

  • CF_Add
  • CF_Subtract
  • CF_Multiply
  • CF_Divide

Alongside these, we’ve added three tables: Agents, CustomFunctions, and CustomFunctionParameters.

These tables use calculation fields to generate the JSON tool definitions that get passed to the model.

The CustomFunctionParameters table generates the JSON for each tool’s parameters (the properties block), e.g.,

num1

This parameter JSON is combined in the CustomFunctions table to create the full JSON definition for each tool, e.g.,

tools 2

The Agent table pulls all the individual tool JSON objects together into a single JSON array, which is passed to the model as the Tool Definitions.

A simple way to confirm that the AI agent is working and actually using the tools is to modify one of the custom functions to return an obviously wrong result. For example, change CF_Add to always return 99, then ask the agent, “What is 2 plus 2?” The response should come back as 99, showing that the model used the tool rather than calculating the answer itself.

Categories(show all)

Subscribe

Tags