Variables
Turn a filter value into a named runtime prompt — write a query once, run it against different tenants, DNs, or thresholds without editing the graph.
Sometimes you want the same query with a different input. "Bridge domains under tn-prod" today, "bridge domains under tn-staging" tomorrow. Editing the filter value each time is noise; the query's shape isn't changing, only one scalar is.
Variables fix that. You mark a filter value as a variable, and Fabrik prompts for it at run time instead of hard-coding it in the saved query.
The wrench icon
Every filter value input — Property filter value, wildcard pattern, Pattern Builder leaf — has a small wrench icon next to it. Click it and the Configure Variable dialog opens.
After you save, the input shows the variable syntax (${tenant_name}) instead of a literal value, and the wrench pulses to remind you it's live.
The Configure Variable dialog
Five fields — most are optional.
| Field | What it does |
|---|---|
| Label (required) | Human-readable name shown at run time ("Tenant Name", "IP Address"). |
| Type | text, number, or select. Controls the input widget at run time. |
| Options | (select only) Comma-separated list of allowed values. |
| Default value | Pre-fills the run-time prompt. Usually the value that was in the field when you opened the dialog. |
| Placeholder | Hint text inside the empty input at run time. |
| Required | When checked, the Run button stays disabled until the user types something. |
The variable's ID is auto-generated from the label (Tenant Name → tenant_name). You reference it in other fields as ${tenant_name}.
Reusing one variable across fields. The same ${tenant_name} can appear in multiple filter values across multiple nodes. At run time, the user enters it once and every reference is substituted. This is the payoff: a single prompt, many filters.
The ${...} syntax
You can also type ${var_id} directly into any filter value — no dialog needed. Fabrik detects the pattern when the graph runs, creates a default variable definition (text, required, label derived from the ID), and prompts for it.
The dialog is just a nicer front end for the same thing; use whichever fits your flow.
Running a query with variables
A query that contains variables changes the Run button's behaviour:
- The button reads Configure & Run with a wrench icon instead of a lightning bolt.
- Clicking it opens the Configure Query Variables dialog — one field per unique variable, with the widget type (text / number / select) you configured.
- Required variables are starred; the Run Query button inside the dialog stays disabled until every required field has a value.
- Cancel dismisses without running.
On submit, Fabrik substitutes values into the graph in-memory, then sends the resolved query to the backend. The saved query on disk still holds the ${...} placeholders — you only replaced them for this one execution.
Where variables work
Variables resolve inside Filter nodes at run time. That includes:
- Property filter values (including the value side of a
wcardcomparison). - Pattern Builder leaves (each pattern's value field has its own wrench).
- The value input on inline filters attached to a Class node.
The wrench also appears on some Post-Processor config fields (regex patterns, replacement strings). The values still carry the ${...} marker, but runtime substitution is currently scoped to Filter nodes. For now, the reliable path is: declare the variable on a Filter value, and let it drive the query. If you need a post-processor to vary per run, consider splitting the work into two saved queries.
Defaults and scheduled runs
Scheduled queries can't prompt anyone — they run unattended. The schedule carries a values blob alongside the query reference, filled in when you set the schedule up. Variable defaults from the original query serve as the starting point in that dialog, but the scheduled copy is independent: editing the defaults on the source query doesn't retroactively change what the schedule sends.
See Scheduled queries for the scheduling UI.
Worked example — tenant-scoped search
A tiny recipe you'll reach for often.
- Build
Start → fvTenant → Filter (wcard on dn) → fvBD → Output. - On the Filter, pick
dnas the property, operatorwcard, valueuni/tn-placeholder/.*. - Click the wrench next to the value. Label it Tenant Name, type
text, defaulttn-prod, required. - Change the value to
uni/tn-${tenant_name}/.*. - Hit Configure & Run, enter a tenant, click Run Query.
Same graph now works for every tenant. Save it once; reuse forever.
Troubleshooting
Variable gotchas that come up often:
- "The Run button didn't change to Configure & Run." The filter value isn't actually marked as a variable yet. Check that the input shows
${var_id}(not a literal). Click the wrench and save. - "I see
${tenant_name}in the results." Substitution didn't happen because the variable wasn't declared on a Filter node — it's probably on a Post-Processor field. Move the variable to a Filter, or hard-code the value. - "The same variable asks me twice at run time." Two variables with the same ID but different metadata. Open each dialog, confirm they share the same Label and ID — duplicates slip in when you type the syntax manually in two places.
- "Required variable is blank but Run Query is still enabled." Whitespace-only values count as blank; Fabrik
.trim()s before validating. If the button is enabled, your value has non-whitespace content. - "I want to remove a variable." Clear the field in the filter value and retype the literal you want. The variable metadata sticks to the field, not the whole graph — once no field references it, it's gone from the Configure dialog.
Variables let one query serve many inputs. The next page — Pipelines — is about chaining queries so the output of one becomes the filter input of the next.
Post-processors
Reshape APIC results after they come back — filter rows, extract fields, run regex, sort, aggregate — with a visual top-to-bottom pipeline on the canvas.
Pipelines
Feed the output of one query into the filters of the next — multi-stage APIC queries with filter injection, DN scoping, or per-value iteration.