docs

Storage Node

Most workflow nodes are stateless -- they run, produce output, and forget everything. The Storage node changes that. It gives your workflows memory by persisting JSON data between executions, so you can build strategies that track positions over time, remember previous signals, or deduplicate alerts across scheduled runs.

Without storage, every execution starts from scratch. With it, your workflow can answer questions like "what was the price last time I checked?" or "have I already sent this alert?"

Save Snapshot
store

Configuration

Storage Backend
Storage Configuration

Configuration

FieldDescription
CredentialNickAI Native (default -- zero config) or your own AWS credentials for a custom S3 bucket.
BackendAmazon S3. Shown only when using custom credentials.
Actionstore -- save incoming data. retrieve -- load previously saved data. store_if_empty -- save only if the key has no data yet.
Storage KeyThe path for your data object. Supports {inputs.field} interpolation for dynamic keys.
RegionAWS region for S3. Default: us-east-1. Shown only when using custom credentials.

Actions

Store

Saves all incoming JSON data (everything arriving at the trigger input) to the specified key. If the key already exists, its contents are overwritten.

Use this when you want to snapshot the current state of your workflow -- prices, portfolio positions, LLM analysis results -- so you can compare them on the next run.

Retrieve

Loads the JSON data previously saved at the specified key and passes it downstream through the data output. If the key does not exist, the node fails with an error.

Use this to load state from a previous execution -- last known prices, historical signals, or any data you stored earlier.

Store If Empty

Writes data only when the key does not already exist. If the key has data, the node succeeds without overwriting and sets alreadyExists: true in the output.

This is useful for one-time initialization -- seeding default configuration, uploading reference data, or setting initial portfolio state. The workflow can run the same "store if empty" step on every execution without worrying about clobbering existing data.


Storage Key Interpolation

Storage keys support {edgeLabel.field} interpolation, just like prompts in the LLM node. This lets you create dynamic keys based on upstream data.

Key patternResolves to
portfolio/latestportfolio/latest (static)
portfolio/{price_data.symbol}portfolio/BTCUSDT
snapshots/{price_data.symbol}/{price_data.date}snapshots/BTCUSDT/2026-03-20
alerts/{function.alertId}alerts/abc123

The resolved key is automatically prefixed with your user ID and workflow ID for security -- you cannot accidentally read or write another user's data.


Backend Options

NickAI Native

The default. Select "NickAI Native" in the credential dropdown and you are done -- no AWS account, no bucket configuration, no IAM policies. Your data is stored in NickAI's platform-managed S3 bucket, scoped to your account.

  • Storage limit: 100 MB per upload
  • Data format: JSON
  • Isolation: Data is prefixed with your user ID and workflow ID, so workflows cannot access each other's storage

This is the recommended option for most users.

Custom S3

If you need to store data in your own AWS bucket -- for compliance, larger files, or integration with other systems -- select your AWS credentials from the dropdown instead.

  1. Go to Credentials and add an AWS credential with your accessKeyId, secretAccessKey, and bucketName.
  2. In the Storage node, select that credential.
  3. Set the Region to match your bucket.

Workflow Examples

EMA Crossover with Memory

A classic strategy that needs to remember the previous EMA values to detect crossovers. On each scheduled run, the workflow retrieves the last snapshot, computes new EMAs, checks for a crossover, and stores the updated state for next time.

BTC Price
BINANCE:BTCUSDT
Load State
retrieve
Compute EMAs
EMA 12/26
Crossover?
1 rule
Place Order
Buy BTC
Save State
store

The Load State node retrieves the previous EMA values. The Compute EMAs Function node receives both the live price and the stored state, calculates the new short and long EMAs, and detects a crossover. The Save State node persists the updated EMAs so the next run can pick up where this one left off.

Deduplication Pattern

Prevent duplicate alerts by storing a hash of each alert you send. Before sending, check if the hash already exists.

Analyze Market
Claude Sonnet 4.6
Hash Signal
SHA-256
Dedup Check
store_if_empty
New Signal?
alreadyExists = false
Send Alert

The Hash Signal Function node creates a unique key from the signal content. The Dedup Check Storage node uses store_if_empty with that hash as the key. If {storage.alreadyExists} is false, the signal is new and the alert goes out. If it is true, the signal was already sent on a previous run -- no duplicate email.


Pricing

The Storage node is FREE -- it costs 0 credits per execution. You can store and retrieve data as often as your workflow needs without any credit cost.


Output

The Storage node outputs different fields depending on the action.

Store / Store If Empty

PathDescription
{storage.success}true if the operation succeeded
{storage.action}"store" or "store_if_empty"
{storage.key}Full storage key used (including user/workflow prefix)
{storage.size}Size of stored data in bytes
{storage.alreadyExists}true if key already had data (store_if_empty only)
{storage.metadata.backend}"nickai" or "s3"
{storage.metadata.bucket}Bucket name
{storage.metadata.timestamp}ISO 8601 timestamp of the operation

Retrieve

PathDescription
{storage.success}true if the operation succeeded
{storage.action}"retrieve"
{storage.key}Full storage key used
{storage.data}The retrieved JSON object -- access nested fields like {storage.data.lastPrice}
{storage.size}Size of retrieved data in bytes
{storage.metadata.backend}"nickai" or "s3"
{storage.metadata.bucket}Bucket name
{storage.metadata.timestamp}ISO 8601 timestamp of the operation
{storage.metadata.lastModified}When the data was last written

Next Steps

  • Function Node -- Transform or compute data before storing, or process retrieved data.
  • Conditional Node -- Branch logic based on stored state (e.g., check {storage.alreadyExists}).
  • Price Data Node -- Fetch live prices to store as snapshots.
  • Credentials -- Set up AWS credentials for custom S3 storage.