Power tier Local CLIP model, runs on your GPU

Search your video archive by what's actually in it.

Type a phrase. Get ranked thumbnails in well under a second. The model lives on your machine, the index lives in your SQLite database, and your archive never leaves the building.

Try red dress at the beach · skateboarding at night
StreamStash · AI Search
wearing a red dress at the beach 0.04s
94%
91%
88%
82%
What you get

Sub-second queries against everything you've saved.

Three numbers do most of the explaining. They come from a real install on a mid-range desktop, not a marketing benchmark.

0.04s

Typical query runtime

Vector lookup against an indexed library of a few thousand clips. Stays sub-second well past 100,000.

~1min / 1k

Index time per 1,000 clips

First-time cost on an RTX 3060. New saves are embedded in the background as they arrive.

0calls

External API requests

No keys, no usage tier, no cloud. The model file sits on disk and the embeddings sit next to your library.

How it works

Three local steps. No accounts, no uploads.

1

StreamStash embeds each saved frame

When a clip lands in your library, it's passed through a local CLIP model. The model returns a vector — a numerical fingerprint of what's visible in the frame.

CLIP ViT-B/32 · GPU when available, CPU otherwise
2

Your query becomes a vector too

Type a phrase. CLIP encodes the text into the same vector space as your frames, so "person in red at the beach" lands near the frames that show that scene.

Phrasing only has to be roughly right
3

Results come back ranked by distance

The closest matches surface as ranked thumbnails with a confidence score. Click one to play the underlying file from your library.

Stays sub-second past 100,000 clips
The honest comparison

Filename search misses everything you actually wanted to find.

Most archives are searchable in name only. AI search reads the pixels — which is the part you remember.

Filename search

You remember a moment, not a string.

You know there's a clip somewhere with someone in a red dress on a beach. The filename has none of that.

VID_20240715_142318.mp4 tiktok_@creator_7350421987651.mp4 IG_reel_C8wPx-7tNUk.mp4

No match. You scroll for fifteen minutes, give up, and re-save the post from the original platform.

AI search

You type what you remember seeing.

The query goes through CLIP locally and lands next to the frames that actually look like the scene.

red dress at the beach
VID_20240715_142318.mp4 94%
tiktok_@creator_7350... 91%
IG_reel_C8wPx-7tNUk.mp4 88%
In the app

The actual search screen.

Same layout, same scoring, same local-first execution. This is the page in StreamStash, not a stylised mock.

StreamStash AI search page with a natural-language query and ranked result thumbnails
Included with Power

AI search ships with the Power tier.

Power adds AI search, all eight platforms, unlimited monitored feeds, unlimited live monitors, and cross-platform deduplication. One-time purchase, all updates included, runs entirely on your machine.

From £40 · one-time, lifetime updates
FAQ

Questions about local AI search.

StreamStash uses a local CLIP model to turn each saved frame into a vector — a numerical fingerprint of what the frame contains. Your typed query becomes a vector in the same space, and the closest matches come back as ranked thumbnails. The model and the index live on your machine, so nothing is uploaded and there is no third-party service in the chain.
No. The CLIP weights sit on disk after a one-time download. Indexing reads your local files, runs them through CLIP locally, and writes the results into a SQLite database next to your library. Querying is the same: input is text, output is a list of file paths and scores. There is no API key, no telemetry on the search itself, and nothing is sent to a third party.
Any reasonably modern NVIDIA GPU works. A mid-range card like an RTX 3060 indexes about 1,000 clips per minute and returns queries in well under a second. CPU-only fallback is supported but indexing is slower — fine if you index overnight, less fun for very large libraries.
Roughly one minute per 1,000 clips on a mid-range GPU. You only pay that cost once. New saves are embedded in the background as they arrive, so the search bar stays current without you having to run anything by hand.
Yes, and it's worth being honest about them. CLIP is strongest on visual scenes and weaker on highly specific text inside frames, on subtle differences between similar-looking creators, and on prompts that describe abstract or stylistic concepts. Phrases that describe a scene work; phrases that describe a feeling are hit or miss.
No. AI search is a Power tier feature. Free includes Quick Download, the local library, and limited monitoring. Power adds AI search, all platforms, unlimited feeds and live monitors, and cross-platform deduplication.

Stop scrolling. Start searching.

Power unlocks AI search and the full archiving stack. One-time purchase, runs on your machine, every update included.

See Power pricing