Skip to content

S3 stands for Swift Semantic Search - using USearch & UForm to builds scalable multimodal Retrieval-Augmented pipelines on Apple devices ๐Ÿ

Notifications You must be signed in to change notification settings

ashvardanian/SwiftSemanticSearch

Repository files navigation

Swift Semantic Search ๐Ÿ

Preview

This Swift demo app shows you how to build real-time native AI-powered apps for Apple devices using Unum's Swift libraries. Under the hood, it uses UForm to understand and "embed" multimodal data, like images, multilingual texts, and ๐Ÿ”œ videos. Once the embeddings are computed, it uses USearch to provide real-time search over the semantic space. That same engine also enables geo-spatial search over the coordinates of the images, and has been to shown to easily scale even to 100M+ entries on an iPhone ๐Ÿ

SwiftSemanticSearch demo Dog SwiftSemanticSearch demo with Flowers

The demo app is capable of text-to-image and image-to-image search, can uses vmanot/Media to fetch the camera feed, embedding and searching frames on the fly. To test the demo:

# Clone the repo
git clone https://github.com/ashvardanian/SwiftSemanticSearch.git

# Change directory & decompress the images dataset.zip, which brings:
#   - `images.names.txt` with newline-separated image names
#   - `images.uform3-image-text-english-small.fbin` - precomputed embeddings
#   - `images.uform3-image-text-english-small.usearch` - precomputed index
#   - `images` - directory with images
cd SwiftSemanticSearch
unzip dataset.zip

After that, fire up the Xcode project and run the app on your fruity device!


Links: