Skip to main content

Build with LLMs

If you are using AI assistants / coding agents in your workflows, then use this guide to accelerate how you work with Statsig in your application.

The llms.txt file

We host an /llms.txt file at https://docs.statsig.com/llms.txt file that provides a structured index of all Statsig documentation.

This file follows an emerging convention for making documentation accessible to AI tools.

When you provide this URL to an AI coding assistant like Codex, Cursor, Claude Code, or GitHub Copilot, it can understand Statsig's tools and capabilities and provide more accurate and contextual help when you're working with Statsig.

Example usage

There are two ways to use the /llms.txt file with AI tools:

1. Reference it when asking questions

Provide the URL when asking AI chatbots questions about Statsig:

Example Question 1

Read Statsig documentation: https://docs.statsig.com/llms.txt
How do I implement feature flags in a Next.js 14 app using the App Router?

Example Question 2

Reference https://docs.statsig.com/llms.txt
I'm getting a "Network timeout" error when initializing the Python SDK. What could be causing this and how do I fix it?

Example Question 3

Read https://docs.statsig.com/llms.txt
Should I use client-side or server-side evaluation for a mobile app? What are the tradeoffs?

2. Add it to your IDE context

Add https://docs.statsig.com/llms.txt to your project context in AI coding assistants like Cursor, Windsurf, or GitHub Copilot. The AI will then automatically reference Statsig documentation every time when helping with your Statsig code.