Cody is an AI coding assistant that helps you understand, write, and fix code faster. It uses advanced search to pull context from both local and remote codebases so that you can use context about APIs, symbols, and usage patterns from across your entire codebase at any scale, all from within VS Code. Plus, Cody users can choose from the latest large language models—like GPT-4o and Claude 3.5 Sonnet—to customize Cody to their needs.
Install Cody to get started with free AI-powered autocomplete, chat, prompts, and more.
Cody autocompletes single lines, or whole functions, in any programming language, configuration file, or documentation. It’s powered by the latest instant LLM models, for accuracy and performance.
Answer questions about your entire codebase, specific files and symbols, or general programming topics.
For example, you can ask Cody:
- "How is our app's secret storage implemented on Linux?"
- "Where is the CI config for the web integration tests?"
- "Write a new GraphQL resolver for the AuditLog"
- "Why is the UserConnectionResolver giving an "unknown user" error, and how do I fix it?"
- "Add helpful debug log statements"
- "Make this work" (seriously, it often works—try it!)
Streamline your development process by using prompts to understand, improve, fix, document, and generate unit tests for your code.
You can also create your own prompts and save them in the Prompt Library to tailor Cody to your workflow.
Cody users can select the LLM they want to use for chat and experiment to choose the best model for the job. Choose from multiple options including Claude 3.5 Sonnet, Gemini 1.5 Pro, and Mixtral 8x7B. Cody Pro users can also select Claude 3 Opus and GPT-4o. See the full list of model options here.
Administrators for Sourcegraph Enterprise instances can configure which model options to let team members choose from.
This extension works for all Cody plans, including Cody Free, Cody Pro, and Cody Enterprise.
You can find detailed information about Cody's available plans on our website.
Cody works for any programming language because it uses LLMs trained on broad data. Cody works great with Python, Go, JavaScript, and TypeScript code.
Cody is powered by Sourcegraph’s code search, which it uses to retrieve context from your codebase and extend its capabilities. By using context from entire projects, Cody can give more accurate answers and generate idiomatic code.
For example:
- Ask Cody to generate an API call. Cody can gather context on your API schema to inform the code it writes.
- Ask Cody to find where in your codebase a specific component is defined. Cody can retrieve and describe the exact files where that component is written.
- Ask Cody questions that require an understanding of multiple files. For example, ask Cody how frontend data is populated in a React app; Cody can find the React component definitions to understand what data is being passed and where it originates.
Cody Enterprise can search context from your entire remote codebase using Sourcegraph's code search. This allows Cody to answer questions about all of your code, even the repositories that don't live on your local machine.
Contact us to set up a trial of Cody Enterprise. If you’re an existing Sourcegraph Enterprise customer, contact your technical advisor.
See https://cody.dev/ for demos, information and more.