The Core Principle: Specificity
Instead of “Fix the user service,” try:In theIf there are specific implementation details you want included, make sure to specify. For example:my-web-apprepo (PR #42), refactor theUserServiceclass insrc/services/user.tsto use theUserRepositorypattern shown inProductService/ProductRepository.
Ensure all tests intests/services/user.test.tspass and add new tests for the repository with 90%+ coverage. Update the diagram indocs/architecture/user-service.md.
Elements of a Strong Prompt
- Scope: What repository, branch, or files are involved? (e.g.,
my-web-apprepo,PR #42,src/services/user.ts) - Goal: What is the high-level objective? (e.g., Refactor
UserService, improve testability) - Tasks: What specific actions should the agent take? Use a numbered or bulleted list for clarity. (e.g., Extract logic to
UserRepository, use dependency injection, update tests, update diagram) - Context/Patterns: Are there existing patterns, examples, or documentation to reference? (e.g.,
ProductService,ProductRepository) - Success Criteria: How will you know the task is done correctly? (e.g., Tests pass, 90%+ coverage, diagram updated)
Platform-Specific Guidance
Slack Integration
When prompting Codegen in Slack:- Use threads for complex, multi-step requests to keep context organized
- Tag specific repos early in the conversation: “In the
frontendrepo…” - Reference PR numbers directly: “Review PR #123 and suggest improvements”
- Ask for status updates on long-running tasks: “What’s the progress on the refactoring?”
Linear Integration
When working with Linear issues:- Reference issue numbers for context: “This relates to CG-1234”
- Break down large tasks into sub-issues when appropriate
- Specify acceptance criteria clearly in the issue description
- Use @codegen to assign implementation tasks directly
GitHub Integration
When requesting PR reviews or code changes:- Link to specific files and line numbers when possible
- Reference related issues or previous PRs for context
- Specify review criteria: security, performance, maintainability
- Request specific types of feedback: “Focus on error handling”
Common Task Types
Code Review
Bug Fixing
Feature Development
Refactoring
Advanced Techniques
Providing Context Effectively
When working with large codebases:Iterative Prompting
Build on previous work:Handling Complex Dependencies
For interconnected changes:Troubleshooting Common Issues
”I don’t have enough context”
Problem: Codegen asks for more information Solution: Provide file paths, existing patterns, and specific requirements upfront”The changes broke existing functionality”
Problem: Modifications caused test failures Solution: Always specify which tests should continue passing and request regression testing”The implementation doesn’t match our patterns”
Problem: Code doesn’t follow team conventions Solution: Reference specific examples of preferred patterns in your codebase”The task is too large”
Problem: Request is overwhelming or unclear Solution: Break down into smaller, specific tasks with clear dependenciesBest Practices
Be Specific
Include file paths, function names, and exact requirements rather than general descriptions.
Provide Examples
Reference existing code patterns, similar implementations, or desired outcomes.
Set Success Criteria
Define what “done” looks like: passing tests, performance metrics, or specific functionality.
Consider Dependencies
Mention related systems, APIs, or components that might be affected by changes.
Clear, detailed prompts empower Codegen agents to deliver accurate results
faster, significantly streamlining your workflow. Remember that Codegen has access
to your entire codebase and can understand complex relationships between files.
Getting Help
If you’re unsure how to structure a prompt for your specific use case:- Start with a simple request and iterate based on the results
- Ask Codegen to suggest how to break down complex tasks
- Reference the capabilities documentation to understand available tools
- Check out real examples in our use cases section