A gRPC-based code analysis service for C# codebases using tree-sitter and stack-graphs. Part of the Konveyor analyzer-lsp ecosystem.
This tool provides semantic code analysis for C# projects, enabling queries to find:
- Type references (classes, interfaces, structs)
- Method calls and definitions
- Field usages and declarations
- Namespace imports and usages
It builds a stack graph from C# source code and optionally decompiled dependencies, then provides a gRPC service for querying that graph.
- Rust 1.70+ with cargo
- Protocol Buffers compiler (protoc)
- .NET SDK 9.x or higher
- Optional: ilspycmd and paket for dependency analysis
# Clone the repository
git clone <repository-url>
cd c-sharp-analyzer-provider-cli
# Build
cargo build
# Install .NET tools (optional, for full dependency analysis)
dotnet tool install --global ilspycmd
dotnet tool install --global paket# Start the server
cargo run -- --port 9000 --name c-sharp
# In another terminal, initialize a project
grpcurl -plaintext -d '{
"analysisMode": "source-only",
"location": "/path/to/csharp/project"
}' localhost:9000 provider.ProviderService.Init
# Query for references
grpcurl -plaintext -d '{
"cap": "referenced",
"conditionInfo": "{\"referenced\": {\"pattern\": \"System\\\\.Web\\\\.Mvc.*\"}}"
}' localhost:9000 provider.ProviderService.Evaluate- Semantic Analysis: Uses tree-sitter for parsing and stack-graphs for semantic understanding
- Dependency Analysis: Optionally resolves and analyzes .NET dependencies
- Pattern Matching: Regex-based queries for flexible symbol search
- Location Filtering: Query by location type (method, field, class, or all)
- gRPC Service: Standard gRPC interface for integration
- Multiple Transports: HTTP/2, Unix domain sockets, or Windows named pipes
- Persistent Caching: SQLite-based stack graph storage for fast startup
- Quick Start Guide - Get up and running quickly
- CLAUDE.md - Guidance for AI assistants working with this codebase
- Architecture Overview - System design, components, and data flow
- Development Guide - Setup, workflows, and adding features
- Testing Guide - Running tests, debugging, and writing new tests
Analyzes only your project's source code. Fast and lightweight.
cargo run -- --port 9000 --name c-sharp
# Then init with: "analysisMode": "source-only"Analyzes source code plus all resolved dependencies. Requires ilspycmd and paket.
# Install tools first
dotnet tool install --global ilspycmd paket
# Run server
cargo run -- --port 9000 --name c-sharp
# Init with: "analysisMode": "full"grpcurl -plaintext -d '{
"cap": "referenced",
"conditionInfo": "{\"referenced\": {\"pattern\": \"System\\\\.Collections.*\"}}"
}' localhost:9000 provider.ProviderService.Evaluategrpcurl -plaintext -d '{
"cap": "referenced",
"conditionInfo": "{\"referenced\": {\"pattern\": \"MyApp\\\\.Services\\\\..*\", \"location\": \"method\"}}"
}' localhost:9000 provider.ProviderService.Evaluategrpcurl -plaintext -d '{
"cap": "referenced",
"conditionInfo": "{\"referenced\": {\"pattern\": \".*Controller\", \"location\": \"class\"}}"
}' localhost:9000 provider.ProviderService.Evaluate# Build
cargo build
# Run linter
cargo clippy
# Run tests
make run-demo
# Run specific test
cargo test -- --nocapturesrc/
├── main.rs # Server entry point
├── analyzer_service/ # gRPC service definitions
├── provider/ # Provider implementation
├── c_sharp_graph/ # Stack graph query engine
└── pipe_stream/ # Named pipe support (Windows)
tests/
├── integration_test.rs # Integration tests
└── demos/ # Test cases
docs/ # Developer documentation
See Development Guide for detailed information.
Contributions are welcome! Please:
- Read the Development Guide
- Check existing issues or create a new one
- Fork the repository and create a feature branch
- Make your changes with tests
- Run
cargo clippyandcargo fmt - Submit a pull request
The project uses integration tests that run against a live server instance:
# Full test suite with server management
make run-demo
# Manual testing
cargo run -- --port 9000 --name c-sharp # Terminal 1
cargo test -- --nocapture # Terminal 2See Testing Guide for comprehensive testing documentation.
The system consists of several layers:
- gRPC Service Layer: Handles client requests and responses
- Provider Layer: Manages project state and coordinates analysis
- Stack Graph Engine: Builds and queries semantic graphs
- Dependency Resolution: Handles .NET dependencies via Paket and ILSpy
See Architecture Overview for detailed design documentation.
- Rust standard library
- SQLite (for graph caching)
-
ilspycmd: Decompiles .NET assemblies to C# source
dotnet tool install --global ilspycmd
-
paket: Resolves .NET dependencies
dotnet tool install --global paket
Options:
--port <PORT> TCP port for gRPC over HTTP/2
--socket <SOCKET> Unix socket or named pipe path
--name <NAME> Service name
--db-path <DB_PATH> SQLite database path (default: temp dir)
--log-file <LOG_FILE> Log file path
-v, --verbosity Log verbosity level
RUST_LOG: Set log level (debug, info, warn, error)RUST_LOG=debug cargo run -- --port 9000
- Caching: Stack graphs are persisted to SQLite for fast restarts
- Streaming: Results are streamed to avoid buffering large result sets
- Concurrency: Multi-threaded async runtime handles concurrent requests
- Incremental: Reuses cached graphs when project hasn't changed
- No authentication or authorization (intended for local/trusted use)
- C# only (no other .NET languages yet)
- Regex patterns only (no AST-based queries)
- Limited incremental update support
[Add your license here]
- analyzer-lsp - Language Server Protocol implementation
- tree-sitter - Parser generator and incremental parsing library
- stack-graphs - Code navigation using stack graphs
- tree-sitter-c-sharp - C# grammar for tree-sitter
- Issues: Report bugs or request features via GitHub issues
- Documentation: See docs/ directory
- CI/CD: GitHub Actions workflow in
.github/workflows/
This project uses:
- Tonic for gRPC
- Tokio for async runtime
- Tree-sitter for parsing
- Stack Graphs for semantic analysis