Today, I'm going to show you how to navigate a code workspace with lsproxy, an open-source dev container for code exploration workflows like AI coding agents.
Installation and setup
Download the python sdk.
pip install lsproxy-sdk
Pull and run the docker container. WORKSPACE_PATH
is the absolute path to your code workspace (likely the repository root).
docker pull agenticlabs/lsproxy
docker run \
-p 4444:4444 \
-v $WORKSPACE_PATH:/mnt/workspace \
agenticlabs/lsproxy
Set up the client in python.
# First create the API client
from lsproxy import (
Lsproxy,
FileRange,
GetReferencesRequest,
GetDefinitionRequest,
)
client = Lsproxy()
Find definitions in a file
symbols = client.definitions_in_file("src/sample_file.rs")
Gives you lots of information about the symbols defined within.
Get source code and references
# Pull source code
file_range = FileRange(
path="src/sample_file.rs",
start=symbol.range.start,
end=symbol.range.end,
)
source_code = client.read_source_code(file_range).source_code
# Get references with surrounding code context
reference_request = GetReferencesRequest(
identifier_position=symbol.identifier_position,
include_code_context_lines=2,
)
references = client.find_references(reference_request)
Returns definition source code and references across the workspace. For example if we chose UploadIngestionMessage
symbol defined in the file.
Code
pub struct UploadIngestionMessage {
pub ingest_specific_chunk_metadata: IngestSpecificChunkMetadata,
pub chunk: ChunkReqPayload,
pub dataset_id: uuid::Uuid,
pub upsert_by_tracking_id: bool,
}
References
server/src/bin/ingestion-worker.rs
15: use trieve_server::handlers::chunk_handler::{
16: BulkUploadIngestionMessage, FullTextBoost, SemanticBoost, UpdateIngestionMessage,
17: UploadIngestionMessage,
: ____^
18: };
19:
@@@@@: -----
793: #[tracing::instrument(skip(payload, web_pool))]
794: async fn upload_chunk(
795: mut payload: UploadIngestionMessage,
: _________________^
796: dataset_config: DatasetConfiguration,
797:
@@@@@: -----
server/src/handlers/chunk_handler.rs
210: pub attempt_number: usize,
211: pub dataset_id: uuid::Uuid,
212: pub ingestion_messages: Vec<UploadIngestionMessage>,
: ________________________________^
213: }
214:
server/src/operators/chunk_operator.rs
4: };
5: use crate::handlers::chunk_handler::{BulkUploadIngestionMessage, ChunkReqPayload};
6: use crate::handlers::chunk_handler::{ChunkFilter, UploadIngestionMessage};
: __________________________________________________^
7: use crate::operators::group_operator::{
8:
@@@@@: -----
2514: chunk_only_group_ids.group_tracking_ids = None;
2515:
2516: let upload_message = UploadIngestionMessage {
: _____________________________^
2517: ingest_specific_chunk_metadata: IngestSpecificChunkMetadata {
2518:
Go to definition
If you have a reference to a symbol and want to find its definition, you can do that, too.
definition_request = GetDefinitionRequest(position=reference)
# Some languages allow multiple definitions so we get a list
definitions = client.find_definition(definition_request).definitions
Check out the interactive tutorial to see this and more in-depth examples, and the API spec for supported endpoints:
- Tutorial: https://demo.lsproxy.dev
- API Spec: https://docs.lsproxy.dev/api-reference