Skip to content

Use get_pooled in process_single_choice#11079

Merged
slin1237 merged 1 commit intomainfrom
chang/reasoning
Sep 29, 2025
Merged

Use get_pooled in process_single_choice#11079
slin1237 merged 1 commit intomainfrom
chang/reasoning

Conversation

@CatherineSue
Copy link
Collaborator

@CatherineSue CatherineSue commented Sep 29, 2025

Motivation

The current process_single_choice uses .create() which creates a fresh instance every time, bypassing the pool. It is inefficient.

ParserFactory actually has a registry with pooling.

Modifications

  • Use pooled parser with mutex in process_single_choice in grpc router

Accuracy Tests

Benchmarking and Profiling

Checklist

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @CatherineSue, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request refactors the process_single_choice function within the gRPC router to enhance performance and resource management. By switching from creating new parser instances to retrieving them from a pre-configured pool, the change aims to reduce overhead and improve the overall efficiency of the parsing process. Additionally, a mutex lock is implemented to ensure safe concurrent access to these pooled resources.

Highlights

  • Performance Optimization: The process_single_choice function in the gRPC router now utilizes a pooled parser from the reasoning_parser_factory instead of creating a new instance each time, significantly improving efficiency by reusing existing resources.
  • Concurrency Safety: A mutex lock has been introduced when acquiring a pooled parser to ensure thread-safe access and prevent potential data races in concurrent environments.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly switches from creating a new reasoning parser on each request to using a pooled parser, which should improve efficiency by reducing allocations. The use of a mutex to protect the shared parser is appropriate. I have one suggestion to minimize the duration for which the lock is held, which is a good practice in concurrent programming.

Comment on lines +776 to 789
let mut parser = pooled_parser
.lock()
.map_err(|e| format!("Failed to acquire reasoning parser lock: {}", e))?;
match parser.detect_and_parse_reasoning(&processed_text) {
Ok(result) => {
if !result.reasoning_text.is_empty() {
reasoning_text = Some(result.reasoning_text);
}
processed_text = result.normal_text;
}
Err(e) => {
return Err(format!("Reasoning parsing error: {}", e));
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

To minimize the duration the mutex is held, it's a good practice to release the lock as soon as the shared resource is no longer needed. You can achieve this by scoping the lock acquisition and the call to detect_and_parse_reasoning within a block, and then processing the result outside of that block. This ensures the lock is not held during the match and subsequent assignments.

            let result = {
                let mut parser = pooled_parser
                    .lock()
                    .map_err(|e| format!("Failed to acquire reasoning parser lock: {}", e))?;
                parser.detect_and_parse_reasoning(&processed_text)
            };

            match result {
                Ok(result) => {
                    if !result.reasoning_text.is_empty() {
                        reasoning_text = Some(result.reasoning_text);
                    }
                    processed_text = result.normal_text;
                }
                Err(e) => {
                    return Err(format!("Reasoning parsing error: {}", e));
                }
            }

@slin1237 slin1237 merged commit f065e5b into main Sep 29, 2025
37 of 39 checks passed
@slin1237 slin1237 deleted the chang/reasoning branch September 29, 2025 22:48
ch-tiger1 pushed a commit to ch-tiger1/sglang that referenced this pull request Oct 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants

Comments