Conversation
Greptile SummaryRewrote the streamer C++ mock to address all previously identified issues. The implementation now uses per-instance state via Key improvements:
Minor issue:
Confidence Score: 4/5
Important Files Changed
Sequence DiagramsequenceDiagram
participant Client
participant Mock as streamer-mock
participant State as MockStreamerState
participant FileState as FileReadState
participant FS as File System
Client->>Mock: runai_start(&streamer)
Mock->>State: new MockStreamerState()
State-->>Mock: state instance
Mock-->>Client: Success
Client->>Mock: runai_request(paths, offsets, sizes, ...)
Mock->>State: Check has_active_request
State-->>Mock: false
Mock->>State: Set has_active_request = true
loop For each file
Mock->>FileState: initialize_file_read()
FileState->>FS: open(path)
FS-->>FileState: file descriptor
FileState->>FS: seek(offset)
FileState->>FileState: Store chunk_sizes, destination
FileState-->>Mock: Success
Mock->>State: file_states.push_back(file_state)
end
Mock-->>Client: Success
loop Until all chunks read
Client->>Mock: runai_response(&file_index, &index)
Mock->>State: Get current file_state
alt File not complete
Mock->>FileState: read_chunk(&chunk_index)
FileState->>FS: read(chunk_size)
FS-->>FileState: data
FileState->>FileState: Update destination_offset, current_chunk_index
FileState-->>Mock: Success + chunk_index
Mock-->>Client: Success, file_index, chunk_index
else File complete
Mock->>State: current_file_index++
Mock->>Mock: Try next file (recurse)
else All files complete
Mock->>State: Set has_active_request = false
Mock-->>Client: FinishedError
end
end
Client->>Mock: runai_end(streamer)
Mock->>State: delete state
Mock-->>Client: void
|
cpp/mock/streamer-mock.cc
Outdated
| enum class ResponseCode : int | ||
| { | ||
| Success = 0, | ||
| FinishedError, | ||
| FileAccessError, | ||
| EofError, | ||
| S3NotSupported, | ||
| GlibcPrerequisite, | ||
| InsufficientFdLimit, | ||
| InvalidParameterError, | ||
| EmptyRequestError, | ||
| BusyError, | ||
| CaFileNotFound, | ||
| UnknownError, | ||
| ObjPluginLoadError, | ||
| GCSNotSupported, | ||
| __Max, |
There was a problem hiding this comment.
logic: ResponseCode enum duplicated in wrong namespace - should use common::ResponseCode from cpp/common/response_code/response_code.h
The real implementation has ResponseCode in runai::llm::streamer::common namespace, not runai::llm::streamer. This duplication creates maintenance issues.
| enum class ResponseCode : int | |
| { | |
| Success = 0, | |
| FinishedError, | |
| FileAccessError, | |
| EofError, | |
| S3NotSupported, | |
| GlibcPrerequisite, | |
| InsufficientFdLimit, | |
| InvalidParameterError, | |
| EmptyRequestError, | |
| BusyError, | |
| CaFileNotFound, | |
| UnknownError, | |
| ObjPluginLoadError, | |
| GCSNotSupported, | |
| __Max, | |
| // Use the common ResponseCode enum from the real implementation | |
| #include "common/response_code/response_code.h" | |
| using ResponseCode = common::ResponseCode; |
cpp/mock/streamer-mock.cc
Outdated
| constexpr const char* RESPONSE_MESSAGES[] = { | ||
| "Request sent successfuly", | ||
| "Finished all responses", | ||
| "File access error", | ||
| "End of file reached", | ||
| "S3 not supported", | ||
| "GLIBC version should be at least 2.29", | ||
| "Increase process fd limit or decrease the concurrency level. Recommended value for the streamer alone is the concurrency multiplied by 64, in addition to your application fd usage", | ||
| "Invalid request parameters", | ||
| "Empty request parameters", | ||
| "Streamer is handling previous request", | ||
| "CA bundle file not found", | ||
| "Unknown Error", | ||
| "Error loading object storage plugin", | ||
| "GCS not supported", | ||
| }; |
There was a problem hiding this comment.
logic: RESPONSE_MESSAGES array duplicated - should use common::description() function
The real implementation provides common::description(int) function. Duplicating the messages creates maintenance burden.
| constexpr const char* RESPONSE_MESSAGES[] = { | |
| "Request sent successfuly", | |
| "Finished all responses", | |
| "File access error", | |
| "End of file reached", | |
| "S3 not supported", | |
| "GLIBC version should be at least 2.29", | |
| "Increase process fd limit or decrease the concurrency level. Recommended value for the streamer alone is the concurrency multiplied by 64, in addition to your application fd usage", | |
| "Invalid request parameters", | |
| "Empty request parameters", | |
| "Streamer is handling previous request", | |
| "CA bundle file not found", | |
| "Unknown Error", | |
| "Error loading object storage plugin", | |
| "GCS not supported", | |
| }; | |
| // Remove this array and use common::description() instead in get_response_message() |
cpp/mock/streamer-mock.cc
Outdated
| const char* get_response_message(int response_code) | ||
| { | ||
| if (response_code < 0 || response_code >= static_cast<int>(ResponseCode::__Max)) | ||
| { | ||
| return "Invalid response code"; | ||
| } | ||
| return RESPONSE_MESSAGES[response_code]; | ||
| } |
There was a problem hiding this comment.
logic: get_response_message() duplicates functionality that already exists
Use common::description(int) instead.
| const char* get_response_message(int response_code) | |
| { | |
| if (response_code < 0 || response_code >= static_cast<int>(ResponseCode::__Max)) | |
| { | |
| return "Invalid response code"; | |
| } | |
| return RESPONSE_MESSAGES[response_code]; | |
| } | |
| // Remove this function and use common::description() directly in runai_response_str() |
Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!
| const char* runai_response_str(int response_code) | ||
| { | ||
| return 0; | ||
| using namespace runai::llm::streamer; | ||
| return get_response_message(response_code); | ||
| } |
There was a problem hiding this comment.
logic: Use common::description() instead of get_response_message()
| const char* runai_response_str(int response_code) | |
| { | |
| return 0; | |
| using namespace runai::llm::streamer; | |
| return get_response_message(response_code); | |
| } | |
| const char* runai_response_str(int response_code) | |
| { | |
| return common::description(response_code); | |
| } |
Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!
a2cd5d7 to
3ec957d
Compare
| state->current_item++; | ||
| return 0; | ||
| // Open file | ||
| state.file = utils::Fd(::open(path, O_RDONLY)); |
There was a problem hiding this comment.
style: Direct use of ::open without checking errno on failure means error messages may be misleading
| state.file = utils::Fd(::open(path, O_RDONLY)); | |
| int fd = ::open(path, O_RDONLY); | |
| if (fd == -1) | |
| { | |
| LOG(ERROR) << "Failed to open file: " << path << " (errno: " << errno << ")"; | |
| return static_cast<int>(common::ResponseCode::FileAccessError); | |
| } | |
| state.file = utils::Fd(fd); |
cpp/mock/streamer-mock.cc
Outdated
| }; | ||
|
|
||
| // Global state instance | ||
| static MockStreamerState g_state; |
There was a problem hiding this comment.
logic: Static global state (g_state) means multiple streamer instances share state, violating isolation
The real implementation uses per-instance state via the streamer pointer, but this mock uses a global singleton. If multiple tests or callers use separate streamer instances concurrently, they'll interfere with each other. Is concurrent usage with multiple streamer instances expected in tests?
cpp/mock/streamer-mock.cc
Outdated
| // Clean up on error | ||
| g_state.file_states.clear(); | ||
| g_state.has_active_request = false; |
There was a problem hiding this comment.
logic: Error cleanup doesn't reset has_active_request consistently
If initialize_file_read fails after the first file succeeds, has_active_request is set to false, but if it fails on the first file, the flag remains true from line 276.
|
|
||
| for i in range(expected_chunks): | ||
| file_relative_index, chunk_relative_index = runai_response(self.streamer) | ||
| if chunk_relative_index == None: |
There was a problem hiding this comment.
style: Explained behavior when None is returned - happens when FINISHED_ERROR_CODE is returned by C++ (see libstreamer.py:92-93), indicating all chunks complete or error occurred
573df8e to
4bba84d
Compare
| } | ||
| } // namespace runai::llm::streamer | ||
|
|
||
| } // extern "C" No newline at end of file |
There was a problem hiding this comment.
style: Missing newline at end of file
| } // extern "C" | |
| } // extern "C" | |
Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!
| } | ||
| } // namespace runai::llm::streamer | ||
|
|
||
| } // extern "C" No newline at end of file |
There was a problem hiding this comment.
style: Missing newline at end of file
Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!
Rewrite the streamer cpp mock to improve its functionality