-
Notifications
You must be signed in to change notification settings - Fork 1.4k
[OPIK-4383] [BE] Add ExperimentAggregationPublisher, ExperimentDenormalizationJob and tests #5511
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
thiagohora
merged 76 commits into
main
from
thiagohora/OPIK-4383_add_experiment_aggregation_publisher_job_tests
Mar 12, 2026
Merged
Changes from all commits
Commits
Show all changes
76 commits
Select commit
Hold shift + click to select a range
599e7cf
[OPIK-4380] [BE] Add experiment aggregates for denormalized metrics
thiagohora 653e52c
[OPIK-4380] [BE] Add MySQL deadlock retry mechanism for concurrent da…
thiagohora 672b5ce
Fix visibility
thiagohora 19ad5e4
[OPIK-4380] [BE] Address PR review comments for experiment aggregates
thiagohora 3978a0e
[OPIK-4382] [BE] Refactor experiment aggregates with import cleanup a…
thiagohora 2a62ebb
[OPIK-4380] [BE] Fix table definition
thiagohora 8dfbd70
[OPIK-4380] [BE] Address PR comments and consolidate DatasetItemServi…
thiagohora 520bafb
Merge branch 'main' into thiaghora/OPIK-4380-experiment-denormalization
thiagohora 14a49b1
[OPIK-4380] [BE] Fix missing log_comment and centralize search criter…
thiagohora 032ad46
[OPIK-4380] [BE] Fix createVersionFromDelta consolidation after rebase
thiagohora c71235e
[OPIK-4380] [BE] Address PR review comments - fix type mismatch, extr…
thiagohora b14d687
[OPIK-4380] [BE] Extract shared helper for experiment data pagination
thiagohora 48d79c0
Merge branch 'thiaghora/OPIK-4380-experiment-denormalization' into th…
thiagohora 82fb51e
Merge branch 'main' into thiaghora/OPIK-4380-experiment-denormalization
thiagohora 618986c
[OPIK-4383] [BE] Add Redis stream subscriber for debounced experiment…
thiagohora 2f1796e
Revision 2: Address PR comments - add config defaults, remove toggle,…
thiagohora 4c9f6fc
Revision 3: Add @Max(500) to consumerBatchSize and @NotNull to jobLoc…
thiagohora f679239
Merge branch 'main' into thiaghora/OPIK-4380-experiment-denormalization
thiagohora 05c2f95
[OPIK-4380] [BE] Address PR review comments - fix TYPE_REFERENCE visi…
thiagohora 350f49e
Revision 4: Address remaining JetoPistola review comments (#7, #8, #10)
thiagohora 810fbd3
Revision 3: Add switchIfEmpty fallback for deleted traces in populate…
thiagohora 5228408
Fix tests
thiagohora a11fdb0
Revision 6: Move countTotal log from DAO to service layer
thiagohora 80369a1
Revision 7: Apply Spotless formatting
thiagohora f009442
Revision 8: Make populateAggregations(UUID, int) private
thiagohora 441da52
Merge branch 'main' into thiaghora/OPIK-4380-experiment-denormalization
thiagohora 12915c5
[OPIK-4380] [BE] Add evaluation_method support to experiment_aggregat…
thiagohora cf7c3ba
[OPIK-4380] [BE] Extract shared helper for experiment aggregation que…
thiagohora 60a2ec3
[OPIK-4380] [BE] Enforce non-null contract on countTotal criteria par…
thiagohora db5f331
[OPIK-4380] [BE] Fix countTotal ignoring target project IDs in normal…
thiagohora 2d27350
[OPIK-4380] [BE] Apply Spotless formatting
thiagohora b907350
Merge branch 'thiaghora/OPIK-4380-experiment-denormalization' into th…
thiagohora 1315d56
Merge branch 'main' into thiagohora/OPIK-4382_metrics_computation_ser…
thiagohora 82a69d1
Merge branch 'main' into thiagohora/OPIK-4382_metrics_computation_ser…
thiagohora 230c45d
[OPIK-4382] [BE] Address PR review comments on experiment aggregates
thiagohora fad6915
Revision 3: Address PR comments E, F, G, H
thiagohora dadcabc
Revision 4: Fix ExperimentServiceTest to include ExperimentGroupEnric…
thiagohora 791f62b
[OPIK-4382] [BE] Extract shared Row→ExperimentGroup mappers into Expe…
thiagohora 11286ba
[OPIK-4382] [BE] Deduplicate bindGroupCriteria into ExperimentGroupMa…
thiagohora b7ba063
[OPIK-4382] [BE] Extract streamGroupQuery helper and fix null percent…
thiagohora 1335553
[OPIK-4382] [BE] Consolidate cost/duration helpers into ExperimentGro…
thiagohora 739477b
[OPIK-4382] [BE] Fix pagination count and add criteria filter tests
thiagohora d39c715
[OPIK-4380] [BE] Extract shared filter helpers into FilterQueryBuilder
thiagohora 019e300
[OPIK-4382] [BE] Consolidate filter helpers in getExperimentItemsStat…
thiagohora b538376
Revision 9: Extract shared helpers to eliminate duplication across DAOs
thiagohora 2d80a7d
[OPIK-4383] [BE] Add experiment aggregate event listener and no-op pu…
thiagohora 6634138
Revision 2: Fix missing import for ExperimentAggregationPublisher
thiagohora 613e8d4
[OPIK-4383] [BE] Add ExperimentAggregationPublisher, ExperimentDenorm…
thiagohora 1d9064f
Merge branch 'main' into thiagohora/OPIK-4382_metrics_computation_ser…
thiagohora 6b1fa21
Merge branch 'thiagohora/OPIK-4382_metrics_computation_service' into …
thiagohora bff0682
Merge branch 'thiagohora/OPIK-4383-redis-stream-subscriber-experiment…
thiagohora 3c06cf8
Merge branch 'thiagohora/OPIK-4383_add_experiment_aggregate_event_lis…
thiagohora bfdc17d
Fix tests setup
thiagohora 98460ea
Merge branch 'thiagohora/OPIK-4383_add_experiment_aggregate_event_lis…
thiagohora ad3d567
Merge branch 'thiagohora/OPIK-4383_add_experiment_aggregate_event_lis…
thiagohora 4e165e8
[OPIK-4383] [BE] Address PR review: move DAO logs to service layer
thiagohora b04d5c0
[OPIK-4383] [BE] Address PR review: extract shared DAO helper and fix…
thiagohora 7324724
[OPIK-4383] [BE] Short-circuit deleteByTraceIds when no spans found
thiagohora 388c1f1
[OPIK-4383] [BE] Fix cascade deletion failures after trace delete
thiagohora f2acb60
Merge branch 'thiagohora/OPIK-4383_add_experiment_aggregate_event_lis…
thiagohora 1fecbac
[OPIK-4383] [BE] Address PR review comments on ExperimentDenormalizat…
thiagohora 7d35979
Fix @Every job interval config key casing and add jobs section to tes…
thiagohora 88099b9
Replace @Every annotation with programmatic Quartz scheduling
thiagohora aae95b4
Add experiment context to error log and extract publishIfNotEmpty helper
thiagohora 7b97e6e
Fix NPE in ExperimentAggregateEventListenerTest mock setup
thiagohora 64d14c3
Merge branch 'main' into thiagohora/OPIK-4383_add_experiment_aggregat…
thiagohora 08a96c8
[OPIK-4383] [BE] Remove DAO-level log.info from ExperimentAggregatesD…
thiagohora 3904111
Merge branch 'thiagohora/OPIK-4383_add_experiment_aggregate_event_lis…
thiagohora 98f697c
Remove accidentally committed doc files
thiagohora 7cfb60f
[OPIK-4383] [BE] refactor: extract triggerAggregation helper to centr…
thiagohora c43e7e7
[OPIK-4383] [BE] fix: restore TagOperations.tagUpdateFragment in Span…
thiagohora d59daca
Merge branch 'thiagohora/OPIK-4383_add_experiment_aggregate_event_lis…
thiagohora 8931a39
Merge branch 'main' into thiagohora/OPIK-4383_add_experiment_aggregat…
thiagohora 3a74920
Adding InterruptableJob
thiagohora 80c35d3
[OPIK-4383] [BE] Address PR review: expand safety valve, env var prefix
thiagohora c4b493d
Merge branch 'main' into thiagohora/OPIK-4383_add_experiment_aggregat…
thiagohora File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Some comments aren't visible on the classic Files Changed page.
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
188 changes: 188 additions & 0 deletions
188
...kend/src/main/java/com/comet/opik/api/resources/v1/jobs/ExperimentDenormalizationJob.java
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,188 @@ | ||
| package com.comet.opik.api.resources.v1.jobs; | ||
|
|
||
| import com.comet.opik.api.events.ExperimentAggregationMessage; | ||
| import com.comet.opik.infrastructure.ExperimentDenormalizationConfig; | ||
| import com.comet.opik.infrastructure.lock.LockService; | ||
| import io.dropwizard.jobs.Job; | ||
| import jakarta.inject.Inject; | ||
| import jakarta.inject.Singleton; | ||
| import lombok.NonNull; | ||
| import lombok.extern.slf4j.Slf4j; | ||
| import org.quartz.DisallowConcurrentExecution; | ||
| import org.quartz.InterruptableJob; | ||
| import org.quartz.JobExecutionContext; | ||
| import org.quartz.UnableToInterruptJobException; | ||
| import org.redisson.api.RedissonReactiveClient; | ||
| import org.redisson.api.stream.StreamAddArgs; | ||
| import reactor.core.publisher.Flux; | ||
| import reactor.core.publisher.Mono; | ||
| import ru.vyarus.dropwizard.guice.module.yaml.bind.Config; | ||
|
|
||
| import java.time.Instant; | ||
| import java.util.UUID; | ||
| import java.util.concurrent.atomic.AtomicBoolean; | ||
| import java.util.function.Function; | ||
|
|
||
| import static com.comet.opik.infrastructure.lock.LockService.Lock; | ||
|
|
||
| /** | ||
| * Scheduled job responsible for flushing debounced experiment aggregation events to the Redis stream. | ||
| * | ||
| * <p>Periodically (default 5s, configurable via {@code experimentDenormalization.jobInterval}) this job: | ||
| * <ol> | ||
| * <li>Queries the Redis ZSET index for members whose debounce window has elapsed.</li> | ||
| * <li>Each ZSET member encodes both workspaceId and experimentId as {@code "workspaceId:experimentId"}, | ||
| * ensuring cross-workspace isolation for experiments that share the same UUID.</li> | ||
| * <li>Reads the userName from the associated Redis hash bucket.</li> | ||
| * <li>Publishes an {@link ExperimentAggregationMessage} to the Redis stream for each ready experiment.</li> | ||
| * <li>Removes the processed entry from both the ZSET index and the hash bucket.</li> | ||
| * <li>Handles stale ZSET entries (expired hash) by removing them without publishing.</li> | ||
| * </ol> | ||
| * | ||
| * <p>Uses a ZSET scored by expiry timestamp for O(log(N)+M) index lookups, avoiding full keyspace scans. | ||
| */ | ||
| @Slf4j | ||
| @Singleton | ||
| @DisallowConcurrentExecution | ||
| public class ExperimentDenormalizationJob extends Job implements InterruptableJob { | ||
|
|
||
| private static final Lock SCAN_LOCK_KEY = new Lock("experiment_denormalization_job:scan_lock"); | ||
|
|
||
| private final AtomicBoolean interrupted = new AtomicBoolean(false); | ||
| private final ExperimentDenormalizationConfig config; | ||
| private final RedissonReactiveClient redisClient; | ||
| private final LockService lockService; | ||
|
|
||
| @Inject | ||
| public ExperimentDenormalizationJob( | ||
| @NonNull @Config("experimentDenormalization") ExperimentDenormalizationConfig config, | ||
| @NonNull RedissonReactiveClient redisClient, | ||
| @NonNull LockService lockService) { | ||
| this.config = config; | ||
| this.redisClient = redisClient; | ||
| this.lockService = lockService; | ||
|
thiagohora marked this conversation as resolved.
|
||
| } | ||
|
|
||
| @Override | ||
| public void doJob(JobExecutionContext context) { | ||
| if (!config.isEnabled()) { | ||
| log.debug("Experiment denormalization job is disabled, skipping"); | ||
| return; | ||
| } | ||
|
|
||
| // Check for interruption before starting | ||
| if (interrupted.get()) { | ||
| log.info("Experiment denormalization job interrupted before execution, skipping"); | ||
| return; | ||
| } | ||
|
|
||
| log.info("Starting experiment denormalization job - checking for pending experiments"); | ||
|
|
||
| lockService.bestEffortLock( | ||
| SCAN_LOCK_KEY, | ||
| Mono.defer(() -> getExperimentsReadyToProcess() | ||
| .flatMap(this::processExperiment) | ||
| .onErrorContinue((throwable, experimentId) -> log.error( | ||
|
thiagohora marked this conversation as resolved.
|
||
| "Failed to process pending experiment '{}'", | ||
| experimentId, throwable)) | ||
| .doOnComplete( | ||
| () -> log.info( | ||
| "Experiment denormalization job finished processing all ready experiments")) | ||
| .then()), | ||
| Mono.defer(() -> { | ||
| log.info( | ||
| "Could not acquire lock for scanning pending experiments, another job instance is running"); | ||
| return Mono.empty(); | ||
| }), | ||
| config.getJobLockTime().toJavaDuration(), | ||
| config.getJobLockWaitTime().toJavaDuration()) | ||
| .subscribe( | ||
| __ -> log.info("Experiment denormalization job execution completed"), | ||
| error -> log.error("Experiment denormalization job interrupted while acquiring lock", error)); | ||
| } | ||
|
|
||
| /** | ||
| * Queries the ZSET index in pages for experiment IDs whose debounce window has elapsed (score <= now). | ||
| * Uses offset/count pagination to avoid materializing the entire ZSET into memory. | ||
| * Since each processed experiment is removed from the ZSET, we always query from offset 0. | ||
| * A safety counter caps the number of expand iterations (using batchSize as the limit) to | ||
| * prevent infinite loops if entries fail to be removed (e.g., due to errors swallowed by onErrorContinue). | ||
| */ | ||
| private Flux<String> getExperimentsReadyToProcess() { | ||
| long nowMillis = Instant.now().toEpochMilli(); | ||
| int batchSize = config.getJobBatchSize(); | ||
| var index = redisClient.<String>getScoredSortedSet(ExperimentDenormalizationConfig.PENDING_SET_KEY); | ||
| var iterations = new int[]{0}; | ||
|
|
||
| log.debug("Checking for experiments ready to process (up to timestamp: '{}', batchSize: '{}')", | ||
| nowMillis, batchSize); | ||
|
|
||
| if (interrupted.get()) { | ||
| log.info("Experiment denormalization job interrupted before execution, skipping"); | ||
| return Flux.empty(); | ||
| } | ||
|
|
||
| return index.valueRange(Double.NEGATIVE_INFINITY, true, nowMillis, true, 0, batchSize) | ||
| .expand(collection -> { | ||
| if (collection.size() < batchSize) { | ||
| return Mono.empty(); | ||
| } | ||
| iterations[0]++; | ||
| if (iterations[0] >= batchSize) { | ||
| log.warn("Reached maximum expand iterations '{}', stopping pagination to prevent infinite loop", | ||
| batchSize); | ||
| return Mono.empty(); | ||
| } | ||
| return index.valueRange(Double.NEGATIVE_INFINITY, true, nowMillis, true, 0, batchSize); | ||
| }) | ||
| .flatMapIterable(Function.identity()) | ||
| .map(Object::toString); | ||
|
thiagohora marked this conversation as resolved.
|
||
| } | ||
|
|
||
| /** | ||
| * Processes a single pending experiment: publishes a stream message and cleans up the Redis state. | ||
| * The {@code member} is a compound key of the form {@code "workspaceId:experimentId"}, which | ||
| * ensures experiments with the same UUID in different workspaces are handled independently. | ||
| * If the hash bucket has already expired (stale ZSET entry), only the ZSET entry is removed. | ||
| */ | ||
| private Mono<Void> processExperiment(String member) { | ||
| int separatorIndex = member.indexOf(ExperimentDenormalizationConfig.MEMBER_SEPARATOR); | ||
| String workspaceId = member.substring(0, separatorIndex); | ||
| String experimentIdStr = member.substring(separatorIndex + 1); | ||
|
|
||
|
thiagohora marked this conversation as resolved.
|
||
| log.info("Processing pending experiment: '{}' for workspace: '{}'", experimentIdStr, workspaceId); | ||
|
|
||
| var bucket = redisClient.<String, String>getMap(ExperimentDenormalizationConfig.EXPERIMENT_KEY_PREFIX + member); | ||
| var index = redisClient.getScoredSortedSet(ExperimentDenormalizationConfig.PENDING_SET_KEY); | ||
| var stream = redisClient.getStream(config.getStreamName(), config.getCodec()); | ||
|
|
||
| return bucket.get(ExperimentDenormalizationConfig.USER_NAME_FIELD) | ||
| .flatMap(userName -> { | ||
| var message = ExperimentAggregationMessage.builder() | ||
| .experimentId(UUID.fromString(experimentIdStr)) | ||
| .workspaceId(workspaceId) | ||
| .userName(userName) | ||
| .build(); | ||
|
|
||
| return stream.add(StreamAddArgs.entry(ExperimentDenormalizationConfig.PAYLOAD_FIELD, message)) | ||
| .doOnNext(id -> log.info( | ||
| "Enqueued aggregation message for experiment '{}' with stream id '{}'", | ||
| experimentIdStr, id)) | ||
| .then(bucket.delete()) | ||
| .then(index.remove(member)); | ||
| }) | ||
| .switchIfEmpty(Mono.defer(() -> { | ||
| log.warn("Stale index entry found with no bucket data, removing member: '{}'", member); | ||
| return index.remove(member); | ||
| })) | ||
| .then() | ||
| .doOnSuccess(__ -> log.info("Successfully processed and removed pending experiment: '{}'", | ||
| experimentIdStr)); | ||
| } | ||
|
|
||
| @Override | ||
| public void interrupt() throws UnableToInterruptJobException { | ||
| interrupted.set(true); | ||
| log.info("ExperimentDenormalizationJob reaper job interrupted"); | ||
| } | ||
| } | ||
49 changes: 42 additions & 7 deletions
49
...n/java/com/comet/opik/domain/experiments/aggregations/ExperimentAggregationPublisher.java
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,34 +1,69 @@ | ||
| package com.comet.opik.domain.experiments.aggregations; | ||
|
|
||
| import com.comet.opik.infrastructure.ExperimentDenormalizationConfig; | ||
| import com.google.inject.ImplementedBy; | ||
| import jakarta.inject.Inject; | ||
| import jakarta.inject.Singleton; | ||
| import lombok.NonNull; | ||
| import lombok.extern.slf4j.Slf4j; | ||
| import org.redisson.api.RMapReactive; | ||
| import org.redisson.api.RedissonReactiveClient; | ||
| import reactor.core.publisher.Flux; | ||
| import reactor.core.publisher.Mono; | ||
| import ru.vyarus.dropwizard.guice.module.yaml.bind.Config; | ||
|
|
||
| import java.time.Duration; | ||
| import java.time.Instant; | ||
| import java.util.Set; | ||
| import java.util.UUID; | ||
|
|
||
| @ImplementedBy(ExperimentAggregationPublisher.ExperimentAggregationPublisherImpl.class) | ||
| public interface ExperimentAggregationPublisher { | ||
|
|
||
| void publish(@NonNull Set<UUID> experimentIds, @NonNull String workspaceId, @NonNull String userName); | ||
| Mono<Void> publish(@NonNull Set<UUID> experimentIds, @NonNull String workspaceId, @NonNull String userName); | ||
|
|
||
| @Singleton | ||
| @Slf4j | ||
| class ExperimentAggregationPublisherImpl implements ExperimentAggregationPublisher { | ||
|
|
||
| private final ExperimentDenormalizationConfig config; | ||
| private final RedissonReactiveClient redisClient; | ||
|
|
||
| @Inject | ||
| ExperimentAggregationPublisherImpl() { | ||
| ExperimentAggregationPublisherImpl( | ||
| @NonNull @Config("experimentDenormalization") ExperimentDenormalizationConfig config, | ||
| @NonNull RedissonReactiveClient redisClient) { | ||
| this.config = config; | ||
| this.redisClient = redisClient; | ||
| } | ||
|
|
||
| @Override | ||
| public void publish(@NonNull Set<UUID> experimentIds, @NonNull String workspaceId, | ||
| public Mono<Void> publish(@NonNull Set<UUID> experimentIds, @NonNull String workspaceId, | ||
| @NonNull String userName) { | ||
| // TODO: implement debounce mechanism before enqueuing to Redis stream | ||
| log.debug( | ||
| "Experiment aggregation publish skipped for experiments '{}': debounce mechanism not yet implemented", | ||
| experimentIds); | ||
| if (!config.isEnabled() || experimentIds.isEmpty()) { | ||
|
thiagohora marked this conversation as resolved.
|
||
| log.info("Skipping publish: enabled='{}', experimentIds.size='{}'", | ||
| config.isEnabled(), experimentIds.size()); | ||
| return Mono.empty(); | ||
| } | ||
|
|
||
| Instant expiryTimestamp = Instant.now().plusMillis(config.getDebounceDelay().toMilliseconds()); | ||
| var index = redisClient.getScoredSortedSet(ExperimentDenormalizationConfig.PENDING_SET_KEY); | ||
|
|
||
|
thiagohora marked this conversation as resolved.
|
||
| return Flux.fromIterable(experimentIds) | ||
| .flatMap(experimentId -> { | ||
| String member = workspaceId + ":" + experimentId; | ||
| RMapReactive<String, String> bucket = redisClient | ||
| .getMap(ExperimentDenormalizationConfig.EXPERIMENT_KEY_PREFIX + member); | ||
|
|
||
| return index.add(expiryTimestamp.toEpochMilli(), member) | ||
| .then(bucket.put(ExperimentDenormalizationConfig.USER_NAME_FIELD, userName)) | ||
| .then(bucket.expire(Duration.ofMillis(config.getDebounceDelay().toMilliseconds() * 2))) | ||
| .doOnSuccess(__ -> log.info( | ||
| "Enqueued experiment '{}' for workspace '{}' in pending bucket with expiryTimestamp='{}'", | ||
| experimentId, workspaceId, expiryTimestamp)); | ||
| }) | ||
| .doOnError(error -> log.error("Error enqueueing experiments in pending bucket", error)) | ||
| .then(); | ||
| } | ||
| } | ||
| } | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.