Conversation
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: ASSERTIVE Plan: Pro Run ID: 📒 Files selected for processing (2)
Summary by CodeRabbit
WalkthroughAdds liquidation support: new liquidation result types and enum, exposes fee/config data used by liquidations, introduces helper liquidation factories, wires liquidation factories into the registry, and adds multiple protocol-specific liquidation adapters plus a GUIDELINES doc. All changes are declarations and new fetch implementations for liquidation metrics. Changes
Sequence Diagram(s)mermaid Runner->>Registry: request adapters of type "liquidations" Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Suggested labels
🚥 Pre-merge checks | ✅ 3 | ❌ 3❌ Failed checks (2 warnings, 1 inconclusive)
✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches✨ Simplify code
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
factory/aaveLiquidations.ts
Outdated
|
|
||
| type Config = Record<string, { pools: string[]; start: string }>; | ||
|
|
||
| const configs: Record<string, Config> = { |
There was a problem hiding this comment.
hmm, why not export aaveProtocolConfigs from helper/aave and use it here instead of duplicating the same config, this way, when a new pool/protocol gets added there, you automatically get support
|
|
||
| type LiquidationConfig = Record<string, { comptroller: string; start: string }>; | ||
|
|
||
| const liquidationConfigs: Record<string, LiquidationConfig> = { |
There was a problem hiding this comment.
same comment as aave, can keep only the missing protocols here
liquidations/compound-v3/index.ts
Outdated
| const AbsorbCollateralEvent = 'event AbsorbCollateral(address indexed absorber, address indexed borrower, address indexed asset, uint256 collateralAbsorbed, uint256 usdValue)' | ||
| const AbsorbDebtEvent = 'event AbsorbDebt(address indexed absorber, address indexed borrower, uint256 basePaidOut, uint256 usdValue)' | ||
|
|
||
| const config: { [chain: string]: { comets: string[]; start: string } } = { |
There was a problem hiding this comment.
can be pulled from compound v3 fees adapter, this way, any change there is reflected here
g1nt0ki
left a comment
There was a problem hiding this comment.
can you also track liquidations in perp dexes, I think gmx forks emit liquidation event, and might need to look for events/rely on their api in some cases
| const collateralUnderlying = underlyingMap[collateralCToken] | ||
| const exchangeRate = exchangeRateMap[collateralCToken] | ||
| if (collateralUnderlying && exchangeRate) { | ||
| const underlyingAmount = (BigInt(event.seizeTokens) * exchangeRate) / BigInt(1e18) |
There was a problem hiding this comment.
Need to update the conversion here, this is from Compound docs:
oneCTokenInUnderlying = exchangeRateCurrent / (1 * 10 ^ (18 + underlyingDecimals - cTokenDecimals))
There was a problem hiding this comment.
I think the current formula is also correct since we add to balances in raw units, we keep the extra decimals (like adding 1e18 to ETH bal instead of 1 wei). Using this tx hash as an example, if we use the current formula we get 63.4 LINK seized (cLINK underlying), if we add underlyingDecimals - cTokenDecimals, it would show 0.000000006 LINK seized, and if we follow the docs and calculate underlyingAmount in two steps then we get 63.4 LINK seized (same output):
const txHash = '0x14f738e13b596d61bb66adfe86f6a1250948c0983111a460976546ba2bb281b7'
const seizeTokensRaw = 315071765737n
const exchangeRateStored = 201264676862259987086714351n
const cTokenDecimals = 8n
const underlyingDecimals = 18n
// current method
const oneStepUnderlyingRaw = (seizeTokensRaw * exchangeRateStored) / (10n ** 18n)
// compound docs: https://docs.compound.finance/v2/#protocol-math
const oneCTokenInUnderlyingScaled = (exchangeRateStored * (10n ** 18n)) / (10n ** (18n + underlyingDecimals - cTokenDecimals))
const twoStepUnderlyingRaw = (seizeTokensRaw * oneCTokenInUnderlyingScaled * (10n ** underlyingDecimals)) / ((10n ** cTokenDecimals) * (10n ** 18n))
console.log('oneStepUnderlyingRaw:', oneStepUnderlyingRaw.toString()) // 63412817119478982867
console.log('twoStepUnderlyingRaw:', twoStepUnderlyingRaw.toString()) // 63412817119478980634
console.log('underlyingHuman:', Number(oneStepUnderlyingRaw) / 1e18) // 63.41281711947898There was a problem hiding this comment.
can you check cases of liquidated USDC, USDT? they have 6 decimals and make sure the calculation is correct to them
adapters/types.ts
Outdated
| // LIQUIDATIONS | ||
| export type FetchResultLiquidations = FetchResultBase & { | ||
| dailyLiquidations?: FetchResponseValue | ||
| dailyLiquidatedDebt?: FetchResponseValue |
There was a problem hiding this comment.
personally, I would prefer dailyLiquidationRepaidDebt, because debt is repaid, not liquidated
our HL indexer is tracking liquidation volume on Hyperliquid too |
|
The aave-v3.ts adapter exports: |
|
The aave-v3.ts adapter exports: |
|
The aave-v3.ts adapter exports: |
|
The aave-v3.ts adapter exports: |
There was a problem hiding this comment.
Actionable comments posted: 11
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
fees/aave-v3.ts (1)
317-319:⚠️ Potential issue | 🔴 CriticalCritical: wrong market mapping for Mantle config.
Line 318 maps
CHAIN.MANTLEtoAaveMarkets[CHAIN.MEGAETH], which will fetch the wrong pool set for Mantle.🐛 Suggested fix
[CHAIN.MANTLE]: { - pools: AaveMarkets[CHAIN.MEGAETH], + pools: AaveMarkets[CHAIN.MANTLE], start: '2026-01-16', },🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@fees/aave-v3.ts` around lines 317 - 319, Mapping for CHAIN.MANTLE is wrong: it currently assigns pools: AaveMarkets[CHAIN.MEGAETH] which pulls MEGAETH pools; update the pools entry to use the correct Mantle key (e.g., AaveMarkets[CHAIN.MANTLE] or the actual Mantle market constant) so CHAIN.MANTLE points to its own AaveMarkets set (check the AaveMarkets object for the exact Mantle key and replace AaveMarkets[CHAIN.MEGAETH] in the CHAIN.MANTLE config).adapters/types.ts (1)
224-232: 🧹 Nitpick | 🔵 TrivialPre-existing syntax issue: double comma in array.
Line 230 has
'dailyNetIncome',, 'dailyPremiumVolume'with a double comma. This doesn't cause runtime errors but is a code smell.🧹 Fix double comma
- 'dailyOtherIncome', 'dailyOperatingIncome', 'dailyNetIncome',, 'dailyPremiumVolume', 'dailyNotionalVolume', + 'dailyOtherIncome', 'dailyOperatingIncome', 'dailyNetIncome', 'dailyPremiumVolume', 'dailyNotionalVolume',🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@adapters/types.ts` around lines 224 - 232, The whitelistedDimensionKeys array in adapters/types.ts contains a stray double comma between the strings 'dailyNetIncome' and 'dailyPremiumVolume', creating a syntax/formatting bug; fix it by editing the whitelistedDimensionKeys Set literal to remove the extra comma so entries read "'dailyNetIncome', 'dailyPremiumVolume'", then run the linter/TS compile to ensure no other trailing punctuation errors remain.
♻️ Duplicate comments (1)
liquidations/compound-v3/index.ts (1)
8-19:⚠️ Potential issue | 🟡 Minor
startDatesandCometAddressesare currently in sync, but the manual synchronization pattern should be improved.Both the liquidations adapter and the fees adapter currently support the same 9 chains (Ethereum, Polygon, Arbitrum, Base, Scroll, Optimism, Mantle, Linea, Unichain). However, the design requires developers to manually update
startDateswhenever new chains are added toCometAddressesinfees/compound-v3.ts. Consider exporting start dates from the fees adapter or using a shared configuration to prevent future misalignment.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/compound-v3/index.ts` around lines 8 - 19, The local startDates map is manually duplicated and can drift from the CometAddresses in fees/compound-v3.ts; instead export a shared startDates (or a combined config) from the fees adapter and import it here. Update this file to remove the hardcoded startDates, import the exported start date map (ensuring keys match the CometAddresses symbol in fees/compound-v3.ts) and use that imported object for backfill start dates; verify the imported object covers all chains present in CometAddresses and add/guard defaults if any chain is missing.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@factory/aaveLiquidations.ts`:
- Around line 35-51: Replace the separate handling of aaveV3ChainConfig and
sparkChainConfig by using the centralized aaveProtocolConfigs from helpers/aave:
import aaveProtocolConfigs, iterate its entries and for each protocol key (e.g.
"aave-v3", "spark") assign configs[protocol] = { pools: pools.map(p =>
p.lendingPoolProxy), start } (preserving the existing shape), and remove the
duplicate loops over aaveV3ChainConfig and sparkChainConfig; keep behavior
identical for pool extraction and start propagation to ensure backward
compatibility.
In `@factory/compoundV2.ts`:
- Around line 78-81: The options object in factory/compoundV2.ts contains a
typo: change the key options.protocolRevenueratio to
options.protocolRevenueRatio; update the object literal where options: {
protocolRevenueratio: 1 } to use the correct camelCase key and search for any
other occurrences of the misspelled symbol (protocolRevenueratio) in the
codebase to rename them to protocolRevenueRatio so callers reading options or
defaulting logic in the CompoundV2 factory use the correct property name.
In `@helpers/aave/liquidations.ts`:
- Around line 19-28: The loop over pools is sequential and waits for each
options.getLogs call; change it to parallel by mapping pools to promises and
using Promise.all to fetch all events concurrently (use pools.map(p =>
options.getLogs({ target: p, eventAbi: AaveAbis.LiquidationEvent }))). After
Promise.all resolves, iterate the array of events results and call
dailyLiquidations.add(...) and dailyLiquidationRepaidDebt.add(...) for each
event (preserve the existing aggregation logic), ensuring error handling if any
getLogs promise can fail.
In `@helpers/compoundV2.ts`:
- Around line 176-182: Add a JSDoc comment above the exported function
compoundV2LiquidationsExport describing its purpose (creates a SimpleAdapter
that exports Compound v2 liquidation data), document the parameters: "config"
(object mapping chain to comptroller and optional start string) and the options
object (with pullHourly?: boolean and otherRootOptions as a catch-all), and
document the return value as SimpleAdapter; use `@param` tags for each parameter
including types and optional fields and an `@returns` tag for the SimpleAdapter to
satisfy the helper-file documentation guideline.
- Around line 220-241: The loop calls options.getLogs per cToken which is slow;
replace the per-token calls with a single batched call using options.getLogs({
targets: cTokens, eventAbi: LiquidateBorrowEvent, flatten: false }) to fetch all
events while retaining which target they came from, then iterate the returned
per-target event groups and for each target use the same logic: lookup
debtUnderlying from underlyingMap[target.toLowerCase()], skip if missing,
iterate events to add repayAmount to dailyLiquidationRepaidDebt and compute
collateral underlying using event.cTokenCollateral.toLowerCase(),
exchangeRateMap and dailyLiquidations exactly as in the original inner loop so
behavior is unchanged.
In `@helpers/gmx.ts`:
- Around line 181-183: The code is recording collateral lost instead of
liquidation notional; replace usage of log.collateral with the liquidated
position's size (log.size) when updating dailyLiquidations so we track perp
liquidation volume (i.e., use Number(log.size) with the same scaling/divisor
currently applied to collateral), and update the comment to indicate we're using
liquidated position size (not collateral) for notional volume.
In `@liquidations/euler/index.ts`:
- Around line 5-6: The comment for SKIP_CHAINS is too vague—replace "stalls on
this chain" with a concise, specific explanation referencing why CHAIN.TAC is
skipped (e.g., frequent RPC timeouts, long block reorgs, indexer inconsistency)
and include any reproduction details or a link/ID to the issue/ticket and date;
update the comment immediately above const SKIP_CHAINS (and add a TODO if this
is temporary) so future maintainers see both the technical reason and any
tracking reference.
In `@liquidations/morpho/index.ts`:
- Around line 54-57: The adapter constant currently sets pullHourly: false which
prevents hourly pulls for this EVM log-based adapter; update the adapter object
(the const named adapter) to set pullHourly: true so the adapter uses hourly
pulls when processing on-chain events via getLogs(), and verify any downstream
code that assumes daily-only behavior still works with hourly granularity (check
the adapter property pullHourly and any pull scheduling logic that reads it).
In `@liquidations/silo-finance-v1/index.ts`:
- Around line 60-62: Current code spawns one RPC call per silo via Promise.all;
replace it with a single batched call to options.getLogs by passing targets:
siloAddresses and flatten: false so the RPC calls are batched. Update the
assignment where allEvents is computed (currently using siloAddresses.map +
Promise.all and eventAbi: Liquidate) to call options.getLogs({ targets:
siloAddresses, eventAbi: Liquidate, flatten: false }) and ensure downstream
logic still expects an array-of-arrays shape (or flatten explicitly if needed)
to preserve existing behavior.
- Around line 106-117: The seized-collateral is being recorded under the debt
asset key (event.asset) in the liquidation loop; instead, look up the
collateralToken from the previously fetched assetStorages for the corresponding
silo/asset pair and pass that collateralToken to dailyLiquidations.add when
event.seizedCollateral > 0. Locate the loop that iterates liquidationData (for
(const { silo, event } of liquidationData)) and replace the key used in
dailyLiquidations.add with the collateralToken resolved from assetStorages
(using the silo and event.asset as the lookup key), and if the lookup is missing
handle gracefully (skip or fallback) so you don’t record seizures under the
wrong asset.
In `@liquidations/xena-finance/index.ts`:
- Around line 14-17: The loop converts log.collateralValue to a JS Number which
can lose precision for large uint256 values; instead parse collateralValue as a
BigInt (e.g. BigInt(log.collateralValue)), perform the 1e30 scaling using BigInt
arithmetic or a bigint/decimal library, and then pass a safe high-precision
representation to dailyLiquidations.addUSDValue (update addUSDValue to accept a
string/BigInt/BigNumber if it currently requires a JS Number). Locate the
conversion in logs.forEach(...) and adjust either the division to use BigInt
math (or a BigNumber library) and the addUSDValue signature so no precision is
lost when handling very large collateralValue values.
---
Outside diff comments:
In `@adapters/types.ts`:
- Around line 224-232: The whitelistedDimensionKeys array in adapters/types.ts
contains a stray double comma between the strings 'dailyNetIncome' and
'dailyPremiumVolume', creating a syntax/formatting bug; fix it by editing the
whitelistedDimensionKeys Set literal to remove the extra comma so entries read
"'dailyNetIncome', 'dailyPremiumVolume'", then run the linter/TS compile to
ensure no other trailing punctuation errors remain.
In `@fees/aave-v3.ts`:
- Around line 317-319: Mapping for CHAIN.MANTLE is wrong: it currently assigns
pools: AaveMarkets[CHAIN.MEGAETH] which pulls MEGAETH pools; update the pools
entry to use the correct Mantle key (e.g., AaveMarkets[CHAIN.MANTLE] or the
actual Mantle market constant) so CHAIN.MANTLE points to its own AaveMarkets set
(check the AaveMarkets object for the exact Mantle key and replace
AaveMarkets[CHAIN.MEGAETH] in the CHAIN.MANTLE config).
---
Duplicate comments:
In `@liquidations/compound-v3/index.ts`:
- Around line 8-19: The local startDates map is manually duplicated and can
drift from the CometAddresses in fees/compound-v3.ts; instead export a shared
startDates (or a combined config) from the fees adapter and import it here.
Update this file to remove the hardcoded startDates, import the exported start
date map (ensuring keys match the CometAddresses symbol in fees/compound-v3.ts)
and use that imported object for backfill start dates; verify the imported
object covers all chains present in CometAddresses and add/guard defaults if any
chain is missing.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 503a5a61-9688-4c8c-85b5-4505ead3b7c7
📒 Files selected for processing (19)
adapters/types.tsfactory/aaveLiquidations.tsfactory/compoundV2.tsfactory/gmxV1.tsfactory/registry.tsfees/aave-v3.tsfees/compound-v3.tsfees/morpho/index.tsfees/spark/index.tshelpers/aave/index.tshelpers/aave/liquidations.tshelpers/compoundV2.tshelpers/gmx.tsliquidations/compound-v3/index.tsliquidations/euler/index.tsliquidations/fraxlend/index.tsliquidations/morpho/index.tsliquidations/silo-finance-v1/index.tsliquidations/xena-finance/index.ts
| // get aave-v3 fee config | ||
| configs["aave-v3"] = {}; | ||
| for (const [chain, { pools, start }] of Object.entries(aaveV3ChainConfig)) { | ||
| configs["aave-v3"][chain] = { | ||
| pools: (pools as Array<{ lendingPoolProxy: string }>).map((p) => p.lendingPoolProxy), | ||
| start, | ||
| }; | ||
| } | ||
|
|
||
| // get spark fee config | ||
| configs["spark"] = {}; | ||
| for (const [chain, { pools, start }] of Object.entries(sparkChainConfig)) { | ||
| configs["spark"][chain] = { | ||
| pools: pools.map((p) => p.lendingPoolProxy), | ||
| start, | ||
| }; | ||
| } |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Consider consolidating aave-v3 and spark into aaveProtocolConfigs.
These chain configs are handled separately from aaveProtocolConfigs. If they could be added to the centralized config in helpers/aave, future additions would automatically be picked up for both fees and liquidations.
Would you like me to investigate whether aave-v3 and spark can be integrated into aaveProtocolConfigs?
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@factory/aaveLiquidations.ts` around lines 35 - 51, Replace the separate
handling of aaveV3ChainConfig and sparkChainConfig by using the centralized
aaveProtocolConfigs from helpers/aave: import aaveProtocolConfigs, iterate its
entries and for each protocol key (e.g. "aave-v3", "spark") assign
configs[protocol] = { pools: pools.map(p => p.lendingPoolProxy), start }
(preserving the existing shape), and remove the duplicate loops over
aaveV3ChainConfig and sparkChainConfig; keep behavior identical for pool
extraction and start propagation to ensure backward compatibility.
| [CHAIN.HEMI]: { comptroller: "0xB2fF02eEF85DC4eaE95Ab32AA887E0cC69DF8d8E", start: '2025-03-06' }, | ||
| }, | ||
| options: { protocolRevenueratio: 1 }, | ||
| }, |
There was a problem hiding this comment.
Typo in options key: protocolRevenueratio should be protocolRevenueRatio.
This typo may cause the option to be silently ignored.
🐛 Fix typo
},
- options: { protocolRevenueratio: 1 },
+ options: { protocolRevenueRatio: 1 },
},📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| [CHAIN.HEMI]: { comptroller: "0xB2fF02eEF85DC4eaE95Ab32AA887E0cC69DF8d8E", start: '2025-03-06' }, | |
| }, | |
| options: { protocolRevenueratio: 1 }, | |
| }, | |
| [CHAIN.HEMI]: { comptroller: "0xB2fF02eEF85DC4eaE95Ab32AA887E0cC69DF8d8E", start: '2025-03-06' }, | |
| }, | |
| options: { protocolRevenueRatio: 1 }, | |
| }, |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@factory/compoundV2.ts` around lines 78 - 81, The options object in
factory/compoundV2.ts contains a typo: change the key
options.protocolRevenueratio to options.protocolRevenueRatio; update the object
literal where options: { protocolRevenueratio: 1 } to use the correct camelCase
key and search for any other occurrences of the misspelled symbol
(protocolRevenueratio) in the codebase to rename them to protocolRevenueRatio so
callers reading options or defaulting logic in the CompoundV2 factory use the
correct property name.
| for (const pool of pools) { | ||
| const events: any[] = await options.getLogs({ | ||
| target: pool, | ||
| eventAbi: AaveAbis.LiquidationEvent, | ||
| }) | ||
| for (const e of events) { | ||
| dailyLiquidations.add(e.collateralAsset, e.liquidatedCollateralAmount) | ||
| dailyLiquidationRepaidDebt.add(e.debtAsset, e.debtToCover) | ||
| } | ||
| } |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Consider parallelizing pool log fetches for better performance.
The sequential for...of loop with await inside fetches logs from each pool one at a time. For protocols with multiple pools, parallel fetching could improve performance.
♻️ Optional parallel fetch pattern
- for (const pool of pools) {
- const events: any[] = await options.getLogs({
- target: pool,
- eventAbi: AaveAbis.LiquidationEvent,
- })
- for (const e of events) {
- dailyLiquidations.add(e.collateralAsset, e.liquidatedCollateralAmount)
- dailyLiquidationRepaidDebt.add(e.debtAsset, e.debtToCover)
- }
- }
+ await Promise.all(pools.map(async (pool) => {
+ const events: any[] = await options.getLogs({
+ target: pool,
+ eventAbi: AaveAbis.LiquidationEvent,
+ })
+ for (const e of events) {
+ dailyLiquidations.add(e.collateralAsset, e.liquidatedCollateralAmount)
+ dailyLiquidationRepaidDebt.add(e.debtAsset, e.debtToCover)
+ }
+ }))🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@helpers/aave/liquidations.ts` around lines 19 - 28, The loop over pools is
sequential and waits for each options.getLogs call; change it to parallel by
mapping pools to promises and using Promise.all to fetch all events concurrently
(use pools.map(p => options.getLogs({ target: p, eventAbi:
AaveAbis.LiquidationEvent }))). After Promise.all resolves, iterate the array of
events results and call dailyLiquidations.add(...) and
dailyLiquidationRepaidDebt.add(...) for each event (preserve the existing
aggregation logic), ensuring error handling if any getLogs promise can fail.
| export function compoundV2LiquidationsExport( | ||
| config: { [chain: string]: { comptroller: string; start?: string } }, | ||
| { pullHourly = true, ...otherRootOptions }: { | ||
| pullHourly?: boolean | ||
| [key: string]: any | ||
| } = {}, | ||
| ): SimpleAdapter { |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Add JSDoc documentation for this public helper function.
Per coding guidelines for helpers, public functions should include JSDoc comments describing the function purpose, parameters, and return value.
+/**
+ * Creates a liquidation tracking adapter for Compound V2-style protocols.
+ * `@param` config - Chain configurations mapping chain name to comptroller address and optional start date
+ * `@param` options - Adapter options including pullHourly (default true)
+ * `@returns` SimpleAdapter configured for liquidation tracking
+ */
export function compoundV2LiquidationsExport(
config: { [chain: string]: { comptroller: string; start?: string } },
{ pullHourly = true, ...otherRootOptions }: {As per coding guidelines: "Include JSDoc comments for public functions in helper files"
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export function compoundV2LiquidationsExport( | |
| config: { [chain: string]: { comptroller: string; start?: string } }, | |
| { pullHourly = true, ...otherRootOptions }: { | |
| pullHourly?: boolean | |
| [key: string]: any | |
| } = {}, | |
| ): SimpleAdapter { | |
| /** | |
| * Creates a liquidation tracking adapter for Compound V2-style protocols. | |
| * `@param` config - Chain configurations mapping chain name to comptroller address and optional start date | |
| * `@param` options - Adapter options including pullHourly (default true) | |
| * `@returns` SimpleAdapter configured for liquidation tracking | |
| */ | |
| export function compoundV2LiquidationsExport( | |
| config: { [chain: string]: { comptroller: string; start?: string } }, | |
| { pullHourly = true, ...otherRootOptions }: { | |
| pullHourly?: boolean | |
| [key: string]: any | |
| } = {}, | |
| ): SimpleAdapter { |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@helpers/compoundV2.ts` around lines 176 - 182, Add a JSDoc comment above the
exported function compoundV2LiquidationsExport describing its purpose (creates a
SimpleAdapter that exports Compound v2 liquidation data), document the
parameters: "config" (object mapping chain to comptroller and optional start
string) and the options object (with pullHourly?: boolean and otherRootOptions
as a catch-all), and document the return value as SimpleAdapter; use `@param` tags
for each parameter including types and optional fields and an `@returns` tag for
the SimpleAdapter to satisfy the helper-file documentation guideline.
| for (const cToken of cTokens) { | ||
| const events: any[] = await options.getLogs({ | ||
| target: cToken, | ||
| eventAbi: LiquidateBorrowEvent, | ||
| }) | ||
| if (events.length === 0) continue | ||
|
|
||
| const debtUnderlying = underlyingMap[cToken.toLowerCase()] | ||
| if (!debtUnderlying) continue | ||
|
|
||
| for (const event of events) { | ||
| dailyLiquidationRepaidDebt.add(debtUnderlying, event.repayAmount) | ||
|
|
||
| const collateralCToken = event.cTokenCollateral.toLowerCase() | ||
| const collateralUnderlying = underlyingMap[collateralCToken] | ||
| const exchangeRate = exchangeRateMap[collateralCToken] | ||
| if (collateralUnderlying && exchangeRate) { | ||
| const underlyingAmount = (BigInt(event.seizeTokens) * exchangeRate) / BigInt(1e18) | ||
| dailyLiquidations.add(collateralUnderlying, underlyingAmount) | ||
| } | ||
| } | ||
| } |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Sequential getLogs calls may cause performance issues for protocols with many markets.
Each cToken requires a separate getLogs call. For protocols with many markets, this could be slow. Consider batching by using getLogs({ targets: cTokens, flatten: false }) to fetch all events at once while preserving the cToken association.
♻️ Suggested optimization
- for (const cToken of cTokens) {
- const events: any[] = await options.getLogs({
- target: cToken,
- eventAbi: LiquidateBorrowEvent,
- })
- if (events.length === 0) continue
-
- const debtUnderlying = underlyingMap[cToken.toLowerCase()]
- if (!debtUnderlying) continue
-
- for (const event of events) {
+ const allLogs = await options.getLogs({
+ targets: cTokens,
+ eventAbi: LiquidateBorrowEvent,
+ flatten: false,
+ })
+
+ for (let idx = 0; idx < cTokens.length; idx++) {
+ const cToken = cTokens[idx]
+ const events = allLogs[idx] as any[]
+ if (!events || events.length === 0) continue
+
+ const debtUnderlying = underlyingMap[cToken.toLowerCase()]
+ if (!debtUnderlying) continue
+
+ for (const event of events) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@helpers/compoundV2.ts` around lines 220 - 241, The loop calls options.getLogs
per cToken which is slow; replace the per-token calls with a single batched call
using options.getLogs({ targets: cTokens, eventAbi: LiquidateBorrowEvent,
flatten: false }) to fetch all events while retaining which target they came
from, then iterate the returned per-target event groups and for each target use
the same logic: lookup debtUnderlying from underlyingMap[target.toLowerCase()],
skip if missing, iterate events to add repayAmount to dailyLiquidationRepaidDebt
and compute collateral underlying using event.cTokenCollateral.toLowerCase(),
exchangeRateMap and dailyLiquidations exactly as in the original inner loop so
behavior is unchanged.
| // stalls on this chain | ||
| const SKIP_CHAINS = new Set<string>([CHAIN.TAC]) |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Document the reason for skipping TAC chain more explicitly.
The comment "stalls on this chain" is vague. Consider adding more context (e.g., RPC issues, indexer limitations) for future maintainers.
📝 Suggested documentation improvement
-// stalls on this chain
+// Skip TAC chain - RPC/indexer calls stall or timeout, possibly due to network reliability issues
const SKIP_CHAINS = new Set<string>([CHAIN.TAC])📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // stalls on this chain | |
| const SKIP_CHAINS = new Set<string>([CHAIN.TAC]) | |
| // Skip TAC chain - RPC/indexer calls stall or timeout, possibly due to network reliability issues | |
| const SKIP_CHAINS = new Set<string>([CHAIN.TAC]) |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@liquidations/euler/index.ts` around lines 5 - 6, The comment for SKIP_CHAINS
is too vague—replace "stalls on this chain" with a concise, specific explanation
referencing why CHAIN.TAC is skipped (e.g., frequent RPC timeouts, long block
reorgs, indexer inconsistency) and include any reproduction details or a link/ID
to the issue/ticket and date; update the comment immediately above const
SKIP_CHAINS (and add a TODO if this is temporary) so future maintainers see both
the technical reason and any tracking reference.
| const adapter: SimpleAdapter = { | ||
| version: 2, | ||
| pullHourly: false, | ||
| adapter: Object.fromEntries( |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Consider enabling pullHourly: true for EVM log-based adapters.
The adapter uses getLogs() for on-chain events. Setting pullHourly: true would avoid recomputing data for the same time period and provide more granular data, which aligns with repository guidelines for EVM log-based adapters.
const adapter: SimpleAdapter = {
version: 2,
- pullHourly: false,
+ pullHourly: true,
adapter: Object.fromEntries(📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const adapter: SimpleAdapter = { | |
| version: 2, | |
| pullHourly: false, | |
| adapter: Object.fromEntries( | |
| const adapter: SimpleAdapter = { | |
| version: 2, | |
| pullHourly: true, | |
| adapter: Object.fromEntries( |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@liquidations/morpho/index.ts` around lines 54 - 57, The adapter constant
currently sets pullHourly: false which prevents hourly pulls for this EVM
log-based adapter; update the adapter object (the const named adapter) to set
pullHourly: true so the adapter uses hourly pulls when processing on-chain
events via getLogs(), and verify any downstream code that assumes daily-only
behavior still works with hourly granularity (check the adapter property
pullHourly and any pull scheduling logic that reads it).
| const allEvents = await Promise.all( | ||
| siloAddresses.map(silo => options.getLogs({ target: silo, eventAbi: Liquidate })) | ||
| ) |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Consider batching getLogs calls instead of using Promise.all.
The current approach spawns parallel HTTP requests for each silo, which could overwhelm RPC endpoints when there are many silos. Using getLogs({ targets: siloAddresses, flatten: false }) would batch these into fewer requests.
♻️ Suggested optimization
- const allEvents = await Promise.all(
- siloAddresses.map(silo => options.getLogs({ target: silo, eventAbi: Liquidate }))
- )
+ const allEvents = await options.getLogs({
+ targets: siloAddresses,
+ eventAbi: Liquidate,
+ flatten: false,
+ })📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const allEvents = await Promise.all( | |
| siloAddresses.map(silo => options.getLogs({ target: silo, eventAbi: Liquidate })) | |
| ) | |
| const allEvents = await options.getLogs({ | |
| targets: siloAddresses, | |
| eventAbi: Liquidate, | |
| flatten: false, | |
| }) |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@liquidations/silo-finance-v1/index.ts` around lines 60 - 62, Current code
spawns one RPC call per silo via Promise.all; replace it with a single batched
call to options.getLogs by passing targets: siloAddresses and flatten: false so
the RPC calls are batched. Update the assignment where allEvents is computed
(currently using siloAddresses.map + Promise.all and eventAbi: Liquidate) to
call options.getLogs({ targets: siloAddresses, eventAbi: Liquidate, flatten:
false }) and ensure downstream logic still expects an array-of-arrays shape (or
flatten explicitly if needed) to preserve existing behavior.
| for (const { silo, event } of liquidationData) { | ||
| if (event.seizedCollateral > 0) { | ||
| dailyLiquidations.add(event.asset, event.seizedCollateral) | ||
| } | ||
| if (event.shareAmountRepaid > 0) { | ||
| const conv = debtConversionMap[`${silo}:${event.asset}`] | ||
| if (conv && conv.totalSupply > 0n) { | ||
| const underlyingRepaid = BigInt(event.shareAmountRepaid) * conv.totalBorrow / conv.totalSupply | ||
| dailyLiquidationRepaidDebt.add(conv.asset, underlyingRepaid) | ||
| } | ||
| } | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Search for Silo V1 Liquidate event documentation or implementation
rg -n -A 10 "event Liquidate" --type ts
# Check for any existing Silo adapter that might clarify the event structure
fd -i silo --type f --extension ts --exec cat {}Repository: DefiLlama/dimension-adapters
Length of output: 14866
🏁 Script executed:
# Read the full context of the silo-finance-v1 adapter
cat -n liquidations/silo-finance-v1/index.ts | head -120Repository: DefiLlama/dimension-adapters
Length of output: 5409
event.asset is the debt asset, not the collateral token. Line 108 should use the collateral token address instead.
In Silo's architecture, each debt asset has an associated collateral token. The assetStorage(address _asset) function returns collateralToken for the given debt asset, but at line 108, seized collateral is being recorded with event.asset (the debt asset) as the key instead of the actual collateralToken.
// Line 79-83: collateral token is retrieved but not used for seizure recording
const assetStorages = await options.api.multiCall({
calls: uniquePairs.map(p => ({ target: p.silo, params: [p.asset] })),
abi: 'function assetStorage(address _asset) view returns (address collateralToken, ...)',
...
})
// Line 108: incorrectly keys seizure by debt asset instead of collateral token
dailyLiquidations.add(event.asset, event.seizedCollateral)
The assetStorages data (containing collateralToken for each silo:asset pair) should be used to look up the correct collateral token when recording seized amounts.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@liquidations/silo-finance-v1/index.ts` around lines 106 - 117, The
seized-collateral is being recorded under the debt asset key (event.asset) in
the liquidation loop; instead, look up the collateralToken from the previously
fetched assetStorages for the corresponding silo/asset pair and pass that
collateralToken to dailyLiquidations.add when event.seizedCollateral > 0. Locate
the loop that iterates liquidationData (for (const { silo, event } of
liquidationData)) and replace the key used in dailyLiquidations.add with the
collateralToken resolved from assetStorages (using the silo and event.asset as
the lookup key), and if the lookup is missing handle gracefully (skip or
fallback) so you don’t record seizures under the wrong asset.
| logs.forEach((log: any) => { | ||
| // collateralValue is USD with 1e30 precision, same as GMX v1 | ||
| dailyLiquidations.addUSDValue(Number(log.collateralValue) / 1e30) | ||
| }) |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Potential precision loss with Number() for large USD values.
collateralValue is a uint256 that can exceed Number.MAX_SAFE_INTEGER (~9e15). Dividing by 1e30 mitigates this for most realistic values, but very large liquidations could still lose precision. Consider using BigInt arithmetic:
♻️ Suggested fix using BigInt
logs.forEach((log: any) => {
// collateralValue is USD with 1e30 precision, same as GMX v1
- dailyLiquidations.addUSDValue(Number(log.collateralValue) / 1e30)
+ dailyLiquidations.addUSDValue(Number(BigInt(log.collateralValue) / BigInt(1e15)) / 1e15)
})🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@liquidations/xena-finance/index.ts` around lines 14 - 17, The loop converts
log.collateralValue to a JS Number which can lose precision for large uint256
values; instead parse collateralValue as a BigInt (e.g.
BigInt(log.collateralValue)), perform the 1e30 scaling using BigInt arithmetic
or a bigint/decimal library, and then pass a safe high-precision representation
to dailyLiquidations.addUSDValue (update addUSDValue to accept a
string/BigInt/BigNumber if it currently requires a JS Number). Locate the
conversion in logs.forEach(...) and adjust either the division to use BigInt
math (or a BigNumber library) and the addUSDValue signature so no precision is
lost when handling very large collateralValue values.
|
The aave-v3.ts adapter exports: |
There was a problem hiding this comment.
Actionable comments posted: 7
♻️ Duplicate comments (6)
liquidations/morpho/index.ts (1)
54-57:⚠️ Potential issue | 🟡 MinorEnable hourly pulls for this EVM log-based liquidation adapter.
pullHourlyis currentlyfalse, which conflicts with the repo’s EVM log pull guidance.Based on learnings: "Applies to adapters//evm//*.ts : Use
pullHourly: truewherever evm logs and allium queries are used to avoid recomputing data for the same time period and provide more granular data".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/morpho/index.ts` around lines 54 - 57, The adapter's configuration currently sets pullHourly: false; update the SimpleAdapter object (the adapter constant named "adapter") to use pullHourly: true for this EVM log-based liquidation adapter so hourly pulls are enabled (i.e., change the pullHourly property on the adapter object to true), and run tests/checks to ensure any callers of the adapter handle hourly pulls correctly.helpers/compoundV2.ts (2)
220-241:⚠️ Potential issue | 🟠 MajorBatch
LiquidateBorrowlog fetching instead of sequential per-cToken calls.Awaiting
getLogsin a loop will amplify latency and RPC fragility; this path should fetch in batch and then iterate grouped results.As per coding guidelines: "Use batching with api.multiCall for batch operations in helpers instead of sequential calls" and "helpers/** ... Use api.multiCall for batching, not Promise.all".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@helpers/compoundV2.ts` around lines 220 - 241, The loop currently awaits options.getLogs per cToken (symbols: cTokens, options.getLogs, LiquidateBorrowEvent) causing sequential RPC calls; instead build an array of targets (cTokens), call the provider batching helper (options.api.multiCall or api.multiCall) to fetch LiquidateBorrowEvent logs in one batch, then iterate the returned grouped results and apply the existing logic that updates dailyLiquidationDebtRepaid.add(debtUnderlying, event.repayAmount) and dailyLiquidationCollateral.add(collateralUnderlying, underlyingAmount) using underlyingMap, exchangeRateMap and exchangeRate (calculate underlyingAmount from event.seizeTokens and exchangeRate); remove the per-iteration await and do not use Promise.all so all logs are fetched via api.multiCall and then processed per cToken.
176-182: 🛠️ Refactor suggestion | 🟠 MajorAdd JSDoc for this exported helper function.
This helper is public in
helpers/and should have function-level JSDoc.As per coding guidelines: "Include JSDoc comments for public functions in helper files".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@helpers/compoundV2.ts` around lines 176 - 182, Add a JSDoc block above the exported function compoundV2LiquidationsExport describing the function's purpose, its parameters and types (config: object mapping chain to { comptroller: string; start?: string }, and the options object with pullHourly?: boolean plus otherRootOptions), the return type (SimpleAdapter), and any important behavior (defaults, side effects). Reference the parameter names config, pullHourly, and otherRootOptions in the doc and include a brief example or usage note and `@returns` and `@param` tags to satisfy the public helper documentation guideline.liquidations/compound-v3/index.ts (1)
8-19: 🧹 Nitpick | 🔵 TrivialAvoid duplicating start/deployment metadata; source it from the existing fees config.
startDateshere can drift from the canonical Compound V3 config over time.As per coding guidelines: "Import chain configs, addresses, and deployment blocks from existing
fees/adapters to avoid duplication".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/compound-v3/index.ts` around lines 8 - 19, Replace the duplicated startDates record with the canonical deployment/start metadata imported from the existing fees adapter for Compound V3 (e.g., import the startDates or deploymentBlocks exported by fees/compound-v3.ts) and update any usages to reference that imported symbol instead of the local startDates constant; remove the local const startDates declaration and ensure the file uses the imported name (e.g., startDates or deploymentBlocks) so chain start dates stay in sync with the fees config.liquidations/silo-finance-v1/index.ts (2)
60-62:⚠️ Potential issue | 🟠 MajorBatch the
Liquidatescans per silo.This still does one
getLogsrequest per silo, andPromise.allfails the whole adapter on the first bad RPC response. Given the RPC instability already showing up in this PR, this pattern is likely to make backfills flaky. Please use the batchedoptions.getLogs({ targets: siloAddresses, eventAbi: Liquidate, flatten: false })path here.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/silo-finance-v1/index.ts` around lines 60 - 62, The current code calls options.getLogs per silo inside Promise.all (allEvents) which issues one RPC per silo and fails entirely on the first RPC error; change to the batched API by calling options.getLogs with targets: siloAddresses, eventAbi: Liquidate and flatten: false so the provider performs a single batched request and returns per-silo arrays; update the allEvents assignment to use options.getLogs({ targets: siloAddresses, eventAbi: Liquidate, flatten: false }) and handle the resulting nested array shape accordingly instead of assuming Promise.all results.
79-83:⚠️ Potential issue | 🔴 CriticalBook seized collateral under
collateralToken, notevent.asset.Line 108 records
seizedCollateralagainstevent.asset, but the precedingassetStorage()lookup already resolves a distinctcollateralTokenfor that samesilo:assetpair. Whenever those differ,dailyLiquidationCollateralis attributed to the wrong token. CarrycollateralTokenthrough the lookup map and use that address for the collateral side.🐛 Minimal fix
- const debtConversionMap: Record<string, { asset: string; totalBorrow: bigint; totalSupply: bigint }> = {} + const debtConversionMap: Record<string, { asset: string; collateralToken: string; totalBorrow: bigint; totalSupply: bigint }> = {} @@ debtConversionMap[uniqueKeys[i]] = { asset: uniquePairs[i].asset, + collateralToken: storage.collateralToken, totalBorrow: BigInt(storage.totalBorrowAmount), totalSupply: BigInt(debtTokenSupplies[supplyIdx]), } } for (const { silo, event } of liquidationData) { - if (event.seizedCollateral > 0) { - dailyLiquidationCollateral.add(event.asset, event.seizedCollateral) - } + const conv = debtConversionMap[`${silo}:${event.asset}`] + if (event.seizedCollateral > 0 && conv?.collateralToken) { + dailyLiquidationCollateral.add(conv.collateralToken, event.seizedCollateral) + } if (event.shareAmountRepaid > 0) { - const conv = debtConversionMap[`${silo}:${event.asset}`] if (conv && conv.totalSupply > 0n) { const underlyingRepaid = BigInt(event.shareAmountRepaid) * conv.totalBorrow / conv.totalSupply dailyLiquidationDebtRepaid.add(conv.asset, underlyingRepaid)Also applies to: 93-115
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/silo-finance-v1/index.ts` around lines 79 - 83, The code currently attributes seized collateral to event.asset instead of the collateralToken returned by the assetStorage lookup; update the assetStorages mapping (from the multiCall over uniquePairs) to retain collateralToken per pair (e.g., include collateralToken in the lookup results keyed by silo+asset), then replace uses of event.asset when incrementing seizedCollateral/dailyLiquidationCollateral with the resolved collateralToken address (use the stored collateralToken from assetStorages for the corresponding uniquePairs entry wherever seizedCollateral or dailyLiquidationCollateral is updated).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@helpers/aave/liquidations.ts`:
- Around line 4-10: Add TypeScript interfaces and a JSDoc block for the exported
helper aaveLiquidationsExport: define an interface for the config shape (mapping
chain => { pools: string[]; start?: string }), an interface for root options
instead of {[key: string]: any} (including pullHourly?: boolean and any known
options), and an interface for the decoded liquidation event with typed fields
liquidatorAddress, liquidatee, collateralAsset, debtAsset, debtToCover,
liquidatedCollateralAmount (and any other fields used). Replace occurrences of
untyped params and casts (e.g., the function signature’s root options and
.map((log: any) => ...)) with these interfaces and use them in the function body
and return types, and add a JSDoc block above aaveLiquidationsExport describing
purpose, params, and return value.
In `@helpers/gmx.ts`:
- Around line 165-183: The exported helper gmxV1LiquidationsExports lacks JSDoc
and uses implicit any for logs; add a JSDoc block above gmxV1LiquidationsExports
describing parameters/return (config:
IJSON<{vault:string,start:string,deadFrom?:string}> and returns BaseAdapter) and
document behavior; replace the untyped logs and log callback parameter by
defining a concrete interface for the LiquidatePosition event (fields: key,
account, collateralToken, indexToken, isLong, size, collateral, reserveAmount,
realisedPnl, markPrice with appropriate types) and type logs as
Array<LiquidatePositionLog> (or similar), then update the forEach signature to
use that type and keep use of
dailyLiquidationCollateral.addUSDValue(Number(log.collateral) / 1e30). Ensure to
import or declare FetchOptions/BaseAdapter types if needed.
In `@liquidations/compound-v3/index.ts`:
- Around line 39-54: The code currently adds raw token amounts from
AbsorbCollateralEvent and AbsorbDebtEvent to dailyLiquidationCollateral.add(...)
and dailyLiquidationDebtRepaid.add(...); update these to use the emitted
usdValue (8-decimal) instead: replace dailyLiquidationCollateral.add(e.asset,
e.collateralAbsorbed) with a call to
dailyLiquidationCollateral.addUSDValue(Number(e.usdValue) / 1e8) for
AbsorbCollateralEvent, and replace dailyLiquidationDebtRepaid.add(baseToken,
e.basePaidOut) with dailyLiquidationDebtRepaid.addUSDValue(Number(e.usdValue) /
1e8) for AbsorbDebtEvent so metrics use oracle-priced USD notional.
In `@liquidations/fraxlend/index.ts`:
- Around line 16-19: Replace the direct call that uses registries[options.chain]
and getAllPairAddresses (the code assigning `pairs` via `options.api.call`) with
logic that enumerates Fraxlend pair addresses by reading the factory `Create`
events (cached with `cacheInCloud: true`) for the Fraxlend factory;
specifically, call the factory event enumeration (filtering on the `Create`
event) to build the pair list, store/cache those addresses with `cacheInCloud:
true`, then iterate those instances to fetch `Liquidate` events from each pair
instance instead of calling `getAllPairAddresses()`. Ensure you keep references
to `registries[options.chain]`/factory address and replace the
`getAllPairAddresses` ABI usage with the event-based enumeration and caching.
- Around line 26-38: The current code uses an unbounded Promise.all over
pairs.map calling options.getLogs per pair (with eventAbi Liquidate) which can
spike RPC load and fail on a single rejection; change this to a batched call
pattern: either use api.multiCall (or options.getLogs with a single call
supplying targets array) to fetch logs for all pairs in one request and then
iterate grouped results to call dailyLiquidationCollateral.add and
dailyLiquidationDebtRepaid.add, or replace Promise.all with a controlled
concurrency pool (e.g., PromisePool) to limit parallel getLogs calls; ensure you
still skip missing borrowAsset/collateralAsset and preserve per-event processing
but remove the unbounded Promise.all over pairs.map and the per-pair direct
options.getLogs calls.
In `@liquidations/GUIDELINES.md`:
- Around line 20-24: Fix markdownlint MD022/MD031 by ensuring blank lines
surround headings and fenced code blocks: add a blank line above and below the
"Aave Forks and Compound V2 Forks" heading, add blank lines before and after the
fenced code block starting with ```typescript, and ensure the "Singleton
Contracts (Morpho Blue, Compound V3)" heading likewise has an empty line above
and below it so all headings and code fences follow linting rules.
In `@liquidations/xena-finance/index.ts`:
- Around line 4-11: The file hardcodes the Xena vault address via the VAULT
constant inside the fetch function; instead import the vault address (and any
related deployment block/config) from the existing fees adapter exports and use
those instead of the hardcoded VAULT. Replace the VAULT constant and any direct
references (e.g., the target passed to options.getLogs and any deployment-block
usage) with the imported symbol(s) from the fees xena-finance adapter so both
adapters share the single source-of-truth.
---
Duplicate comments:
In `@helpers/compoundV2.ts`:
- Around line 220-241: The loop currently awaits options.getLogs per cToken
(symbols: cTokens, options.getLogs, LiquidateBorrowEvent) causing sequential RPC
calls; instead build an array of targets (cTokens), call the provider batching
helper (options.api.multiCall or api.multiCall) to fetch LiquidateBorrowEvent
logs in one batch, then iterate the returned grouped results and apply the
existing logic that updates dailyLiquidationDebtRepaid.add(debtUnderlying,
event.repayAmount) and dailyLiquidationCollateral.add(collateralUnderlying,
underlyingAmount) using underlyingMap, exchangeRateMap and exchangeRate
(calculate underlyingAmount from event.seizeTokens and exchangeRate); remove the
per-iteration await and do not use Promise.all so all logs are fetched via
api.multiCall and then processed per cToken.
- Around line 176-182: Add a JSDoc block above the exported function
compoundV2LiquidationsExport describing the function's purpose, its parameters
and types (config: object mapping chain to { comptroller: string; start?: string
}, and the options object with pullHourly?: boolean plus otherRootOptions), the
return type (SimpleAdapter), and any important behavior (defaults, side
effects). Reference the parameter names config, pullHourly, and otherRootOptions
in the doc and include a brief example or usage note and `@returns` and `@param`
tags to satisfy the public helper documentation guideline.
In `@liquidations/compound-v3/index.ts`:
- Around line 8-19: Replace the duplicated startDates record with the canonical
deployment/start metadata imported from the existing fees adapter for Compound
V3 (e.g., import the startDates or deploymentBlocks exported by
fees/compound-v3.ts) and update any usages to reference that imported symbol
instead of the local startDates constant; remove the local const startDates
declaration and ensure the file uses the imported name (e.g., startDates or
deploymentBlocks) so chain start dates stay in sync with the fees config.
In `@liquidations/morpho/index.ts`:
- Around line 54-57: The adapter's configuration currently sets pullHourly:
false; update the SimpleAdapter object (the adapter constant named "adapter") to
use pullHourly: true for this EVM log-based liquidation adapter so hourly pulls
are enabled (i.e., change the pullHourly property on the adapter object to
true), and run tests/checks to ensure any callers of the adapter handle hourly
pulls correctly.
In `@liquidations/silo-finance-v1/index.ts`:
- Around line 60-62: The current code calls options.getLogs per silo inside
Promise.all (allEvents) which issues one RPC per silo and fails entirely on the
first RPC error; change to the batched API by calling options.getLogs with
targets: siloAddresses, eventAbi: Liquidate and flatten: false so the provider
performs a single batched request and returns per-silo arrays; update the
allEvents assignment to use options.getLogs({ targets: siloAddresses, eventAbi:
Liquidate, flatten: false }) and handle the resulting nested array shape
accordingly instead of assuming Promise.all results.
- Around line 79-83: The code currently attributes seized collateral to
event.asset instead of the collateralToken returned by the assetStorage lookup;
update the assetStorages mapping (from the multiCall over uniquePairs) to retain
collateralToken per pair (e.g., include collateralToken in the lookup results
keyed by silo+asset), then replace uses of event.asset when incrementing
seizedCollateral/dailyLiquidationCollateral with the resolved collateralToken
address (use the stored collateralToken from assetStorages for the corresponding
uniquePairs entry wherever seizedCollateral or dailyLiquidationCollateral is
updated).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 7ef8afab-ec81-49f6-8dc8-8d39a0cb99fa
📒 Files selected for processing (11)
adapters/types.tshelpers/aave/liquidations.tshelpers/compoundV2.tshelpers/gmx.tsliquidations/GUIDELINES.mdliquidations/compound-v3/index.tsliquidations/euler/index.tsliquidations/fraxlend/index.tsliquidations/morpho/index.tsliquidations/silo-finance-v1/index.tsliquidations/xena-finance/index.ts
| export function aaveLiquidationsExport( | ||
| config: { [chain: string]: { pools: string[]; start?: string } }, | ||
| { pullHourly = true, ...otherRootOptions }: { | ||
| pullHourly?: boolean | ||
| [key: string]: any | ||
| } = {}, | ||
| ): SimpleAdapter { |
There was a problem hiding this comment.
🛠️ Refactor suggestion | 🟠 Major
Add TypeScript interfaces and JSDoc documentation to the exported helper function.
aaveLiquidationsExport is a public helper API exported without JSDoc documentation, and it uses untyped parameters ([key: string]: any for root options) and untyped decoded logs (.map((log: any) =>). Field access on log for liquidatorAddress, liquidatee, collateralAsset, debtAsset, debtToCover, and liquidatedCollateralAmount will only fail at runtime if field names drift. Define interfaces for the config object shape, root options, and liquidation event structure, then add a JSDoc block describing the function's purpose and parameters.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@helpers/aave/liquidations.ts` around lines 4 - 10, Add TypeScript interfaces
and a JSDoc block for the exported helper aaveLiquidationsExport: define an
interface for the config shape (mapping chain => { pools: string[]; start?:
string }), an interface for root options instead of {[key: string]: any}
(including pullHourly?: boolean and any known options), and an interface for the
decoded liquidation event with typed fields liquidatorAddress, liquidatee,
collateralAsset, debtAsset, debtToCover, liquidatedCollateralAmount (and any
other fields used). Replace occurrences of untyped params and casts (e.g., the
function signature’s root options and .map((log: any) => ...)) with these
interfaces and use them in the function body and return types, and add a JSDoc
block above aaveLiquidationsExport describing purpose, params, and return value.
| export const gmxV1LiquidationsExports = (config: IJSON<{ | ||
| vault: string, | ||
| start: string, | ||
| deadFrom?: string, | ||
| }>) => { | ||
| const exportObject: BaseAdapter = {} | ||
| Object.entries(config).forEach(([chain, chainConfig]) => { | ||
| exportObject[chain] = { | ||
| fetch: async (options: FetchOptions) => { | ||
| const dailyLiquidationCollateral = options.createBalances() | ||
| const logs = await options.getLogs({ | ||
| target: chainConfig.vault, | ||
| eventAbi: 'event LiquidatePosition(bytes32 key,address account,address collateralToken,address indexToken,bool isLong,uint256 size,uint256 collateral,uint256 reserveAmount,int256 realisedPnl,uint256 markPrice)', | ||
| }) | ||
|
|
||
| logs.forEach((log: any) => { | ||
| // Using collateral (not size) to reflect value lost by position owner | ||
| dailyLiquidationCollateral.addUSDValue(Number(log.collateral) / 1e30) | ||
| }) |
There was a problem hiding this comment.
🛠️ Refactor suggestion | 🟠 Major
Add JSDoc and concrete log typing for the new public helper.
gmxV1LiquidationsExports is exported from helpers/ but has no JSDoc, and the liquidation event loop uses any for logs.
♻️ Proposed refactor
+type GmxLiquidatePositionLog = {
+ collateral: string | number | bigint
+}
+
+/**
+ * Creates liquidation adapters for GMX v1 vaults by chain.
+ * Tracks collateral seized from `LiquidatePosition` events.
+ */
export const gmxV1LiquidationsExports = (config: IJSON<{
@@
- logs.forEach((log: any) => {
+ logs.forEach((log: GmxLiquidatePositionLog) => {
// Using collateral (not size) to reflect value lost by position owner
dailyLiquidationCollateral.addUSDValue(Number(log.collateral) / 1e30)
})As per coding guidelines: "Include JSDoc comments for public functions in helper files" and "All helpers must have proper TypeScript types with no implicit any".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@helpers/gmx.ts` around lines 165 - 183, The exported helper
gmxV1LiquidationsExports lacks JSDoc and uses implicit any for logs; add a JSDoc
block above gmxV1LiquidationsExports describing parameters/return (config:
IJSON<{vault:string,start:string,deadFrom?:string}> and returns BaseAdapter) and
document behavior; replace the untyped logs and log callback parameter by
defining a concrete interface for the LiquidatePosition event (fields: key,
account, collateralToken, indexToken, isLong, size, collateral, reserveAmount,
realisedPnl, markPrice with appropriate types) and type logs as
Array<LiquidatePositionLog> (or similar), then update the forEach signature to
use that type and keep use of
dailyLiquidationCollateral.addUSDValue(Number(log.collateral) / 1e30). Ensure to
import or declare FetchOptions/BaseAdapter types if needed.
liquidations/compound-v3/index.ts
Outdated
| const absorbCollateralEvents: any[] = await options.getLogs({ | ||
| target: comet, | ||
| eventAbi: AbsorbCollateralEvent, | ||
| }) | ||
| for (const e of absorbCollateralEvents) { | ||
| dailyLiquidationCollateral.add(e.asset, e.collateralAbsorbed) | ||
| } | ||
|
|
||
| if (baseToken) { | ||
| const absorbDebtEvents: any[] = await options.getLogs({ | ||
| target: comet, | ||
| eventAbi: AbsorbDebtEvent, | ||
| }) | ||
| for (const e of absorbDebtEvents) { | ||
| dailyLiquidationDebtRepaid.add(baseToken, e.basePaidOut) | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Compound V3 Comet events AbsorbCollateralandAbsorbDebt`:
- What precision/decimals is
usdValueemitted in? - Is it intended to represent USD notional suitable for direct aggregation in liquidation metrics?`
💡 Result:
usdValueprecision/decimals
usdValue is emitted in USD with 8 decimals (i.e., scaled by 1e8, same as the protocol’s Chainlink price-feed precision). This follows from:
- Comet requiring price feeds to have 8 decimals (
PRICE_FEED_DECIMALS = 8). [1] getPrice()returning a USD price with 8 decimals. [2]- The
AbsorbDebt/AbsorbCollateralusdValuebeing computed viamulPrice(amount, price, scale)(so it inherits the price’s 1e8 scaling). [3]
- Is it intended as USD notional suitable for direct aggregation?
Yes—conceptually it is “USD notional at oracle price” and is suitable for direct aggregation into liquidation/absorption metrics (e.g., “total collateral absorbed (USD)” / “total debt absorbed (USD)”), as long as you interpret it correctly as mark-to-oracle notional, not realized proceeds. [4]
Practical caveats when aggregating:
- It’s oracle-priced at the time of absorption (Chainlink 8-decimal USD), so it won’t equal realized sale value/slippage; realized execution is better inferred from subsequent
BuyCollateralflows. [2][4] - It is integer math (truncation/rounding), so sums are directionally correct but may have small rounding error. [3]
Sources:
[1] CometCore.sol (PRICE_FEED_DECIMALS = 8). (github.com)
[2] Compound III docs: getPrice “returns the price … in USD with 8 decimal places”. (docs.compound.finance)
[3] OpenZeppelin audit excerpt showing valueOfDebtAbsorbed = mulPrice(debtAbsorbed, basePrice, baseScale) and emit AbsorbDebt(..., valueOfDebtAbsorbed). (blog.openzeppelin.com)
[4] Event definitions for AbsorbDebt / AbsorbCollateral include usdValue explicitly as part of those absorption events. (github.com)
Citations:
- 1: https://github.com/compound-finance/comet/blob/main/contracts/CometCore.sol
- 2: https://docs.compound.finance/helper-functions/?utm_source=openai
- 3: https://blog.openzeppelin.com/compound-iii-audit?utm_source=openai
- 4: https://github.com/compound-finance/comet/blob/main/contracts/CometMainInterface.sol
🏁 Script executed:
# Check if the file exists and look at the event definitions
find . -name "index.ts" -path "*/liquidations/compound-v3/*" | head -5Repository: DefiLlama/dimension-adapters
Length of output: 107
🏁 Script executed:
# Search for AbsorbCollateralEvent and AbsorbDebtEvent definitions
rg "AbsorbCollateralEvent|AbsorbDebtEvent" --type ts -B 2 -A 5Repository: DefiLlama/dimension-adapters
Length of output: 1988
🏁 Script executed:
# Check for addUSDValue usage in other liquidation adapters to understand the pattern
rg "addUSDValue" liquidations/ --type ts -B 2 -A 2Repository: DefiLlama/dimension-adapters
Length of output: 428
🏁 Script executed:
# Look for addUSDValue signature and more usage examples
rg "addUSDValue\(" liquidations/ --type ts -B 1 -A 1 | head -40Repository: DefiLlama/dimension-adapters
Length of output: 326
🏁 Script executed:
# Check the balances class/interface definition to understand addUSDValue signature
rg "addUSDValue" --type ts -A 3 | grep -E "(function|method|interface|class)" -A 5 | head -20Repository: DefiLlama/dimension-adapters
Length of output: 1158
🏁 Script executed:
# Look at the actual file context around lines 39-54 to see full picture
cat -n liquidations/compound-v3/index.ts | sed -n '30,65p'Repository: DefiLlama/dimension-adapters
Length of output: 1243
Switch to addUSDValue() for Compound V3 liquidation dimensions using the emitted usdValue field.
The AbsorbCollateral and AbsorbDebt events expose usdValue (in 8-decimal precision), which should replace token-amount accounting. Update lines 44 and 53:
dailyLiquidationCollateral.addUSDValue(Number(e.usdValue) / 1e8)
dailyLiquidationDebtRepaid.addUSDValue(Number(e.usdValue) / 1e8)
This aligns with coding guidelines and ensures liquidation metrics use oracle-priced USD notional rather than unpriced token amounts.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@liquidations/compound-v3/index.ts` around lines 39 - 54, The code currently
adds raw token amounts from AbsorbCollateralEvent and AbsorbDebtEvent to
dailyLiquidationCollateral.add(...) and dailyLiquidationDebtRepaid.add(...);
update these to use the emitted usdValue (8-decimal) instead: replace
dailyLiquidationCollateral.add(e.asset, e.collateralAbsorbed) with a call to
dailyLiquidationCollateral.addUSDValue(Number(e.usdValue) / 1e8) for
AbsorbCollateralEvent, and replace dailyLiquidationDebtRepaid.add(baseToken,
e.basePaidOut) with dailyLiquidationDebtRepaid.addUSDValue(Number(e.usdValue) /
1e8) for AbsorbDebtEvent so metrics use oracle-priced USD notional.
| const pairs: string[] = await options.api.call({ | ||
| target: registries[options.chain], | ||
| abi: 'function getAllPairAddresses() view returns (address[])', | ||
| }) |
There was a problem hiding this comment.
Use factory Create event enumeration (cached) for Fraxlend pair discovery.
For Fraxlend, pair instance discovery should come from factory creation logs with cacheInCloud: true instead of direct address-list calls.
As per coding guidelines: "For Factory-deployed Contracts (Euler, Silo, Fraxlend), enumerate instances from factory Create events with cacheInCloud: true, then fetch Liquidate events from each instance".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@liquidations/fraxlend/index.ts` around lines 16 - 19, Replace the direct call
that uses registries[options.chain] and getAllPairAddresses (the code assigning
`pairs` via `options.api.call`) with logic that enumerates Fraxlend pair
addresses by reading the factory `Create` events (cached with `cacheInCloud:
true`) for the Fraxlend factory; specifically, call the factory event
enumeration (filtering on the `Create` event) to build the pair list,
store/cache those addresses with `cacheInCloud: true`, then iterate those
instances to fetch `Liquidate` events from each pair instance instead of calling
`getAllPairAddresses()`. Ensure you keep references to
`registries[options.chain]`/factory address and replace the
`getAllPairAddresses` ABI usage with the event-based enumeration and caching.
| const VAULT = '0x22787c26bb0ab0d331eb840ff010855a70a0dca6' | ||
|
|
||
| const fetch = async (options: FetchOptions) => { | ||
| const dailyLiquidationCollateral = options.createBalances() | ||
|
|
||
| const logs = await options.getLogs({ | ||
| target: VAULT, | ||
| eventAbi: 'event LiquidatePosition(bytes32 indexed key, address account, address collateralToken, address indexToken, uint8 side, uint256 size, uint256 collateralValue, uint256 reserveAmount, uint256 indexPrice, int256 pnl, uint256 feeValue)', |
There was a problem hiding this comment.
🛠️ Refactor suggestion | 🟠 Major
Reuse Xena vault config from the fees adapter instead of hardcoding it here.
This duplicates core config and can drift from fees/xena-finance.ts.
As per coding guidelines: "Import chain configs, addresses, and deployment blocks from existing fees/ adapters to avoid duplication".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@liquidations/xena-finance/index.ts` around lines 4 - 11, The file hardcodes
the Xena vault address via the VAULT constant inside the fetch function; instead
import the vault address (and any related deployment block/config) from the
existing fees adapter exports and use those instead of the hardcoded VAULT.
Replace the VAULT constant and any direct references (e.g., the target passed to
options.getLogs and any deployment-block usage) with the imported symbol(s) from
the fees xena-finance adapter so both adapters share the single source-of-truth.
There was a problem hiding this comment.
♻️ Duplicate comments (15)
helpers/compoundV2.ts (2)
176-182: 🧹 Nitpick | 🔵 TrivialAdd JSDoc documentation for this public helper function.
Per coding guidelines, public helper functions should include JSDoc comments describing the function purpose, parameters, and return value.
♻️ Proposed JSDoc
+/** + * Creates a liquidation tracking adapter for Compound V2-style protocols. + * Tracks collateral seized from LiquidateBorrow events, converting cToken amounts + * to underlying using exchangeRateStored. + * `@param` config - Chain configurations mapping chain name to comptroller address and optional start date + * `@param` options - Adapter options including pullHourly (default true) + * `@returns` SimpleAdapter configured for liquidation tracking + */ export function compoundV2LiquidationsExport( config: { [chain: string]: { comptroller: string; start?: string } },🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@helpers/compoundV2.ts` around lines 176 - 182, Add a JSDoc block above the exported public helper function compoundV2LiquidationsExport describing its purpose (exports liquidations adapters for Compound V2 across chains), document the config parameter shape (object mapping chain to { comptroller: string; start?: string }), document the options object including pullHourly?: boolean and that otherRootOptions captures additional root options, and state the return type is SimpleAdapter; include examples/usage and tag the function as exported/public if desired.
219-235: 🧹 Nitpick | 🔵 TrivialConsider batching
getLogscalls for better performance.The sequential
for...ofloop fetches logs from each cToken one at a time. For protocols with many markets, parallel fetching usinggetLogs({ targets: cTokens, flatten: false })could improve performance.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@helpers/compoundV2.ts` around lines 219 - 235, The loop is calling options.getLogs for each cToken sequentially; refactor to call options.getLogs once with { targets: cTokens, eventAbi: LiquidateBorrowEvent, flatten: false } to fetch all logs in parallel, then iterate the returned per-target arrays (keeping association to each cToken) and process events the same way using collateralCToken, underlyingMap, exchangeRateMap and dailyCollateralLiquidated (compute underlyingAmount from event.seizeTokens and exchangeRate as before).liquidations/compound-v3/index.ts (1)
33-35:⚠️ Potential issue | 🟠 MajorUse
addUSDValue()with the event'susdValuefield for accuracy during volatile periods.The
AbsorbCollateralevent includes ausdValuefield (8-decimal USD precision) which provides oracle-priced accuracy at liquidation time. Per the liquidation guidelines, this is preferred over token-denominated amounts.🐛 Proposed fix
for (const e of absorbCollateralEvents) { - dailyCollateralLiquidated.add(e.asset, e.collateralAbsorbed) + dailyCollateralLiquidated.addUSDValue(Number(e.usdValue) / 1e8) }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/compound-v3/index.ts` around lines 33 - 35, The loop over absorbCollateralEvents currently calls dailyCollateralLiquidated.add(e.asset, e.collateralAbsorbed), which uses token amounts; change it to use the event's USD field by replacing that call with dailyCollateralLiquidated.addUSDValue(e.asset, e.usdValue) so liquidations are recorded in oracle-priced USD (8-decimal) per AbsorbCollateral event; update any related typings/imports if addUSDValue is not present on the dailyCollateralLiquidated helper.helpers/gmx.ts (1)
165-184: 🧹 Nitpick | 🔵 TrivialAdd JSDoc and typed log interface for this public helper.
The exported
gmxV1LiquidationsExportsfunction lacks JSDoc documentation and usesanyfor the log type at line 181. Define an interface for theLiquidatePositionevent fields and add JSDoc describing the function's purpose.♻️ Proposed refactor
+interface GmxLiquidatePositionLog { + key: string + account: string + collateralToken: string + indexToken: string + isLong: boolean + size: bigint | string + collateral: bigint | string + reserveAmount: bigint | string + realisedPnl: bigint | string + markPrice: bigint | string +} + +/** + * Creates liquidation adapters for GMX v1 vaults by chain. + * Tracks collateral seized and position notional from LiquidatePosition events. + * `@param` config - Chain configurations mapping chain name to vault address and start date + * `@returns` SimpleAdapter configured for liquidation tracking + */ export const gmxV1LiquidationsExports = (config: IJSON<{ vault: string, start: string, deadFrom?: string, }>) => { const exportObject: BaseAdapter = {} Object.entries(config).forEach(([chain, chainConfig]) => { exportObject[chain] = { fetch: async (options: FetchOptions) => { const dailyCollateralLiquidated = options.createBalances() const dailyLiquidationVolume = options.createBalances() const logs = await options.getLogs({ target: chainConfig.vault, eventAbi: 'event LiquidatePosition(bytes32 key,address account,address collateralToken,address indexToken,bool isLong,uint256 size,uint256 collateral,uint256 reserveAmount,int256 realisedPnl,uint256 markPrice)', }) - logs.forEach((log: any) => { + logs.forEach((log: GmxLiquidatePositionLog) => { dailyCollateralLiquidated.addUSDValue(Number(log.collateral) / 1e30) dailyLiquidationVolume.addUSDValue(Number(log.size) / 1e30) })As per coding guidelines: "Include JSDoc comments for public functions in helper files" and "All helpers must have proper TypeScript types with no implicit any".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@helpers/gmx.ts` around lines 165 - 184, Add JSDoc for the exported gmxV1LiquidationsExports function describing its purpose, parameters (config) and returned BaseAdapter, and create a typed interface (e.g., LiquidatePositionEvent) that reflects the event fields (key, account, collateralToken, indexToken, isLong, size, collateral, reserveAmount, realisedPnl, markPrice) and use that interface instead of any in the logs.forEach and in the logs typing; update the fetch implementation to cast logs (or each log) to LiquidatePositionEvent and keep existing uses of dailyCollateralLiquidated and dailyLiquidationVolume but with properly typed numeric conversions.liquidations/morpho/index.ts (1)
49-56: 🧹 Nitpick | 🔵 TrivialEnable
pullHourly: truefor this EVM log-based adapter.The adapter uses
getLogs()for on-chain events but setspullHourly: false. This contradicts the guideline to usepullHourly: truefor EVM log-based adapters.const adapter: SimpleAdapter = { version: 2, - pullHourly: false, + pullHourly: true, adapter: Object.fromEntries(🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/morpho/index.ts` around lines 49 - 56, The adapter configuration sets pullHourly: false but this is an EVM log-based adapter (uses getLogs in fetch for MorphoBlues); change pullHourly to true in the adapter object so hourly pulls are enabled (update the adapter constant where pullHourly is defined), and verify the fetch function used for each chain in Object.fromEntries([chain, { fetch, start }]) correctly handles hourly execution and deduplication of logs when running on an hourly schedule.liquidations/GUIDELINES.md (1)
20-24: 🧹 Nitpick | 🔵 TrivialFix markdownlint spacing violations (MD022/MD031).
Add blank lines around headings and fenced code blocks to resolve the markdownlint warnings at lines 20, 23-24, 41, 44, and 46.
📝 Proposed fix
### Aave Forks and Compound V2 Forks + Add entries directly to `factory/aaveLiquidations.ts` or `factory/compoundV2.ts` instead of creating standalone adapter files... ### Singleton Contracts (Morpho Blue, Compound V3) + ```typescript const fetch = async (options: FetchOptions) => {Apply similar blank line additions around other headings and code fences.
Also applies to: 41-49
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/GUIDELINES.md` around lines 20 - 24, Fix markdownlint MD022/MD031 spacing by adding blank lines before and after headings and fenced code blocks in GUIDELINES.md: insert an empty line above and below the "Aave Forks and Compound V2 Forks" and "Singleton Contracts (Morpho Blue, Compound V3)" headings and ensure the fenced block that starts with the "typescript" snippet has a blank line before the opening ``` and after the closing ```, and apply the same blank-line adjustments to the other heading/code-fence groups around lines referenced (41-49) so all headings and code fences follow markdownlint spacing rules.helpers/aave/liquidations.ts (2)
18-26: 🧹 Nitpick | 🔵 TrivialConsider parallelizing pool log fetches for better performance.
The sequential
for...ofloop fetches logs from each pool one at a time. For protocols with multiple pools, parallel fetching could improve performance.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@helpers/aave/liquidations.ts` around lines 18 - 26, The loop over pools currently fetches logs sequentially causing slow performance; change it to fetch in parallel by mapping pools to promises that call options.getLogs({ target: pool, eventAbi: AaveAbis.LiquidationEvent }) and await Promise.all, then iterate the combined events to call dailyCollateralLiquidated.add(e.collateralAsset, e.liquidatedCollateralAmount); keep references to pools, options.getLogs, AaveAbis.LiquidationEvent, and dailyCollateralLiquidated when implementing the parallelization and aggregation.
4-10: 🧹 Nitpick | 🔵 TrivialAdd TypeScript interfaces and JSDoc documentation.
The function lacks JSDoc and uses
[key: string]: anyfor root options. Define proper interfaces for better type safety.♻️ Proposed improvements
+interface AaveLiquidationConfig { + pools: string[] + start?: string +} + +interface AaveLiquidationsOptions { + pullHourly?: boolean + [key: string]: unknown +} + +/** + * Creates liquidation tracking adapters for Aave-style lending protocols. + * Tracks collateral seized from LiquidationCall events. + * `@param` config - Chain configurations mapping chain name to pool addresses and optional start date + * `@param` options - Adapter options including pullHourly (default true) + * `@returns` SimpleAdapter configured for liquidation tracking + */ export function aaveLiquidationsExport( - config: { [chain: string]: { pools: string[]; start?: string } }, + config: { [chain: string]: AaveLiquidationConfig }, { pullHourly = true, ...otherRootOptions }: { - pullHourly?: boolean - [key: string]: any + pullHourly?: boolean + [key: string]: unknown } = {}, ): SimpleAdapter {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@helpers/aave/liquidations.ts` around lines 4 - 10, The aaveLiquidationsExport function lacks JSDoc and uses a loose {[key: string]: any} type for root options; add proper TypeScript interfaces (e.g., define interface ChainPools { pools: string[]; start?: string }, type Config = Record<string, ChainPools>, and interface RootOptions { pullHourly?: boolean; [extraOption: string]: unknown } or enumerate known optional fields) and update the function signature to use Config and RootOptions instead of the index-any type, then add JSDoc above aaveLiquidationsExport describing the function, its params (config: Config, rootOptions: RootOptions) and return type (SimpleAdapter) so callers and tools get accurate type and documentation.liquidations/xena-finance/index.ts (2)
15-19: 🧹 Nitpick | 🔵 TrivialPotential precision loss with
Number()for large USD values.While dividing by
1e30mitigates precision loss for most realistic values, very large liquidations could still exceedNumber.MAX_SAFE_INTEGERbefore division. Consider using BigInt arithmetic for safety.♻️ Safer BigInt approach
logs.forEach((log: any) => { // collateralValue is USD with 1e30 precision, same as GMX v1 - dailyCollateralLiquidated.addUSDValue(Number(log.collateralValue) / 1e30) - dailyLiquidationVolume.addUSDValue(Number(log.size) / 1e30) + dailyCollateralLiquidated.addUSDValue(Number(BigInt(log.collateralValue) / BigInt(1e15)) / 1e15) + dailyLiquidationVolume.addUSDValue(Number(BigInt(log.size) / BigInt(1e15)) / 1e15) })🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/xena-finance/index.ts` around lines 15 - 19, The loop uses Number(log.collateralValue) and Number(log.size) which can overflow before dividing by 1e30; parse those values as BigInt instead (e.g., const collateralBn = BigInt(log.collateralValue); const sizeBn = BigInt(log.size)), perform the scaling with BigInt to produce a safe decimal string (compute integer = collateralBn / 10n**30n and remainder = collateralBn % 10n**30n, build `${integer}.${pad(remainder)}`) or use a proper arbitrary-precision decimal library (e.g., decimal.js) to divide by 1e30, then pass the resulting safe Number/Decimal value into dailyCollateralLiquidated.addUSDValue and dailyLiquidationVolume.addUSDValue; update the logs.forEach handler accordingly and keep names (logs.forEach, dailyCollateralLiquidated.addUSDValue, dailyLiquidationVolume.addUSDValue, log.collateralValue, log.size) so the fix is easy to locate.
4-4: 🧹 Nitpick | 🔵 TrivialImport vault config from fees adapter instead of hardcoding.
The hardcoded
VAULTaddress can drift fromfees/xena-finance.ts. Import the config from the existing fees adapter to maintain a single source of truth.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/xena-finance/index.ts` at line 4, Replace the hardcoded VAULT constant with the vault address exported by the xena-finance fees adapter: remove the const VAULT = '0x...' line and import the vault value from the fees adapter module (either the exported VAULT symbol or the fees config object, e.g. feesConfig.vault) and use that imported symbol wherever VAULT is referenced in this file so the vault address remains a single source of truth.liquidations/silo-finance-v1/index.ts (2)
59-61:⚠️ Potential issue | 🟠 MajorBatch liquidation log reads instead of one request per silo.
Lines 59-61 currently issue unbounded parallel RPC calls; this is fragile under large silo counts. Use a single batched
options.getLogs({ targets: siloAddresses, ... , flatten: false })or controlled concurrency.As per coding guidelines: "Use
api.multiCallwhere possible, avoidPromise.all. Use PromisePool for non-EVM calls".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/silo-finance-v1/index.ts` around lines 59 - 61, The current code creates unbounded parallel RPC calls via Promise.all on siloAddresses (allEvents = Promise.all(siloAddresses.map(... options.getLogs({ target: silo, eventAbi: Liquidate })))), which is fragile; change to a single batched call by passing targets: siloAddresses to options.getLogs (e.g., options.getLogs({ targets: siloAddresses, eventAbi: Liquidate, flatten: false })) or, if batching is not available, replace the Promise.all approach with a controlled-concurrency executor such as PromisePool or use api.multiCall to limit parallelism and avoid flooding RPC providers.
64-66:⚠️ Potential issue | 🔴 Critical
dailyCollateralLiquidatedis keyed by the wrong token on Liquidate events.Line 65 records
seizedCollateralunderevent.asset. In Silo V1 Liquidate,event.assetis the debt-side asset; seized collateral must be keyed by the corresponding collateral token (resolved from silo asset storage), otherwise collateral liquidation totals are misattributed.As per coding guidelines: "
dailyCollateralLiquidateddimension (required) - Total USD value of collateral seized from borrowers during liquidation events".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/silo-finance-v1/index.ts` around lines 64 - 66, dailyCollateralLiquidated is being keyed by the debt-side asset (event.asset) instead of the collateral token; change the key to the collateral token resolved from the Silo V1 asset storage and record the seizedCollateral under that token. Specifically, replace the use of event.asset in the dailyCollateralLiquidated.add call with the collateral token obtained via the same resolver used elsewhere in this module (the function you use to map a silo asset to its collateral token), and keep passing event.seizedCollateral as the value to dailyCollateralLiquidated.add.liquidations/fraxlend/index.ts (2)
15-18:⚠️ Potential issue | 🟠 MajorUse factory
Createlogs (cached) for pair discovery, notgetAllPairAddresses().Line 15 uses a direct address-list read, which is brittle and re-computes full discovery every run. For Fraxlend instances, enumerate pair addresses from factory creation events and set
cacheInCloud: trueon that discovery step.Based on learnings: "For factory-deployed contracts (Euler, Silo, Fraxlend), fetch contract addresses from factory Create events with
cacheInCloud: true, then fetch Liquidate events from each instance".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/fraxlend/index.ts` around lines 15 - 18, Replace the brittle direct call that assigns pairs via options.api.call to getAllPairAddresses() with a discovery that reads the factory "Create" events for the Fraxlend registry (the same registries[options.chain] factory target) and returns the created pair addresses; ensure this discovery step is implemented as a cached discovery (set cacheInCloud: true) so pair lists are cached across runs, and then iterate those discovered pair addresses to fetch Liquidate events from each instance instead of re-calling getAllPairAddresses().
25-36:⚠️ Potential issue | 🟠 MajorReplace unbounded per-pair
Promise.alllog fetches with batchedgetLogs.Lines 25-36 fan out one RPC request per pair and fail the entire fetch when any single request rejects. Batch log fetching via
targetsto reduce RPC pressure and improve resilience.♻️ Proposed refactor
- await Promise.all( - pairs.map(async (pair, i) => { - const borrowAsset = borrowAssets[i] - const collateralAsset = collateralAssets[i] - if (!borrowAsset || !collateralAsset) return - - const events = await options.getLogs({ target: pair, eventAbi: Liquidate }) - for (const event of events) { - dailyCollateralLiquidated.add(collateralAsset, event.collateralForLiquidator) - } - }) - ) + const allEvents = await options.getLogs({ + targets: pairs, + eventAbi: Liquidate, + flatten: false, + }) + + for (let i = 0; i < pairs.length; i++) { + const borrowAsset = borrowAssets[i] + const collateralAsset = collateralAssets[i] + if (!borrowAsset || !collateralAsset) continue + for (const event of allEvents[i] ?? []) { + dailyCollateralLiquidated.add(collateralAsset, event.collateralForLiquidator) + } + }As per coding guidelines: "Use
api.multiCallwhere possible, avoidPromise.all. Use PromisePool for non-EVM calls".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/fraxlend/index.ts` around lines 25 - 36, The current Promise.all over pairs that calls options.getLogs per-pair (pairs.map -> options.getLogs with eventAbi Liquidate) should be replaced with batched log fetches: chunk pairs into reasonable sizes and call options.getLogs once per chunk using the targets array to fetch many pair logs at once, then flatten results and iterate events to call dailyCollateralLiquidated.add(collateralAsset, event.collateralForLiquidator); ensure you map events back to their collateralAsset (e.g., keep an index-to-asset map for the chunk) and handle individual chunk failures without failing the entire operation (catch/log and continue).liquidations/euler/index.ts (1)
5-6: 🧹 Nitpick | 🔵 TrivialClarify why TAC is skipped.
The current note “stalls on this chain” is too vague for maintenance. Add a concise operational reason and (if available) a tracking reference/TODO for re-enable criteria.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@liquidations/euler/index.ts` around lines 5 - 6, Update the vague comment above SKIP_CHAINS to state the concrete operational reason TAC is excluded (e.g., "Indexer stalls due to missing block timestamps", "RPC timeouts", or "incompatible contract ABI") and include a tracking reference or TODO with the issue/PR number or re-enable criteria; specifically edit the comment that precedes SKIP_CHAINS and reference CHAIN.TAC so future maintainers know why it was skipped and what must be resolved to remove it.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@helpers/aave/liquidations.ts`:
- Around line 18-26: The loop over pools currently fetches logs sequentially
causing slow performance; change it to fetch in parallel by mapping pools to
promises that call options.getLogs({ target: pool, eventAbi:
AaveAbis.LiquidationEvent }) and await Promise.all, then iterate the combined
events to call dailyCollateralLiquidated.add(e.collateralAsset,
e.liquidatedCollateralAmount); keep references to pools, options.getLogs,
AaveAbis.LiquidationEvent, and dailyCollateralLiquidated when implementing the
parallelization and aggregation.
- Around line 4-10: The aaveLiquidationsExport function lacks JSDoc and uses a
loose {[key: string]: any} type for root options; add proper TypeScript
interfaces (e.g., define interface ChainPools { pools: string[]; start?: string
}, type Config = Record<string, ChainPools>, and interface RootOptions {
pullHourly?: boolean; [extraOption: string]: unknown } or enumerate known
optional fields) and update the function signature to use Config and RootOptions
instead of the index-any type, then add JSDoc above aaveLiquidationsExport
describing the function, its params (config: Config, rootOptions: RootOptions)
and return type (SimpleAdapter) so callers and tools get accurate type and
documentation.
In `@helpers/compoundV2.ts`:
- Around line 176-182: Add a JSDoc block above the exported public helper
function compoundV2LiquidationsExport describing its purpose (exports
liquidations adapters for Compound V2 across chains), document the config
parameter shape (object mapping chain to { comptroller: string; start?: string
}), document the options object including pullHourly?: boolean and that
otherRootOptions captures additional root options, and state the return type is
SimpleAdapter; include examples/usage and tag the function as exported/public if
desired.
- Around line 219-235: The loop is calling options.getLogs for each cToken
sequentially; refactor to call options.getLogs once with { targets: cTokens,
eventAbi: LiquidateBorrowEvent, flatten: false } to fetch all logs in parallel,
then iterate the returned per-target arrays (keeping association to each cToken)
and process events the same way using collateralCToken, underlyingMap,
exchangeRateMap and dailyCollateralLiquidated (compute underlyingAmount from
event.seizeTokens and exchangeRate as before).
In `@helpers/gmx.ts`:
- Around line 165-184: Add JSDoc for the exported gmxV1LiquidationsExports
function describing its purpose, parameters (config) and returned BaseAdapter,
and create a typed interface (e.g., LiquidatePositionEvent) that reflects the
event fields (key, account, collateralToken, indexToken, isLong, size,
collateral, reserveAmount, realisedPnl, markPrice) and use that interface
instead of any in the logs.forEach and in the logs typing; update the fetch
implementation to cast logs (or each log) to LiquidatePositionEvent and keep
existing uses of dailyCollateralLiquidated and dailyLiquidationVolume but with
properly typed numeric conversions.
In `@liquidations/compound-v3/index.ts`:
- Around line 33-35: The loop over absorbCollateralEvents currently calls
dailyCollateralLiquidated.add(e.asset, e.collateralAbsorbed), which uses token
amounts; change it to use the event's USD field by replacing that call with
dailyCollateralLiquidated.addUSDValue(e.asset, e.usdValue) so liquidations are
recorded in oracle-priced USD (8-decimal) per AbsorbCollateral event; update any
related typings/imports if addUSDValue is not present on the
dailyCollateralLiquidated helper.
In `@liquidations/euler/index.ts`:
- Around line 5-6: Update the vague comment above SKIP_CHAINS to state the
concrete operational reason TAC is excluded (e.g., "Indexer stalls due to
missing block timestamps", "RPC timeouts", or "incompatible contract ABI") and
include a tracking reference or TODO with the issue/PR number or re-enable
criteria; specifically edit the comment that precedes SKIP_CHAINS and reference
CHAIN.TAC so future maintainers know why it was skipped and what must be
resolved to remove it.
In `@liquidations/fraxlend/index.ts`:
- Around line 15-18: Replace the brittle direct call that assigns pairs via
options.api.call to getAllPairAddresses() with a discovery that reads the
factory "Create" events for the Fraxlend registry (the same
registries[options.chain] factory target) and returns the created pair
addresses; ensure this discovery step is implemented as a cached discovery (set
cacheInCloud: true) so pair lists are cached across runs, and then iterate those
discovered pair addresses to fetch Liquidate events from each instance instead
of re-calling getAllPairAddresses().
- Around line 25-36: The current Promise.all over pairs that calls
options.getLogs per-pair (pairs.map -> options.getLogs with eventAbi Liquidate)
should be replaced with batched log fetches: chunk pairs into reasonable sizes
and call options.getLogs once per chunk using the targets array to fetch many
pair logs at once, then flatten results and iterate events to call
dailyCollateralLiquidated.add(collateralAsset, event.collateralForLiquidator);
ensure you map events back to their collateralAsset (e.g., keep an
index-to-asset map for the chunk) and handle individual chunk failures without
failing the entire operation (catch/log and continue).
In `@liquidations/GUIDELINES.md`:
- Around line 20-24: Fix markdownlint MD022/MD031 spacing by adding blank lines
before and after headings and fenced code blocks in GUIDELINES.md: insert an
empty line above and below the "Aave Forks and Compound V2 Forks" and "Singleton
Contracts (Morpho Blue, Compound V3)" headings and ensure the fenced block that
starts with the "typescript" snippet has a blank line before the opening ``` and
after the closing ```, and apply the same blank-line adjustments to the other
heading/code-fence groups around lines referenced (41-49) so all headings and
code fences follow markdownlint spacing rules.
In `@liquidations/morpho/index.ts`:
- Around line 49-56: The adapter configuration sets pullHourly: false but this
is an EVM log-based adapter (uses getLogs in fetch for MorphoBlues); change
pullHourly to true in the adapter object so hourly pulls are enabled (update the
adapter constant where pullHourly is defined), and verify the fetch function
used for each chain in Object.fromEntries([chain, { fetch, start }]) correctly
handles hourly execution and deduplication of logs when running on an hourly
schedule.
In `@liquidations/silo-finance-v1/index.ts`:
- Around line 59-61: The current code creates unbounded parallel RPC calls via
Promise.all on siloAddresses (allEvents = Promise.all(siloAddresses.map(...
options.getLogs({ target: silo, eventAbi: Liquidate })))), which is fragile;
change to a single batched call by passing targets: siloAddresses to
options.getLogs (e.g., options.getLogs({ targets: siloAddresses, eventAbi:
Liquidate, flatten: false })) or, if batching is not available, replace the
Promise.all approach with a controlled-concurrency executor such as PromisePool
or use api.multiCall to limit parallelism and avoid flooding RPC providers.
- Around line 64-66: dailyCollateralLiquidated is being keyed by the debt-side
asset (event.asset) instead of the collateral token; change the key to the
collateral token resolved from the Silo V1 asset storage and record the
seizedCollateral under that token. Specifically, replace the use of event.asset
in the dailyCollateralLiquidated.add call with the collateral token obtained via
the same resolver used elsewhere in this module (the function you use to map a
silo asset to its collateral token), and keep passing event.seizedCollateral as
the value to dailyCollateralLiquidated.add.
In `@liquidations/xena-finance/index.ts`:
- Around line 15-19: The loop uses Number(log.collateralValue) and
Number(log.size) which can overflow before dividing by 1e30; parse those values
as BigInt instead (e.g., const collateralBn = BigInt(log.collateralValue); const
sizeBn = BigInt(log.size)), perform the scaling with BigInt to produce a safe
decimal string (compute integer = collateralBn / 10n**30n and remainder =
collateralBn % 10n**30n, build `${integer}.${pad(remainder)}`) or use a proper
arbitrary-precision decimal library (e.g., decimal.js) to divide by 1e30, then
pass the resulting safe Number/Decimal value into
dailyCollateralLiquidated.addUSDValue and dailyLiquidationVolume.addUSDValue;
update the logs.forEach handler accordingly and keep names (logs.forEach,
dailyCollateralLiquidated.addUSDValue, dailyLiquidationVolume.addUSDValue,
log.collateralValue, log.size) so the fix is easy to locate.
- Line 4: Replace the hardcoded VAULT constant with the vault address exported
by the xena-finance fees adapter: remove the const VAULT = '0x...' line and
import the vault value from the fees adapter module (either the exported VAULT
symbol or the fees config object, e.g. feesConfig.vault) and use that imported
symbol wherever VAULT is referenced in this file so the vault address remains a
single source of truth.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 65fa2e60-7547-47e3-9fd8-b7f3ad84db81
📒 Files selected for processing (11)
adapters/types.tshelpers/aave/liquidations.tshelpers/compoundV2.tshelpers/gmx.tsliquidations/GUIDELINES.mdliquidations/compound-v3/index.tsliquidations/euler/index.tsliquidations/fraxlend/index.tsliquidations/morpho/index.tsliquidations/silo-finance-v1/index.tsliquidations/xena-finance/index.ts
|
hi @RohanNero , can you try to track Nado? https://github.com/nadohq/nado-contracts |
No description provided.