Skip to content

feat(kasu): add XDC chain (AUDD + USDC deployments) with on-chain TVL reads#18754

Open
kivanov82 wants to merge 12 commits intoDefiLlama:mainfrom
Kasu-Finance:feat/add-xdc-and-onchain-tvl
Open

feat(kasu): add XDC chain (AUDD + USDC deployments) with on-chain TVL reads#18754
kivanov82 wants to merge 12 commits intoDefiLlama:mainfrom
Kasu-Finance:feat/add-xdc-and-onchain-tvl

Conversation

@kivanov82
Copy link
Copy Markdown
Contributor

@kivanov82 kivanov82 commented Apr 14, 2026

Summary

  • Adds XDC chain to the Kasu adapter with two separate deployments:
    • AUDD deployment (denominated in AUDD stablecoin)
    • USDC deployment (denominated in USDC)
  • Uses on-chain reads for pool TVL (totalSupply() per pool) plus an external TVL registry (externalTVLOfPool(address)) for off-balance-sheet exposure.
  • Refactors CONFIG to support multiple deployments per chain.
  • Pools are discovered via chain-specific subgraphs.

Test plan

  • node test.js projects/kasu/index.js completes successfully across base, xdc, and plume_mainnet.
  • XDC AUDD deployment reports pool TVL denominated in AUDD.
  • XDC USDC deployment config in place; will populate once USDC pools are deployed by the pool manager.

Summary by CodeRabbit

  • New Features

    • Added support for multiple subgraph deployments per blockchain chain, enabling diversified asset tracking per network
    • Extended XDC chain support with AUDD and USDC asset deployments
  • Refactor

    • Restructured deployment configuration and caching mechanism to handle multiple deployments independently

kivanov82 and others added 12 commits November 14, 2025 15:42
- Add XDC chain support with Goldsky subgraph and ExternalTVL
- Switch from subgraph balance reads to on-chain totalSupply()
  for accurate historic data when DefiLlama backfills
- Subgraph now used only for pool address discovery
- Add ExternalTVL support for XDC (same pattern as Base)
- Update Base subgraph to v1.0.13

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Adds the second XDC deployment (USDC) alongside the existing AUDD
deployment, and corrects the AUDD pool to be denominated in AUDD
(was mis-attributed to USDC.e). Refactors CONFIG to support multiple
deployments per chain.
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 14, 2026

📝 Walkthrough

Walkthrough

The pull request refactors the Kasu project's configuration structure from single objects per chain to arrays of deployment objects, enabling multiple subgraph deployments under the same chain. It introduces per-deployment cache keys and updates the TVL computation flow to iterate through all configured deployments. The XDC chain configuration is expanded to support two distinct deployments for AUDD and USDC assets.

Changes

Cohort / File(s) Summary
Configuration & TVL Flow Refactoring
projects/kasu/index.js
Restructured CONFIG entries into deployment arrays with per-deployment key fields; updated cache lookup to use 'kasu/' + chain + (key ? '-' + key : '') for distinct caching per deployment; refactored tvl function to iterate over CONFIG[chain] deployments, extracting and processing configuration per deployment; expanded XDC chain to two deployments (AUDD and USDC) with distinct graphURLs, external contracts, and asset mappings.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related PRs

Poem

🐰 Deployments now in arrays we keep,
One chain, one cache, one deeper sweep,
AUDD and USDC side by side,
TVL flows computed with multiplied pride!

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly summarizes the main change: adding XDC chain support with AUDD and USDC deployments using on-chain TVL reads, which aligns with the primary objective of the changeset.
Description check ✅ Passed The PR description provides a clear summary of changes, test plan, and implementation details. However, it does not follow the template structure required by the repository for new protocol listings.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@github-actions
Copy link
Copy Markdown

The adapter at projects/kasu exports TVL:

base                      8.90 M
xdc                       4.00
plume_mainnet             0.00

total                    8.90 M 

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
projects/kasu/index.js (1)

44-46: Potential precision loss when converting large token amounts to Number.

Number() can only safely represent integers up to 2^53-1. For tokens with 18 decimals, pools with >9M USD equivalent could lose precision. The DefiLlama SDK's api.add() handles arrays of BigInt strings natively without this issue.

♻️ Suggested simplification that avoids precision loss
-    const supplies = await api.multiCall({ abi: 'function totalSupply() view returns (uint256)', calls: poolAddresses, permitFailure: true });
-    const poolTvl = supplies.reduce((sum, s) => sum + Number(s || 0), 0);
-    api.addTokens(asset, poolTvl)
+    const supplies = await api.multiCall({ abi: 'function totalSupply() view returns (uint256)', calls: poolAddresses, permitFailure: true });
+    api.add(asset, supplies)

This lets the SDK handle summation internally, preserving BigInt precision and simplifying the code.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@projects/kasu/index.js` around lines 44 - 46, The current code converts
multiCall results to JS Numbers (supplies -> Number(s || 0)) causing precision
loss for large token amounts; instead pass the raw string/BigInt values to the
SDK so it can sum precisely—replace the manual reduce that builds poolTvl with a
direct call to the SDK aggregation method (use api.add or api.addTokens with the
supplies array) and stop using Number() conversion; locate the multiCall result
variable supplies, the reduce that creates poolTvl, and the api.addTokens call
and change the flow so the SDK receives the supplies array (or strings) directly
for accurate big-int summation.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@projects/kasu/index.js`:
- Around line 44-46: The current code converts multiCall results to JS Numbers
(supplies -> Number(s || 0)) causing precision loss for large token amounts;
instead pass the raw string/BigInt values to the SDK so it can sum
precisely—replace the manual reduce that builds poolTvl with a direct call to
the SDK aggregation method (use api.add or api.addTokens with the supplies
array) and stop using Number() conversion; locate the multiCall result variable
supplies, the reduce that creates poolTvl, and the api.addTokens call and change
the flow so the SDK receives the supplies array (or strings) directly for
accurate big-int summation.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 12d4c953-a4f8-421f-bcc0-cd54f69a47dc

📥 Commits

Reviewing files that changed from the base of the PR and between 5eae613 and a410bdf.

📒 Files selected for processing (1)
  • projects/kasu/index.js

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants