Add Gifted Markets volume adapter#5788
Conversation
Fees tracking for groypfi.io.
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (1)
📝 WalkthroughWalkthroughAdds four new volume adapter modules: three TON-based groypfi adapters (aggregator-derivatives, bridge-aggregators, dexs) and one Gifted Markets adapter, each fetching price data, retrieving/filtering transactions or API responses, aggregating daily volumes, and exporting SimpleAdapter or fetch interfaces. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
The groypfi adapter exports: |
|
The groypfi adapter exports: |
|
The giftedMarkets.js adapter exports: |
|
The groypfi adapter exports: |
There was a problem hiding this comment.
Actionable comments posted: 6
🤖 Fix all issues with AI agents
In `@aggregator-derivatives/groypfi/index.ts`:
- Around line 14-24: The current single-page fetch of transactions (httpGet call
assigning txResponse) can miss transactions beyond limit=1000; replace with a
paginated loop that repeatedly calls httpGet for the PERPS_FEE_WALLET
transactions using the TonAPI logical-time cursor (after_lt or before_lt) until
all pages in the [startTimestamp, endTimestamp] window are processed. In
practice: initialize a cursor variable, call the same endpoint adding the cursor
query param, on each page iterate over txResponse.transactions (or []), apply
the existing timestamp and success filters (using startTimestamp/endTimestamp),
accumulate amounts into dailyVolumeNano as you do now, then set the cursor from
the last transaction’s lt/after_lt token returned (or use the last transaction’s
utime/lt) and continue; stop when no new transactions are returned or when the
earliest tx.utime on a page is < startTimestamp to avoid extra pages. Ensure
robust handling if txResponse.transactions is missing and convert values to
BigInt consistently.
- Around line 16-26: dailyVolumeNano is converted to Number which can lose
precision for large BigInt values; update the conversion used to compute
dailyVolumeUSD to avoid BigInt→Number precision loss by using a
decimal/arbitrary-precision library (e.g., decimal.js, big.js) or by safely
splitting the BigInt into integer and fractional parts: compute integerTons =
dailyVolumeNano / 1_000_000_000n and fracNano = dailyVolumeNano %
1_000_000_000n, then perform the arithmetic in a decimal/big-number type (or as
Number only after converting both parts through the big-number lib) to produce
dailyVolumeUSD; locate and change the logic around dailyVolumeNano and
dailyVolumeUSD in the loop and conversion to use the chosen safe decimal
approach.
In `@bridge-aggregators/groypfi/index.ts`:
- Around line 16-26: dailyVolumeNano → dailyVolumeUSD conversion currently uses
Number(dailyVolumeNano) which silently loses precision; instead perform the USD
calculation with integer (BigInt) fixed-point arithmetic or a decimal library.
For a minimal fix, convert tonPriceUSD into a scaled integer (e.g., scaledPrice
= BigInt(Math.round(tonPriceUSD * 1e6))) and compute the USD value as
(dailyVolumeNano * scaledPrice) / BigInt(1e9 * 1e6) (or return a string/BigInt
representing micro-USD) rather than calling Number(dailyVolumeNano); update the
code that sets dailyVolumeUSD and any consumers of that value to accept the
precise result (or use a decimal library like decimal.js / big.js to compute
dailyVolumeUSD from dailyVolumeNano and tonPriceUSD). Ensure changes reference
dailyVolumeNano, dailyVolumeUSD, tonPriceUSD and txResponse.transactions.
- Around line 14-24: The current loop fetching transactions with a single
httpGet using `limit=1000` (response assigned to `txResponse`) can miss older
transactions; implement LT-based pagination: repeatedly call `httpGet` for
`${TON_API}/blockchain/accounts/${FEE_WALLET}/transactions?limit=1000&before_lt=<minLt>`
where `<minLt>` is the smallest `lt` seen in the last `txResponse`, merge each
page's `transactions` into processing, stop when a page is empty or when all
returned `tx.utime` are < `startTimestamp` (or no new smaller `lt`), and update
the existing logic that increments `dailyVolumeNano` (using `tx.in_msg.value`)
to run across all pages. Ensure you preserve checks against `startTimestamp`,
`endTimestamp`, and `tx.success` and convert `tx.in_msg.value` to BigInt as
before.
In `@dexs/groypfi/index.ts`:
- Around line 14-24: The current single httpGet call fetching txResponse with
limit=1000 can miss transactions; modify the logic around txResponse/httpGet for
SWAP_REFERRER to paginate using TON API's before_lt/after_lt parameters:
repeatedly call httpGet with updated before_lt (using the smallest lt from the
last page) to fetch subsequent pages, accumulate transactions into the same loop
that updates dailyVolumeNano, and stop when a page returns no transactions or
when the oldest tx.utime is < startTimestamp (also break early if all tx.utime
values are > endTimestamp or < startTimestamp as appropriate). Ensure you
preserve existing filters (tx.success and tx.in_msg?.value) and use BigInt
conversion exactly as done now while merging results across pages.
- Around line 16-26: dailyVolumeNano is converted to Number before scaling,
risking precision loss; compute the TON integer and fractional parts using
BigInt first, validate the integer part against Number.MAX_SAFE_INTEGER, and
only then convert to Number for final USD math. Specifically: derive wholeTons =
dailyVolumeNano / 1_000_000_000n and remainderNano = dailyVolumeNano %
1_000_000_000n (use these names), check if wholeTons <=
BigInt(Number.MAX_SAFE_INTEGER); if yes compute dailyVolumeUSD =
(Number(wholeTons) + Number(remainderNano) / 1e9) * tonPriceUSD, otherwise
handle overflow explicitly (e.g., log/throw or use a string/BigDecimal path) so
txResponse.transactions -> dailyVolumeNano -> dailyVolumeUSD conversion is safe.
| const txResponse = await httpGet(`${TON_API}/blockchain/accounts/${PERPS_FEE_WALLET}/transactions?limit=1000`); | ||
|
|
||
| let dailyVolumeNano = 0n; | ||
|
|
||
| for (const tx of txResponse.transactions || []) { | ||
| if (tx.utime < startTimestamp || tx.utime > endTimestamp) continue; | ||
| if (!tx.success) continue; | ||
| if (tx.in_msg?.value > 0) { | ||
| dailyVolumeNano += BigInt(tx.in_msg.value) * 100n; | ||
| } | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
TONAPI v2 /blockchain/accounts transactions pagination limit maximum
💡 Result:
For TonAPI v2 endpoint:
GET /v2/blockchain/accounts/{account_id}/transactions
- The
limitquery param hasminimum: 1,default: 100,maximum: 1000. So the max page size is 1000 transactions per request. [1] - Pagination is done with
after_lt/before_lt(logical time cursors), and you can control ordering withsort_order(descdefault, orasc). [1]
Source: TonAPI/OpenTonAPI OpenAPI spec (openapi.yml). [1]
Add pagination to TONAPI transactions query; limit=1000 may undercount daily volume.
The TonAPI v2 endpoint supports pagination via after_lt/before_lt logical time cursors. The current code fetches only limit=1000 without implementing pagination. If the fee wallet processes more than 1,000 transactions in a single day, the volume calculation will be incomplete. Implement cursor-based pagination, iterating through all transactions within the [startTimestamp, endTimestamp] range.
🤖 Prompt for AI Agents
In `@aggregator-derivatives/groypfi/index.ts` around lines 14 - 24, The current
single-page fetch of transactions (httpGet call assigning txResponse) can miss
transactions beyond limit=1000; replace with a paginated loop that repeatedly
calls httpGet for the PERPS_FEE_WALLET transactions using the TonAPI
logical-time cursor (after_lt or before_lt) until all pages in the
[startTimestamp, endTimestamp] window are processed. In practice: initialize a
cursor variable, call the same endpoint adding the cursor query param, on each
page iterate over txResponse.transactions (or []), apply the existing timestamp
and success filters (using startTimestamp/endTimestamp), accumulate amounts into
dailyVolumeNano as you do now, then set the cursor from the last transaction’s
lt/after_lt token returned (or use the last transaction’s utime/lt) and
continue; stop when no new transactions are returned or when the earliest
tx.utime on a page is < startTimestamp to avoid extra pages. Ensure robust
handling if txResponse.transactions is missing and convert values to BigInt
consistently.
| let dailyVolumeNano = 0n; | ||
|
|
||
| for (const tx of txResponse.transactions || []) { | ||
| if (tx.utime < startTimestamp || tx.utime > endTimestamp) continue; | ||
| if (!tx.success) continue; | ||
| if (tx.in_msg?.value > 0) { | ||
| dailyVolumeNano += BigInt(tx.in_msg.value) * 100n; | ||
| } | ||
| } | ||
|
|
||
| const dailyVolumeUSD = (Number(dailyVolumeNano) / 1e9) * tonPriceUSD; |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
JavaScript BigInt to Number conversion precision loss Number.MAX_SAFE_INTEGER
💡 Result:
JavaScript number is an IEEE‑754 double, so it can represent integers exactly only up to ±(2^53 − 1), exposed as Number.MAX_SAFE_INTEGER === 9007199254740991. (developer.mozilla.org)
That’s why converting a BigInt larger than Number.MAX_SAFE_INTEGER to number can silently round (lose integer precision):
const b = 9007199254740993n; // MAX_SAFE_INTEGER + 2
Number(b); // 9007199254740992 (rounded)MDN explicitly warns that coercing BigInt ↔ Number can lose precision, and that mixed BigInt/Number arithmetic requires explicit conversion. (developer.mozilla.org)
Practical rules:
- If
-Number.MAX_SAFE_INTEGER <= b <= Number.MAX_SAFE_INTEGER,Number(b)is exact; otherwise precision may be lost. (developer.mozilla.org) - Prefer keeping values as
BigInt(or strings) when values above 2^53−1 are possible. (developer.mozilla.org)
Citations:
- 1: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/MAX_SAFE_INTEGER?utm_source=openai
- 2: https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/BigInt
- 3: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/MAX_SAFE_INTEGER?utm_source=openai
- 4: https://developer.mozilla.org/docs/Web/JavaScript/Guide/Numbers_and_dates?utm_source=openai
🌐 Web query:
JavaScript Number.MAX_SAFE_INTEGER limit 2^53 BigInt precision
💡 Result:
- Why the limit is ~2⁵³: JavaScript
Numbervalues are IEEE‑754 double precision (binary64) floats with 53 bits of integer precision (52 stored fraction bits + 1 implicit leading bit). That means all integers are exactly representable only up to 2⁵³; beyond that, the spacing between representable integers becomes 2, 4, 8, … so you start skipping integers. [2][3] Number.MAX_SAFE_INTEGER: The largest integer that can be represented exactly and compared correctly as aNumberis 2⁵³ − 1 = 9,007,199,254,740,991 (Number.MAX_SAFE_INTEGER). Past this, distinct integers can round to the sameNumber(e.g.MAX_SAFE_INTEGER + 1 === MAX_SAFE_INTEGER + 2istrue). [1]- Use
BigIntfor exact large integers:BigIntis an integer type with arbitrary precision, intended for values beyond the safe-integer range. Create with a trailingn(123n) orBigInt(...). [4]- Note: you generally can’t mix
NumberandBigIntin arithmetic without explicit conversion (e.g.1n + 1throws). [4]
- Note: you generally can’t mix
Example
Number.MAX_SAFE_INTEGER; // 9007199254740991
Number.MAX_SAFE_INTEGER + 1 ===
Number.MAX_SAFE_INTEGER + 2; // true (precision loss)
const x = BigInt(Number.MAX_SAFE_INTEGER);
x + 1n === x + 2n; // false (exact)Sources:
[1] MDN: Number.MAX_SAFE_INTEGER https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/MAX_SAFE_INTEGER
[2] Wikipedia: double-precision (binary64) significand precision and integer exactness ranges https://en.wikipedia.org/wiki/Double-precision_floating-point_format
[3] Oracle docs: double has 52-bit fraction + implicit leading bit = 53 bits precision https://docs.oracle.com/cd/E19957-01/806-3568/ncg_math.html
[4] MDN: BigInt overview (arbitrary precision; examples vs MAX_SAFE_INTEGER) https://developer.mozilla.org/docs/Web/JavaScript/Data_structures
🏁 Script executed:
cd aggregator-derivatives/groypfi && head -50 index.tsRepository: DefiLlama/dimension-adapters
Length of output: 1315
🏁 Script executed:
rg -A 5 -B 5 "tx\.in_msg" aggregator-derivatives/groypfi/index.tsRepository: DefiLlama/dimension-adapters
Length of output: 434
🏁 Script executed:
rg "tx\." aggregator-derivatives/groypfi/ -t ts --max-count=20Repository: DefiLlama/dimension-adapters
Length of output: 423
🏁 Script executed:
find aggregator-derivatives -name "*.md" -o -name "*.txt" | head -10Repository: DefiLlama/dimension-adapters
Length of output: 54
🏁 Script executed:
rg "ton\|TON\|transaction\|value" aggregator-derivatives/groypfi/ -t ts -A 2 | head -50Repository: DefiLlama/dimension-adapters
Length of output: 54
Guard against BigInt→Number precision loss in USD conversion.
JavaScript's Number type can represent integers exactly only up to 2^53−1 (~9 million TON in nanotons). Converting BigInt values beyond this limit silently loses precision. For a financial data aggregator, this creates a hidden risk: on sufficiently high-volume days, Number(dailyVolumeNano) will round incorrectly, producing inaccurate USD values with no error raised.
Apply the proposed fix to convert safely:
Safe conversion pattern
- const dailyVolumeUSD = (Number(dailyVolumeNano) / 1e9) * tonPriceUSD;
+ const NANO_PER_TON = 1_000_000_000n;
+ const wholeTons = dailyVolumeNano / NANO_PER_TON;
+ if (wholeTons > BigInt(Number.MAX_SAFE_INTEGER)) {
+ throw new Error(`dailyVolumeNano exceeds JS safe range: ${dailyVolumeNano}`);
+ }
+ const dailyVolumeTON =
+ Number(wholeTons) + Number(dailyVolumeNano % NANO_PER_TON) / 1e9;
+ const dailyVolumeUSD = dailyVolumeTON * tonPriceUSD;🤖 Prompt for AI Agents
In `@aggregator-derivatives/groypfi/index.ts` around lines 16 - 26,
dailyVolumeNano is converted to Number which can lose precision for large BigInt
values; update the conversion used to compute dailyVolumeUSD to avoid
BigInt→Number precision loss by using a decimal/arbitrary-precision library
(e.g., decimal.js, big.js) or by safely splitting the BigInt into integer and
fractional parts: compute integerTons = dailyVolumeNano / 1_000_000_000n and
fracNano = dailyVolumeNano % 1_000_000_000n, then perform the arithmetic in a
decimal/big-number type (or as Number only after converting both parts through
the big-number lib) to produce dailyVolumeUSD; locate and change the logic
around dailyVolumeNano and dailyVolumeUSD in the loop and conversion to use the
chosen safe decimal approach.
| const txResponse = await httpGet(`${TON_API}/blockchain/accounts/${FEE_WALLET}/transactions?limit=1000`); | ||
|
|
||
| let dailyVolumeNano = 0n; | ||
|
|
||
| for (const tx of txResponse.transactions || []) { | ||
| if (tx.utime < startTimestamp || tx.utime > endTimestamp) continue; | ||
| if (!tx.success) continue; | ||
| if (tx.in_msg?.value > 0) { | ||
| dailyVolumeNano += BigInt(tx.in_msg.value) * 200n; | ||
| } | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
TONAPI v2 blockchain accounts transactions endpoint pagination limit cursor
💡 Result:
For TONAPI v2 GET /v2/blockchain/accounts/{account_id}/transactions, pagination is LT-based (no opaque cursor field).
Query params (pagination-related):
limit: 1..1000 (default 100) [1]before_lt: get transactions before a given logical time (LT) (omit to get latest) [1]after_lt: get transactions after a given LT (omit to get latest) [1]sort_order:desc(default) orasc[1]
How to paginate (typical, newest → older):
- First page: call endpoint with no
before_lt/after_lt(optionally setlimit). - Next page: take the smallest (oldest)
ltyou received and pass it asbefore_lt(if you see overlap, usebefore_lt = last_lt - 1). Keep repeating until you get an empty page. [1]
Oldest → newest:
- Use
sort_order=ascand advance withafter_ltsimilarly. [1]
Source: TONAPI/OpenTonAPI OpenAPI spec for this endpoint. [1]
Implement pagination for TONAPI transactions; limit=1000 can undercount daily volume.
If the address has more than 1,000 transactions in the time window, you'll miss volume. Loop through pages using TONAPI's LT-based pagination: retrieve the smallest lt from each response and pass it as before_lt in the next request until the page is empty or you pass startTimestamp.
🤖 Prompt for AI Agents
In `@bridge-aggregators/groypfi/index.ts` around lines 14 - 24, The current loop
fetching transactions with a single httpGet using `limit=1000` (response
assigned to `txResponse`) can miss older transactions; implement LT-based
pagination: repeatedly call `httpGet` for
`${TON_API}/blockchain/accounts/${FEE_WALLET}/transactions?limit=1000&before_lt=<minLt>`
where `<minLt>` is the smallest `lt` seen in the last `txResponse`, merge each
page's `transactions` into processing, stop when a page is empty or when all
returned `tx.utime` are < `startTimestamp` (or no new smaller `lt`), and update
the existing logic that increments `dailyVolumeNano` (using `tx.in_msg.value`)
to run across all pages. Ensure you preserve checks against `startTimestamp`,
`endTimestamp`, and `tx.success` and convert `tx.in_msg.value` to BigInt as
before.
| let dailyVolumeNano = 0n; | ||
|
|
||
| for (const tx of txResponse.transactions || []) { | ||
| if (tx.utime < startTimestamp || tx.utime > endTimestamp) continue; | ||
| if (!tx.success) continue; | ||
| if (tx.in_msg?.value > 0) { | ||
| dailyVolumeNano += BigInt(tx.in_msg.value) * 200n; | ||
| } | ||
| } | ||
|
|
||
| const dailyVolumeUSD = (Number(dailyVolumeNano) / 1e9) * tonPriceUSD; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's understand the file context and see what values tx.in_msg.value could be
cat -n bridge-aggregators/groypfi/index.tsRepository: DefiLlama/dimension-adapters
Length of output: 1596
🏁 Script executed:
# Search for how txResponse is obtained and what types are involved
rg -A 5 -B 5 "txResponse" bridge-aggregators/groypfi/index.tsRepository: DefiLlama/dimension-adapters
Length of output: 675
🏁 Script executed:
# Look for any type definitions or interfaces related to transactions
fd -e ts -e tsx bridge-aggregators/groypfi/ | head -20Repository: DefiLlama/dimension-adapters
Length of output: 485
🌐 Web query:
JavaScript Number precision BigInt MAX_SAFE_INTEGER 2^53 conversion
💡 Result:
-
Why
2^53matters forNumber: JavaScriptNumberis IEEE‑754 double precision; only 53 bits of integer precision are available (52 stored mantissa bits + 1 implicit), so the largest integer that’s always represented exactly is2^53 − 1, exposed asNumber.MAX_SAFE_INTEGER(9007199254740991). Above that, different integers can compare equal (loss of integer precision). [1] -
Rule of thumb: Use
Numberonly whenNumber.isSafeInteger(n)is true; otherwise useBigIntfor exact integer math. [1] -
Converting
Number -> BigInt:BigInt(value)converts a Number only if it’s an integer; fractional/NaN/Infinity throw. [2][3]- If your Number is already imprecise (e.g.,
9007199254740993can’t be represented exactly), converting it toBigIntwill “lock in” the already-rounded value—so for large IDs, prefer parsing from string:BigInt("9007199254740993"). [1][2]
-
Mixing
BigIntandNumber: You can’t use arithmetic operators with one sideBigIntand the otherNumber; you must explicitly convert one side, otherwise you get aTypeError. [4]
Example
Number.MAX_SAFE_INTEGER; // 9007199254740991
Number.MAX_SAFE_INTEGER + 1 ===
Number.MAX_SAFE_INTEGER + 2; // true (precision loss) [1]
BigInt(Number.MAX_SAFE_INTEGER) + 2n; // 9007199254740993n
1n + 1; // TypeError (must convert) [4]Sources: [1] [2] [3] [4]
Guard against silent BigInt→Number precision loss in USD conversion.
Number(dailyVolumeNano) silently loses precision once values exceed 2^53−1 (9,007,199,254,740,991), which can happen on high-activity days when accumulated nanoton volumes exceed ~9 million TON. This distorts the reported volume without throwing an error.
🔧 Proposed solution to ensure precision
- const dailyVolumeUSD = (Number(dailyVolumeNano) / 1e9) * tonPriceUSD;
+ const NANO_PER_TON = 1_000_000_000n;
+ const wholeTons = dailyVolumeNano / NANO_PER_TON;
+ if (wholeTons > BigInt(Number.MAX_SAFE_INTEGER)) {
+ throw new Error(`dailyVolumeNano exceeds JS safe range: ${dailyVolumeNano}`);
+ }
+ const dailyVolumeTON =
+ Number(wholeTons) + Number(dailyVolumeNano % NANO_PER_TON) / 1e9;
+ const dailyVolumeUSD = dailyVolumeTON * tonPriceUSD;📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| let dailyVolumeNano = 0n; | |
| for (const tx of txResponse.transactions || []) { | |
| if (tx.utime < startTimestamp || tx.utime > endTimestamp) continue; | |
| if (!tx.success) continue; | |
| if (tx.in_msg?.value > 0) { | |
| dailyVolumeNano += BigInt(tx.in_msg.value) * 200n; | |
| } | |
| } | |
| const dailyVolumeUSD = (Number(dailyVolumeNano) / 1e9) * tonPriceUSD; | |
| let dailyVolumeNano = 0n; | |
| for (const tx of txResponse.transactions || []) { | |
| if (tx.utime < startTimestamp || tx.utime > endTimestamp) continue; | |
| if (!tx.success) continue; | |
| if (tx.in_msg?.value > 0) { | |
| dailyVolumeNano += BigInt(tx.in_msg.value) * 200n; | |
| } | |
| } | |
| const NANO_PER_TON = 1_000_000_000n; | |
| const wholeTons = dailyVolumeNano / NANO_PER_TON; | |
| if (wholeTons > BigInt(Number.MAX_SAFE_INTEGER)) { | |
| throw new Error(`dailyVolumeNano exceeds JS safe range: ${dailyVolumeNano}`); | |
| } | |
| const dailyVolumeTON = | |
| Number(wholeTons) + Number(dailyVolumeNano % NANO_PER_TON) / 1e9; | |
| const dailyVolumeUSD = dailyVolumeTON * tonPriceUSD; |
🤖 Prompt for AI Agents
In `@bridge-aggregators/groypfi/index.ts` around lines 16 - 26, dailyVolumeNano →
dailyVolumeUSD conversion currently uses Number(dailyVolumeNano) which silently
loses precision; instead perform the USD calculation with integer (BigInt)
fixed-point arithmetic or a decimal library. For a minimal fix, convert
tonPriceUSD into a scaled integer (e.g., scaledPrice =
BigInt(Math.round(tonPriceUSD * 1e6))) and compute the USD value as
(dailyVolumeNano * scaledPrice) / BigInt(1e9 * 1e6) (or return a string/BigInt
representing micro-USD) rather than calling Number(dailyVolumeNano); update the
code that sets dailyVolumeUSD and any consumers of that value to accept the
precise result (or use a decimal library like decimal.js / big.js to compute
dailyVolumeUSD from dailyVolumeNano and tonPriceUSD). Ensure changes reference
dailyVolumeNano, dailyVolumeUSD, tonPriceUSD and txResponse.transactions.
| const txResponse = await httpGet(`${TON_API}/blockchain/accounts/${SWAP_REFERRER}/transactions?limit=1000`); | ||
|
|
||
| let dailyVolumeNano = 0n; | ||
|
|
||
| for (const tx of txResponse.transactions || []) { | ||
| if (tx.utime < startTimestamp || tx.utime > endTimestamp) continue; | ||
| if (!tx.success) continue; | ||
| if (tx.in_msg?.value > 0) { | ||
| dailyVolumeNano += BigInt(tx.in_msg.value) * 100n; | ||
| } | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
TONAPI v2 /blockchain/accounts transactions endpoint pagination limit cursor documentation
💡 Result:
For TONAPI v2 GET /v2/blockchain/accounts/{account_id}/transactions, pagination is LT-based (no opaque “cursor” token):
- Query params used for pagination
before_lt(int64): fetch transactions before a given logical time (use this to page backward/older). (raw.githubusercontent.com)after_lt(int64): fetch transactions after a given logical time (use this to page forward/newer). (raw.githubusercontent.com)- Docs note: omit
before_lt/after_lt“to get last transactions”. (raw.githubusercontent.com)
- Limit
limitdefault 100, min 1, max 1000. (raw.githubusercontent.com)
- Sort
sort_order:"desc"(default) or"asc"bylt. (raw.githubusercontent.com)
Canonical reference is the official OpenAPI spec (openapi.yml). (raw.githubusercontent.com)
Citations:
- 1: https://raw.githubusercontent.com/tonkeeper/opentonapi/master/api/openapi.yml
- 2: https://raw.githubusercontent.com/tonkeeper/opentonapi/master/api/openapi.yml
- 3: https://raw.githubusercontent.com/tonkeeper/opentonapi/master/api/openapi.yml
- 4: https://raw.githubusercontent.com/tonkeeper/opentonapi/master/api/openapi.yml
- 5: https://raw.githubusercontent.com/tonkeeper/opentonapi/master/api/openapi.yml
- 6: https://raw.githubusercontent.com/tonkeeper/opentonapi/master/api/openapi.yml
Add pagination to TONAPI transactions; limit=1000 will miss high-volume days.
This will miss volume on days with >1,000 txs for the referrer wallet. Use the before_lt and after_lt parameters to paginate through results, stopping when transactions fall outside the [startTimestamp, endTimestamp] range.
🤖 Prompt for AI Agents
In `@dexs/groypfi/index.ts` around lines 14 - 24, The current single httpGet call
fetching txResponse with limit=1000 can miss transactions; modify the logic
around txResponse/httpGet for SWAP_REFERRER to paginate using TON API's
before_lt/after_lt parameters: repeatedly call httpGet with updated before_lt
(using the smallest lt from the last page) to fetch subsequent pages, accumulate
transactions into the same loop that updates dailyVolumeNano, and stop when a
page returns no transactions or when the oldest tx.utime is < startTimestamp
(also break early if all tx.utime values are > endTimestamp or < startTimestamp
as appropriate). Ensure you preserve existing filters (tx.success and
tx.in_msg?.value) and use BigInt conversion exactly as done now while merging
results across pages.
| let dailyVolumeNano = 0n; | ||
|
|
||
| for (const tx of txResponse.transactions || []) { | ||
| if (tx.utime < startTimestamp || tx.utime > endTimestamp) continue; | ||
| if (!tx.success) continue; | ||
| if (tx.in_msg?.value > 0) { | ||
| dailyVolumeNano += BigInt(tx.in_msg.value) * 100n; | ||
| } | ||
| } | ||
|
|
||
| const dailyVolumeUSD = (Number(dailyVolumeNano) / 1e9) * tonPriceUSD; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
find . -name "index.ts" -path "*/groypfi/*" | head -5Repository: DefiLlama/dimension-adapters
Length of output: 199
🏁 Script executed:
cat -n ./dexs/groypfi/index.ts | sed -n '1,50p'Repository: DefiLlama/dimension-adapters
Length of output: 1596
🌐 Web query:
JavaScript Number.MAX_SAFE_INTEGER BigInt precision loss silent conversion
💡 Result:
-
Why
Number.MAX_SAFE_INTEGERmatters: JavaScriptNumberis IEEE‑754 double; integers are only exact up to 2^53 − 1 = 9,007,199,254,740,991 (Number.MAX_SAFE_INTEGER). Above that, you can get “impossible” results likeMAX_SAFE_INTEGER + 1 === MAX_SAFE_INTEGER + 2because the value can’t be represented exactly. [1][4] -
“Silent” precision loss happens with
Number(not BigInt):- If you ever store/parse a large integer as a
number, precision may already be gone. - Then converting that imprecise
numbertoBigInt(e.g.BigInt(someNumber)) preserves the wrong rounded value, because the rounding happened earlier innumberform. (MDN notesBigInt()converts integerNumbers; it can’t recover lost digits.) [2][1]
- If you ever store/parse a large integer as a
-
BigInt does not silently mix with Number in arithmetic: Operations that mix
bigintandnumber(e.g.1n + 1) throw a TypeError; you must convert explicitly. [3] -
Converting BigInt → Number can silently lose precision: Coercing a
BigInttoNumber(e.g.Number(big)) may lose integer precision, and MDN recommends avoiding coercion between the two types for this reason. [5]
Practical rule: if the value might exceed ±(2^53−1), keep it as BigInt (or a string) end-to-end; don’t let it become a number at any point. [1][5]
Sources:
[1] MDN: Number.MAX_SAFE_INTEGER https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/MAX_SAFE_INTEGER
[2] MDN: BigInt() constructor https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt/BigInt
[3] MDN: “can’t convert BigInt to number” (mixing BigInt and Number) https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Errors/Cant_convert_BigInt_to_number
[4] V8: BigInt feature / safe-integer precision example https://v8.dev/features/bigint
[5] MDN: BigInt (coercion can lose precision; recommendation) https://developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/BigInt
Guard against BigInt→Number precision loss in USD conversion.
The code converts dailyVolumeNano to Number without protection, which silently loses precision for values exceeding Number.MAX_SAFE_INTEGER (2^53 − 1 ≈ 9.0 × 10^15). This threshold equals roughly 9 million TON (~$31.5M), a realistic scenario for active DEX volume. The precision loss will go undetected and produce inaccurate USD volumes.
Divide by the nanoTON scale before converting to Number, and validate the whole-TON amount is within safe range:
Proposed fix
- const dailyVolumeUSD = (Number(dailyVolumeNano) / 1e9) * tonPriceUSD;
+ const NANO_PER_TON = 1_000_000_000n;
+ const wholeTons = dailyVolumeNano / NANO_PER_TON;
+ if (wholeTons > BigInt(Number.MAX_SAFE_INTEGER)) {
+ throw new Error(`dailyVolumeNano exceeds JS safe range: ${dailyVolumeNano}`);
+ }
+ const dailyVolumeTON =
+ Number(wholeTons) + Number(dailyVolumeNano % NANO_PER_TON) / 1e9;
+ const dailyVolumeUSD = dailyVolumeTON * tonPriceUSD;📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| let dailyVolumeNano = 0n; | |
| for (const tx of txResponse.transactions || []) { | |
| if (tx.utime < startTimestamp || tx.utime > endTimestamp) continue; | |
| if (!tx.success) continue; | |
| if (tx.in_msg?.value > 0) { | |
| dailyVolumeNano += BigInt(tx.in_msg.value) * 100n; | |
| } | |
| } | |
| const dailyVolumeUSD = (Number(dailyVolumeNano) / 1e9) * tonPriceUSD; | |
| let dailyVolumeNano = 0n; | |
| for (const tx of txResponse.transactions || []) { | |
| if (tx.utime < startTimestamp || tx.utime > endTimestamp) continue; | |
| if (!tx.success) continue; | |
| if (tx.in_msg?.value > 0) { | |
| dailyVolumeNano += BigInt(tx.in_msg.value) * 100n; | |
| } | |
| } | |
| const NANO_PER_TON = 1_000_000_000n; | |
| const wholeTons = dailyVolumeNano / NANO_PER_TON; | |
| if (wholeTons > BigInt(Number.MAX_SAFE_INTEGER)) { | |
| throw new Error(`dailyVolumeNano exceeds JS safe range: ${dailyVolumeNano}`); | |
| } | |
| const dailyVolumeTON = | |
| Number(wholeTons) + Number(dailyVolumeNano % NANO_PER_TON) / 1e9; | |
| const dailyVolumeUSD = dailyVolumeTON * tonPriceUSD; |
🤖 Prompt for AI Agents
In `@dexs/groypfi/index.ts` around lines 16 - 26, dailyVolumeNano is converted to
Number before scaling, risking precision loss; compute the TON integer and
fractional parts using BigInt first, validate the integer part against
Number.MAX_SAFE_INTEGER, and only then convert to Number for final USD math.
Specifically: derive wholeTons = dailyVolumeNano / 1_000_000_000n and
remainderNano = dailyVolumeNano % 1_000_000_000n (use these names), check if
wholeTons <= BigInt(Number.MAX_SAFE_INTEGER); if yes compute dailyVolumeUSD =
(Number(wholeTons) + Number(remainderNano) / 1e9) * tonPriceUSD, otherwise
handle overflow explicitly (e.g., log/throw or use a string/BigDecimal path) so
txResponse.transactions -> dailyVolumeNano -> dailyVolumeUSD conversion is safe.
Wrong category
|
The groypfi adapter exports: |
|
The groypfi adapter exports: |
|
The groypfi adapter exports: |
|
The groypfi adapter exports: |
|
The groypfi adapter exports: |
|
The groypfi adapter exports: |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@dexs/giftedMarkets.js`:
- Around line 5-7: The url line in async function fetch uses ${API_BASE} outside
of a template string, causing a syntax error; update the url assignment in fetch
to build the string correctly (e.g., use a template literal with backticks
around the whole URL or concatenate API_BASE with the path) so that the call to
axios.get(url, {...}) receives a valid string; check the variable API_BASE and
the const url declaration to ensure they reference the correct symbol names.
- Around line 25-27: The object being exported (module.exports) contains an
invalid inline token "(UTC midnight as unix)" next to the start property which
breaks JS parsing; remove that inline text from the exported object and instead
add it as a proper comment above or beside the start field (e.g., convert to //
UTC midnight as unix) so the start: 1769904000 remains a valid property; locate
the start property in the exported object in giftedMarkets.js and replace the
inline note with a comment.
dexs/giftedMarkets.js
Outdated
| async function fetch(timestamp) { | ||
| const url = ${API_BASE}/defillama/volume; | ||
| const r = await axios.get(url, { |
There was a problem hiding this comment.
Fix template literal syntax for url.
Line 6 won’t parse; ${API_BASE} must be inside a template string (or concatenated).
🛠️ Proposed fix
- const url = ${API_BASE}/defillama/volume;
+ const url = `${API_BASE}/defillama/volume`;🧰 Tools
🪛 Biome (2.3.13)
[error] 6-6: Expected a semicolon or an implicit semicolon after a statement, but found none
An explicit or implicit semicolon is expected here...
...Which is required to end this statement
(parse)
[error] 6-6: Invalid regex flag
This is not a valid regex flag.
(parse)
[error] 6-6: Invalid regex flag
This is not a valid regex flag.
(parse)
[error] 6-6: Invalid regex flag.
The 'u' and 'v' regular expression flags cannot be enabled at the same time.
(parse)
[error] 6-6: Invalid regex flag
This is not a valid regex flag.
(parse)
🤖 Prompt for AI Agents
In `@dexs/giftedMarkets.js` around lines 5 - 7, The url line in async function
fetch uses ${API_BASE} outside of a template string, causing a syntax error;
update the url assignment in fetch to build the string correctly (e.g., use a
template literal with backticks around the whole URL or concatenate API_BASE
with the path) so that the call to axios.get(url, {...}) receives a valid
string; check the variable API_BASE and the const url declaration to ensure they
reference the correct symbol names.
dexs/giftedMarkets.js
Outdated
| start: 1769904000, | ||
| (UTC midnight as unix) | ||
| methodology: |
There was a problem hiding this comment.
Remove invalid token inside module.exports.
The inline (UTC midnight as unix) text is not valid JS and breaks parsing. Use a comment instead.
🛠️ Proposed fix
- start: 1769904000,
-(UTC midnight as unix)
+ start: 1769904000, // UTC midnight as unix📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| start: 1769904000, | |
| (UTC midnight as unix) | |
| methodology: | |
| start: 1769904000, // UTC midnight as unix | |
| methodology: |
🧰 Tools
🪛 Biome (2.3.13)
[error] 25-26: Expected a property, a shorthand property, a getter, a setter, or a method but instead found '('.
Expected a property, a shorthand property, a getter, a setter, or a method here.
(parse)
[error] 26-26: expected , but instead found midnight
Remove midnight
(parse)
[error] 26-26: expected , but instead found as
Remove as
(parse)
[error] 26-26: expected , but instead found unix
Remove unix
(parse)
[error] 27-27: Expected a function body but instead found 'methodology'.
Expected a function body here.
(parse)
🤖 Prompt for AI Agents
In `@dexs/giftedMarkets.js` around lines 25 - 27, The object being exported
(module.exports) contains an invalid inline token "(UTC midnight as unix)" next
to the start property which breaks JS parsing; remove that inline text from the
exported object and instead add it as a proper comment above or beside the start
field (e.g., convert to // UTC midnight as unix) so the start: 1769904000
remains a valid property; locate the start property in the exported object in
giftedMarkets.js and replace the inline note with a comment.
|
The groypfi adapter exports: |
|
The groypfi adapter exports: |
|
The gifted-markets.ts adapter exports: |
|
The groypfi adapter exports: |
|
The groypfi adapter exports: |
|
The gifted-markets.ts adapter exports: |
| import { CHAIN } from "../helpers/chains"; | ||
| import fetchURL from "../utils/fetchURL"; | ||
|
|
||
| const API_BASE = "https://api.gifted.markets"; |
| const ratesResponse = await httpGet(`${TON_API}/rates?tokens=ton¤cies=usd`); | ||
| const tonPriceUSD = ratesResponse.rates?.TON?.prices?.USD || 3.50; | ||
|
|
||
| const txResponse = await httpGet(`${TON_API}/blockchain/accounts/${SWAP_REFERRER}/transactions?limit=1000`); |
There was a problem hiding this comment.
not a optimal way, in case we want to backfill the data in future, fetching all transaction is not a good way to calculate this.
There was a problem hiding this comment.
no need to hardcode or fetch TON Prices, you can simply use the balance object and add ton address and amount or use addCGToken helper in balance object
Summary
Adds a new adapter for Gifted Markets (gifted.markets) to report daily volume to DefiLlama.
What is tracked
Methodology
Data source
Start date
startfield)Notes