Backfill historical sales into LinkJolt or reconcile LinkJolt with your system of record without creating duplicates.
15 min read
The orderId field is a unique key. Re-submitting the same orderId returns 409 duplicate instead of creating a second row. Use the same ID as in your source system (Stripe payment_intent ID, Shopify order ID, etc.).
import postgres from 'postgres';
const sql = postgres(process.env.DATABASE_URL);
const KEY = process.env.LINKJOLT_API_KEY;
// Pull sales from your DB. Each row must have affiliate tracking info already.
const sales = await sql`
SELECT order_id, campaign_id, affiliate_id, amount
FROM sales WHERE affiliate_id IS NOT NULL AND synced_at IS NULL
ORDER BY created_at ASC LIMIT 1000
`;
for (const s of sales) {
const res = await fetch('https://linkjolt.io/api/v1/conversions', {
method: 'POST',
headers: {
'Authorization': `Bearer ${KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
campaignId: s.campaign_id,
affiliateId: s.affiliate_id,
amount: Number(s.amount),
orderId: s.order_id, // idempotency key
}),
});
if (res.status === 409) {
// already synced — just mark in DB
} else if (!res.ok) {
console.error('failed', s.order_id, await res.text());
continue;
}
await sql`UPDATE sales SET synced_at = NOW() WHERE order_id = ${s.order_id}`;
// Stay under 300 req/min (Ultimate) — ~5 per second
await new Promise(r => setTimeout(r, 250));
}Ultimate: 300 req/min. A tight loop at 5 req/sec = 300/min exactly. Leave headroom: aim for 4 req/sec (250ms gap). Watch the X-RateLimit-Remaining response header and back off if it hits 0.
async function postWithBackoff(url, opts) {
for (let attempt = 0; attempt < 5; attempt++) {
const res = await fetch(url, opts);
if (res.status !== 429) return res;
const retryAfter = parseInt(res.headers.get('retry-after') || '5', 10);
await new Promise(r => setTimeout(r, retryAfter * 1000));
}
throw new Error('rate limit — gave up');
}Full reference: POST /api/v1/conversions