Skip to content

Polling for Loan Changes

The CRF Exchange API provides an endpoint which allows you to poll for changes to loans in your portfolio. This is useful for keeping your systems in sync with the latest loan information, events, and documents. This guide will walk you through how to use the polling endpoint effectively.

Before you begin, ensure you have the following:

  • Your API key
  • Your Partner ID

If you don’t have an API key or Partner ID, please refer to the Getting Started guide.

The Exchange API uses a cursor-based pagination system that enables efficient polling for loan changes. The algorithm follows a continuous polling pattern with built-in rate limiting and cursor persistence.

  1. cursor: A token representing your position in the loan dataset. Store this value between polling cycles. The first time you poll, the cursor will be empty, indicating you want to start from the beginning. It is important to note that each viewType should have its own cursor stored separately. You can keep track of as many cursors as you need for different views, but the other parameters passed should always match the stored cursor.
  2. hasMoreResults: A boolean flag indicating whether more results are immediately available.
  3. Rate Limiting: The API enforces rate limits. When exceeded (HTTP 429), the response includes a Retry-After header.
flowchart TD
    A[1. Load cursor from database or use from last iteration] --> B[2. Fetch loans with cursor parameter
GET /loans?cursor=cursor] B --> C[3. Process returned loans
- Download new documents
- Update local data] C --> D[4. Store new cursor to database] D --> E{hasMoreResults?} E -->|YES| B E -->|NO| F[5. Wait before next poll
minutes to hours] F --> A
cursor = getCursorFromDatabase(viewType) // May be null/empty on first run, exclude the parameter in that case

When: At application startup or when beginning a new polling cycle.

do:
response = GET /loans?partnerId={id}&viewType={type}&cursor={cursor}&maxResults=100
if response.status == 429:
// Rate limited - wait as instructed
retryAfter = response.headers["Retry-After"] ?? 60
sleep(retryAfter seconds)
continue // Retry same request
loans = response.loans
newCursor = response.cursor
hasMoreResults = response.hasMoreResults
// Process loans (download documents, update data)
for each loan in loans:
processLoan(loan)
// IMPORTANT: Store cursor after processing
storeCursorInDatabase(viewType, newCursor)
cursor = newCursor
while hasMoreResults == true

When to Continue Immediately: When hasMoreResults is true, continue fetching without delay. This indicates the API has more data ready for you right now.

When to Store the Cursor: After successfully processing each batch of loans. This ensures you can resume from the correct position if your process crashes or restarts.

// Exited fetch loop because hasMoreResults == false
// All current changes have been retrieved
sleep(pollingInterval)
// Resume polling from stored cursor
cursor = getCursorFromDatabase(viewType)
goto step 2

When to Take a Break:

  • Minutes (5-15 min): If you need near-immediate updates
  • Hours (1-6 hours): For regular business hour synchronization
  • Days (24 hours): For batch processing or reporting systems

The appropriate interval depends on your business requirements and how quickly you need to react to loan changes. You can also pull on a scheduled or nightly basis. The schedule is up to you.

The API returns HTTP 429 when rate limits are exceeded:

if response.status == 429:
retryAfter = response.headers["Retry-After"] ?? 60
log("Rate limited, waiting {retryAfter} seconds")
sleep(retryAfter seconds)
// Retry the same request with same cursor

Key Points:

  • Always respect the Retry-After header value
  • Do not update the cursor when rate limited
  • Retry the exact same request after waiting
  • Default to 60 seconds if header is missing

DO store the cursor:

  • After successfully processing each batch of loans
  • Before your application shuts down gracefully
  • After handling each batch, even if errors occur during processing

DON’T store the cursor:

  • During rate limiting (before retry)
  • Before processing the returned loans
  • If the API request fails
try:
response = fetchLoans(cursor)
for loan in response.loans:
try:
processLoan(loan) // May partially fail
catch error:
logError(loan.id, error)
// Continue processing other loans
// Store cursor even if some loans failed processing
// You can reprocess failed loans separately
storeCursor(response.cursor)
catch apiError:
// Don't store cursor if API call failed
logError("API fetch failed", apiError)
return

Here’s a simplified but complete polling implementation:

async function pollForChanges(viewType: string, intervalMinutes: number) {
while (true) {
let cursor = await getCursor(viewType);
let hasMoreResults = true;
// Fetch all available changes
while (hasMoreResults) {
const response = await fetchLoans(viewType, cursor);
// Process the batch
for (const loan of response.loans) {
await processLoan(loan);
}
// Store progress immediately
await storeCursor(viewType, response.cursor);
cursor = response.cursor;
hasMoreResults = response.hasMoreResults;
}
// All caught up - wait before next poll
console.log(`Caught up. Waiting ${intervalMinutes} minutes...`);
await sleep(intervalMinutes * 60 * 1000);
}
}

Track these metrics for healthy polling operations:

  • Cursor advancement rate: How frequently the cursor changes
  • Batch sizes: Number of loans per response
  • Processing duration: Time to process each batch
  • Rate limit hits: Frequency of 429 responses
  • Last successful poll: Timestamp of last completed cycle

The polling algorithm follows this pattern:

  1. Load cursor from persistent storage
  2. Fetch continuously while hasMoreResults is true
  3. Store cursor after each successful batch
  4. Handle rate limits by respecting Retry-After headers
  5. Wait between cycles based on business requirements (minutes/hours/days)
  6. Resume from cursor when starting next cycle

This approach ensures:

  • No duplicate processing (cursor tracks position)
  • Resilience to failures (cursor stored frequently)
  • Efficient use of API resources (batch fetching)
  • Compliance with rate limits (exponential backoff)