How to calculate ADC delay time?

Hello,
I can see below two definitions in the rsl10_calibrate_power.h
How to calculate those values if my device’s SYSCLK is 8MHz and uses ADC_PRESCALE_200 like below?
Sys_ADC_Set_Config(ADC_VBAT_DIV2_NORMAL | ADC_NORMAL | ADC_PRESCALE_200);

/* ----------------------------------------------------------------------------
 * Constant Definitions
 * ----------------------------------------------------------------------------
 * Assumptions: SYSCLK = 16 MHz, ADC = low frequency mode (SLOWCLK = 1 MHz)
 * ------------------------------------------------------------------------- */
/* 3 times the length of time corresponding to the minimum sample rate,
 * which is deemed sufficient to allow the ADC to stabilize */
#define ADC_STABILIZATION_DELAY         0x7530

/* Corresponds to sample rate of the ADC as configured (100 Hz) */
#define ADC_MEASUREMENT_DELAY           0x2710

Thanks,
Calvin

@Calvin

I wanted to let you know that we are still looking into this one on our end, I will share the relevant information here.

@Calvin

Let’s have a look at the function

Calibrate_Power_MeasureSupply(uint32_t *adc_ptr)
/* ----------------------------------------------------------------------------
 * IFunction     : static uint32_t Calibrate_Power_MeasureSupply
 *                                                 (uint32_t *adc_ptr)
 * ----------------------------------------------------------------------------
 * Description   : Measure a supply voltage; returns the median measurement of
 *                 3 measurements to ensure that we're rejecting sampling noise
 * Inputs        : *adc_ptr            - Pointer to ADC data register
 * Outputs       : return value        - Median measurement from the ADC
 *                                       data register
 * Assumptions   : Calibrate_Power_Initialize() has been called.
 * ------------------------------------------------------------------------- */

So we could see the purpose is for rejecting sample noise.
We collect 3 measurement value in different time periods.

    uint32_t supply1 = 0;
    uint32_t supply2 = 0;
    uint32_t supply3 = 0;

There are two delay time period.
a. ADC_STABILIZATION_DELAY
b. ADC_MEASUREMENT_DELAY

You can select your own value which is over minimum sample rate.
For example T1 for supply1; T2 for supply2 and T3 for supply3 — purpose is for rejecting sample noise!

Why we have ADC_STABILIZATION_DELAY?
For example: power on, we call this function, so we need have a long time to make sure everything ready like VBAT etc. for the first time measurement.


When using LSAD->CFG = LSAD_NORMAL | LSAD_PRESCALE_200; it uses this formula.

/* Three times the length of time corresponding to the minimum sample rate, /
/
which is deemed sufficient to allow the LSAD to stabilize */

#define STABILIZATION_DELAY (SystemCoreClock * 3 / 625)
ADC Frequency = 625 Hz
/* Corresponds to sample rate of the LSAD as configured (625 Hz) */

#define MEASUREMENT_DELAY (SystemCoreClock / 625)
ADC Frequency = 625 Hz
Therefore you can have:

ADC_STABILIZATION_DELAY = 8000000 *3/625
&
ADC_MEASUREMENT_DELAY = 8000000 /625

Thank you for using our community forum!