When using the ADC, the measurement is always relative to the analog reference voltage, which most of the time will be the same as Vdd. On many Nucleo, Discovery, and custom boards this will be a 3.3v LDO regulator. However, 3.3v is the target voltage, not the actual, and various factors could lead to significant voltage drops that lead to multi-% inaccuracy if you assume 3.3v.
Many (all?) STM32 MCUs come with an internal bandgap based reference and a calibration value that gives the value of that bandgap at some fixed reference (see your MCU datasheet for the exact value). For example, on the F303RE the bandgap reference is for 3.3V at 25C, and the calibration value is stored at an address that is stored as VREFINT_CAL_ADDR in the HAL libraries, which mbed uses.
The supply voltage Vdd can be calculated as such:
const float BANDGAP_VDD_REF = 3.3;
float readRefVoltage()
{
double vdd;
double vdd_calibed;
double vref_calibed;
double vref_f;
uint16_t vref_u16;
uint16_t vref_cal;
vref_cal= *((uint16_t*)VREFINT_CAL_ADDR);
vref_u16 = vref.read_u16();
vdd = BANDGAP_VDD_REF * (float)vref_cal / 4095.0 / vref.read();
return vdd;
}
To make a calibrated readings of an ADC pin:
//Reads voltage divider, and uses internal calibrated reference voltage to calculate real voltage
//Not 100% accurate, but better than assuming 3.3v
AnalogIn adc0(A0);
float readADC_Calibrated()
{
uint16_t vref_cal= *((uint16_t*)VREFINT_CAL_ADDR); //factory calibration value for 3.3v on F303RE
uint16_t vref_u16 = vref.read_u16(); //read the internal voltage calibration
float vdd = BANDGAP_VDD_REF * (float)vref_cal / 4095.0 / vref.read();
//ain.read() returns float value between 0 and 1 (a % of Vdd)
float reading = adc0.read();
//the calibrated reading
return reading * vdd;
}
In my personal experience, the Vdd calculated from the bandgap reference was within 20-30mV of the real value, leading to the first 2 digits of any ADC measurement being accurate, and the third digit pretty close.