The code I'm reading has a definition for DCO suggesting it is set to approx 12.9MHz (12902400L). I'm not sure if that is exactly the frequency used as there is a software PLL which tries to keep the DCO within +/- 0.5% of nominal. The software PLL works by having a 1ms timer (timer B using SMCLK from 32.678kHz crystal) and a 1 second timer (time A using ACLK from 32.768kHz crystal). The one second timer counts the number 1ms ticks, which should be approx 1000. If the count is (1000 + 5) then it slows down the DCO one step. From what I can workout the timer B settings are derived from the DCO definition (12902400L), so this leads me to believe the system frequency should be ~12.9MHz (lets say 13MHz). If I am reading the data sheet correctly, 13MHz => minimum supply voltage of 2.85V, so I should set the SVS to 2.9V. Is that correct? Now, when the unit is turned off it goes into low power mode (LPM3). I believe the DCO is disabled in this mode, so I presume the minimum supply voltage would be 1.8V and I can set the SVS to 1.9V (lowest). Is that correct? I presume it's valid to change the SVS thresholds if the system frequency changes. I don't even know if I have to do that. What would happen if I left it at 2.9V in LPM mode? Would the SVS still generate a POR reset? If so, then I guess it makes sense to reduce the threshold to 1.9V for LPM3 mode, to prevent unnecessary POR. Is that what is normally done or recommended? Thanks, Brendan.
↧