I have an MSP430F2618 executing at 16MHz. As part of my initialization sequence, I would like to debounce a signal that is on a noisy line and susceptible to transient value changes. To help with this, I've put a delay on the reading with code like the following: do { oldCommandVal = currCommandVal if ((P5IN & 0xF) == BIT0) { currCommandVal = 0; } else if ((P5IN & 0xF) == BIT1) { currCommandVal = 1; } else { currCommandVal = 2; } if (retry < 10) { __delay_cycles(64000); ++retry; } else { WDTCTL = 0; } } while ((currCommandVal == 2) || (oldCommandVal != currCommandVal)) When the --opt_for_speed=0 (default), the above code works as expected; however, if I set the --opt_for_speed=5 (max speed), the code above does not work as expected. Instead of a 4ms delay between GPIO reads, it is closer to 2ms between GPIO reads. In looking at the listing files, I do not notice any immediate differences between the generated instructions. Any thoughts on what might cause this (given that the intrinsic is stated to delay for an exact number of cycles)?
↧