Page 1 of 1
Calculating 1ms with timers
Posted: Fri Aug 22, 2008 6:18 am
by chukmunnlee
Hi,
I've setup an interrupt routine for the timers but I'm not sure how to set it to call at 1ms. Here is what I have
TIMER0_CR = 0;
TIMER0_DATA = TIMER_FREQ(1);
TIMER0_CR = TIMER_ENABLE | TIMER_DIV_1 | TIMER_IRQ_REQ;
irqSet(IRQ_TIMER0, timerIRQ);
It's working but I'm not sure if it is going at 1ms. Is there a formula to do this?
Thanks
Regards
Re: Calculating 1ms with timers
Posted: Sun Aug 24, 2008 8:04 pm
by cheesethulhu
The argument to TIMER_FREQ() is frequency in Hz (events per second). You gave it the period in milliseconds. In order to calculate frequency in Hz, you need to take the reciprocal of the period in seconds.
Code: Select all
1 second
period = 1 millisecond * ---------------- = 0.001 seconds
1000 millisecond
1
------------- = 1000 Hz
0.001 seconds
You need to use TIMER_FREQ(1000) for a 1ms timer. If you decide to (or might decide to) use a slower timer in the future, you should continue reading.
Using TIMER_FREQ() and TIMER_DIV_1 will only work for frequencies between 512 Hz and 33554432 Hz, so for a period of 1ms (1000 Hz), you're just fine. However, TIMER_FREQ() and TIMER_DIV_1 have a maximum period of about 1.9ms. If, after reading the first sentence of this post, you were wondering why your interrupt wasn't firing once per second, now you know.
In order to have your interrupt fire less often than 512 Hz, you need to use a divider other than TIMER_DIV_1 and the corresponding TIMER_FREQ. Take a look at the comments in libnds/include/nds/timers.h (within your devkitPro installation directory) to see which frequencies are supported by each TIME_FREQ*() and TIMER_DIV_*. Just make sure the numbers on them match.