This application note describes the implementation of a single pin measurement of frequencies up to 500 kHz using the Timer 2 counter input pin. The code accompanying this document implements a frequency acquisition system that can be combined with a voltage measurement via the ADC to track both the frequency and voltage of the input signal. The measurement is displayed on an HD44780 compatible LCD screen. The display.c code accompanying this application note implements the routines necessary to output to the LCD display.
Since frequency is the number of cycles of a given waveform recorded over 1 second, both the number of cycles and the time taken for that number of cycles have to be measured. By measuring the number of cycles over 1 second, the frequency of the waveform is determined.
In order to measure the number of cycles, a record of the number of times a 1-to-0 transition occurs on the waveform should be maintained. This is done using the Timer 2 counter input pin, which increments the Timer 2 registers on a 1-to-0 transition.
There are two ways to measure a second. The first involves setting up Timer 0 with reload values such that it overfl ows when 10 ms has elapsed. By counting 100 of these overflows, a 1 second interval can be measured. See freq.c for an example.
The second method is to use the time interval counter, which can be set up to interrupt the core after 1 second has elapsed. The advantage here is that the core is interrupted less frequently than a segmented count using Timer 0, and thus is free to carry out tasks between these interruptions. See tic.c for an example.
Since the Timer 2 registers can hold only a 16-bit result, the maximum measurable frequency is 65535 Hz. In order to extend this frequency, a record is kept of the number of times Timer 2 overflows during the 1 second interval. This is then factored into the calculation of the actual frequency during this interval.
Implemented in both methods is a calibration function. If a 100 kHz square wave is input to P1.0 and the INT0 button is pressed, the software will calibrate the frequency measurement to compensate for errors that interrupt latency introduces at the higher frequencies; however, the error is only significant above 10 kHz, so the calibration routine ignores the gain error below this frequency.