DSO firmware version 3.64

BenF: “Fast-mode is likely your best choice for short cycle time and high waveform detail at a T/Div of 5ms. …”

I understand the problem with CPU. That’s very bad news, actually it is the worst then I expected. If I understand the algorithm, not only that we have problem with high frequencies on big TD, but there is one additional: the scope does not work in real time fully because trigger cannot detect regular case if signal comes during screen refresh or it is till at the end/begging of the buffer.

I had problem with trigger, but I thought it was because of high trigger level. But now, I figured out that trigger case can be missed.

I think you are focused too much for high frequency cases. Accentually, hight frequency signals works much better then expected (I tested RS232).
But, I think the most of people does not expect to see 300kHz waveform with 1 MHz sampling rate, but we definitely expect to see good and quality signal for frequencies less then 50kHz:

  • Industrial/Automotive signals,
  • Sound,
  • Car/Auto signals,
  • Power supply signals,
  • Switches (inverters, …)
  • Very slow signals: temperature, …
    I think this is real and very big target of this scope.

For example, we have ONLY 5 kHz (or 50kHz, but virtually small buffer size) sampling rate for TD=5ms, without LP filter, with slow auto refresh and possibility to miss some signals or trigger cases.

Don’t take me wrong. I’m very very very thankful for your enormous effort and time on this, and you have made fantastic improvements comparing to original software, but I think you are not focused on middle/low frequencies which are the most important for this kind of scope.

  1. Let’s keep DMA approach for low TD (high frequencies). Also, maybe you could consider splitting buffer on two parts: while one part is filled with DMA, you can examine another one, and then switch the DMA to second part of buffer. I used that approach in past for MD DOS sound card oscilloscope and FFT and it worked great in real time. Of course, I’m not sure is CPU can cover this. This part is not so important.

  2. Most important: For middle and low frequencies, let’s skip DMA approach. I hope you can process with at lest 100kHz sampling rate without DMA and keep average, Min and Max values in buffer as I suggested with your existing approach and buffer size (but reduced for MIN/MAX values). Also the buffer must be circular, so you will never loss trigger and can make auto refresh fast because you always have last n values in the buffer.

I think that calculating Avg, Min and Max could be very fast (I think that CPU has condition assign instruction). Per each sample, you need just three additional interrupt instructions:
[0]. Take the sample (skip calling utility function to save time)

  1. avg_sum += sample_value
  2. if sample_value > max then max := sample_value
  3. if sample_value < min then min := sample_value

After calculating avg, min, max for series of samples, store these three values in circular buffer. This is the time to process trigger detection and avoid that latency. When detect trigger, you start counting time position, so you can beginning of signal in the buffer. The buffer is used only for displaying. All (or most of) other calculations can be at rel-time. If measurements makes this complicated, skip that functionality. The main purpose of oscilloscope is to see the signal waveform.

I think that processing interrupt, taking sample and these thee additional instructions for avg, min and max can be done with at least 100 MHz sampling rate because they requires just “few” CPU clocks. I will not surprised if this can be done with 250 kHz or even more sampling rate.

Thanks,
Dejan