With the advent of powerful computers, it became possible to perform real-time data analysis. Not only could users see the signal being acquired and recorded in real time but they could also watch how fiduciary points were being positioned on the traces, allowing them to check that analysis was proceeding correctly. Real-time data analysis also provides the possibility, if required, to adjust analysis settings and immediately see their effect on the newly analyzed traces.
Now that real-time is taken for granted and the novelty of seeing traces and analysis on screen in real-time has worn off, many users realize that, although they run long recordings of many hours on multiple subjects, the last thing they want or need to do is to watch the signal and analysis while they are being performed.
On the other hand, most, if not all, users will tell you that they never trust their system to the point that they would not look at their data, and send them directly for post-processing or for direct inclusion in study databases.
This is not so much because the telemetry and software tools are not performing well. It is because, very often, the data points of greatest interest are those where the signal took on an unexpected shape, or when the produced derived data started moving away from standard values.
And these difficult data zones are where human expertise remains indispensable for validating or editing results proposed by the software. Keep in mind that currently, and probably for many more years to come, when a human expert has difficulty distinguishing, for example, an ECG arrhythmia from a normal complex, software tools, at best, indicate an abnormality but are unable to qualify it as 100% reliable.
In view of this, well designed software will be able to establish easily accessible lists of all "difficult" data zones, allowing fast and efficient examination and qualification of each of those potentially interesting events.