Posted on

Sustainable High-speed Data Acquisition

Introduction to High-Speed Data Acquisition: What You Should Know

The importance of data acquisition (DAQ) systems in laboratory, industrial and practical applications cannot be overstated because they provide the standard mechanism for transferring analog data into a computer for further processing. However, these DAQ systems are anything but monolithic.

They range from those that gather data at a relatively slow rate, without demanding a high degree of accuracy, to those which must gather very accurate data at a very high rate. While DAQ systems may differ in various characteristics such as their sample rate, number of channels, resolution, memory, and even platform, they share the common denominators of making sense of some real-world phenomena by leveraging digital technology.

However, the latter systems are called high-speed data acquisition (DAQ) and operate at several orders of magnitude of precision and data volume higher than the typical DAQ. 

DAQ Systems Design

A not-so-typical example of a high-speed DAQ system is a mission-critical application such as a missile guidance system that requires highly accurate information to be stored, gathered, and analyzed billions of times per second.

Needless to say, this is not a feat for a fainthearted system. As a result, implementing these high-speed DAQ systems will test both the integrity and mettle of your computer like no other application. 

Depending on the application and use-case, most will subsequently compress the large volume of data accumulated into a commensurate format that can be suitably handled in a PC environment. 

Interconnected Ecosystem of Dependence

The missile-guidance, high-speed DAQ system example illustrated is obviously of the cutting-edge variety and most systems don’t usually require such a high degree of data acquisition velocity.

The difference between a typical high-speed and ordinary DAQ systems lies mostly in their capability to acquire and transfer data at the same time, with no splits in an analog record for data rates up to say, 2 GB/s continuously to data storage systems with large capacities of up to 192 TB.

The system and hardware capable of sustaining data rate speeds at these frequencies must be well fine-tuned for the task at hand. This optimization is important because of the plethora of interconnecting systems interacting and multitasking with each other at the same time. 

Therefore, high-speed DAQ systems are designed as rich ecosystems of dependency: these constituent parts often comprise the data recording and acquisition itself, the graphical monitoring of the systems performance and associated signals, seamless output plus synchronization to external devices; all working together in the real-time framework of applications.

The most recent iterations of these systems have powerful architectures that comprise a host of other subsystems that perform intelligent signal processing while juggling the implementation of signal conditioning for most commonly used sensors, multichannel with simultaneous sampling of numerous signals, all monitored and recorded in real-time.

Challenges of maintaining a high-speed DAQ System

Computers have evolved in sophistication to the extent that they can perform tasks that were unimaginable just a couple of years ago. CPU power has increased in leaps and bounds with the help of Moore’s law, and are now used in a myriad of applications such as high-resolution gaming systems, flight simulation, querying and searching large databases, and high-speed data acquisition systems.

For the conventional data acquisition (DAQ) system, however, a real challenge is being able to identify when your high-speed data acquisition is running into or about to run into trouble. Why is this so?

PowerLab Data Acquisition Hardware for Research

Well, nothing puts a strain on the capability of a computer than when it has to operate a high-speed, real-time data acquisition system. If you really want to test the limits of a computer system or expose its weakness and deficiencies, there is no better way than to subject it to the full-weight of a multi-channel data acquisition software.  

A high-speed DAQ system has to be on top of its game if it is able to handle the outputs of virtually any device used in unstable/unreliable environments such as those obtained in airborne/mobile geophysical exploration and related monitoring systems. 

The system should also be able to permit real-time graphically monitored of signals as waveforms, with high-resolution displays or chart recorders capable of analog interfaces with high resolution (at least 16 bits), and a variety of configurations (as varied as 32 single-ended channels, 16 differential channels, and so on.)

Nothing drains the power of a conventional PC faster than streaming high-speed data to a hard disk at over 1 kHz per channel, especially doing so while also maintaining a real-time display.

Expert Advice to Guide You through the High-Speed DAQ Quagmire

At DAQiFi we understand that the challenge of selecting the ideal data acquisition components or solution depends on your application needs, and we are positioned to help you with our expertise and knowledge.

Whether the application requirements are fuzzy, or crystal clear; whatever stage you find yourself at your high-speed DAQ implementation, DAQiFi can help you regardless. DAQifi systems and devices are carefully selected to ensure that they yield optimum performance in terms of speed and endurance.

Our systems are designed to be self-calibrating, with careful attention to detail paid to noise immunity, anti-aliasing filters, crosstalk minimization; all with an emphasis on performance and reliability.

Our data collection systems do not buckle under pressure, even when used outside of the typical mainstream applications, because we impose very demanding requirements for reliability under extreme environmental conditions. DAQifi specialists are there to help you define the best solution for your applications.

User-friendliness Should Not Confuse Underlying Sustainability

Most high-speed DAQ systems have been designed to provide a balance of flexibility and ease. From the perspective of the end-user, the configuration and setups are almost turn-key, with most systems employing a simple, intuitive user-interface requiring only a few clicks of the mouse to get things up and running.

portable daq system

Since these systems are designed to work at high-levels of abstraction that isolate the average user from its underlying complexity, it is easy to ignore and take for granted the heavy-duty hauling that goes on behind the scenes. For example, users are afforded the flexibility of defining the sampling rates for the different inputs, as well as specifying recording formats.

Therefore, these upfront conveniences shouldn’t blind you to the drain on your system’s resources that high-speed DAQ systems demand on your PC. Understanding this reality will enable system owners and administrators to anticipate possible bottlenecks in order to ameliorate them accordingly.

Since applications that feature high-speed data acquisition require fast sampling rates, it is imperative for both the underlying software and hardware to be as efficient and responsive as possible. If you don’t want data transfer volumes to be minimized, and ensure that you’re analysis is sped up and unencumbered, you have to make sure that the key parameters that guarantee effectiveness aren’t compromised in any way.

At Daqifi, we prioritize designing these types of robust applications, offering our customers a flexible means to synchronize external devices while spotting trouble before it ruins a critical test.

Useful High-Speed DAQ Applications, Implementations, and Configurations

There is already an abundance of applications where digital data from analog to digital converters are gathered, stored, and analyzed, such as medical equipment, industrial measurement and control, radar systems, avionics, and so on. Many applications are now finding uses for high-speed DAQ systems.

Data acquisition systems with high throughput are usually required to continuously and rapidly acquire and process large volumes of data. To do so uninterruptedly, such systems need reliable connectivity. A high-speed wireless data acquisition system (WDAS) is an obvious remedy and one has been developed for novel applications such as geotechnical centrifuge model testing.

Oscillating Between High-Speed DAQ and DAQ

This particular use-case designed for geotechnical modeling applications illustrates the interplay between “ordinary” DAQ and the high-speed variety. 

The geotechnical systems can maintain low-speed data capture continuously before suddenly triggering a burst of high-speed acquisition when an external event occurs. After this triggering event is done, the system simply reverts to low-speed acquisition to monitor the aftermath of the event.

ADQ-1000 DAQ Server

The data is this particular setup is stored persistently within logging units made of solid-state memory. However, streaming is possible directly to the centrifuge control room in real-time with frequency as low as 10 Hz using wireless transmission.

This configuration employs solid-state memory typically features flash memory, which is extremely reliable in demanding environmental, high-vibration conditions. 

Minimizing Development and Maintenance Overhead

Other high-speed DAQ systems have used configurations that even eliminate local OS nor CPU’s to minimize the attendant software development and maintenance costs they accrue. 

Messia-III (Modularized Extensible SyStem for Image Acquisition), which is a high-speed data acquisition system was designed for the Japanese 8.2m telescope named Subaru. Described as a VMEbus-based system, Messia is linked to a host UNIX workstation in order to remove or mitigate the need for an OS, with a direct 1-Gbps link.

The overall configuration results in the minimization of the time required to set up the DAQ system CCD controller system used for advanced instruments. However, one of the peculiarities of this system that might not scale well with others is the fact that it isn’t event-driven and does not require real-time controls.