Posted on

IoT Data is an Untapped Goldmine

business-charts-commerce

The Internet of Things (IoT) has exploded over the last several years as the technology for connecting devices, appliances, and homes has grown unlike ever before. All of these devices are producing enormous amounts of data at levels companies appear ill-equipped to handle.

Managing the volume of data that comes from connected, smart device is a challenge for companies, but they’re finding it is worth the effort because the value of the data might even exceed that of the very products that generate it. Providing support for companies trying to effectively sort the IoT data and extract the valuable analytics that result was a focus at a recent tech conference.

Mobile World Congress

The annual Mobile World Congress held in Barcelona is historically a gathering of the world’s largest telecom carriers and their suppliers. Recent industry trends and the impact of IoT specifically has led many at this conference to focus heavily on connected devices, wireless networks, and all of the data being generated by the IoT. Consumer, industrial, and automotive data and how to support it was a major theme at this year’s conference.

mobile world congress 2013

Participants at the Mobile World Congress also addressed the revolutionary potential of this data and the current state of companies to presently handle it. Many organizations, even those that are tech leaders, are ill-equipped to deal with the sheer volume and resulting analytics of this data.

Considering there are now billions of connected devices communicating and sending data all of the time, companies need up-to-date cloud network architecture, ultrafast connections, and the processors to succeed. 5G networks that could support this traffic were also a focus. The networks and their suppliers have recognized the value and impact of this transformative data not only to their own businesses but to the end users of the data as well.

IIoT Challenges

Industrial IoT (IIoT) technology providers attending the Mobile World Congress were open and honest about the challenges they are facing. What’s become apparent is that, despite the technology providers commitment to grow and support the IoT through connected devices, even the largest and most technologically advanced of the group were not prepared for the influx of data. Humera Malik, founder, and CEO of Da-Uh (an IIoT analytics solution provider) was quoted as saying, “One of the biggest surprises, when you work with Fortune 500 companies, is how far behind they are in their data readiness. There’s a huge opportunity in putting data together in a way to make the data useful.”

IIOT Data Value

Helena Schwenk, manager at IDC European Big Data and Analytics, proposes that

“Successful IoT solutions are those that will be able to convert data flows from sensors, devices, and endpoints into valuable business information, enabling organizations to automate key processes, create new products and services, and become more intelligent and connected entities.”

The data could serve to drive corporate directives, initiatives and even help with product innovation. Few at the Mobile World Congress challenged the inherent value of all of this data to any organization. It was just a question of how to wrap their hands around it, sort through the noise, and extract valuable data in a meaningful way.

The potential to share data across industry and organizations adds additional value. The data may belong to a customer, but the aggregate analytics are what really matters. Keywell posits,

“You don’t need to own the data to create insights across industries.”

A tire manufacturer, for example, could provide invaluable insights with its data for transportation businesses or construction companies. The possibilities for how the data is used become boundless.

Takeaways for Corporate Leaders and IIOT professionals

industry-iot

  • Companies are hard at work gathering experts to manage the IoT data and take responsibility of the information across all areas of business.
  • Many companies – even large, technologically savvy corporations – lag behind in terms of readiness meaning there is still significant opportunity to put all of this data together and make it meaningful and useful.
  • The value of the data generated by IoT devices and sensors could end up surpassing the value of the underlying products themselves.
  • Infrastructure to support the transfer of IoT data will be a key contributor to any organizations success in managing and extracting value from it.
  • There is as much opportunity to provide data to additional parties or impacted industries as there is to utilize the data in-house.
Posted on

Machine Learning

machines in different colors

Machine learning is a branch of artificial intelligence that takes advantage of advanced statistical techniques to give computers the ability to learn from data without being explicitly programmed to do so. It lets a system spot patterns and quickly make decisions with little to no human intervention.

How Machine Learning Has Evolved Over Time

Machine learning isn’t a science that is completely new. However, it has gained a lot of momentum lately due to the availability of more powerful computing technologies. Machine learning started out from simple pattern recognition and a theory that computers can be made to learn on their own by being shown data, even if they weren’t programmed to carry out specific tasks with it. Thanks to machine learning, systems can continuously adapt when presented with new data.

The algorithms behind machine learning have been around for decades. What has changed lately is that we now have the ability to apply even more complex calculations to big data faster than ever thought possible. Examples of machine learning applications many people are familiar with include: automatic driverless cars, fraud detection systems used by credit card companies, social media sites showing trending topics and online shopping sites recommending items to their users.

Machine Learning Growing in Popularity

During the last few years, there has been a vastly growing amount of interest in machine learning from organizations in various industries. This is mainly due to the availability of big data, more powerful and cheaper computers to process the data and the affordability of information storage solutions.

Businesses can now work with larger and more complex sets of data, all while expecting fast and accurate results. By building the right model, an organization can take advantage of the power of machine learning to spot profitable opportunities while reducing exposure to risk.

Who Uses Big Data?

Industries that routinely deal with large quantities of data have quickly realized just how valuable machine learning technology can be. It can provide them with valuable real time insights that lets them work more efficiently and be more competitive. Here is an overview of who uses machine learning and how they’re applying it to their industry:

big-data-world

  • Sales and Marketing

If you’ve ever seen a website recommend an item based on purchases you’ve made before or similar items you’ve looked at during a past visit, it means they were using machine learning to promote items you’re more likely to have an interest in. Marketers are able to use machine learning to acquire data, analyze it and use it in a way that results in more customized promotional campaigns. The result is advertising and marketing materials that are more relevant to shoppers, making them more likely to make a purchase.

  • Health Care

The health care industry is adopting machine learning at a rapid pace, mainly as a result of the development of wearable medical devices and sensors providing real time information on a patient’s health. Data collected by these sensors can be very useful for medical professionals, as they can use it to identify trends or find ways to improve treatments.

  • Financial Services

Machine learning lets businesses in the financial industry make sense of their data by rapidly identifying the most important insights. This can lead to finding previously unknown investment opportunities or help investors by showing them the most profitable trades at crucial moments. Machine learning can also analyze patterns in data to identify high risk clients or transactions, which may be used to prevent fraud.

  • Government Agencies

Government agencies, like public utilities, are able to benefit from machine learning as they often have large amounts of data coming from a variety of sources. This data can be analyzed in various ways. For example, information coming from sensors can help utilities find ways to prevent waste, boost efficiency and save money.

Popular Machine Learning Methods

There are various machine learning methods out there, but the two most popular are supervised learning and unsupervised learning. Here’s how they work:

Machine Learning methods

  • Supervised Learning

Supervised learning uses labeled examples to train a system. It’s used in cases where a desired output is already known in advance. As it learns, the algorithm receives many inputs together with the right outputs. It perfects its model by comparing its own output with the correct one to spot errors.

Supervised learning is often used in applications where past events are likely to predict future ones. For example, it can help an insurance company determine which customers are likely to file a claim in the near future.

  • Unsupervised Learning

In unsupervised learning, the system doesn’t know which answer is the right one and must figure it out on its own. It will have to explore the data and find some kind of structure within it. A popular example of unsupervised learning is systems being used for marketing campaigns identifying customers with similar attributes and segmenting them so they can receive the right kind of marketing materials.

Posted on

Big Data Use Cases Explained

big data use cases

Big data isn’t just a catchphrase. The reality of using high-end computers to crunch unimaginable volumes of data in pursuit of insights that can mean greater profits has developed to a point where it is hard to fathom an industry that wouldn’t benefit from the process. And if “greater profits” is too general a term, let’s get more specific. Big data applications can reduce processing flaws, increase efficiency, improve production quality, as well as save time and money. To get a sense of the possibilities, let’s take a look at a few specific big data use cases.

Before We Start

While the ways in which big data can be useful to an industry are essentially unlimited, unless you approach the process with a specific business challenge to be addressed, there’s a good chance you will end up wasting time and money. While the power of data acquisition and analysis is a mighty tool, unless you have a tightly focused query, useful insight will be hard to find.

Use cases, like those we are about to reveal, provide real world scenarios that illustrate the value to be found in spending the time beforehand to come up with a targeted question. You need to learn to think tightly and specifically when formulating a question for big data to chew on.

For example, asking where the next big market for your product will likely not yield as useful of a response as asking who in the US is more likely to buy more of that product? Ultimately, big data focuses on finding patterns and examples, thus coming up with appropriate questions that play to this strength is critical.

Product Design Customization

Without naming names, one example of using big data comes from a $2 billion company that engages in product manufacturing. With big data analysis in place, this company decided to focus on the behavior or repeat customers. At the heart of the attention lies the long-held 80/20 rule of business, which decrees that approximately 80% of a company’s sales come from 20% of its customers. By focusing on discovering the habits of this critical minority, it stands to reason that more profits could be generated overall.

One of the gold nuggets uncovered was that the company lost productive manufacturing time while waiting for contracts to be signed. By insuring that the necessary paperwork was always complete ahead of time, the result should be more uptime, productivity, and profits. Another result of the big data process in this example was a shift to lean manufacturing, which advises how to cease production of what the customer doesn’t want and concentrate primarily on what they do want.

Improved Manufacturing Process

The following is an example of successful big data use from the pharmaceutical industry. This company manufactures vaccines and various other blood components. In order to insure purity in the end result, they tracked 200 different variables. On the surface, that sounds like a pretty impressive effort to maintain a high quality in the final product. The problem, as big data analysis pointed out, was that this intensive quality assurance process still allowed for a yield variation from 50 to 100 percent. This level of inconsistency is enough to draw attention from federal regulators, which is bad news.

The analysis was able to separate and identify nine parameters that directly impacted the quality of the final vaccine yield. The rest of the 200 variables were essentially eliminated. The end result was that the company was able to save time in the testing process, increase production by 50 percent, end up with a higher quality yield, and save $5 to $10 million per year.

Fortifying the Supply Chain

Here’s a nifty way one manufacturer found to use big data to make sure they got their raw materials no matter what, even in the face of tornadoes, hurricanes, earthquakes, etc. Through various predictive applications, the company was able to calculate the probabilities of delays at various points in the supply chain, then make arrangements beforehand to identify backup suppliers. This is how you use big data to guard against downtime from unexpected natural disasters.

Better Testing = Higher Quality

A normal Intel computer chip used to go through 19,000 tests before it was cleared for sale. As you might imagine, this was a huge but necessary chunk of time and money dedicated to maintaining high quality standards. But you can bet that any company CEO would give his or firstborn child (an exaggeration!) to be able to cut down on testing time without quality taking a nosedive. Big data to the rescue.

By an exhaustive analysis that began at the wafer level, Intel was able to throw out a large number of their standard tests and focus on those that specifically yielded the most value. The savings on a single line of processors was $3 million in the first year, which is expected to grow to $30 million once fully implemented.

The Bottom Line

The preceding examples are just a few of the dozens of use cases that could be cited. Once you narrowly define the problem and turn loose powerful computers on a big pile of data, you might be surprised at the creative solutions uncovered to problems you didn’t even know you had.

Posted on

Exploring the Data Acquisition Industry

For such an innocuous term, data acquisition (DAQ) is well on its way to being a billion dollar industry. DAQ is the process by which a company measures sound, temperature, pressure, voltage, current, or other physical and/or electrical phenomena. Though it wasn’t so long ago that these measurements were taken with simple mechanical devices and a chart recorder, the Computer Age has changed all that. You would expect a modern day DAQ system to consist of sensors, measurement hardware, and programmable software on a PC. Who are the top 10 vendors in the DAQ market? Grab a seat and we’ll give you our opinion.

Campbell Scientific

As a major vendor in the global DAQ market, Campbell Scientific produces systems intended for survival in rugged conditions associated with long-term, unattended monitoring. Typical applications would include collecting data from machines, soil, weather, water, and energy. As an industry leader, expect Campbell Scientific to stay on the forefront of developments in the PC-based world of data acquisition.

Rockwell Automation

If you’ve ever eaten food, there’s a good chance an integrated system from Rockwell Automation might have had a hand in the process. Though the company specializes in process manufacturing (recipes and formulas) related to large-scale food production, you can also find them at work in other industries like oil and gas, mining, metals, and life sciences.

Dewetron

Dewetron’s niche in the DAQ field is all areas of research and development, as well as specialized data acquisition instruments and custom built solutions related to the automotive, energy and power, transportation, and aerospace industries. With a focus on always moving the DAQ field forward with state-of-the-art technology advances, Dewetron is part of the larger TKH group and is headquartered in Austria.

Yokogawa Electric

Based in Japan, Yokogawa Electric has earned a solid reputation in the design, manufacture, and sales of information technologies, control systems, and measurement solutions. With a century of experience behind it, this company has managed to successfully reinvent itself to stay in step with changing industrial demands. For today’s world, Yokogawa Electric continues to satisfy clients with top-notch products and service, while emphasizing environmental sustainability.

Honeywell International

Make no mistake, Honeywell International is a multinational conglomerate with fingers in a whole bunch of different industrial pies. One of those pies happens to be DAQ. As well as the usual measurement and control products, this company enjoys a high demand for its circular chart and paperless recorders. Honeywell International presently focuses in particular on energy, safety, security, productivity, and global urbanization.

Pentek

As an ISO 9001-2015 certified company, Pentek prides itself on the manufacture and sales of cutting edge DAQ solutions, with a specialty in digital signal processing and software radio applications. You can expect to find Pentek embedded chips in rugged environments associated with military and defense applications. The ISO 9001 certification assures clients that Pentek products will meet the most rigorous standards for performance.

Ametek

With 15,000 employees and locations in 30 countries, one could say Ametek has created a rather large footprint in the DAQ industry. Equipment in high demand from this manufacturer includes programmable power equipment, industrial battery charges, analytical instruments, electromagnetic compatibility test equipment, and gas turbine generator sensors. Ametek is a leading provider of systems to the aerospace and defense industries.

Spectris

This UK company might have a name that reminds us of a secretive James Bond organization, but Spectris is anything but fiction. Specialty equipment includes a focus on improving productivity, streamlining processes, and delivering higher quality for laboratory and industrial applications. A corporate focus that encourages employer entrepreneurism keeps cutting edge products always in the pipeline.

Keysight Technologies

As another company that provides measuring equipment to the aerospace and defense industries, Keysight Technologies prides itself on not selling one-size-fits-all solutions off the shelf but rather provides a process that includes consulting, customization, and optimization that fits directly into the client’s product lifecycle. The result of almost eight decades of refinement and innovation, Keysight Technologies enjoys high name recognition for the future.

National Instrument Corp.

With more than forty years under its belt, National Instrument Corp. has created a solid spot in the niche dedicated to manufacturing virtual instruments and automated test equipment. A sampling of industries served includes academic and research, wireless, aerospace and defense, automotive, energy, and heavy equipment. National Instrument Corp. prides itself on creating more than singular products but rather entire ecosystems for clients.

These are just a few of the big players in the DAQ field. As the Industrial Internet of Things (IIoT) grows, expect this already important industry to become even more so.

Posted on

How Data Acquisition Enhance Business Development

biz dev

Though the collection and analysis of data has been with us as long as there have businesses interested in improving and understanding their processes, the Computer Age has seen a definite quickening in the sheer volume of data that can be gathered and processed. So is there really a point to being able to acquire data at a mind-boggling pace? In short, yes. A business’s essence is defined by the data it collects, but if you demand to know exactly how data acquisition can make a bottom line difference in an industrial business or application, keep reading.

Knowing the Unknown

Gathering data about something that is unknown is an excellent way to learn more about such disparate entities as scientific phenomena or new product designs. Often these patterns, especially in the former, are too subtle or take place over too long a time period to make intelligent suppositions. For example, it’s easy to reflect back on weather patterns over a decade just on the basis of personal experience, but the picture becomes less accurate when we try to assess long-term data that may go back a century or more.

The second example, new product design, allows engineers and designers to make tweaks based on data they collect without having to send out a terrible prototype into the marketplace for testing the hard way. This not only decreases time to market but avoids foisting something dangerous or poorly functioning on innocent beta testers.

Manufacturing / Quality Testing

Do the final products rolling off the end of the assembly line match up to the original design specifications for performance and safety? Unless your company collects this data, you’re left to make a biased assessment on the basis of what seems to be reality. Humans are notoriously poor judges in such matters. Data acquisition allows you to amass accurate records of how well the end result matches design expectations. It’s critical to know whether there is a gap that tells you the specs are unreasonable or the product needs improving. All of this points right to the heart of quality control. Doing this poorly can doom a business quickly.

Repair and Diagnostics

In the manufacturing and industrial world, it’s not always a simple matter to send a technician in to figure out why a piece of machinery is malfunctioning. Maybe the issue would require a complete dismantling of millions of dollars worth of equipment that would shut down the line for days or even weeks. This is the perfect scenario to illustrate how data acquisition techniques allow for the design of diagnostic systems that can figure out the problem in a fraction of the time it would take a human brain.

Monitoring

To expand on the previous section, wouldn’t it be an even better idea to use data acquisition to create and install monitoring systems that were designed to identify maintenance and repair issues or even system failures before they occur? Based on data collected during the times when machinery is running perfectly, the preventative monitoring of manufacturing or industrial systems could eliminate or at least greatly reduce outages and the accompanying downtime and optimize machine performance.

Automation

Often humans aren’t the best choice when it comes to operating complex or dangerous machinery over the course of an eight hour or longer work shift. That’s where data acquisition hardware and/or software can take the human element out of the loop, saving thousands of man hours annually for large companies. The Industrial Revolution has come and gone and it’s no longer a necessity for a human hand to be on the wheel of everything that happens during the manufacturing process.

Final Thoughts

To ignore the inherent possibilities of data acquisition is like trying to stop a speeding freight train by standing on the track with a stop sign. It’s time for entrepreneurs and CEOs to drop any lingering antagonism and take a hard look at how a mass of information can help their business massively.

Posted on

Understanding the Value of Data Acquistion

Those in the data science business realize that taking on important big data projects for business requires a structured process or life-cycle, to use a catchphrase, that includes five major stages. It’s the second, Data Acquisition and Understanding, we’re concerned with at this juncture. Within this stage there are three primary steps, or tasks to accomplish in order to meet the dual goals of 1) produce high-quality data that clearly relates to the target variables, and 2) develop a data pipeline solution that refreshes the data regularly and allows for artificial learning.

Ingest the Data

The first step in our Data Acquisition and Understanding process involves establishing a procedure that allows you to be able to move the data from where it is (the source location) to where you want it to be (the target location). Before analysis, training, or any sort of predictive activities can take place, be concerned with how you’ll be able to effectively select and move the data sets you need.

Explore the Data

Step number two in the process involves developing a solid understanding of the data. Data collected in the real world is far from perfect. It may be incomplete, full of distractions, or contain any number of other problems. This is the auditing process. It may be necessary to accomplish it in iterations before what you’re working with is clearly understood and you’re ready to introduce to the modeling process.

The reality is the data will likely need to be cleaned before it’s of much use. The phrase “Garbage In, Garbage Out” definitely applies here. The cleaning process is an article (or more) unto itself. Here are some of the tasks to focus on. Once you’re working with clean data, it’s time to step back and look for existing patterns. The goal here is to note any naturally existing connections between the data set and the target model you want to apply it to. Is there enough data to accomplish the goal and move forward?

The iterative nature of this step should be evident as well. You might have perfectly clean data but it just doesn’t match well with the modeling that is intended. There’s a distinct possibility you’ll have to go back and look for new or better data sources that will either augment or replace the first identified set.

Creating a Data Pipeline

After your data is ingested and cleaned, expect that the next step will be to create the process by which new incoming data is integrated into the working model through regular scoring and refreshing measures. In this sense, the data pipeline is simply an organized workflow that all team members are familiar with. It should be an automatic strategy that takes new data from various sources and prepares it for use in the ongoing learning process. There are various designs this pipeline might take. The three most common are:

  • batch-based
  • streaming or real-time
  • a hybrid of the two

Constraints of the present system as well as the specific needs of your business, obviously, play a large role determining the ultimate architecture of the pipeline.

data pipeline

Deliverables

As we come to the end of this stage, there are three deliverables that should be complete before proceeding to the next major stage, Modeling. They are:

Data Quality Report: This report should include attribute and target relationships, variable ranking, and data summaries at the least, but can cover much more ground if you need it.

Solution Architecture: This should contain a description of your data pipeline that you use to build predictive solutions based on new data after the model is complete. The pipeline to retrain your model on the basis of new data should be included as well.

The Big Decision: Some call this the checkpoint decision. The bottom line is that it serves as a place to stop and evaluate what you’ve done and what you expect to accomplish in the future. Ultimately, now is the time to cut short the project if the returns don’t justify the cost and labor time involved. Your basic choices are to proceed, collect more data, or give up the project.

Posted on

The Importance of Data for Your Business

Although it may seem that the concept of data collection is connected with the Information Age, effective business leaders have been collecting data long before the advent of technology. Business leaders have used that data to help them make decisions about their businesses. In the past, data collection was done manually.

A business might have questioned customers or clients in person or by taking surveys through the mail or over the phone. Businesses would use this data to create informed marketing strategies to reach more customers. However, because manual data collection was costly and time consuming, businesses had to rely on a small sample to make their decisions.

Now, with the technology available, it is easy to collect an ample amount of data to inform your marketing. In fact, you may find it challenging to sift through the data to understand what is relevant.

Bernard Marr, a data and analytics expert, said, “I firmly believe that Big Data and its implications will affect every single business – from Fortune 500 enterprises to mom and pop companies – and change how we do business, inside and out. Basically, no matter how small your business is, you do have a use for data.”

While data is important, you cannot rely solely on it to make your business successful. You must still work hard and make good decisions. Data is just one of the many resources you can use to grow your business.

Data and Decision Making

Even the smallest of businesses can create data. If your business has any sort of online presence like a website or social media account or if you accept online payments, then you have data that you can use.

big data collection

When making decisions about your company, you likely have a lot of factors that influence your decisions. You may rely on intuition or things you witness within your company to help you make decisions. Data can be a powerful resource because it gives you facts and figures to drive your decisions.

For example, when ordering inventory, you may make purchases based on what you have noticed has been selling well. Or, by using data you could have exact numbers on which items are selling and how many your have in stock. Data can allow you to make the most accurate decision.

You may worry that your business has yet to generate enough data, but Merit Solutions states that any business that has been going for at least a year has “a ton of data” to use when making decisions. People just need to know how to use it.

Any size business can use data for a variety of purposes, including:

  • gaining and keeping customers
  • improving customer service
  • marketing and social media
  • making predictions about sales

Data and Problem-Solving

When your business is having issues, such as slow sales or an unsuccessful marketing campaign, you will want to use data to help you figure out exactly what went wrong.

Analyzing data can help you learn exactly what your business is doing right and what needs to be improved.

Data and Performance

If you want to know how your company or even certain aspects of your business is performing, collecting and analyzing data can show you.

For example, when marketing, it is important that your campaigns earns more money than you spend on the campaign. Data can help you see if your marketing is worth the money.

Or, you may have a salesperson who you think is the best in your business, so you give him or her the best leads. Reviewing data may show you that another salesperson has a higher performance, but gets fewer leads. Knowing what is happening allows you to make informed decisions.

Data and Processes

Data can make a difference in your processes and let you minimize waste and lost time. Marina Martin says in her book Business Efficiency for Dummies,Inefficiencies cost many companies anywhere from 20-30% of their revenue each year.

Business Insider states that businesses waste the most money on bad advertising decisions. Data can help you to focus your advertising and maximize your ROI.

Data and Consumers and the Market

Data allows you to better know your customers and what they want. PayPal co-founder Max Levchin said, “The world is now awash in data and we can see consumers in a lot clearer ways.

Collecting data is essential to any business looking to gather and utilize real-world data to make smarter and more effective decisions. Collecting data is not a luxury but a requirement in the 20th century.

Posted on

Introduction to DAQ Software

gathering data

Software for DAQ systems has advanced rapidly over the past few years. Gone are the days of having to write software from scratch for every sensor in your DAQ system. And whilst some of the skill of writing bespoke DAQ software might have faded into the past, the data tools available now offer far more powerful data manipulation and analysis than ever before. This, and the ever decreasing cost of DAQ software, means that it is becoming ever more common to see DAQ systems implemented even on amateur projects.

Whilst the software you use to manipulate and analyse your DAQ data will depend on your individual requirements, and it is therefore hard to recommend the right approach for you, in general there are three approaches to DAQ software – the old-school, bespoke approach, off-the-shelf proprietary DAQ software, and software incorporated into your input device. Let’s take a look at all three.

The Traditional Way

The old-school way of collecting and analysing DAQ data was to have an engineer write you bespoke software for each and every sensor you had implemented. If that sounds like it was a slow, laborious process, you don’t know the half of it.

Today, this approach is not really suitable for the vast majority of users. Some highly technical industries, employing exotic sensors and requiring very low latency rates, still benefit from having bespoke software written for them. For most of us, however, the hassle and expense of doing this rules it out.

Proprietary DAQ Software

Once DAQ cards become standard across many industries, it was possible for software companies to build software that could aggregate and analyse data from most of these cards. Today, there are a huge number of software solutions available for DAQ systems, ranging from simple data loggers to fully-featured data visualization software.

Among the most popular packages today are: WinDaq, a pretty basic but reliable solution; IceCube, a little more expensive but offering a huge scope for modification and customization; and Chameleon DAQ, a newcomer to the market but quickly making a name for itself.

These software packages have many advantages, of course. The ability to analyse data from within common desktop environments, and output this in widely-recognized formats, is a huge advantage over older systems.

Incorporated DAQ Software

That said, proprietary DAQ software still commonly requires the user to have some knowledge of a range of proprietary computer programming languages. The outcome of this is that companies have to spend resources on getting specialists in to program their DAQ systems, and amateur users have to waste many hours learning coding just to get the data they need.

Accordingly, in recent years some DAQ devices have incorporated software into the hardware DAQ device itself. There are many advantages to this approach – not only are these systems easier to use, but the fact that this software is running on a dedicated piece of hardware greatly reduces latency time.

Overall, for most small businesses and amateur users, this kind of DAQ software is recommended, if only because it can be used “straight out of the box”.

Posted on

Input Device for DAQ – What is the Best Way to Gather Data

Today we will take a look at input devices for Data Acquisition (DAQ) systems. If you are new to DAQ, or are coming to it afresh after some time away, it’s worth reminding yourself of the basic parts of DAQ systems.

Essentially, most DAQ systems incorporate three components – the sensors that take a real-world phenomenon and turn it into an electrical signal, a card or other device that aggregates and sometimes amplifies these inputs, and then the computer terminal which is used to analyse the data produced.

Input devices are therefore at the heart of DAQ systems, taking the input from many sensors, aggregating them, and then passing them to software for analysis. Nowadays, many input devices are able to perform quite sophisticated manipulation of signals before passing them to software, and these devices range widely in terms of performance and extra features.

It is impossible, of course, to recommend the perfect DAQ input device for your purposes, because the sheer range of systems that now have DAQ systems incorporated in them means that each system is unique.

Nevertheless, there are three broad types of data input device, and it worth knowing the differences between them:

Direct Output

This is the way it used to be done, in the bad old days before modern systems. Typically, in a factory 20 years ago, each sensor would be hard-linked to a dedicated computer terminal. There were many problems with this approach, not least the expense of having individual terminals for each sensor, and replacing these every time the factory’s environment killed them.

Today, this is not a serious consideration for most DAQ users, unless you have very specific requirements that necessitate a direct hardware link.

DAQ Cards

When DAQ cards were invented a few decades ago, they were hailed as a revolution in DAQ systems. The advantage over older systems was certainly pronounced – one card inside a computer could take and aggregate inputs from multiple sensors, and this significantly cut down the cost of DAQ systems.

As DAQ cards developed, the number of inputs they could receive increased year on year, and multi-channel DAQ systems became commonly used. The low initial investment also meant that many companies who had never used DAQ systems before started to implement them.

In addition, as DAQ cards developed, more and more sensors and software systems were made compatible with them, which helped them to become the industry standard DAQ input device for many years.

Portable DAQ Units

Today, however, DAQ cards are themselves being replaced with portable, discrete DAQ units. These devices incorporate all of the advantages of DAQ cards, being massively multi-channel and able to accept a huge range of input types, but also have a few features that give them the edge.

Many of these new devices are able to output data via wi-fi, for instance, using already existing infrastructure as a medium to collect DAQ data, and further cutting costs. This also makes them portable, obviating the need to disassemble machinery to access DAQ data.

All in all, it is expected that these portable DAQ input devices will eventually replace DAQ cards in the vast majority of situations.

Posted on

The End of DAQ Cards

why daq cards are dead

why daq cards are dead

DAQ cards were once the go-to solution for data acquisition systems. In the days when data acquisition was something of a niche activity, limited to high-end manufacturing and academic applications, often these cards were the only way of collecting and storing incoming sensor data.

If you’ve used DAQ cards, however, you are probably already aware that they suffer from some pretty major drawbacks. First and foremost among these are the price of dedicated DAQ cards, and the fact that in order to used them the user often needs extensive knowledge of proprietary programming languages.

In recent years, a new breed of data acquisition devices has appeared on the market, offering flexible data acquisition solutions that make implementing DAQ systems much easier. DAQifi devices are at the forefront of these innovative devices, and offer several huge advantages over traditional DAQ card implementations.

The Advantages of DAQifi Devices

DAQ cards typically output data using a dedicated hard link, and in years past this often meant having a separate PC workstation for every data acquisition process. Not only did this mean extra expense in terms of hardware, it often meant that bringing data from several processes together was a manual, painful business. DAQifi cards send the collected data over a WiFi network – either an existing one, or one generated by the device itself – to custom software.

What this means in practice is that a single PC, tablet, or even smart phone can be used to aggregate all the data being collected, bringing it all together for easy analysis and manipulation. This also means that the computer being used to collect and manipulate data does not need any additional hardware to be used for this purpose.

In addition, DAQifi devices represent better value than many DAQ card solutions. This is because DAQ cards are often made to be used to collect one type of data only, and in many cases this means that a bank of cards must be used in order to collect even quite basic data. The flexibility of DAQifi devices makes them cheaper to implement in many situations.

This is especially true in situations where portability is paramount. The fact that DAQifi devices run on their own power makes them ideal for situations where having a dedicated PC workstation is simply impossible. This is the case in many industrial processes, where the environment is not conducive to the health of computer hardware, and also in situations where the system under study is inherently mobile, such as in automotive engineering.

Lastly, the user interface which comes as standard on DAQifi devices means that using them is incredibly simple in comparison to many DAQ card solutions. Often, even in high-end scientific applications, all that is needed from a data acquisition system is for it to feed data to a centralized device, in a format which is easy to work with, for later analysis.

This is exactly what DAQifi devices achieve, and it is therefore not surprising that they are eclipsing DAQ card solutions in many situations.

To learn more about our products please click here.