How do openQCM Q-1 and openQCM NEXT measure dissipation across multiple harmonics?


Ever wondered what really happens when your openQCM measures dissipation? Today we’re opening the hood and showing you exactly how it works—including the engineering challenge that made us rethink the standard approach.

One of the questions we get asked most often is: “How exactly does the dissipation measurement work?”

It’s a fair question. And honestly, the answer involves a bit of creative problem-solving that we think you’ll find interesting—especially if you’ve ever struggled with measurements in viscous solutions or wondered why your higher overtones sometimes behave unexpectedly.

The Sensing Principle

openQCM Q-1 and openQCM NEXT measure two parameters simultaneously: frequency (related to mass changes) and dissipation (related to viscoelastic properties). This dual measurement capability is what makes QCM-D such a powerful technique for characterizing soft matter, biological films, and polymer layers.

Unlike systems based on oscillator circuits or the ring-down technique, our instruments employ a scalar network analyser approach. We passively interrogate the quartz crystal by performing a frequency sweep around the resonance, generating a sinusoidal signal and measuring the amplitude of the crystal’s response.

Think of it as gently probing the system across a range of frequencies rather than forcing it to oscillate at a predetermined point. This approach allows us to reconstruct the complete resonance curve and extract both frequency and bandwidth information.

Building the Resonance Curve

The measurement process is conceptually straightforward:

Step 1. Set an excitation frequency f₁ and measure the response amplitude

Step 2. Increment to f₂, f₃, f₄… measuring each response

Step 3. The complete set of points defines the resonance curve

Step 4. Repeat for each harmonic: fundamental, 3rd, 5th, 7th overtone…

From this curve, we extract the resonance frequency (the peak position) and the bandwidth (the peak width), which is directly related to energy dissipation in the system.

The Classical Approach: −3 dB Bandwidth

The standard definition of dissipation is elegantly simple:

 

D = \frac{1}{Q} = \frac{\Delta f}{f_r}

 

where Δf is the bandwidth measured at the −3 dB level (half-power points), corresponding to an amplitude of Apeak/√2. This definition has solid physical grounding: it relates directly to the energy dissipated per oscillation cycle.

In an ideal world, this would be all we need. However, real-world measurements—particularly at higher overtones—present a challenge that isn’t always discussed in textbooks.

The Overtone Problem

During extensive testing of our instruments, we observed something important: the amplitude of resonance peaks decreases significantly as the overtone order increases.

If your fundamental at 10 MHz shows a peak amplitude of 18 arbitrary units, the 5th overtone might reach only 12–14 units, and the 9th overtone even less. This occurs due to several physical

When rigidly applying a fixed cut-off level criterion to an overtone with reduced peak amplitude

factors:

  • Reduced mechanical displacement at higher harmonics
  • Frequency-dependent coupling efficiency between electronics and crystal
  • Increased energy losses at higher frequencies

When the peak amplitude becomes s

ufficiently low, the standard −3 dB approach encounters practical difficulties:

Noise floor proximity. The −3 dB cut-off level approaches the system noise, making precise identification of f₁ and f₂ difficult.

Degraded signal-to-noise ratio. Small fluctuations translate into significant bandwidth errors.

Undefined intersections. In extreme cases, the −3 dB level may not clearly intersect the resonance curve at all.

If you’ve ever experienced unstable dissipation readings at higher harmonics, this is likely the underlying cause.

Our Approach: Adaptive Cut-off Levels

Faced with this limitation, we had two options: accept that higher overtones would be inherently unreliable, or develop a more robust methodology. We chose the latter.

openQCM Q-1 and openQCM NEXT employ a custom cut-off level for each harmonic, defined as:

A_{cutoff,n} = A_{peak} - \Delta A_n

where ΔAn is an amplitude offset calibrated for each overtone during instrument setup, ensuring the measurement remains well above the noise floor while still capturing the relevant portion of the resonance curve.

The ΔAn values are determined during instrument calibration, optimized for each harmonic to balance noise immunity with measurement sensitivity.

 A custom cut-off level for each overtone rather than the fixed -3 dB level.
The cut-off level for bandwidth determination (Δf = f₂ − f₁) is calculated relative to each harmonic’s peak amplitude, ensuring reliable dissipation measurements even as signal amplitude decreases at higher overtones.

Physical Interpretation

We want to be transparent about what this methodology means for your data.

The parameter we measure—denoted Dn(inst) (instrumental dissipation)—is systematically related to, but not numerically identical to, the canonical dissipation factor defined at −3 dB. The absolute values will differ.

However, and this is the key point:

Relative variations ΔDn(inst) faithfully track real changes in energy dissipation. When your film softens, D increases. When it rigidifies, D decreases. The trends are physically meaningful and reproducible—which is precisely what matters for real-time monitoring experiments.

For applications requiring absolute comparison with literature values, correction factors can be applied to map the measured bandwidth to the equivalent −3 dB value. This mapping depends on the resonance lineshape and can be determined analytically for Lorentzian resonances or empirically using reference measurements.

Multi-Harmonic Analysis

One of the strengths of QCM-D is the ability to probe your sample at multiple frequencies simultaneously. Different overtones are sensitive to different effective depths within the contacting medium:

Lower Harmonics

Greater penetration depth. Sensitive to the bulk properties and entire film thickness.

Higher Harmonics

Smaller penetration depth. More sensitive to surface and near-surface layers.

Comparing dissipation across harmonics provides insight into the vertical structure of your film. Is it homogeneous throughout? Does the surface behave differently from the bulk? Multi-harmonic data helps answer these questions.

Rigidity Verification

For rigid films where the Sauerbrey equation applies, you should observe:

  • Δfn/n approximately constant across all harmonics
  • Dn values remaining low and relatively constant

Significant deviations from this behavior—particularly D values that increase with overtone number—indicate viscoelastic contributions that require more sophisticated modelling approaches such as the Voigt or Maxwell models.

Practical Recommendations for Viscous Solutions

Based on our experience supporting researchers across diverse applications, here are some practical suggestions for working with high-viscosity samples:

Widen the frequency sweep range. Viscous loading significantly broadens the resonance peak. Ensure your sweep captures the complete curve, including the tails.

Increase the number of sampling points. Higher point density improves the accuracy of peak detection and bandwidth determination, particularly for broad, low-amplitude resonances.

Prioritize lower harmonics. The fundamental and 3rd overtone typically provide the most reliable signal in viscous environments, where higher overtones may be strongly attenuated.

Summary

openQCM Q-1 and openQCM NEXT measure dissipation by reconstructing the resonance curve through frequency sweeps and calculating the bandwidth for each harmonic.

Our adaptive cut-off methodology addresses a practical limitation of the standard −3 dB approach: the reduced signal amplitude at higher overtones. By using harmonic-specific cut-off levels calibrated above the noise floor, we ensure reliable dissipation measurements across all available overtones.

Is this a departure from the canonical definition? Yes. Does it provide meaningful, reproducible, physically relevant data for monitoring viscoelastic changes in real-time? Absolutely.

The multi-harmonic dissipation data, combined with frequency shift measurements, enables characterization of soft films, hydrated layers, and biological samples where the Sauerbrey equation alone is insufficient.


Questions about our measurement methodology or need assistance optimizing your experimental setup? We’re always happy to discuss—get in touch or leave a comment below.

Revolutionizing QCM Research: How AI will Transform Quartz Crystal Microbalance Technology

by Claude 3 Opus

This post was entirely written by Claude 3 Opus. We believe that who better than an AI can describe what the future implications of using this technology in the scientific field will be?”‘


In the realm of surface science and materials characterization, Quartz Crystal Microbalance (QCM) technology has emerged as a powerful tool for monitoring and analyzing surface interactions and thin film properties. With its ability to detect minute changes in mass and viscoelastic properties, QCM has found extensive applications in fields such as biomedicine, environmental monitoring, and materials science. However, the full potential of QCM can be further unleashed by leveraging the capabilities of Artificial Intelligence (AI). In this blog post, we will explore how AI can revolutionize QCM technology, focusing on hardware implementation and experimental interpretation, including both frequency and dissipation analysis.

AI-Driven Hardware Optimization: One of the key areas where AI can significantly impact QCM technology is in hardware implementation. By employing machine learning algorithms, researchers can optimize the design and performance of QCM sensors. For instance, AI can assist in selecting the most suitable quartz crystal materials, electrode configurations, and resonance frequencies based on the specific application requirements. This optimization process can lead to enhanced sensitivity, stability, and reproducibility of QCM measurements.

Let’s consider a practical example. In a study aimed at detecting specific biomarkers for disease diagnosis, researchers can utilize AI algorithms to optimize the QCM sensor design. By training a neural network on a dataset containing various quartz crystal materials, electrode geometries, and resonance frequencies, along with their corresponding sensitivity and selectivity metrics, the AI model can predict the optimal combination of parameters for achieving the highest detection performance. This data-driven approach can save significant time and resources compared to traditional trial-and-error methods.

Moreover, AI can facilitate the integration of QCM with other complementary techniques, such as surface plasmon resonance (SPR) or electrochemical methods. By leveraging data fusion and pattern recognition algorithms, AI can help in the intelligent combination of multiple sensing modalities, enabling a more comprehensive characterization of surface phenomena. For example, in a study investigating the adsorption kinetics of proteins on functionalized surfaces, AI can be employed to merge QCM and SPR data, providing insights into both mass and optical properties simultaneously. This synergistic approach can provide deeper insights into the underlying mechanisms and improve the overall reliability of the experimental results.

Intelligent Data Analysis and Interpretation: The true power of AI in QCM technology lies in its ability to revolutionize experimental interpretation and post-processing analysis. QCM experiments generate vast amounts of complex data, including frequency and dissipation shifts, which can be challenging to interpret manually. This is where AI comes into play, offering intelligent algorithms for data analysis and pattern recognition.

One of the key applications of AI in QCM data interpretation is the development of predictive models. By training machine learning algorithms on large datasets of QCM experiments, researchers can build models that can accurately predict the behavior of surface interactions and thin film properties based on the observed frequency and dissipation changes. For instance, in a study investigating the growth kinetics of polymer thin films, an AI model can be trained on historical QCM data, including film thickness, deposition rate, and corresponding frequency and dissipation shifts. The trained model can then be used to predict the film properties for new experimental conditions, enabling researchers to optimize the deposition process and tailor the film characteristics.

AI can also greatly enhance the real-time monitoring and control of QCM experiments. By integrating AI algorithms with the QCM instrumentation, researchers can develop intelligent feedback loops that automatically adjust experimental parameters based on the real-time data analysis. For example, in a study investigating the adsorption of nanoparticles onto a functionalized surface, an AI algorithm can continuously monitor the frequency and dissipation shifts during the experiment. If the AI detects any deviations from the expected behavior, it can automatically adjust the flow rate, concentration, or other relevant parameters to maintain optimal experimental conditions. This adaptive approach can optimize the experimental conditions, minimize artifacts, and ensure the reproducibility of the results.

Furthermore, AI can assist in the interpretation of complex QCM data, such as the analysis of viscoelastic properties and the deconvolution of multiple overlapping processes. By employing advanced signal processing techniques and machine learning algorithms, AI can help in extracting hidden patterns and separating the contributions of different physical phenomena. For instance, in a study investigating the adsorption and conformational changes of proteins on a surface, AI can be used to deconvolve the frequency and dissipation shifts into separate contributions from mass loading and viscoelastic changes. By applying techniques such as principal component analysis (PCA) or independent component analysis (ICA), AI can identify the dominant factors influencing the QCM response and provide a more detailed understanding of the protein adsorption process.

Another exciting application of AI in QCM data interpretation is the identification of unique “fingerprints” or patterns associated with specific surface interactions or materials. By training AI algorithms on a diverse dataset of QCM experiments, researchers can develop classification models that can automatically recognize and categorize different types of surface interactions based on their characteristic frequency and dissipation signatures. This can be particularly useful in fields such as biosensing, where the ability to quickly identify and distinguish between different analytes or biomarkers is crucial. For example, in a study developing a QCM-based sensor for the detection of multiple cancer biomarkers, AI can be trained to recognize the unique QCM response patterns associated with each biomarker, enabling rapid and accurate detection in complex biological samples.

Conclusion: The integration of Artificial Intelligence with Quartz Crystal Microbalance technology holds immense potential for advancing surface science and materials characterization. By leveraging AI-driven hardware optimization and intelligent data analysis, researchers can unlock new possibilities in QCM experiments. From enhancing sensor performance to enabling real-time monitoring and control, AI can significantly streamline and improve the experimental workflow. Moreover, AI-powered predictive models and advanced data interpretation techniques can provide deeper insights into surface interactions and thin film properties, accelerating scientific discoveries and technological advancements.

As AI continues to evolve, its synergy with QCM technology will undoubtedly shape the future of surface science and materials research. The ability to harness the power of AI in QCM experiments will enable researchers to tackle complex problems, uncover hidden patterns, and make data-driven decisions with unprecedented accuracy and efficiency. By embracing AI as a valuable tool in their research arsenal, scientists can push the boundaries of what is possible with QCM technology, leading to groundbreaking discoveries and innovations in fields ranging from biomedicine to materials science.

As the scientific community continues to explore the vast potential of AI in QCM technology, it is essential to foster collaborations between experts in surface science, materials characterization, and AI. By bringing together the knowledge and expertise from these diverse fields, researchers can develop novel AI algorithms and frameworks specifically tailored to the unique challenges and opportunities presented by QCM experiments. This interdisciplinary approach will be key to unlocking the full potential of AI in advancing QCM technology and driving scientific progress.

In conclusion, the integration of Artificial Intelligence with Quartz Crystal Microbalance technology represents a paradigm shift in surface science and materials characterization. By harnessing the power of AI, researchers can optimize hardware implementation, streamline experimental workflows, and extract valuable insights from complex QCM data. As AI continues to evolve and mature, its impact on QCM technology will only grow, opening up new frontiers for scientific exploration and innovation. It is an exciting time for the scientific community, as the synergy between AI and QCM technology promises to revolutionize our understanding of surface interactions and materials properties, paving the way for groundbreaking discoveries and technological advancements in the years to come.

Introducing the new openQCM Q-1 Python Software

The new openQCM Q-1 Python Software: real-time monitoring of frequency and dissipation variations of a Quartz Crystal Microbalance through the analysis of the resonance curve

Continue reading “Introducing the new openQCM Q-1 Python Software”

An exciting year

Hello, everyone. This 2018 has been a very exciting year. The openQCM project is growing beyond our expectations. We have launched 2 new devices and are working on the development of new scientific tools, which I hope will help the world of research in a completely new way. For this reason, we have temporary neglected our blog, although we have reported all our developments on Researchgate. Now that some of the most challenging work has been done, we can finally publish a series of posts dedicated to the complex development of the new Python software. Vittorio, who was personally involved in the development of the software, will describe every step and the updates that will take place in the near future. It comes from a constant exchange of with the scientific community and is constantly being developed.we would like to thank warmly all the researchers who have helped us. I hope that this will be of interest to you. Thanks again to all (Raffaele and Marco)

 


 

Start to read the new series of posts:  Introducing the new openQCM Q-1 Python Software

openQCM test of quartz crystal in contact with liquid

Here we report in detail the verification test of openQCM Quartz Crystal Microbalance in contact with pure water according to the theory based on Kanazawa – Gordon equation

Continue reading “openQCM test of quartz crystal in contact with liquid”

openQCM verification test using Impedance and Network Analyzer

Researchers working at International University of Malaysia compared openQCM Quartz Crystal Microbalance with standard scientific instruments Network and Impedance Analyzer

Continue reading “openQCM verification test using Impedance and Network Analyzer”

Why the open source hardware will change the Science

A year has already gone by the   launch of openQCM and there are many things to tell. When we have tried as private company, the way of the open source by launching one of the first scientific analytical instrument in the world completely open hardware, we would never have imagined this level of success. We designed the OpenQCM for a market sector that we thought very restricted: the Quartz Crystal Microbalance.

The evening of the launch we had sold more than 8 devices, and at the end of this year we have sold hundreds of products in many countries of the world and especially to the most important research institutes and large private companies that we had not taken in account. Before the launch, I would have called a success the overall sale of 30 units!

 

Basically, that which is born almost on a bet, has become an unexpected and very important component of our business …. to such an extent to lead us to take up this new road in a more systematic way, with the design and the realization of new open-source scientific products, that we will launch soon through the openQCM platform.

The fundamental reason why I write this is to share our experience and to encourage those who want to see their new product realized. Novaetech, the company that we founded more than 10 years ago, was born with the purpose of delivering services and custom systems for the research world. We started our adventure in the field of aerospace, and then we have differentiated our business over time. These activities have allowed us to achieve a number of prototypes, which were supposed to be turned into real products. Unfortunately, the dream often must face the reality, especially if one works at a national context such as our country, Italy, where to invest the money on an idea is nearly unimaginable.

Throughout these years, we have contacted and met countless potential investors in order to find the necessary resources for the realization of a first prototype. Indeed, every time we have found locked doors …. and in the meantime our prototypes kept closed in a drawer or relegated to extremely limited and specific applications and uses. After all, our company did not have enough resources to get to start production or even distribute new products.

At the same time, the international economic crisis has had as consequence the contraction of our turnover, risking to drift a decade of hard work. It was in these circumstances that, at the end of 2014, we have decided to change strategy thinking it would be better to loan our ideas to the rest of the world rather than keeping them closed and unused, perhaps forever … and this was a winning move!

I have always been a fan of the open source concept, and among early adopters since the beginning of the ‘90s with Linux. I remember how the discovery of a different way of thinking about the software, through an open approach to share and develop, was at that time revolutionary and at the same time exciting and thrilling. Then, in the last few years, this concept has been extended to the hardware, with benefits and implications that even today I think it is difficult to understand.

The Open Source / Open Hardware strategy is and will become more and more an important resource in the world of research and business, especially for small companies like ours, as well as for individual Makers, Designers, and Research institutes and even, I believe with certainty, for the large companies … and the corporations have shown us a willingness to use open source technology!

In these days, we are seeing the establishment of a shared Manifest for Open Hardware in the field of science.  The access to Open Source scientific instrumentation will bring down the gap between the richest countries and those that, at the present, have no chance to devote large resources to research. All thanks to a growing team of people involving makers, scientists, engineers, programmers …. a movement that starts from the base and results in an overturning of the concept: “I invest only where there the market is.”

A movement that is involving even openQCM device, which, with the help of researchers and users, has led us perfecting the device and planning the release of new versions with more features and performances. Many researchers, in fact, have not merely used the openQCM device, but also actively contributed to the future development. They have provided basic information, creating an increasingly broad base of testers. Today we can say, thanks to them, that openQCM device works better than our preliminary expectations and that there is still much work to be done, no more alone, but with the community that has grown with us in the last year.
I conclude with an encouragement to those who not yet started this type of adventure. Do you have an idea? Try it and you will see that so many people across the network certainly are expecting an idea to come out … and this will help you with the support and cooperation characterizing an open movement.


Raffaele

openQCM community develops and shares the new electronic design using KiCAD

openQCM Quartz Crystal Microbalance electronic design is finally released using the free software KiCAD. Thanks to Martin Zalazar, Christian Mista and all the guys working at the Lab of electronic prototyping and 3D printing of the Universidad Nacional de Entre Ríos UNER – Argentina Great thanks for being an active part of the openQCM community ! Continue reading “openQCM community develops and shares the new electronic design using KiCAD”

Sweet openQCM Tasting Water Sucrose Solutions

I’d like to share openQCM Quartz Crystal Microbalance frequency behaviour in contact with water solutions at different weight percent of sucrose. Continue reading “Sweet openQCM Tasting Water Sucrose Solutions”

openQCM frequency stability of quartz crystal microbalance in typical experimental conditions

We are now testing the performances of two different openQCM devices in the most typical experimental conditions in order to measure the frequency stability of the open source quartz crystal microbalance. Continue reading “openQCM frequency stability of quartz crystal microbalance in typical experimental conditions”