Researchers have created a new electronic tongue that can distinguish between similar liquids, such as milk with different amounts of water, various soda types and coffee blends, signs of spoilage in fruit juices, and food safety concerns. The findings indicate that the accuracy of the tongue improves when artificial intelligence (AI) applies its own evaluation metrics to interpret the data from the electronic tongue.
A new electronic tongue has been developed that can detect distinctions in similar liquids, like milk with different water levels, various types of sodas and coffee mixtures, signs of juice spoilage, and food safety issues. The research team, led by experts from Penn State, discovered that the accuracy of results increased significantly when artificial intelligence (AI) established its own criteria for analyzing the data produced by the electronic tongue.
The team released their findings on October 9 in the journal Nature.
According to the findings, the electronic tongue holds promise for enhancing food safety and production, as well as in the field of medical diagnostics. The sensor, combined with AI, can effectively detect and categorize various substances while evaluating their quality, authenticity, and freshness. This process has also helped the researchers gain insights into the decision-making mechanisms of AI, which could contribute to the advancement of AI technologies.
“We’re striving to create an artificial tongue, but the way we experience different foods is more complex than just the tongue,” explained Saptarshi Das, the lead author and Ackley Professor of Engineering. “The tongue comprises taste receptors that interact with food and convey information to the gustatory cortex, which acts as a biological neural network.”
The gustatory cortex is the brain region responsible for interpreting and perceiving various tastes, extending beyond what taste receptors can detect—primarily categorizing flavors into five main types: sweet, sour, bitter, salty, and savory. As the brain becomes familiar with these tastes, it improves its ability to recognize subtle flavor differences. To artificially replicate the functions of the gustatory cortex, the researchers developed a neural network that uses machine learning algorithms to simulate human data processing and understanding.
“In previous studies, we explored how the brain responds to different tastes and attempted to replicate this by incorporating various 2D materials to create a framework for AI to process information in a more human-like manner,” stated co-author Harikrishnan Ravichandran, a PhD student in engineering science and mechanics under Das’s guidance. “In this study, we’re examining a variety of chemicals to assess if the sensors can accurately identify them and if they can detect subtle differences among similar foods, including safety concerns.”
The electronic tongue operates using a graphene-based ion-sensitive field-effect transistor, which is a device capable of detecting chemical ions, linked to an artificial neural network trained on diverse datasets. Importantly, Das highlighted that the sensors are non-functionalized, meaning one sensor can recognize multiple chemicals instead of being specially designed for each individual chemical. The researchers supplied the neural network with 20 critical parameters to evaluate, all related to how a liquid sample interacts with the sensor’s electrical characteristics. Based on these parameters set by the researchers, the AI was able to accurately identify samples—such as diluted milk, various sodas, coffee blends, and different fruit juices with multiple freshness levels—reporting results with over 80% accuracy in about a minute.
“After achieving satisfactory accuracy with human-defined parameters, we allowed the neural network to establish its own metrics using raw sensor data. We discovered that the neural network achieved nearly perfect inference accuracy of over 95% when using machine-derived metrics, as opposed to those provided by humans,” explained co-author Andrew Pannone, a PhD student in engineering science and mechanics under Das’s mentorship. “We employed a technique called Shapley additive explanations, which enables us to inquire about the neural network’s reasoning after making a decision.”
This method incorporates game theory, a decision-making approach that evaluates the choices of others to forecast the outcome for a single participant, to assign values to the data being analyzed. With these explanations, the researchers could decipher how the neural network prioritized different factors of the sample to arrive at a conclusion—shedding light on the neural network’s decision-making process, which has remained largely opaque in artificial intelligence research. They discovered that instead of solely focusing on individual parameters assigned by humans, the neural network considered the data that it determined to be most relevant collectively, with Shapley additive explanations clarifying each input’s significance.
The researchers offered an analogy—like two people tasting milk. While both might recognize it as milk, one could perceive it as spoiled skim milk, while the other might think it’s fresh 2% milk. The reasons for their differing assessments are often difficult to articulate, even for the individuals judging the milk.
“Our findings indicate that the neural network observes more nuanced characteristics in the data—elements that we, as humans, find challenging to express,” Das noted. “By evaluating sensor characteristics as a whole, the neural network reduces variations that may occur from day to day. In the context of milk, it can identify the varying water content and determine whether signals of degradation are significant enough to warrant a food safety concern.”
Das pointed out that the capabilities of the electronic tongue are contingent upon the data utilized for training. While this study primarily focused on food assessment, it has potential applications in medical diagnostics as well. The robustness of their sensors, combined with the importance of sensitivity across numerous fields, offers opportunities for widespread application in various industries, according to the researchers.
Das further explained that precise duplication of sensors is not a prerequisite, as machine learning algorithms can analyze all the information holistically to yield correct results. This flexibility allows for a more efficient and cost-effective manufacturing process.
“We realized that we could accept imperfections,” Das remarked. “Nature thrives on imperfections but still makes sound decisions, similar to our electronic tongue.”
Das is associated with the Materials Research Institute and holds appointments in the Departments of Electrical Engineering, and Materials Science and Engineering. Other contributors from the Penn State Department of Engineering Science and Mechanics include Aditya Raj, a research technologist at the time, Sarbashis Das, who recently achieved his PhD in electrical engineering, Ziheng Chen, a graduate student in engineering science and mechanics, and Collin A. Price, who received his bachelor’s degree in engineering science and mechanics. Mahmooda Sultana from NASA’s Goddard Space Flight Center also contributed to the work.
This research was supported by a Space Technology Graduate Research Opportunities grant from NASA.