Google has plans to extend its AI ventures into the world of health monitoring. The company claims its AI technology, leveraged by machine learning algorithms, may soon be able to analyze voice and coughing patterns to identify potential health conditions in users.
Every sound, like coughing, speaking, and even breathing, made by our body carries subtle clues about our health. Recently, Google claimed its AI may be able to analyze these sound cues and help detect early signs of diseases, get accessible methods for health monitoring, and even alert users when to consult a doctor.
How Would This Work?
Earlier this year, Google introduced HeAR or Health Acoustic Representations, i.e., an acoustic foundation model built by analyzing a vast data set of around 300 million audio clips. The company claims the model is successful in analyzing human sounds and identifying the early signs of diseases.
HeAR can analyze sounds and identify patterns to create a foundation for medical audio analysis. According to Google, HeAR outperforms other models currently available in understanding different sounds and learning from fewer datasets.
The Potential of HeAR in Healthcare
HeAR may stand out through its efficiency. This is because models trained with it will need less data and resources when compared to traditional methods of early illness detection. According to Google, this can help researchers develop accurate models even when working with limited information.
HeAR's goal as a diagnostic aid may eventually be to provide support in remote areas with limited infrastructure and access to medical staff. While not a replacement for healthcare, such tools can help in the early stages of diagnosis for diseases like TB (tuberculosis) and COPD(chronic obstructive pulmonary disease disease).
Is HeAR The Future of Acoustic Health Research?
Salcit Technologies, an Indian respiratory healthcare company, built a product called Swaasa that uses AI to analyze cough-related sounds and advise on lung health.
They are now exploring how HeAR could help expand their acoustic AI models. Currently, HeAR is working with Salcit in their research on the early detection of TB.