“New Jerseys Revealed for 4 Nations Face-Off: USA, Canada, Finland, and Sweden Gear Up!”

USA, Canada, Finland, Sweden jerseys unveiled for 4 Nations Face-Off The 4 Nations Face-Off rosters were announced in December, the captains were named this week and now the tournament organizers have revealed what the United States, Canada, Finland and Sweden will wear in February. The jerseys, designed by Fanatics with input from the NHL and
HomeHealthRevolutionary AI Tool Unveils Hidden Indicators for Stillbirth and Newborn Risks

Revolutionary AI Tool Unveils Hidden Indicators for Stillbirth and Newborn Risks

A novel analysis utilizing AI has examined close to 10,000 pregnancies and uncovered new combinations of risk factors associated with severe negative outcomes, such as stillbirth.

The research revealed that infants who are currently treated similarly could face a risk variation of up to ten times.

According to Dr. Nathan Blue, the study’s senior author, the AI model developed by the researchers identified an “unexpected” mix of factors that increase risk levels. He noted that this model marks a crucial advancement towards more tailored risk assessments and pregnancy care.

The findings are published in BMC Pregnancy and Childbirth.

Unforeseen Risks

Initially, the researchers worked with a comprehensive dataset of 9,558 pregnancies collected from across the country, which included various social and physical characteristics – from the pregnant individual’s level of social support to metrics like blood pressure, medical history, and fetal weight, along with the pregnancy outcomes. Through AI, they unearthed new combinations of maternal and fetal traits linked to adverse outcomes like stillbirth.

Typically, female fetuses carry a slightly lower risk of complications compared to male fetuses – a well-established phenomenon. However, the research team found that when pregnant individuals have pre-existing diabetes, female fetuses are at a higher risk than their male counterparts.

This newly uncovered pattern highlights the capability of the AI model to provide fresh insights into pregnancy health, as stated by Blue, an assistant professor in obstetrics and gynecology at the Spencer Fox Eccles School of Medicine at the University of Utah. “It recognized something that even experienced clinicians might overlook,” he added.

The researchers particularly aimed to enhance risk predictions for fetuses who are at the lower 10% in weight, but not in the bottom 3%, as these babies are small enough to raise concerns but typically remain healthy. Determining the appropriate action for such pregnancies can be complex: should there be close monitoring leading to early delivery, or can the pregnancy progress normally? Current clinical guidelines advocate for extensive medical supervision for all such cases, which can impose significant emotional and financial burdens.

However, the study found that among this group of low-weight fetuses, the risk of poor pregnancy outcomes varied considerably, ranging from average risk to nearly ten times the norm. This risk was influenced by factors such as fetal gender, the presence of pre-existing diabetes, and any fetal anomalies, such as heart defects.

Blue emphasizes that the analysis pinpointed correlations among variables without establishing causal relationships behind negative outcomes.

The broad spectrum of risk aligns with the intuitive understanding of experienced doctors, who recognize that many low-weight fetuses are healthy. They usually consider numerous additional factors to make individualized evaluations regarding risk and treatment. However, an AI-driven risk assessment tool could provide significant benefits over such “gut feeling” judgments, enabling informed, consistent, and equitable decisions about care.

Why AI?

To tackle these challenges, the researchers implemented a type of model known as “explainable AI,” which not only estimates risk based on a set of pregnancy factors but also clarifies which variables influenced that risk and to what extent. Unlike traditional “black box” AI models that are often obscure even to experts, explainable AI “shows its work,” allowing biases to be identified and addressed.

In essence, explainable AI combines the adaptability of expert clinical judgment with the precision of data-driven analysis. This model is adept at assessing risk for uncommon pregnancy scenarios, offering reliable estimates for patients with unique combinations of risk factors. Ultimately, such tools could enhance personalized care by guiding informed choices for individuals in distinctive circumstances.

While the researchers still need to conduct further tests and validations with new populations to confirm their model’s predictions in real-world situations, Blue expresses optimism that an explainable AI model could lead to a transformation in risk assessment and treatment during pregnancy. “AI can provide risk estimates tailored to a person’s specific context,” he remarked, “and do so transparently and consistently, something that our human brains often cannot achieve.”

“Having this ability would be revolutionary for our field,” he concluded.