logo
#

Latest news with #Inria

10-Minute Test Exposes Antibiotic Resistance Threat
10-Minute Test Exposes Antibiotic Resistance Threat

Medscape

time17-07-2025

  • Health
  • Medscape

10-Minute Test Exposes Antibiotic Resistance Threat

Current methods for assessing antibiotic resistance typically rely on bacterial cultures, a process that can take up to 2 days. Such delays can be critical in urgent clinical settings. To address this, researchers at France's National Institute for Research in Digital Science and Technology (Inria) are developing a rapid method that can deliver results within a few hours or even minutes. This new approach is based on the principle that antibiotic resistance results from mutations in the bacterial genome. In theory, sequencing can detect these changes; however, conventional sequencing methods are time-consuming. Karel Břinda, PhD, is a permanent Principal Investigator with the GenScale team at Inria Rennes, one of the regional research centers of Inria. He specializes in computational genomics and rapid diagnostic methods. The technique he developed compared bacterial DNA from patient samples with a reference database of genomes from bacteria known to be resistant or sensitive to antibiotics. Bacteria with similar DNA sequences are likely to exhibit similar resistance profiles. This method, known as Genomic Neighbor Typing, offers two key advantages: It does not require prior knowledge of the pathogen's complex biology, and it can generate predictions from minimal sequencing data, making it exceptionally fast. 'We've shown that once sequencing begins, we can predict resistance or susceptibility in about 10 minutes,' Břinda explained in an interview published by Inria. The technique uses a compact, portable nanopore sequencing device that is approximately the same size as a smartphone. 'With nanopores, you receive a continuous stream of data as soon as the device starts sequencing. Our method allows you to make a prediction almost immediately from this initial data, so the diagnosis is very fast,' he said. 'These devices also produce very long reads, which help identify the nearest genomic neighbors of the pathogen present in the patient's biological samples,' Břinda said. Although promising, this method has only been validated for Streptococcus pneumoniae and Neisseria gonorrhoeae . However, it is not yet applicable to all bacterial species or antibiotics, mainly because of gaps in existing genomic reference databases. To address this, Inria partnered with Rennes University Hospital, Rennes, France, to combine its computational tools with biological and clinical expertise. The hospital also houses a national collection of Enterococcus strains maintained by France's Reference Center for Antibiotic Resistance. 'In the long run, the key question revolves around building large and truly representative databases of bacterial strains,' said Břinda. 'Today, we work with databases of up to thousands of genomes. In the future, sequencing will become increasingly cheaper and more common. We will therefore have much larger databases. But we will then need new computational methods and new software.'

AI's turbulent history, from the calculator to ChatGPT
AI's turbulent history, from the calculator to ChatGPT

LeMonde

time13-07-2025

  • Science
  • LeMonde

AI's turbulent history, from the calculator to ChatGPT

On an internet forum, a slightly confused student asked a not-so-silly question: "Did the definition of AI [artificial intelligence] change?" Another replied, sounding disillusioned: "I just don't really regard this as an academic term [that I'd use] anymore (…) it may not be clear to a person reading it." The use of the term "artificial intelligence" has indeed never been fully consistent. Experts generally agree on a preliminary definition: AI is a machine capable of carrying out tasks previously thought to require human intelligence. "But what is human intelligence?" asked Anne Laurent, a computer science researcher and director of the Inria branch at the University of Montpellier. "Philosophers, engineers and cognitive scientists do not agree. Its definition changes over time." If we look at the broad history of AI, it may have been in the 19 th century that the first human intelligence skill fell, conquered by calculating machines capable of multiplying and dividing. But a narrower view often prevails, placing the birth of AI in the 1940s with the emergence of the computer. 1940s: The foundation One of the very first electronic computers, the Colossus, was used by the British military to break the encrypted communications of the German high command. It was far more powerful than the electromechanical machines that came before it. The machines that followed would be easier to program and capable of running complex and varied algorithms, finally making it possible to experiment with theories about "thinking" machines, which began to circulate at the time. As early as 1943, Walter Pitts and Warren McCulloch imagined the concept of artificial neurons, a simplified model of how human neurons function.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store