The Future of Computing: Harnessing Molecules for Sustainable Data Management

In this contributed article, Erfane Arwani, founder and CEO of Biomemory, discusses how molecular computing (using molecules rather than traditional silicon chips for computational tasks) could be a critical component in revolutionizing data storage, despite the exponential growth of AI.

Life is Fleeting, But Data is Forever – Meet your Digital Twin

[SPONSORED POST] With the transformation of medicine from analog to digital, plus the rise of new data-generating devices for health tracking and genomic information, we can look forward to a new world in which virtually every aspect of a patient’s medical history can be communicated, stored, and manipulated. For each patient, this huge body of data represents a sort of digital twin, a treasure trove of useful medical information and insights that could become invaluable in developing patient treatments in the future.

Revolutionizing Bioscience Research: Creating an Atlas of the Human Body

Making healthcare and life science (HCLS) discoveries is time-consuming and requires considerable amounts of data. HPC enterprise infrastructure with AI and edge to cloud capabilities is required for biomedical research to make creating a human atlas of the body possible. The HPE, NVIDIA and Flywheel collaboration using the latest technologies designed for HCLS promise to transform biomedical research.

Analysis of 145 Generative AI Startups IDs Opportunities to Remedy Pain Points in Healthcare and Life Sciences

Generative AI technology could deliver industry-changing improvements in healthcare delivery and life sciences productivity, efficiency and patient outcomes and presents a massive untapped opportunity for entrepreneurs and investors, according to a new market analysis by Justin Norden of GSR Ventures, Jon Wang, and Ambar Bhattacharyya of Maverick Ventures.

The Problem with ‘Dirty Data’ — How Data Quality Can Impact Life Science AI Adoption

Jason Smith, Chief Technology Officer, AI & Analytics at Within3, highlights how many life science data sets contain unclean, unstructured, or highly-regulated data that reduces the effectiveness of AI models. Life science companies must first clean and harmonize their data for effective AI adoption.

AI from a Psychologist’s Point of View

Researchers at the Max Planck Institute for Biological Cybernetics in Tübingen have examined the general intelligence of the language model GPT-3, a powerful AI tool. Using psychological tests, they studied competencies such as causal reasoning and deliberation, and compared the results with the abilities of humans. Their findings, in the paper “Using cognitive psychology to understand GPT-3” paint a heterogeneous picture: while GPT-3 can keep up with humans in some areas, it falls behind in others, probably due to a lack of interaction with the real world.

How AI Can be Used to Help People See

From reducing the reliance on trial and error to providing ultra-precise drug effectiveness data, AI has a major role to play in making clinical trials more efficient and robust. In this contributed article, ophthalmologist and biopharma CEO Dr George Magrath explains how AI is being harnessed in the development of eye care medicines.

CATALOG Achieves Historic DNA Computing Milestone

Catalog Technologies, Inc., a leader in DNA-based digital data storage and computation, has made a historic breakthrough in DNA computation by demonstrating the ability to search data stored in DNA in a massively parallel and scalable manner with resource usage almost independent of the data size. 

The Move Toward Green Machine Learning

A new study suggests tactics for machine learning engineers to cut their carbon emissions. Led by David Patterson, researchers at Google and UC Berkeley found that AI developers can shrink a model’s carbon footprint a thousand-fold by streamlining architecture, upgrading hardware, and using efficient data centers. 

NVIDIA Launches Large Language Model Cloud Services

NVIDIA today announced two new large language model cloud AI services — the NVIDIA NeMo Large Language Model Service and the NVIDIA BioNeMo LLM Service — that enable developers to easily adapt LLMs and deploy customized AI applications for content generation, text summarization, chatbots, code development, as well as protein structure and biomolecular property predictions, and more.