Where Tech Meets Bio #1: Convergence of technologies
Intro, my analytical framework, what to expect in future newsletters
Welcome to Where Tech Meets Bio, where I review paradigm-changing discoveries in science and technology, analyze the impact of novel technologies on the pharma and biotech industries, and review leading companies, products, and technologies shaping our lives in the “century of biotech.”
Prologue
Back in 2016, I wrote my first article about artificial intelligence in the pharmaceutical industry, ‘The Coming Boom In Drug Discovery.’ It featured several companies I got impressed by at the time: Toronto-based Atomwise (later moved to San Francisco), TwoXAR (later rebranded into Aria Pharmaceuticals), and Berg Health (acquired by BPGbio in 2023).
The three companies are approaching drug discovery in quite different ways:
Atomwise developed the AtomNet engine, based on a convolutional neural network (CNN), to run ultra-large virtual screening in chemical spaces of up to billions of synthesizable molecules against a known target of interest. Essentially, their algorithm keeps filling in a protein pocket of a digitally-encoded protein structure, atom by atom, until it comes up with a ligand optimal for modulating the protein -- like searching for a perfect key to a lock. Atomwise is probably one of the most successful examples of using structure-based drug discovery strategy using deep learning, they have always maintained it super focused.

The company scored lots of high-profile R&D deals with big pharma and became notable in the field, having raised $177 million (as per Crunchbase). I expect Atomwise to go on IPO this or the next year.
The second company, Palo Alto-based TwoXAR (now, Aria Pharmaceuticals) took a quite different strategy of using AI technologies and developed their Symphony Platform. The approach is to use AI (I suppose, a multimodal architecture with natural language models and deep learning at the core) to mine multiple sources of different data types simultaneously -- biology and chemical data -- to be able to then select molecules with novel mechanisms of action and within a defined space of constraints. Humans are still to decide on the final candidates’ prioritization. I suppose the company mostly relies on public or commercial external data sources, but I may be wrong about it.
TwoXAR (Aria Pharmaceuticals) now has a pipeline of a dozen preclinical candidates, around $15 million raised in total venture capital and recent positive in vivo results for one of their candidates for idiopathic pulmonary fibrosis (IPF). Well, to be fair, it seems a bit moderate for a 9-year-old AI-driven drug discovery company, in my opinion (in relative terms to some other industry players with similar R&D approaches). But who knows, things are definitely moving forward and I am curious what they will come up with this year. After all, Andreessen Horowitz himself and folks at SoftBank took the bet on the company in 2018, among other investors.
Finally, my third pick in the 2016 article was Boston-based Berg Health. Founded in 2006, the company went a long way to building an impressive clinical portfolio. In 2009, Berg bought cancer tissue samples from more than 1,000 cancer patients at various medical schools from across the US. Those included over 40 types of cancer cells - multiple types of breast cancer, prostate cancer, etc - along with healthy tissue extracted from the same patients. Fast forwarding to the present day, the company has built one of the world’s largest biobanks of human specimens, with a large repository of molecular and clinically annotated phenotype data. Owning unique proprietary data is gold in the era of artificial intelligence, so that was a strategic move.
Later on, the company developed a sophisticated AI platform called Interrogative Biology. A central component of the Interrogative Biology platform, bAIcis module, is a “state of the art Bayesian artificial intelligence learning software used for patient stratification, precision medicine, target identification, and lead optimization of novel drug candidates.”
According to Dr. Niven Narain, co-founder and CEO of Berg Health (and now, after the M&A, CEO of PBGbio, the new owner company):
“Our focus, and competitive edge, lies in a blended hypothesis-driven and patient-centric approach where we leverage real patient samples (via a comprehensive patient biobank) and AI to uncover and match the right drugs for the right patients in the right doses” (source).
In 2016, the usual reaction of pharma/biotech folks to the topic of artificial intelligence would often be rolling eyes. In the best-case scenario, curiosity. Of course, some companies did invest in pilots and internal programs at that time, but those were cautious steps.
As a lifelong chess player, however, I was deeply impressed by what AI did to the game -- computer engines became unbeatable by humans. Literally, the world’s chess champion, Magnus Carlsen, won’t stand a chance against the AI chess engines like Stockfish today, and neither would he do in 2016. For a number of years, humans have played separately in human-only tournaments, and AI engines play with each other in separate tournaments (creating some otherworldly games, I must say -- check this “thriller” on the chess board, if you are into chess).

But chess is just a game, with clear and well-annotated data, and biology is complex with biology data being a mess, they say. So, AI successes in chess can’t translate to biology research. They are right in that the adoption of AI is indeed a much slower process in healthcare research and probably, the results will not be as definitive as in the chess world. No matter how good an AI algorithm is at predicting the perfect molecule, you still need to run lab experiments, preclinical and clinical studies. But the last several years proved the translatability argument not entirely true. AI is revolutionizing drug discovery, right now. But it is a slow revolution, to me, it is more of an “evolution.”
We now see an exponential rise in AI adoption by pharma and biotech companies, research institutions, and discovery centers. There is a heated race to build a more efficient AI platform for drug research.
But where am I leading with the above examples and this little too-long intro? I merely want to let you know that at BiopharmaTrend, we have been into artificial intelligence for drug discovery since 2016, closely watching the rise and fall of hundreds of leading companies in this space. At that time, we decided to run (and running till these days) a dedicated analytical report that profiles lots of companies on our radars, now more than 400 of them and counting. We went as far as creating the first AI Productivity Index for drug discovery.

While this newsletter is dedicated to a broad spectrum of technologies and scientific areas, it will be heavily biased toward AI companies and case studies. It just historically happened to be our key topic of interest; later we expected into many other areas.
Now, what I realized during years of market research and consulting for venture capital funds, drug discovery and biotech startups and some big pharma clients via my BiopharmaTrend consulting company, is this:
First, artificial intelligence is not enough to make a revolution in science and pharma business. It always thrives on other data enabling technologies, for instance, next generation sequencing, combinatorial chemistry (synthetically accessible chemical space to search through), high-throughput proteomics, DNA-encoded libraries, Cryo-EM, EHR-databases, biobanks, and so on. True innovation happens when there is a really good source of quality biological/chemical/clinical data available. In this newsletter we will mostly be focusing on multidisciplinary stuff and exploring convergence of various technologies and scientific ideas.
Second, the exponential rise of artificial intelligence (mainly, deep learning and NLP) led to renewed interest in some quite old technologies, which now can be re-imagined using previously unavailable modeling capabilities.
Finally, in order for a breakthrough AI application to become a scalable business, the whole thing needs to be compatible with what I call “scalability technologies” or “scalability infrastructure”. For instance, can something be done in a high throughput manner, technically? Can it be produced rapidly?
The FFS framework
In order to understand a specific technological trend or a particular company’s growth potential, I am often using traditional consulting techniques, common in the consulting space: SWOT analysis, PEST analysis, Porter’s five forces, the 3 Cs, formal benchmarking, the 4 Ps, BCG growth-share matrix and some others.
But as an aspiring consultant, I surely came up with my own analytical framework (at least, I have never come across it before). I am using it to answer three main questions:
Identify and reflect on growth/innovation trend drivers in certain scientific niches
Reason growth potential for tech-driven drug discovery and biotech companies and explain it to clients (and to myself)
Guide my writing and analytical work (who is going to be the next industry disruptor and why?)
So, what’s the framework? I call it the FFS framework.
It is based on the idea that technologies can roughly be divided into foundational and functional. Foundational technologies are general-purpose, and applicable to various aspects of the industry, or even different industries. A good example is a family of deep neural networks (DNNs), or a more specific example -- Transformers. Some large language models (LLMs) are based on transformers.
DNNs are applicable to a multitude of tasks in drug discovery and biotech, and so it is foundational tech. Other examples of foundational technologies include quantum computing, virtual reality, blockchain, etc.
In contrast to foundational technologies (which are mostly digital, data, or hardware related), there are functional technologies, peculiar to a specific field of science or application use case, For example, Cryo-EM is a typical functional technology (in my view), which is focusing on predicting 3D structures of molecules. There is not much you can do with it beyond structural biology, or structural chemistry, but it is powerful when it comes to its function.
The combination of functional technologies with foundational technologies often leads to disruptions in the relevant fields and new applications of old technologies. For instance, combining good-old next-gen sequencing (NGS) with deep learning algorithms may lead to the next level in target identification, gene editing, diagnostics, and myriads of other use cases and novel areas of research.
Finally, there is a third crucial component for successful growth: scalability infrastructure. Both functional and foundational technologies must be compatible with some technologies that allow running things in high throughput mode, or on a large production scale.
For example, let’s say there is a novel assay for preclinical research (functional technology) and it can be rapidly analyzed using deep learning techniques to give unique insights. Is the new assay compatible with lab equipment, hardware/lab software? Can it be done at scale? Those questions might not be very important for a scientist to run a proof-of-concept study, but they will be crucial for a company to scale to commercially meaningful results.
Below is a simplified representation of FFS framework for company/industry analysis:
Let’s review the recent success story of Moderna and apply the FFS framework in action (here in a very simplified manner, just to illustrate the idea). Below is a chart (simplified) of a possible FFS analysis, where I, II and III represent technologies, and I-II, I-III and II-III represent tech compatibility considerations (is a given technology scalable at all):
Moderna rose to prominence after it managed to come up with an mRNA vaccine as its answer to the rapidly rising COVID-19 pandemic, and it did it really fast. They also apply their platform for creating therapeutic vaccines. From the technological point of view, their success can be explained by an efficient match of functional, foundational and scalability technologies.
Their core functional technologies are mRNA and the corresponding delivery system based on lipid nanoparticles (LNPs). Let’s roughly simplify it to just mRNA technology for the purpose of this example.
Next, they apply an organization-scale digital platform that integrates all R&D into a data-centric workflow, including analysis of target proteins, design of corresponding vaccines in silico and automation of various experiments down the line. They are using cloud-based data infrastructure to connect the dots between all the stages of R&D pipeline and they are known to be a “digital-first” company (read their case study ‘How Building a Digital Biotech is Mission-Critical to Moderna’).
Finally, the critical element is the ability to synthesize RNA. It is not a unique technology to Moderna in general, but this technology is available in principle, scalable, and they integrate it into their business model:
In summary, the timely and efficient combination of digital approaches (foundational tech) with state-of-the-art mRNA discovery (functional technology) and scalability infrastructure (high throughput mRNA preclinical production facility), of course, combined with urgent market need, led to the emergence of one of the most successful biotech stories in the history of this industry.
On a more general note, using the FFS framework allows me to quickly understand areas of disruptive potential, where we can expect a lot of research and a wave of novel startups. If we have growing new Functional technology or an improved variation of legacy technology, and it can be enabled using one of the foundational technologies (e.g., deep learning, NLP, computer vision, quantum computing, etc.), we can certainly expect a lot of exciting companies in this space.
Take metabolomics + machine learning, and we have room for disruptive drug discovery and diagnostics startups. Take organ-on-a-chip + machine learning, and we have next-generation preclinical research models, etc.
Ok, that was a long intro, thanks for getting through. Time to discuss what to expect in the upcoming newsletters (they will be structured differently).
Future newsletters
I am planning to write to a more general public of life science professionals and everyone interested in cutting-edge science. We will be touching upon a much broader scope of topics than just drug discovery. I will be covering all aspects of life where biotech is changing our existence for the better -- from medicine to synbio, aging research, longevity, the environment, the food industry, agriculture, etc. We will be discussing progress in aging research, and the frontiers of human augmentation.
This is a weekly newsletter, where you can expect the following types of insights:
discussions of notable trends in drug discovery, biotech, and clinical research
interviews with industry key opinion leaders in science and business
reviews of innovative startups, company listicles, and “company of the week“ picks
reviews of cutting-edge technologies and scientific breakthroughs
occasional conference announcements for our media partners (we always partner only with high-quality events and often include good discounts for tickets)
Finally, each month we are planning to release a PDF version of our recurring report Where Tech Meets Bio, reviewing the ongoing developments and companies shaping the industry. The PDF report will be available for premium subscribers at BiopharmaTrend.com. Stay tuned.
Thanks for reading this, and welcome to let me know your feedback at editor@biopharmatrend.com. If you don’t want to miss the next post, consider subscribing to have them delivered to your inbox:








Great Article! Thanks for sharing the FFS framework. Just want to learn more about it, whether the foundational tech has to be digital or AI/ML related. Like the Moderna case, the NGS sequencing tech could be a foundational tech to identify the antigens further coded by the mRNA, or the NGS should be categorized as "scalability infrastructure"?