[Image courtesy of Sora/OpenAI]
If navigating your organization’s lab software landscape feels like untangling chaos, you’re not alone. While data and software maturity levels can vary significantly, it’s not unheard of for many research-focused organizations to have walled gardens throughout their organization, sometimes with distinct labs operating as technological islands. Sometimes, there are miniature islands within a single lab — or workstations contained within them. While connecting the proverbial data dots is a priority and imperative in increasingly cross-disciplinary research, according to Thermo Fisher Scientific experts, the reality is often far more complex than anticipated. James Pena, Senior Product Manager, recalls one customer admitting to using “15 LIMS… half of them homegrown” within just one division.
This pervasive fragmentation can also extend to semantics. “Even within an organization, one laboratory might refer to a ‘sample’ or a ‘run,’ whereas another laboratory might refer to it as an ‘aliquot’ or a ‘sequence,'” Pena said.
Meeting diverse labs where they are
Addressing this level of deeply rooted fragmentation requires a methodical approach, rather than a disruptive, all-at-once overhaul. That’s where Thermo Fisher’s “land and expand” philosophy comes in. “We usually focus on one area to start transforming… then expand from there,” Pena said.
It’s a bit of crawl-walk-run, or land-and-expand within the organization
—Pena
The ambition to make data Findable, Accessible, Interoperable, and Reusable (FAIR) is well-established. Yet, the journey towards FAIR data, a framework that has been around roughly two decades, often stalls.
James Pena
James Pena
Thermo Fisher notes that its Connect Platform embodies FAIR data principles through its secure, scalable data collection and storage features. That is, it offers a centralized repository that transforms discrete instrument data into findable, accessible, interoperable, and reusable formats that enable future reanalysis, analytics, and AI/ML applications.
Despite the promise of advanced data projects, the inherent complexity of modern research helps explain why grand, top-down integration efforts frequently lose steam, according to David Hardy, Sr. Manager of Data, Analytics and AI Enablement at Thermo Fisher Scientific. “Despite the good intentions and the known benefits, it’s a long road ahead – a multi-year journey,” Hardy explains. “And faced with those challenges on a multi-year journey, inevitably things like changes in leadership or shifts in priorities occur. I’ve heard a few times how organizations then start to question if the effort is really worth it, and they get set back a couple of years. It really requires visionary leaders who will stick with it, given the vast nature of the challenges in the data landscape.”
David Hardy
David Hardy
Key to tackling that landscape, especially for a major instrument vendor navigating labs filled with competitors’ equipment, is avoiding the creation of yet another proprietary walled garden. This requires a deliberate commitment to openness, explains Pena. “The first step is taking this agnostic approach to how we ingest, consume, and map data,” he stated. “Despite Thermo Fisher being a vendor of many different laboratory solutions, we acknowledge that laboratories are incredibly diverse.”
Without this foundational integration work and a commitment to interoperability, Hardy warns, researchers remain bogged down, often inadvertently creating new problems while trying to solve old ones. “[Without accessible data] they’ll face the same issues of data access and wrangling, spending lots of time there,” Hardy noted. “Effectively, you’d just end up with a lot of home-brew type applications… just making more data silos effectively.”
One size rarely fits all…
Thermo Fisher underpins its strategy with a tiered offering in its Connect Platform, acknowledging that labs rarely exist on the same rung of the digital transformation ladder. This internal disparity can be stark; as Pena observed, some labs might champion cutting-edge IT while another team just down the hall might be more entrenched.
Recognizing this reality, the company offers multiple editions of Connect designed to meet labs where they are: starting with a free Individual tier focused on connectivity for select Thermo Fisher instruments. A Team edition introduces broader data management and vendor-agnostic connectivity for instruments capable of producing data files or connecting online. At the other end of the spectrum is a full Enterprise tier for system-wide orchestration. The platform’s capabilities expand significantly at the Team and Enterprise levels to embrace the multi-vendor reality found in most labs. “Despite Thermo Fisher being a vendor of many different laboratory solutions, we acknowledge that laboratories are incredibly diverse,” Pena said.
Part of the goal of Connect is to move away from the hard-coded, point-to-point integrations that brittle over time and prevent broader data sharing. Hardy warns that researchers without a solid integration plan often wind up unintentionally reinforcing fragmentation. “Without accessible data,” said Hardy, “they’ll face the same issues of data access and wrangling, spending lots of time there—effectively, you’d just end up with a lot of home-brew type applications…just making more data silos.”
Committing to the journey to integrate today, enable tomorrow
But realizing a grand vision requires a careful, thoughtful approach. Sweeping, simultaneous changes across an organization often “grind to a halt,” as Hardy put it. As the proverb goes, slow and steady wins the race. “We usually focus on one area to start transforming… then expand from there,” Pena explained. “It’s a bit of crawl-walk-run, or land-and-expand within the organization.” This incremental strategy also mitigates the risks of large-scale projects stalling. “Despite the good intentions and the known benefits, it’s a multi-year journey,” Hardy emphasized. The process requires both long-term leadership and commitment. “I’ve heard a few times how organizations then start to question if the effort is really worth it, and they kind of get set back a couple of years.”
It really requires visionary leaders who will stick with it
—Hardy
But sticking with it is required to tap into emerging technologies or more forward-looking applications of AI, such as AI agents, which can potentially automate tasks from ordering consumables (“a very low-risk activity,” according to Pena) to complex workflows. The potential is “enormous,” as Hardy notes. Yet, the path isn’t purely technological. Hardy cautions that there are also cultural and regulatory dimensions to navigate. And then there is the “black box” nature of some AI (i.e., complex deep learning models used forpredicting compound toxicity or analyzing medical images). The black box problem is “kind of antithesis to today’s validation structure,” Pena said. Consequently, it is a generally a good idea to start with lower-risk applications and build from there, ensuring comfort and refining governance along the way.
By deliberately building an open, integrated data ecosystem first, organizations move away from chasing disconnected fixes and toward strategically enabling the next generation of scientific discovery. “I’d say it’s a really exciting time with the technology and helping customers with platforms like Connect Enterprise and using all these agentic AI tools,” Hardy said. “I think it’s going to be a fun journey.”