Log In
Log in

Bringing NGS Testing In House



The below text is a transcript from the webinar. Because it is a transcript, there may be oddities that arise from the process of translating speech into text. We recommend accessing the recording, above, to gain full context. 

Overview and Introductions

At this time I'd like to introduce our speaker, Dr. Eric Loo. He is a system professor of pathology and laboratory medicine at Dartmouth Hitchcock medical center, specializing in hematopathology, and molecular genetic pathology. He also provides medical directorship for the laboratory up there at Mt. Ascutney Hospital which is a Dartmouth Hitchcock-affiliated critical care hospital in Vermont.

He serves as the New Hampshire State pathology representative to the national government services, the medicare administrative contractor for their jurisdiction, which is very important as well know especially the molecular. And our other speaker is Dr. Rakesh Nagarajan. I think I screwed it up. I messed it up, I'm sorry. He is the founder and CEO of Pierian diagnostics or DX. He is trained as a physician, scientist with a deep interest in molecular biology, molecular laboratory work flows and techniques. He is multiple informatics sub specialties include bio informatics bio specimen informatics clinical trials informatics and medical informatics. He currently serves on camps molecular oncology committee and the next general sequencing project team, so you will turn your attention to our two speakers and the project they are making public this morning. Welcome everyone.

Rakesh: Thanks so much for the kind inductions, and thank you all for being here at 7:00 in the morning. I like to jump right in and talk about the topics we'll be discussing today.  First we'll go through industry trends that are showing that complex molecular diagnostics and  next generation sequencing is really taking off already, and is expected to be a very strong growth area. The second is, we'll talk about the business case for why you should in source your own next generation sequencing testing, one of the of value propositions that are available there.

Third we'll talk about the expected as you bring in the next generation sequencing, as a bedrock of your precision medicine program of your own organization, what challenges you expect to face there. Fourth we'll describe the blueprint for success, so really how do you get over these challenges and establish a prescription medicine program very successfully at your organization, and finally Eric will talk about the success story that is Dartmouth, including how they've navigated a reimbursement and how to do that generally.

Next Generation Sequencing (NGS) Landscape

Rakesh: So let's jump right in, and talk about the industry trends. There's already strong market adoption today in next generation sequencing. There are just north of about 360 laboratories that are conducted next generation sequencing back in 2016, average 30% to 40% year over year growth. There are over five million genetic tests that were performed, in just north of one million tests that use next generation sequencing. When you look at the entire molecular diagnostics market, that market size is expected to almost double over a five year period, and a big portion of that is through next generation sequencing, and sequencing is the fastest growing modality, with Geiger that's estimated to at almost 20%.

We're are at true inflection point, when it comes to next generation sequencing because of a confluence of multiple events. One is that we all know the cost of sequencing continues to drop, second knowledge is really starting to grow and explode, especially as applied to targeted therapy and immunotherapy and third a reimbursement is really starting to clarify. So here we show examples of all those trends where immunotherapy approval so site agnostic approval of Keytruda, really allowed folks to assess a biomarker like TMB, or tumor mutation burden, using next generation sequencing, and to apply that very broadly for all solid tumor types.

In March of this year, the FDA actually approved next generation sequencing tests, four different tests and also approved payment for those tests in advanced stage cancer, so stage three stage, four cancers, and that was a landmark event to have reimbursement for solid tumor assays that are greater than 50 genes. Let's get into the business case as to why you should bring NG's in house. We'll start by talking through this amp study that did a micro costing analysis and a health economic analysis, using a set of five different genomic sequencing procedure codes. These GSP's were set up to, in areas where there ought to be clear reimbursement, so this included targeted germ line panels, and hearing loss and excellent intellectual disability. Whole exome for rare undiagnosed genetic disorders are rugged as well as somatic cancer of both what we call less than 50 Gene panels in a solid tumor, and greater than 50 panels in solid tumor[inaudible 00:06:39]

The first part of the analysis, it's a little hard to read here took a survey across 13 laboratories that that did one of these tests, and essentially determined the total cost of running the assay, and that's summarized on the right hand side there, and he's you can see smaller targeted panels both in sematic cancer, and in germline are somewhere between $700 and $1500 and larger assays, like the greater than 50 gene tumor panels in whole exomes range somewhere between $2000 to $2500 and these costs are significantly lower than pure send out based testing. They then follow that up, by doing a health economic analysis across a set of three conditions.

I'm only showing the example here for non-small cell lung cancer, where they compared the current care with care that would have been gotten through the genomic sequencing procedure, process and that care in general is to really navigate a patient to a non targeted therapy, to navigate them to a targeted therapy, to a trial or to hospice. And each of the costs then are shown, excuse me are shown on the right hand side.  As you can imagine the cost of next generation sequencing is about double that of traditional testing for PCR based testing typically for EGFR in this case. But, the benefits are great. Namely, you've got almost half the adverse events through testing of the next generation sequencing test. Obviously greater eligibility for a clinical trial, which also increases cost, a greater probability of entering hospice, but those costs are a lot lower than the use of non targeted therapy which occurs at a much greater level, when using the standard of care in very limited PCR based testing.

Overall almost three million dollars in anticipation savings for a health plan that's covering a million lives. In a nutshell cheaper, safer, more effective to do next generation sequencing. This slide really summarizes two additional studies, that were cited in that same article, first cost effectiveness of a panel in melanoma, and second inner mountains retrospective study, where they had matched controls of patients that underwent next change the sequencing, versus those that didn't, and in both cases there was not only an improvement in outcome whether that's measured by quality adjusted life years, or progression free survival, it was the costs were also lower. So again both higher quality of care lower costs.

Does In Sourcing NGS Testing Make Sense?

Now I'd like to move into talking about why folks are choosing to in source and a survey of adopters that jumped into this process early. First and foremost there is a demand from clinicians. They're being bombarded through multiple marketing modalities of that next generation sequencing looking at numerous genes, is the most effective way to manage their patients. So that's the top most on the list. The second is really anticipated efficiencies, so rather than have a block that needs to be sent out, needs to be tracked, and receive a report that then needs to be put in the medical record, you can actually run that in-house and that then also increases turnaround time, which is also called out as a reason to in source.

Very importantly organizations really want to establish a personalized medicine, or precision medicine initiative at their own organization. There's great value for research, you're receiving all of that data and discrete fashion, to do research yourself. And I've already shown you that it reduces the overall cost of care, and the same benefits we've heard time and again from our own customers. Before we jump into the anticipated challenges, I'd like to talk about a success case.

The Moffitt Cancer Center Story

This was our first client as we became a company in May of 2014, how Moffitt chose to partner with us to establish their precision medicine or NGS program from their molecular diagnostic laboratory. They immediately launched with a solid tumor [inaudible 00:11:20] assay so very targeted panels, back in 2014, they went live in October of 2015, with both of those assays that included assay validation, which we'll get into in a minute, as well as EMR integration of that workflow, so receiving an order through LIS and pushing that signed out, report back up to their medical record.

They continued to use our products and services over time, as they've run into challenges that we'll describe, that includes utilizing what we call our laboratory services, and this really leverages a distributive model that I'll describe in a few minutes. They also moved to using our interpretation services that really addresses the interpretation bottleneck, which I'll cover in the middle of my presentation, and finally they've matriculated to a much higher number of gene assay and solid tumors, called the TruSight Tumor 170 and again we'll cover that as the progression or the evolution of assays that most organizations will move through as they establish a precision Medicine Program at their own organization. The top barriers that were cited by early adopters, first and foremost is a scarcity of informatics expertise. Next generation sequencing requires a huge lift, and IT and informatics both in storage computation, annotation, interpretation provenance tracking, things of that nature and you've got to have a really good handle on all of those concepts. The second is the technology is very rapidly evolving so, the minute you establish an assay after validation, there are 10 new assays that are bigger better cheaper, how do you really navigate that, how often should you update your panel, why should you update a panel. Things of that nature I think you really need to think through, validation itself is an enormous lift and we'll talk much more about that. That has become less of a problem in my opinion since 2011 where at Washington University we validated one of the first Next generation sequencing assays is seven years later there are now published articles on how to validate an NGS assay and we'll talk a little bit more about that.

There is an expense to implementation that can't be disregarded. This includes capital equipment cost, variable cost for consumables, trained technologists in molecular techniques next generation sequencing techniques, obviously professional. Folks who review and sign out cases. All are needed in order to launch one or more NGS assays, and really the first application, is the most difficult to get launched. An example that I'd use again back at my days at Washington University first assay took about nine months to launch. The second assay about five, five and a half months, the third in about 90 days and so we learn very quickly how to do that over time, but that first assay is a huge lift. And finally the amount of data that you have to curate yourself if you're going to take on that activity, is enormous and the wave of data are monumental to try to interpret cases effectively as you keep up with the literature, keep up with the latest practice guidelines things of that nature.

This side really summarizes that interpretation bottleneck that I was describing. That difficulty in curating data, so as we go from the top to the bottom of the funnel, essentially there are raw data in the structure of what we call reads, that's going from terabytes to gigabytes of data. There's processing and detection of variance that happens through alignment and variant calling, there is then filtering invalidation, ie identifying variants that are real, in the actual sample. And that happens through a validation process where appropriate cutoffs are set previously, you then move into the area of annotation and interpretation. The annotation can include a biological annotation, functional annotation, in silico predictions of the effect of mutation on protein function, things of that nature followed by interpretation which is the variant really clinically relevant in my patients context, and if so why is that therapeutic as a diagnostic? Is it prognostic? Is there a risk of disease that I should trigger genetic counseling?

Things of that nature and then finally there's a clinical application. How is the physician who is the treating physician with that patient going to manage the patient, by talking to him, by coming up with a treatment plan or a management plan, in concert with the patient and his or her family. Jump right into the blueprint for success, and in order to do that I really want to talk about the clinical workflow in a typical molecular diagnostic laboratory that's running next generation sequencing, you start essentially with tissue, or a tube of blood. That tissue or tube of blood is purified into DNA. It's then ... Or RNA. It's then prepped done through library prep, and put on to the sequencer.

The sequencer generates those terabytes of data that I talked about, that then need to be processed through what's called secondary analysis alignment variant calling, including then annotation and what we call classification. Also, to classify variance, and the different regulatory bodies both in germ line, and somatic cancer have now established classification systems, and how to classify variants through a methodological or evidence based approach, both ACMG and Cap and ASCO in germ line somatic cancer respectively so variants are classified for their clinical relevance. There are data visualization and you QC analyses that need to be done, so did the sample successfully go through the entire process, both the wet and the dry that I described previously, and should the sample be rerun, should you backfill certain components using Sanger sequencing or other approaches, so variety of QC analyses need to be done before you get into the interpretation process.

That interpretation process then uses clinically relevant databases, so in somatic cancer that includes FDA approval labels NCC and ENASCO practice guidelines looking at the latest literature for convincing evidence where, that evidence has not met the need for practice guidelines yet, but it's right on the cusp. All of those elements need to be reviewed finalized by a medical director, signed out and that report then needs to be pushed to the medical record, or integrated with third parties. Now this workflow really can be distributed, in a distributed model of testing, where roughly this is color coded as wet dry and professional. So there are organizations today that are clean of certified that will do a technically sequencing. So before you commit to the capital cost of establishing a laboratory in your own organization, you can actually get the sequencing done at another site. You yourself can still practice that medicine by receiving the discrete data back, processing through informatics solutions, and doing to professional reporting yourself.

But there are ways to get going through this distributive model. There are other organizations that will actually support the professional component of this workflow. So if you're lacking professionals whether those are geneticists, genetic counselors, excuse me molecular pathologists, those functions what we call various scientists, those functions can be supported by organizations to provide that function. In a nutshell however you can in source this type of testing, still start bringing the data internally, and start and leverage the expertise that you have, whether it's a very strong laboratory function, a very strong informatics function or a very strong professional function. I really then now like to jump into how to start thinking about establishing a precision medicine program at your own organization.

It all starts with identifying the clinical need and examining your own institutional strengths, so you may have strengths and particular institutes, or centers that are strong in somatic cancer, or neurodegenerative disorders or Pediatrics things of that nature. You should also examine strengths and research, as well as the existence of informatics or lack of informatics and whether that's consolidated, whether that's fragmented. You then get into reviewing your own institutions priorities, and again these are typically largely driven by those same centers and institutes, typically there are champions that wish to establish precision medicine programs. Those champions come from different clinical specialties, and or they have research or educational agendas, and those then tie into the competitive market to ERN and the stature that you have as an organization.

You want to keep establishing that leadership position for yourself. You then get into establishing a business plan, so how are you going to resource this institutionally? Who's going to pay for it? Is essentially paid? Is it paid individually by multiple departments as well as a central payment? Are there philanthropic funds? Things of that nature, what reimbursement strategies will you employ? And Eric will talk a lot about that, and what resource established are needed on the research side in order to then effectively effectively warehouse these data, and potentially mine them in the future. You then get into prioritizing initial clinical applications. You can't do everything at once, it's very hard to establish a PGX, somatic cancer, targeted germ line, pediatric and whole exome program simultaneously. Essentially then you get into developing a project plan, getting institutional approval and implementing, there is a monitoring process to assess outcomes measures, determine as you go through that establishment of that precision medicine program, as new resources are needed push those back into the business plan, reassess that and turn that crank over and over again, as you grow precision[inaudible 00:21:58] program. This slide really talks about logistically, how to get into NGS and this is really the path many of our customers have followed, so namely first in somatic cancer especially our conversion of PCR based testing to NGS based testing for that same purpose, so perhaps you're looking at only targeted sites by PCR, now you're going to expand to look at whole genes, but up but a smaller number of genes.

The next phase is to really think about starting with commercially validated panels, in that the vendor has already done a good job, in having adequate coverage of areas they expect target, and you're not starting from scratch by designing your own probe spades or amplicon, in order to do that. You can then matriculate what we call custom NGS panels in order to then design your own assays, and then move from more targeted NGS panels, to larger panels all the way to exome, and as I already described, the other pathway to really consider is to start by outsourcing that sequencing through laboratory services, as you start to establish your financial, needs as you establish a full laboratory and that is a very safe way to start practicing that precision assay program before you bring the entire component in-house, that would include the wet dry and the professional. So as you think about getting into the NGS space, I described this challenge of validation. Assay validation is a challenge in the next generation sequencing space. What you are actually sequencing is quite large, so you need appropriate validation and that includes, assessing the minimum number of samples needed for each variant type. So if you're going to sequence a single type variants and ind els, and copy number variance. There are a minimum number of samples needed to assess those different variant types, if you're going to accept different sample types again, there are a minimum number of samples that are needed for let's say, bone marrow blood FFPE things of that nature, you have to asses the limitations of your assays, so perhaps you're only sequencing SNV's in ind els.

So what is the largest size of ind els you can protect and you have to expose those limitations to the treating physician. You obviously have to calculate things like analytical specificity sensitivity, positive predictive value, accuracy things of that nature and during that validation process, is when you establish acceptance and rejection criteria, and that allows you to process the sample trooper in production, and determine when it is you need to rerun that sample, starting from the library prep or all the way back from DNA isolation. And finally in NGS the lower limit of detection, is a unique concept in that it's actually a function of both coverage, [inaudible 00:25:11] fraction because you have such depth and range of depth the sequencing, in NGS, you have to calculate lower limited detection as a function of both of those elements.

I'd like to conclude my part of the presentation by talking about the complexity of the workflow as the data flow within the system. Once data are generated, which is shown there on the top left. Those data typically then are as I described process through secondary analysis annotation QC, and interpretation that could include both DNA and RNA or DNA or are only depending on the types of samples that you're receiving. Those then need to be classified and interpreted in order to generate a final report, that's at the bottom right. These workflows are now typically well handled by informatics companies,  so rather than be daunted by this complexity, embrace it. And with that I'd like to hand this to Eric to talk about the Dartmouth case study, and reimbursement. Thank you.

How Darmouth-Hitchcock Brought NGS Testing In House

Eric Loo: The case study at Dartmouth. My father had always told me that average people learn from their own mistakes, if you want to be smart you've got to learn from other people's mistakes.  I'm going to hope that by sharing some of these trials and tribulations, that we went through that will alleviate, some of the burdens that we might experience bring this in-house for yourself. This is  Dartmouth Hitchcock,  we're not really the biggest organization ever, but we are the largest health care provider for our state. If you can see ... This is not the .... If you can see here this is the flagship hospital here in the main campus in Lebanon New Hampshire.

You can see we're surrounded by forests, and it with the [inaudible 00:27:03] thing for Dartmouth. But anyway we're a 417 bed hospital. We had just under 20000 twenty admissions last year. So really not the largest hospital center, but we do have a lot of affiliate sites that we kind of draw on samples from. So if you look at the state of New Hampshire you can see most of the population is actually pretty densely located down in the southern part of the state so we actually do lose a lot of film business down towards the Boston area. But we reclaim some of that from central Vermont, and in from the northern part of the student a little bit from Maine. Dartmouth itself is right there. Why would you want to bring NGS into your own institutions? We are an academic medical center, we train about 400 residents and fellows per year.

If we really want to support the academic mission for this, if you're going to be producing pathologists for the future that going to be lap managing molecular labs in the future you want them to be exposed to this type of thing. So that's one of the key things that led us to  bring in testing in-house, but we also wanted to support the clinical mission. You know for a fact that your clinicians are going to be ordering this tests and regardless of if you provide it in-house or not. And so you really want to ask yourself can you make it cheaper yourself, than if you were to send it out. And in our case, it turned out to be the fact. So these are our in-house volumes, the state of New Hampshire has about 1.3 Million people, and by area it's about twice the size of Los Angeles County. Los Angeles County has about eight times as many people in New Hampshire though.

If you look at your own patient, that amount of people gives us about this many anatomic pathology specimens, and the CEGAD lab, the clinical genomics and advanced technology lab, which is our molecular lab at Dartmouth, really gets about maybe 25000 samples per year. So the goal for the lab in general though, is to maintain the send up all these at about less than 5% or so and keep the send out costs at about 78% of the total operating costs for the lab. We tried to do aggressive test utilization and contracting with our reference labs to to maintain things. Now in terms of the mother lab itself, Greg Sangol has compared this to running a pizza joint, in that you have to have X number of cheese pizza's to maintain your financial viability, whereas other specialty pizzas don't earn you that much but you know it's important to have them.

If you look at the specimen types, over the assay types that provide. We have solid tumor testing, genetic testing, infectious disease testing and hematology malignancy testing, the ID volume, the infectious disease volume really is the driver for our lab in terms of the financial stuff.  The Molecular lab is never going to be an economic engine for your pathology lab in general, that's going to be your chemistry and what not, but you can still produce some good volume there. This is a complex slide, but it's going to be the one to focus on. It shows that it really is less costly to make stuff in the house. If you look on this row, on this column here, we're trying to compare apples to apples so CPT codes that we're charging in-house for various assays, versus the CPT codes that would have been charged if we sent it to a reference lab. The financial numbers have been fudged a little bit on purpose to protect pricing confidentiality, but the year to date volumes and the savings, are accurate based on, the actual numbers. This is a comparison of some of the various NGS assays that we provide in-house, the my mileage sequencing panel, lung cancer panel, and a melanoma panel and basically this is the direct variable cost for the lab per unit assay. This is the fully loaded cost on that we provide, and this is the cost that it would have cost to send out if we had sent out that assay. So if we look at the totals year to date, this is only looking at maybe about 400 samples for the first quarter of this year, but we've seen the institution about 40% of total costs overall. So it does make some financial sense to do this, and lose less money by bringing it in-house, even though the reimbursement that we get, is getting better it still doesn't really cover the cost of the assay, but you're still losing less money overall by bringing these testing modalities in-house.

Now it is a complex workflow, and the slide really doesn't do it justice, but when they talk about a two week turn around time, it really does take a minimum of two weeks. The first week here, just the first couple of days is required just to extract the DNA to process the DNA sequencing library, and all that stuff and then when they talk about the sequencing portion, that really is a time sink as well, just from the moment you put it on the sequencer, it takes about thirty hours before you can take it off and the sequencer and it's only at the very end of the fourth day, that you know the quality of your sequencing. So if a sample failed you won't know it until the end of the first week, and that could increase the turnaround time by a week there.

The second week really is all of this dry lab stuff that Rakesh was talking about, and that is something not to sneeze at, so the second week here, this graph doesn't really do it justice. The workflow is really complex in itself. There's a lot of quality control analysis that has to go into the assay. So if you've never run one of these assays before, the prelimarace that you have is functioning in vitro, right?  You don't have all of the mechanisms that you have in your body to correct errors, so your prelimarace, is going to introduce errors into the sequence that you're generating every hundred base pairs or so, and every hundred base pairs doesn't sound like a lot, but when you're reading hundreds of thousands of reads per patient, those little errors that get introduced really can add up a lot.

All of this stuff here is to make sure that, we're not reporting out an error as an actual  patient pathologic variant. So all of this stuff is really important 

We really hit a suffering point at the dry bench, there was a big bottleneck there, and we were running in about 1.5 FTE's and there are some molecular labs that have, six seven different biomes [inaudible 00:35:34] that are able to help you process this data and even then some of the reports that they turnout are not really all that beautiful looking. [inaudible 00:35:44] diagnostics really allowed us to widen that bottleneck, and make things flow a little bit better. One of the other benefits of doing this was uniformity of reporting. There's going to be a bottleneck in the professional component as well sometimes, unless you have a lot of molecular pathologists in the house. And given the complexity of the reporting, there's going to be a lot of inconsistency in the way things are interpreted as well. You don't want your institution reporting out things, the same variable all different ways. And going with one of these, like peer NDX for example, allows you to have more uniformity of reporting regardless of the pathologist that is eventually going to sign out the case.

Now, you're generating all of this data and if you were here yesterday you know that Data and information is power. You want to do something with that, so the long term vision at Dartmouth is to use this in conjunction with the bio repository. Our institution actually created a shared resource for the whole institution, which is a bio bank, and it's a cap certified one and we're partnering with DC platforms, which is a company that provides software to help network your bio bank and regulated your Bio bank and so as you'll hear, probably later today if you attend the session with Dr Maurice from the Hill Clinic, the pharmaceutical industry is really, really interested in obtaining this data, so all of tumor specimens that we have, have been de identified.

They're going to be put into this bio repository, and the clincher here is that we have all of the genetic information associated, from the sequencing with the tumor specimens. This is all going to be put into our Bio bank. BC platform's allows pharmaceutical industries and all the other researchers that might be interested in collaborating to actually query this network of bio banks, to find whether or not an institution has a patient or tumor sample that has a particular mutation or  variant, that they may be interested in ... So  this will allow us to partner outside of our institution, to do research and possibly to get our patient population more clinical trial opportunities, and whatnot.

I think this is a very powerful thing that may not be talked about too much, but you can also use all of the data for other internal quality control purposes. Now it is an investment, I don't know if a for profit the hospital institution would want to take the plunge, but if you do, you really need to invest in people. In an ideal world you want to have at least maybe three people that are able to run the assay. This is the clinical lab at Dartmouth, and that's Greg he's the leading visionary for the lab, but any way, you want of these three clinical lab scientists that are able to run the assay, and depending on where you are you might be hard to find on. Not everyone can run a molecular assay, you have to have the ability to run very complex assays and spend a lot of attention to detail that not everybody may be able to do. And if you do decide to train somebody in-house, you have to take into account the length of training. You won't be able to have somebody to run an assay in just a week. It would probably take a two to three months for that purpose.

And three clinical lab scientists are ideal, because not only do you need somebody to run the assay, you'll need people to run re validations when they occur, and they're always going to be emergency situations where you need backup coverage. I say that with a caveat, we've been running on 1.5 FTV's and those folks have been real stressed out. But if you find somebody, and you're in a remote location like us, you want to keep them happy. You also want at least to bioiformantation ... Bioinformaticists to help run the dry lab portion and sending stuff out there, and that can be a little bit tough to find as well, if you're out in a more remote area. They don't grow on trees.

You need to invest in equipment as well, The sequencers that we have in the house are actually on reagent rental, but there are some hard capital that you would need to invest in the DNA extraction devices. You might want to invest in some robotics to help automate the process a little bit, especially if you're short staffed. And then, one thing to you know not ignore, is the IT infrastructure ... This is a huge component that you'll need to consider, you're going to be generating terabytes of data every week, and that's not something to sneeze at. We originally had considered forming things out to the cloud, and using Microsoft azure but that got a little bit too expensive when we really costed it out.

Right now we're just buying up more and more IT space with our hospital systems, but this is something to consider, it'll take at least six months to get your pipelines set up because all of the information sending and sharing has to be hepa compliant, so that's something to factor in when you're building your assays. The key learning experiences from our lab at least is that, if you're an academic institution you definitely want to do this to bring in-house, but even if you're not there are some good financial reasons to bring the testing in-house. It is an investment of both time and money and personnel, so consider that, and you know even though it's a very complex workflow, you don't have to do everything in-house. You are able to farm out things to make it more cost effective for your own institution, and there are multiple ways to do that.

Reimbursement for NGS Tests

Now the elephant in the room however is going to be the reimbursement issue. There's both you know, private insurers and the government, and that this is not something to sneeze at because all of the policies that you have to follow to get reimbursed are very complex. No I'm not going to talk too much about Medicare, but it is worthwhile to talk about the MOLdx program a little bit. So if you're not familiar with molecular testing, of the MOLdx program was a lab benefits management program that was developed by the Mac Palmetto in this region here. And all of these areas or these jurisdictions, that have dotted lines around them, if you operate in one of those states, you fall under the jurisdiction of a map that uses the local coverage determination that was determined by Mac Palmetto and so that would be the MOLdx program.

To get reimbursed if you live in one of these states that it's a little bit more onerous, because you have to register any test that you perform with these CPT codes with the MOLdx program and that in itself is a large burdensome task. There are multiple forms that you need to fill out. You need to submit all of the verification data that you did to validate your assay in-house, and after all of this is done which can take months, you would be assigned a Z code. And the Z code is what they used to compare against their master list, to determine your reimbursement and even if you are assigned a Z code, it doesn't guarantee your reimbursement, they may have reimbursed you the full amount that they would reimburse partially or not at all, so it's it can be a little bit onerous.

Now that the private labs on the other hand, it's like the Wild West. We didn't do a good job of regulating ourselves, so you know external forces are  have been coming in to regulate us, and the first shoe dropped for us at Dartmouth Hitchcock, in November of 2017  that's when United health care, contacted the contracting department to our institution to let them know that something was happening. I guess technically they made the announcement on their web page. A few months before that, but putting something on their web page and not informing people isn't the best way to get information out. But in any event prior authorization started coming. We learned about in November of 2017, and we have been really scrambling to comply with all these things. And once United Healthcare notified us, and we learned about this, we additional digging and we discovered multiple private insurers, in our area were starting to start developing these laboratory benefits management programs that inquiring prior authorization for molecular testing.

And so to deal with this, it's kind of like a bad joke how many hospital administrators does it take to deal with prior authorization, but it really does take a coordinated team effort, the people that we had to get together to manage this included, the people that were involved in professional coding, people involved in the authorization process the billing systems, hospital lab IT and the molecular genetics lab. And the reason for this was there's these multiple parts to building ... Our lab manager really did a fantastic job in getting the lab registered for all of the different components, so each of the private insurers have their own different pathways and requirements for you to follow to get reimbursed. That in itself is a huge onerous task.

The clinical colleagues really didn't have to do very much on their end, because the prior auth process for them would remain about the same, but for the laboratory on a lot unluckier diagnostic testing associated with cancer tumors, cancers and tumors are ordered by pathologists, and that in itself is a little bit of a task because pathology hasn't traditionally been involved in obtaining prior authorization, we don't see the patients, we don't know their insurance information and whatnot, so we had to work with our revenue cycle management, com for health to obtain um, the prior authorization cycle. The fourteen day rule, that came up a little bit, but it really didn't affect us too much.

If you recall the 2018 final rule that went into effect in January first, eliminated all the issues that we're facing. The big problem that we ran into with all pathologist ordered molecular testing was that in order to obtain prior authorization, for that test you have to code associated with ICD 10 code, and often times that stuff wasn't generated yet. If you look at the workflow for the evaluation of a bone marrow biopsy, the biopsy if it's taken on day zero I know that this is going to be an acute leukemia. I need to have the molecular testing for diagnostic purposes, I'll order a order it at this time to prevent any delays in patient care. However, the coding people might get it on this date for prior authorization, but I can't send out the report until a day or two afterwards, so the diagnosis hasn't been made yet.

It's a little bit problematic for the coders, and the people that are trying to obtain prior authorization at that time. It's something we're still working through, but it's something to be aware of. With that I'm running out of time, and I'd like to thank you and thank all these folks in the molecular lab at Dartmouth Hitchcock, for helping to prepare this presentation. And I'll turn it back to Rakesh to wrap up.

Great thanks so much Eric, and like to thank the Peer NDX folks as well, so takes a village to support the informatics, and the associated services that we provide to our partner labs. Really like to recap by talking about, are you ready to establish NGS program at your own organization? Summarized by telling you that the industry trends are positive, there's great momentum with complex molecular testing especially using, next generation sequencing. We've shown you that you can advance knowledge, improve patient care and even create a positive economic impact through both publications, and Dartmouth's specific example. We're showing you that you have to expect the challenges, you have to roll with the punches as Eric described, and make sure you're adequately resourced.

There are blueprints for success that you can adopt as appropriate for your own organization. And of course there are union forces that are ready to aid you on your journey. So both your peers, peer organizations that are there to help you, as well as industry partners. So with that we'll stop and take any questions from the audience.

Audience member:  Eric, in your lab, the wet side part, are you saying that combine the molecular for infectious diseases and cancer, and non cancer all in one wet lab or do you have ... And then the spread out for the recording the [inaudible 00:49:28] then [inaudible 00:49:29] How do you do that front end keys, multiple wet labs or ...

Eric  LooDartmouth is an interesting case. A few years ago, we re consolidated and restructured the lab a little bit, to make things a little bit, more lean. The molecular lab at Dartmouth is actually unified, and in one location.  We have an area for extraction of DNA in one area that's very wide and mobile, that we can restructure as needed. The infectious disease testing, the tumor sequencing all of that stuff happens in the same physical location. The dry lab portion which is a complexity in itself, we just didn't have enough hands to do the work, and we at that time had a little bit, of a financial crunch, so we weren't given permission to hire more people at that time, but we still need to get the work done. And because of that, we decided to farm out that that portion of the work flow to pure NDX and it worked out pretty good.

Audience member: Did you have any challenges in handling tissues, eventually infectious tissues, versus [inaudible 00:50:38] blocks or blood samples, I mean those are quite a heterogeneous group.

Eric  Loo: Right. Greg actually put together a real good team of people, and we didn't bring on NGS until 2014, and so we had a bunch of molecular assays that we were doing prior to 2014, like the BNT solclinality studies, some of the infectious disease studies and stuff. So all of that was in place at the time, so it was easy for us just grow the wet lab portion.  When we decided to bring in NGS testing in 2014, the wet lab portion was there. We had people that were trained in molecular techniques, and were able to do the testing portion, but the data that was being generated that that time was just a huge explosion that we really weren't able to cope with, and were people to hire hands to process so yeah.


Download the slides used in the webinar.