Articles

Interview with Deep Lens CoFounders and Pathologist User

VIPER-Slide-Viewer

Deep Lens Inc. is a digital pathology start-up from Ohio that exited stealth mode last week and announced $3.2 million in funding from Sierra Ventures, Rev1 Ventures, and Tamarind-Hill Fund. They’re focused on making pathology accessible by creating a platform and database of digital pathology slides that’s user friendly, informative, and powered by artificial intelligence to streamline the workflow of pathologists. We spoke with Dave Billiter, Deep Lens CEO, Simon Arkell, Deep Lens President, and Dr. Jason Jarzembowski, Chief of Pediatric Pathology at the Medical College of Wisconsin and Deep Lens consultant.

Ben Ouyang, Medgadget: Tell me about Deep Lens.

Dave Billiter, Deep Lens CEO: Deep Lens is a cloud based platform supporting pathologists, translational scientists and really driving forward new discoveries in research, as well as providing significant advantages to pathologists and researchers in quality and optimizing workflow. It’s really focused on benefits to the patients by supporting the medical professional and providing access to biopharma companies that are producing the new discoveries and clinical trials and drugs that lead to better treatment.

 

Medgadget: How does it work? What’s the input and what’s the output?

Billiter: Any group that’s trying to go digital: we’re breaking down their barriers. We provide a laboratory service that digitizes the glass slides that they produce then dynamically put them into our cloud servers. It mimics what they see in their upright microscopes, but in a digital fashion. Then they get into the VIPER workflow. VIPER has been created by the pathologist experts for the last 10 years.

The inputs are the images which represent the glass slides. They’re also provided with any documentation that supports that case. This includes the molecular probes, genetics, you name it. The advantage of our platform: pathologists are faced with a tsunami of information to make a diagnosis to get a patient on a trial. VIPER enables many different inputs of information of the system that enables the presentation of that data based upon a specific tumour type.

From an output perspective, we generate this as a report that can be integrated with our systems with an API. This plays into artificial intelligence. A big part of what we’re doing is generating analytics on top of those images as well as associations with those cases themselves. We can segment, quantify, and obtain specific attributes in the image associated with that case. This provides more value to the pathologist and can generate more precise clinical trials.

VIPER is a tool that can also aid and educate pathologists that may not have the specialty in adenocarcinoma, for example. Even when we’re progressing with our artificial intelligence and deeplearning: as soon as that image is produced it’s in the case view for that pathologist. We’ll look across our network based on the machine learning that can recognize, just from our images, that Deep Lens assist will say: “This image is 89% similar to other slides of this tumour type”. This would be the first level in assisting the pathologist and say that “hey, this looks like adenocarcinoma.”

The second level says: “you may want to pay specific attention to this aspect of this slide image.” Features like % mitosis, mitotic index, anaplasia are ones we got from years of working with pathologists.

The third level gets to the forms and supporting info themselves, to help a pathologist move through the report. Not only does this help in confirmation of diagnosis, we also look at the quality of the specimens throughout those forms. This supports what the pathologist does from a quality assurance standpoint. It also queues up our model in what we’re doing with the biopharma industry. The logic based workflow gives us the opportunity to present clinical trial information, new biomarker information, and new stains that pathologists may not know about.

 

Medgadget: What’s the need for this technology?

Billiter: In this space of pathology, we’re really faced with supply and demand. The supply is going down (pathologists). The challenge with that is related with the demand. The demand for pathology discipline is continuing to increase. The biology associated with disease is increasing at an increasing rate.

We also want to focus on precision predictive medicine that can also relate to clinical trials that are being designed. The need for pathologists is high because they’re our critical path. The pathologist is the first in a disease like cancer diagnosis.

It’s important to note that Deep Lens was not created to displace pathologists. It’s the opposite: We’re supporting the pathology discipline.

 

MedgadgetHow does this plug into the pathologist’s lab technology right now?

Dr. Jason Jarzembowski, Chief of Pediatric Pathology at Medical College of Wisconsin and Deep Lens consultant: When I look at it a lab tech medicine point of view, machines do what they do best and people what they do best. In the lab we’ve already had autopipetters and machines that can do all that work without making the mistakes that humans do because we’re tired or distracted. As we’ve moved into the digital age, we have more and more computer involvement, like computer assisted analysis and now to artificial intelligence. Currently, we already do computer test clinical tests and the instrument reports the results. For example, here at Children’s we do our blood counts on a computer. It gives us a collage of all the neutrophils it counted and you can tell it if it’s right or wrong. If it’s wrong you tell it why and it’ll learn. The VIPER and DEEP LENS is taking this technology through the anatomic pathology realm. We can look at tissue and biopsy reception specimens and computers are doing that for us now.

 

Medgadget: How does it change your workflow as a pathologist?

Dr. Jarzembowski: Some of it is as simple as having a nice computer interface. Everything is in one spot – I don’t have to go into the EMR with a tab for the patient history, a tab for the first diagnosis, and so on.

It also shows me the image and it can count cells for me, which is important for a lot of the things we do. It can tell me how many cells and how many mitotic cells there are and then what are the ratios? As an example, one of the tumours we look commonly at needs to count 5000 cells and that takes me about 20 minutes. If a computer can do that for me in 20 milliseconds, that’s a major improvement in my lifestyle and frees me up to do other things. The computer can also more reliably count things. Once you train the computer appropriately, it won’t make mistakes, it won’t be swayed to think a cell is closer to a certain diagnosis; it’s agnostic. Furthermore, when we do special stains, we can have the computer enumerate and fraction positive/negative and tell us if it’s staining nuclei or cytoplasm. So it takes that busy work away from us by helping us and doing it better.

 

Medgadget: How do pathologists feel about this?

Dr. Jarzembowski: I’m personally very excited about it – others are excited and some are skeptical. It’s like any change and adoption of any technology – it takes time and people to have experience with it. I’d like to think that most of us in the medical specialty are life long learners and want to do things better. We’re hopefully receptive to this sort of thing. Whether this computer is good or bad at what it does is based ultimately on how we’ve programmed and populated it with information. If it’s not doing a good job, we need to teach it to do more cases and give it better roles.

Because we’re using it right now in the diagnostic realm, we have time to explore the potential goose chase that the computer or a human might send you on. It’s not different than me going down the hall and asking a colleague what they think it is, and have them tell me something. I might not agree, but still explore those things anyway, and sometimes they turn out to be right. That’s why we ask somebody else.

At the end of the day, asking a computer isn’t really any different than asking a colleague.

 

Medgadget: How accurate is the AI?

Billiter: We’ve taken our methods and compared it to the open and published information for 8 models. Depending on tumour type and method that we’re testing, there can be a range from 100 to 1000 cases. I’m excited to tell you that our methods have outperformed those cases with double digit percentages of accuracy improvement.

 

Medgadget: What makes Deep Lens and VIPER stand out?

Simon Arkell, Deep Lens President: We have access to great images and annotations and diagnoses that provide the initial modeling. Our model is not like some in the industry that try to get exclusivity on millions of slides they scan and annotate. The beautify of VIPER is that we have hundreds and thousands of pathologists using this platform around the world that will generate the content.

We’re also looking to work with developers of other deep learning and models. If there’s a specific machine learning to identify a certain type of cancer, we can work with them to tie into our platform and make it available to all of our users. This then becomes an app store/platform.

 

Medgadget: How much does it cost?

Arkell: Our goal from a business perspective is that we’re making this platform to all pathologists free of charge, regardless if they’re an individual or part of a large institution. If we can help pathologists identify correct biomarkers, we’re helping the pharmaceutical industry to have a direct channel to the pathologists and allow them to identify and recruit patients to clinical trials.

 

Medgadget: How will you address the privacy concerns of this model?

Billiter: We’re always extremely conscious of privacy concerns, technically and from a process standpoint. We are not transferring and giving data to biopharma companies. The mechanism is taking information from the biopharma company, then embedding it into our platform, which is then utilized by our pathologists and health systems to create the match. So it’s actually a very secure model that’s structured to HIPAA concerns and regulatory aspects.

 

Medgadget: What motivates you to pursue this?

Billiter: We’ve all been associated with cancer in some form. Our mission and vision is really focused on providing this type of solution to get back to help the patients and helping biopharma companies to enhance drug discovery. We think we can have an impact on that. We’re excited to get this out to the market.

Read More Press about Deep Lens

This blog first appeared on Medgadget - October 23rd, 2018.