In episode four of our Roadmap to Improving Technology Transition Season, Just on sat down with Doctor Catherine Grgicak, associate professor of chemistry at Rutgers University, Camden, and Stephanie Stoiloff, chief of the Forensic Services division at the Miami-Dade Police Department to discuss their real world experiences with introducing new forensic technology into practice. While new forensic research products are often innovative and interesting, they may not always be practical in operational forensic laboratories. Researchers should lean on practitioner input to develop technologies that fill a necessary gap, are unbiased, and offer benefits over existing technologies. Listen along as Doctor Grgicak and Stephanie describe lessons learned in their own experiences with technology, transition strategies for improving product validation, and advice from both an academic and a practitioner perspective. This episode is funded by the National Institute of Justice's Forensic Technology Center of Excellence. Some content in this podcast may be considered sensitive and may evoke emotional responses, or may not be appropriate for younger audiences.
Introduction [00:00:01] RTI International's justice practice area presents Just Science. Welcome to Just Science, a podcast for justice professionals and anyone interested in learning more about forensic science, innovative technology, current research, and actionable strategies to improve the criminal justice system. In episode four of our Roadmap to Improving Technology Transition Season, Just on sat down with Doctor Catherine Grgicak, associate professor of chemistry at Rutgers University, Camden, and Stephanie Stoiloff, chief of the Forensic Services division at the Miami-Dade Police Department to discuss their real world experiences with introducing new forensic technology into practice. While new forensic research products are often innovative and interesting, they may not always be practical in operational forensic laboratories. Researchers should lean on practitioner input to develop technologies that fill a necessary gap, are unbiased, and offer benefits over existing technologies. Listen along as Doctor Grgicak and Stephanie describe lessons learned in their own experiences with technology, transition strategies for improving product validation, and advice from both an academic and a practitioner perspective. This episode is funded by the National Institute of Justice's Forensic Technology Center of Excellence. Some content in this podcast may be considered sensitive and may evoke emotional responses, or may not be appropriate for younger audiences. Here's your host, Rebecca Shute.
Rebecca Shute [00:01:21] Hello and welcome to Just Science. I'm your host, Rebecca Shute, with the Forensic Technology Center of Excellence, a program of the National Institute of Justice. On today's episode, we will discuss the challenges and approaches to transitioning forensic research into an operational laboratory. This discussion builds on a recent Forensic Laboratory Needs Technology Working Group effort to develop a roadmap to improve research and technology transition in forensic science. Here to guide us in the discussion is Doctor Catherine Grgicak, associate professor of chemistry at Rutgers University, Camden, and Stephanie Stoiloff, chief of the forensic Sciences division at the Miami-Dade Police Department. Welcome, Catherine and Stephanie.
Catherine Grgicak [00:01:59] Thanks for having us.
Rebecca Shute [00:02:00] In our roadmap, we focus on the important task of transitioning technology from research to operation in a forensic application. We should also note that the language of tech transition versus tech transfer is important here. Catherine and Stephanie, could you help us understand your perspective on what successful technology transition could look like?
Catherine Grgicak [00:02:18] Sure. I think I'll take this from an academic perspective. First. For me, successful technology transition is multifaceted. So when I think about transitioning knowledge, scientific principles, ideas, concepts, developments to an operational crime lab, I think about it from the perspective of not only the student, not only the professor, the research product, but also from the perspective of the operational labs, i.e. the stakeholder, the person who's actually going to be using that item that's been transitioned. From my perspective, I think transition is a very broad term that really can encompass all aspects of learning. And if we can embrace that and understand that each of the stakeholders that fall within the forensic domain, that is the researchers, that academicians, the students, the crime laboratory analysts, the directors, and then ultimately the courts and society, if we can all understand each other's needs and I think we would have a successful tech transition and scientific ideas that are easily translated between, you know, the individual who begins thinking about the idea and then the society
who will absorb, you know, those ideas. So that's how I think about transition. It's very broad and it's very universal and it's it's inclusionary, put it that way and includes all aspects of the forensic domain and each component part and how they interface with each other.
Stephanie Stoiloff [00:03:59] I think that where the transition are the most important thing for us to evaluate is that it's a reliable and reproducible technology at the minimum, and then to look at how it could benefit our operation, whether that's something that is a new a new test that we don't have already, we don't have the capability of doing or the technology doesn't exist yet, or it is something that makes the current process easier, and it could be something that's a new test, that's used in the laboratory to make the analysis process easier. Or it could be something that could be used for, triaging samples, for example, outside the laboratory in order to help us expedite analysis once the evidence is submitted for analysis. So it's multifaceted on our end as well.
Rebecca Shute [00:04:45] Thanks for your perspectives here. It's also a helpful point to note that not all research can or should transition. Catherine, how would you describe the research products that are right and ready for transition?
Catherine Grgicak [00:04:55] If there's a technology that's ripe for transitioning, presumably it would be reproducible. Robust. I think in general, there are three main attributes of technology that's ready to be transitioned, and that that's really the responsibility of the researcher to show the decision maker whether or not the new idea or technology has relevance. That is, it's able to answer the question that the operational laboratories would have. So either it needs to fill a gap that exists, right, that current technology is unable to answer. So a good example of that is, you know, the recent application of forensic genetic genealogy. It's a relatively new idea, but it uses existing technology that is sequencing, but in a different way and using a different database than what we're used to. So it was really putting together the component parts in a clever and effective way in order to answer a question that had yet been unresolved. So in any case, the idea is that the new technology really needs to be able either to fill a gap and answer the relevant questions. And then the other thing it needs to do is it has to be legitimate, that it has to be perceived as fair, and which is extremely important in our domain, because if a technology, for whatever reason, is even perceived to be biased, whether it be, you know, some machine learning algorithm or ANN that's perceived to, you know, be biased in its decision making procedures or if it's, you know, the way something is triaged, let's say unintentionally, there's there's a bias towards only choosing a specific kind of sample that is unintended. You know, those are the things that have to be tested, as well as just showing that the technology works better or the idea works better. And of course, the last thing is that the technology has to be credible. That is, it not only has to be able to answer the relevant questions and be salient to the community, but it has to work better than technology that already exists. So it either has to be faster or it has to be more reproducible. So of course, the stakeholders will decide or have to have a discussion as to what better means. But ultimately there has to be some metrics that are met and, such that the researcher can demonstrate that credibility has been met. And I think if the researcher, if the person that has the new ideas, is able to demonstrate those three things to the operational labs or to those that will absorb or use the technology, I think that will be a better line of communication, and it would probably lead to faster adoption and to the point of relevance. I think that is really where the crime labs and the researchers need to get together because of the researchers don't have a good sense of what questions still need to be answered. Then they might develop the most wonderful and advanced technology. But if it's not able to answer the needs of the crime laboratories, then there's a missing part
there. So I think it would be really important for researchers to think about their research in the context of thinking about the research while trying to adhere to those three principles. And I think if one adheres to those three principles, then they'll be open to conversations and hopefully there will be faster adoption.
Stephanie Stoiloff [00:08:29] I agree and I'll just follow up. And I think a lot of research that can be done, certainly is interesting, but it may not fall into the applicable or relevant box, you know, so to speak. So, you know, we want to make sure that the research that is being done can be used in the laboratory to answer questions that we can't currently do, or make it faster or some benefit to the laboratory.
Catherine Grgicak [00:08:52] So as a researcher, I think it's invaluable to start talking to the crime laboratory directors or analysts very, very early on, even before design. So it's basically during the conceptualization phase where it would be best for someone like me, you know, the academic researcher, to go to the crime labist, let's say someone who has a lot of experience with the day to day needs of running a crime lab and get their perspective early on, because that perspective could very well impact any future design going forward. And so instead of, you know, a lab director like Stephanie getting a technology and doing beta testing, what I personally would love to see is the crime labs getting involved very, very early on. So they may not actually be doing the design and research, but they would list out the requirements or help the researcher understand what the current gaps really are and what the most important gaps are. Because I think sometimes, again, as Stephanie said, there's really interesting research, but how it fits within the forensic pipeline is another question altogether.
Stephanie Stoiloff [00:10:11] And I think just to give a simple example, is that I have partnered with some other universities in the past, and there's been some very creative ways of identifying different drugs. However, at least one in particular, involves still having to handle the drugs. And for our side, in our perspective, we we also have to consider officer safety. And so the things that we think about, you know, so those tests we would not recommend or support, even though it can technically do what it's being asked to do as far as analysis of drugs or detecting fentanyl, if there's anything to do with having to open a package or touch in a way, we do not recommend that at all. So there are we want to make sure that we are. We also have to consider those factors as well.
Rebecca Shute [00:10:53] There's a really helpful theme about communication and alignment between researchers and practitioners, as well as other community members. And this is a theme that we certainly speak to in the roadmap document. How might we improve alignment between these forensic community members? You mentioned communicating early and often, but what other mechanisms could work?
Stephanie Stoiloff [00:11:10] I think having an open door and be open to new things. I think a lot of times with laboratory certainly and and crime laboratory, we are accredited. We have various procedures for things and and standards we have to adhere to. But understanding that in science there's always something better coming. Right. It's it involves a lot of validation and work on our part. But if it's worthwhile and it does something different or add some added benefit, then we need to be open to those ideas. And I think that open communication with researchers and think is is a win win, because if you can implement something from idea to technology, aiming to actually implement that, as a product, that's the best scenario, right? But it's also important to learn from the failures and to learn things that don't work and to learn to why they don't work, and then go
back and redefine and reassess. And that's all part of the scientific process. And so I think it's it's important that engagement happens early on.
Catherine Grgicak [00:12:02] I agree with that completely. And I, I also think that there has to be some study, let's say, of what the innovation ecosystem is. Who are the main players there. So oftentimes we'll think about academic programs and the crime labs. And sometimes we'll forget about the industry partners. And really the industry partners are an important aspect of technology transition because ultimately any new technology that becomes popular has to be licensed, has to be supported, has to be validated, and there has to be tech support and training and all of these other aspects associated with it. So I do think that industry partners really do need to be acknowledged as part of this ecosystem, as well as the investigators and attorneys. So these Flint were a group that we were part of. We're a wonderful group, but we didn't have, for example, an attorney as part of the group. And that would have been something that I think would have been really interesting because at the end of the day, we would have had maybe a fuller understanding of what their needs would be as part of this ecosystem. So again, they may not be the ones actually transitioning the technology into their labs, but they most certainly are a stakeholder in the fact that they are affected by whatever technology is or is not implemented into the laboratory. So there is an important, let's say, tension associated with making sure that all of the diverse stakeholders are considered when we're thinking about the forensic ecosystem and translation and transition. The other thing I want to point out or emphasize is the fact that some of the challenges in transitioning tech into laboratory have to do with the resources of the laboratory themselves. Right? So we spent a lot of time within this group talking about whether or not all of the labs have to adopt, you know, all of the new technology within a given year. And of course, that's not possible. So one of the things that I think is really important for everybody is just to understand sort of what their priorities are, what's important for their laboratory. And then from a research perspective, it's our job to express to the community what it is. Our projects will help solve. What gap or projects or grand problems or projects will help solve. So to me, I always felt like the ecosystem is very, very, very broad and I think it's much broader than we've given it credit before in the past. And one of the wonderful things about this roadmap is I think it does a very good job of touching all of the aspects of technology transition, rather than just focusing on, you know, that the very act of transitioning a product to market, because that's only one stage of a very large path. And. In a very long path all the way from concept to market and licensing. That's a very long pathway that has to be acknowledged and respected and, dare I say, supported. And it has to be supported both from, you know, the crime laboratory perspective, the academics, the administrators at both the crime labs and the academic institutions, but again, also industry partners.
Rebecca Shute [00:15:33] Your comments point to the complexities and interrelationships of the forensic innovation ecosystem. In this roadmap, we focused on developing action items that different community members could realistically do to improve their ability to successfully transition tech. Stephanie, you mentioned the importance of learning from successes and failures of implementation, and you have first hand experience transitioning tech into laboratory operation. Can you talk through some examples of transition successes and failures that you've experienced?
Stephanie Stoiloff [00:15:59] Sure. So I mentioned earlier the example of working with the university on different ways to test, you know, and identify drugs. And why that particular example would have worked in a different environment, but not for law enforcement. Right. And another example is a system that we evaluated for, that was already ready to market, but it was identifying different, casings and calibers for firearms.
And that expedites the that might have to do. So it wasn't something that we would use in a laboratory, but it was something that would be helpful for use outside the laboratory, to help when it comes in for analysis. As Catherine said, it's it's the whole process. It's the instruments that were developed. Right. And then you had vendors selling to law enforcement and laboratories saying that's a really cool thing. But hold up a second. You know, there's a lot of things that we need to evaluate before we can implement that into an actual crime laboratory and then explain, like, why can't it be it can't do everything that the vendors are saying it can do. And a lot of times a lot of that ends up pitting scientists against vendors and industry. And we don't want to do that. We want it to be in conjunction with industry so that we are supporting something that works. And now you see that the transition into rapid is, you know, it's been through the courts. It's been through as far as the rapid DNA act, all of the different things that had to happen in order to use it. There's a process that SWGDAM has created. It's a scientific working group for DNA analysis methods. They have to validate and and provide standards for use in forensic laboratories. So there are things that we have to adhere to in a, in an actual operating laboratory that others may not be aware of. If you look at how it came about, you know, you have law enforcement agencies saying that we're going to use rapid, but we were just trying to explain that, you know, with DNA, knowing how DNA technology came about and was implemented in the 80s and the analysis of evidence samples, it was a long process for accepting. And so it's understanding that that process, we don't want to damage the reputation of DNA by trying to implement something before it's ready, you know, for it to be used. So I think now it's got going through that process. There's standards available. People are implementing it in different ways. And, and it's it's become much more accepted than it was when it was first introduced.
Catherine Grgicak [00:18:15] And I think rapid is a really good example of, you know, a new technology taking some time to develop right from conception all the way to implementation. As you said, Stephanie took about 15 years. So if one thinks about, you know, the research timeline from conception all the way to implementation, I would estimate the average time is probably between 15 and 20 years to see things into fruition. So rapid is a really good example of that. And it's also a really good example of filling a need. I mean, the need was that of rapidity, right? I mean, DNA testing could take a long time. And of course, with very complicated samples and these very complicated mixtures that perhaps can be justified a little bit because of course, you need an expert to evaluate such complex signal. But remember, rapid started with the idea that it would be single source standards. And so why? Why wait a long time to run single source standards? Can we develop a technology, a box, right, a lab in a box basically that could do this very, very quickly, you know, without the need to elicit the expertise of someone with 20 years of education. That's a very interesting example of filling a need and the community having the tenacity to support, you know, the development of a novel technology to fill that need.
Stephanie Stoiloff [00:19:46] So it also has to be accepted by the courts. Everybody has to be comfortable with it. There's a lot to do on that end of it as well.
Catherine Grgicak [00:19:52] Yeah, that's why it takes so long.
Rebecca Shute [00:19:54] You both have mentioned great points about the barriers and realities of tech transition and forensics, and this was something that we had to be acutely aware of as we developed this roadmap within the transition from research to laboratory testing, evaluation, validation is incredibly important. We know, for example, that validation is critical to assessing and implementing forensic technology, but the process can be challenging or even ill defined for practitioners who have not gone through this before.
Could you speak a little bit more to the challenges of validation that practitioners may encounter, and how to potentially lower these barriers?
Stephanie Stoiloff [00:20:23] So we had a recent collaborative that was called the National Technology Validation Implementation Collaborative. And one of the important things about this collaborative is that it has different groups within it. There is a steering committee. We have one forensic investigative genealogy and then one for rapid DNA. The idea is that we develop standards for validation and that try to make that path easier with documentation and with setting a standardization for the validation process. The issue with validation, I will say, is that it's a very involved process in a lot of ways, and it utilizes the same resources that are used for casework. You would be taking personnel off of casework to be able to do the validations, but you can't do the casework without doing the validation. So it's managing those resources so that you can get the most out of the technology. Sometimes it's planned and sometimes it's not. Sometimes it's just, you know, like an instrument is no longer supported or a version is no longer supported or something else happens, then you have to validate that. And so the changing or the new instrumentation. But there's always something some testing to go along with that before you can implement in casework.
Catherine Grgicak [00:21:30] I view validation as a type of research. I consider it late stage research. I do think it's an important type of research. So basically when a crime lab has transitioned or is considering transitioning a technology to their lab, they basically have to make sure that it's working in their hands. And so I think one of the challenges is first trying to figure out how broad the experiments need to be in the first place. So if the technology's already been published and there's already been a lot of phase one, phase two research, sometimes forensic scientists call it developmental validation research, then how much of that has to be repeated in the crime lab? And so I think with this new group, it would be interesting for them to speak on just how broad and how deep the internal validations or the on site validations really need to be, versus how much of the information they can gain from the developer without having to repeat the work. So I wonder if there's some room for efficiency building there in terms of validations, one thing that has really helped me hone in on what I believe is important is before I even begin validating or testing a new technology, I write out with my team what are the conditions that would elicit failure to launch? Maybe in Stephanie's case, it would be a failure to implement, right? So I find that that exercise really is important because before doing the experiments, you can figure out what again, those failure to launch criteria are and then what that discussion allows one to do, at least it allowed me to do is figure out what's important to me or to us or to the team. Again, I think it might help ease the burden of validation if we can think about it as a failure to implement or failure to launch type of model. So I always think it's really good to write these things out and put numbers to them, because if you force your analysts or you're, in my case, PhD students, or if you force yourself to put the number down, then you can start asking yourself, okay, well, what is this number exactly? It's a really great tool to elicit team building, have an open dialog, but then it also pushes you to think deeply about the technology you're considering.
Rebecca Shute [00:24:03] I appreciate your consideration of mindset and frameworks for approaching validation and testing. Catherine, I know you have successfully transitioned technology as a researcher, and I'd love to hear more about your experiences and learnings transitioning this technology. What sort of advice would you give researchers trying to accomplish the same thing?
Catherine Grgicak [00:24:19] Well, I think we've already touched on it before, which is interface with the crime labs early, get some input as to what's important to them. And I will say in the reverse, particularly those that are full time academic institutions where their job is research, they they're going to have access to technology that is really in the forefront of biology and chemistry and physics and mathematics in a way that crime labs won't. So I would say that if the crime laboratories use their researchers in the academic institutions to their advantage, they would learn about what's happening much more quickly in other fields. And of course, it doesn't mean that they would adopt everything that other fields are doing, but at least they would. Sort of have an idea of what the potential might be. And then in the converse, you know, the researcher would most certainly benefit from talking to the crime laboratory analysts get a list of their requirements, what's important to them, what would help them ease this technology into operations? So a personal lesson I have to learn. Sometimes you have to be sensitive to just the process requirements. When you're building something, you want to speak to your partners as if they are trusted colleagues and friends and hear them out. So that's the advice I would give.
Stephanie Stoiloff [00:25:47] She is absolutely correct, i think we don't take into account all the time like what the actual implementation looks like. And some of the simple things in software are the biggest headaches in that trying to introduce something, even though it may be more beneficial, it may not be helpful in the process. It may add more time. As an example, for us, we implemented of a new laboratory information management system. And for some people, you know, like the new adopters who had never used the previous system, there is a much easier transition than the ones who have learned a different way had to transition to something else, and they complained about every possible aspect of it because it was so different from what they knew. And so when you introduce things that are, you know, it just take time to adjust. It hinders the adoption process and the transition process. That time adds up and then some frustration.
Rebecca Shute [00:26:36] Stephanie, do you have any advice for forensic lab technical leaders who are new to assessing and implementing new technologies?
Stephanie Stoiloff [00:26:42] I think first, you have to be open to the idea that things can be done other way, right? So I think part of it is because it takes so long to validate a process, and you have a process that, you know, and it's familiar. A lot of times people are resistant to that change, but there are times when if you are really evaluated and tested and we've utilized interns to do this, you know, at time so it doesn't impact the manpower, or womanpower, that are used for casework. Then you can evaluate the possibility of other technologies. I don't have a problem with evaluating something if I can use an intern to evaluate technology for, you know, one of our universities, we've done that even on things we knew we weren't going to adopt it. But it's so that the researchers also understand the process, not just the idea, not just for doing the research, but understand can it be implemented and understand why. And a lot of times I think it's just training and research to understand that it may be a nice idea, but it may not be implementable or worth changing what already exists. So those are things to consider. It's also not looking at just the technology that affects what you do every day in the laboratory, but potentially help like screen information that comes in. So if you could trytriage at the front end before it's submitted, it would it would eliminate some of the samples that were submitted for analysis. For example, you know, there's just different scenarios where that could come into play, certainly for expediency, not necessarily downstream for court or anything like that. But when you're trying to get analysis turnaround as quickly as possible. So I think it's understanding that there's so many different aspects and so many different questions about where it could fit in and being open to that. And I think Catherine said this, in the
very beginning, you know, just understanding this is a full spectrum process. And, you know, each project isn't going to start at the same place. So where it fits in and where the idea is and how mature that idea is, is going to depend on how you assess it. Something from oh, at one time we supported an effort to look at different methods for gunshot residue detection. Was it worth me changing my entire process that was already validated? And, you know, in the current system? No, but it's something that if it could help somebody else downstream or in a different environment, it doesn't have to be something that might necessarily be meant for use in a crime laboratory, but maybe meant for use in theater, like in a war time or war situation. Right. So, so there's different ways that, you know, you just need to be able to test samples out. And they don't have that partnership and that ability in that case, the gunshot residue testing, they didn't have access to firearms and firearms range. But I do. So they would bring the samples here and we would be able to test firing and do a swabbing for them here. So it's a good partnership of resources as well.
Rebecca Shute [00:29:20] How do you hope the forensic community uses an implements this FLNTWG roadmap document?
Stephanie Stoiloff [00:29:24] I mean, the hope is that this document honestly, this document gets people to start talking. To understand that everybody's got a different piece of the pie. They don't have to stay in their lanes, you know, you see what other people's responsibilities are. But you can you can change lanes and merge lanes, you know, for different steps along the way in order to make things better and.
Catherine Grgicak [00:29:44] Complement each other. Yeah, exactly.
Stephanie Stoiloff [00:29:47] I'm really interested to see what happens when people read the document.
Rebecca Shute [00:29:50] Stephanie and Catherine, thank you so much for your time discussing your experiences. It's really been a pleasure talking with you today.
Stephanie Stoiloff [00:29:56] Well thank you.
Catherine Grgicak [00:29:56] Yeah, thanks.
Rebecca Shute [00:29:57] If you enjoyed today's episode, be sure to like and follow Just Science on your platform of choice. For more information on today's topic in the FLNTWG Road map, visit ForensicCOE.org. I'm Rebecca Shute, and this has been another episode of Just Science.
Introduction [00:30:14] This episode concludes our roadmap to improving technology transition season. Tune in next time for a new season discussing domestic radicalization research. Opinions are points of views expressed in this podcast, represent a consensus of the authors, and do not necessarily represent the official position or policies of its funding.
Disclaimer:
Opinions or points of view expressed in these recordings represent those of the speakers and do not necessarily represent the official position or policies of the U.S. Department of Justice. Any commercial products and manufacturers discussed in these recordings are presented for informational purposes only and do not constitute product approval or endorsement by the U.S. Department of Justice.