Game Change: How Researcher-Practitioner Partnerships Are Redefining How We Study Crime
Opening Plenary Panel
When researchers and practitioners work side by side, they can maximize their problem-solving abilities. The research partner can focus on the data and the science; the practitioner can focus on interpreting the findings and applying them in the field. In the plenary panel, panelists described the benefits, challenges and pitfalls of researcher-practitioner partnerships with a focus on the financial benefits to the practitioner.
Moderator: John H. Laub, Director, National Institute of Justice
- Jeff Rojek, Assistant Professor, University of South Carolina
- Tami Sullivan, Assistant Professor, Yale University School of Medicine
- Vivian Tseng, Vice President, Program, William T. Grant Foundation
John Laub I'm very excited about this first plenary session. It's a topic that I believe is absolutely essential in this day and age, researcher-practitioner partnerships. As you know, partnerships are important to NIJ. Like many other federal agencies, state and local government, non-profits, NIJ is coping with limited budgets and other critical resources, and as a result, we need to find creative ways to fulfill our mission without sacrificing quality.
One way we have coped is through emphasizing the creation, extension, and strengthening of partnerships. As I mentioned in my opening remarks, we've recently created a new Office of Research Partnerships at NIJ. We also though have placed renewed emphasis on strengthening partnerships within and outside of the Department of Justice. In line with our commitment to both partnership building and translational criminology, we've brought back our researcher-practitioner partnership solicitation. Through this solicitation we'll provide funding to support criminal justice research and evaluation projects that will ultimately provide criminal justice practitioners with the practice and policy relevant information while affording researchers the opportunity to contribute to the current body of knowledge.
Today's panel will focus on researcher-practitioner partnerships and how they redefine the way we're studying crime and criminal justice. Two of our panelists, Jeff Rojek and Tami Sullivan, were the first two awardees under the researcher-practitioner partnership solicitation that we reintroduced in 2010. Our third panelist, Dr. Vivian Tseng from the W.T. Grant Foundation, will tell us about the work her organization is doing in this area.
Before we begin, I would like to take this moment to introduce our panelists. Dr. Jeff Rojek is an Assistant Professor in the Department of Criminology and Criminal Justice at the University of South Carolina. He's a former police officer and has conducted many police-related research and evaluation projects. For the past four years he's conducted analysis of traffic stop patterns in relationship to racial profiling for the Missouri Attorney General's Office. He holds a Ph.D. in criminology and criminal justice from the University of Missouri, St. Louis.
Dr. Tami Sullivan is an Assistant Professor of Psychiatry and the Director of the Family Violence Research and Programs at the Yale School of Medicine. Dr. Sullivan's program of research focuses on understanding the relationships amongst intimate partner violence, post-traumatic stress and substance abuse. She is a licensed psychologist who has extensive clinical experience with both victims and offenders of IPV, providing services in a wide range of settings. She holds a Ph.D. from Northeastern University.
Dr. Vivian Tseng is Vice President of the W.T. Grant Foundation. She leads the Foundation's grant making and spearheads their initiatives on increasing the understanding of the use of research in policy and practice and improving researcher practice connections. Previously Dr. Tseng was an Assistant Professor in Psychology in Asian American Studies at California State University in Northridge. She holds a doctorate in community psychology from New York University.
[End of video clip]
Building Bridges Between Researchers and Practitioners
Jeff Rojek Thank you for having me here and to be able to speak about our research. I do want to say that what I'm going to talk about is not a solo project. My fellow investigator is Geoff Alpert who is not with us today here, as well as Andrew Hansen, who is our graduate student that was key and instrumental in getting this project done.
I guess I could start with how we got involved in this. As Dr. Laub talked about, my background is in law enforcement. I switched over and went the academic route, but I've always had an orientation to doing applied research and working with law enforcement agencies, as has my partner in this, Geoff Alpert, has a long history of doing this work. We saw the solicitation come out. We were looking at it and saying we're having essentially an exchange. We've done this. We've done these kinds of partnerships. We've had good experiences. We've had things that haven't worked as well. We've talked to others, and I'm sure some in this room have been through this same experience. So, what our thought was what makes these partnerships work? What can we learn from these that are positives as well as barriers, and from that maybe provide information that will expand the utilization of these partnerships?
Now, the idea of partnerships is not new. We can go back to the 1967 President's Commission, the idea of merging science and social science into the practice of criminal justice agencies, but I think in the policing realm it really starts with community-oriented policing. In the sense that agencies, as they start adopting this, wanted community surveys, they wanted insight from law enforcement agencies on the topic to help develop training, and as a result you start to see the emergence of these relationships. Support comes from organizations like NIJ, the COPS Office, BJA, and projects such as Cease Fire, the Locally Initiated Research Partnership, PSN, etc., that really help foster and build relationships, and in return, as we talked about, are the practitioner-researcher initiatives that are now coming back out with NIJ.
It's not only in that crowd, but also in the law enforcement crowd as well where you have law enforcement agencies — for example, law enforcement executives, chiefs, sheriffs, and so forth — who have done this kind of work that have now become strong advocates, publishing articles in academic journals as well as things like Police Chief Magazine, advocating for these relationships. Also, I think what's become a very strong advocacy in the law enforcement community is the International Association of Chiefs of Police. In 2004 they held a symposium, came up with some working documents to help law enforcement practitioners and researchers to develop working relationships, to develop the Research Advisory Committee, who has a mission of supporting these relationships. But more interestingly, they made the argument that all law enforcement agencies should have some type of partnership with researchers. This was the goal of their efforts, and that was our jump-off point.
So, that was the goal of our effort. The first thing we asked was what is the prevalence of it? Where do we stand right now? How many agencies engage in these types of partnerships? The real core of our project was the idea of what are the barriers and facilitators to developing and sustaining these types of partnerships?
We did this, and I'll go over real quickly these points of what we did to do this. We did a national survey of about 2,000 law enforcement agencies and asked them about are they engaged in partnerships, the characteristics of those partnerships, and a more general question about their utilization of research. But the core of that project was essentially these interviews of a hundred…let me step back. We used the surveys to identify agencies who engage in these partnerships. From that, we identified 100 — and the way that it ended up working out, we ended up picking 106 — to interview first the practitioner and then interview the research partner as project to get their working perspective. The idea was to understand that key question of what are the barriers and facilitators to developing and sustaining these partnerships.
The last component of this research was we went out to agencies who had a sustainable partnership with a researcher over time and we used them as models. They varied on degrees of formality, being formal or informal, whether it was a single partner researcher involved or maybe a whole department of sociology or criminology, whatever it may be. What I'm going to focus on today is largely that second point. I want to talk about the barriers and facilitators. I'll talk a little bit and I'll give you kind of just a graph about that prevalence question. It may or may not be a surprise to you. I'll also utilize some of the video clips we did as part of that fourth component where we videotaped both the practitioner and researcher at the same time to get kind of their ideas, and that was the idea of spreading that message from those who were actually doing it.
What I really want to talk about is, one: from the words of those involved, what are the benefits, from the researcher and practitioner perspective? What is, essentially, the barriers and facilitators? I divided it down into two categories: one is kind of organizational/structural, one is interpersonal. We'll talk about that when we look through that.
Real quickly on the nature of partnership, we looked at three types when we surveyed and we asked them — we took this from IACP — we asked agencies in the last five years have you engaged in a partnership, and then we asked them specifically what type. I have only two here. The first one was called cooperation. Cooperation partnership means that either the practitioner informally asked the local or other researcher from a university or whatever it may be some advice, informally via telephone conversation, or maybe filled out a survey, supplied data, but there is no real engagement in trying to build a project and a relationship between the two. The second group was called coordination. Again, we took these from IACP's work. In this kind of context, the researcher and the practitioner join together to jointly develop a project to investigate maybe the quality of training, maybe an impact on a certain type of crime problem in their communities, whatever it may be, and it has some type of formality, maybe they pursued a grant and achieved a grant, or maybe they developed a contract between the two, whatever it may be. One thing about it is only once the project is over, so is the relationship.
The last component would be the collaboration relationships, which is shown in the grey, and the coordination I just talked about is shown in the red in that chart. The collaboration was essentially the same thing but something continued on beyond the initial project. They've worked on multiple projects. That's what we would consider a more sustainable partnership.
When you look at that graph, you see that it's very prevalent. Both forms, particularly the coordination type forms, are very prevalent in our large agencies. As you go down that chart and really where you hit that below-400-officer marker, you really see a rapid decline. That is that most of those who are engaged in these types of partnerships are among our large agencies, and we'll talk about that and the reasons why, and I can say a couple reasons. We surveyed — we asked agencies do you engage in these types of partnerships. I think over half said we don't because we don't have the resources to do it. The second most common group said we have never been approached by a researcher. We never thought about it. So, there is a considerable gap. When we get to the end, there are many agencies who are not engaged in these, and I think when we think about growth this is what we've got to think about, the area that we should be applying to.
First I want to talk about the benefits from the words of those who are involved. Maybe to start that or what I want to do to start that is use a clip from Commissioner Davis at Boston Police Department and his response to why he gets involved in these types of partnerships. As you can see from the interview, the interview is jointly done about a relationship he has with Dr. Anthony Braga who's at Rutgers and also works at Harvard's Kennedy School as well. So, listen to Commissioner Davis' comments. I think he highlights some of the key points that we can jump into.
Ed Davis To be quite honest with them, I don't think any one person has all the answers, and engaging universities and researchers in a conversation about what works is a logical thing to do as police chief or commissioner or wherever you might be working. I think that if you are open to criticism and open to best practices, you can learn a lot from working with someone that's taking that long-term look at things.
Rojek I think what Commissioner Davis highlights are these first two of the three most common things we heard from at least the practitioner side. One was access to expertise and skill, and this could mean methodological skill set to develop an evaluation, maybe an experimental design, maybe a statistical skill set, and the practitioners said "I want to know if what I'm doing works, if it's valid." Also, it was substantive knowledge, maybe they're moving into an area — working hot spot policing, maybe trying to address issues regarding domestic violence or gangs, whatever it may be — and taking someone who has more substantive knowledge and experience in the issue or have done research on it. That's one of the reasons.
Getting a different perspective. As many of the folks we talked to on the practitioner side said, we've got people inside our agency that dealt with these problems, but they dealt with them in our agencies. What are people doing outside? Who can I bring in from outside who has maybe looked at this in different agencies or different communities and have come from a different perspective and disciplines? We had a large majority of folks who were criminologists. We had folks who were involved in these projects that were sociologists, psychologists, epidemiology, medicine, etc. The idea was in some cases — in many cases these agencies have researchers in house, but they're interested in folks who are outside of it to bring in a different perspective and particularly have looked at this issue maybe in other communities and agencies.
The last one would be third-party credibility from the practitioner's perspective. The idea here is bring in somebody outside and get some standing for what I'm doing. It might be in a controversial topic such as racial profiling. If I do the analysis in-house, from the practitioner's perspective, will people see it credible as opposed to if I bring someone outside? I've done a project as the opening — on our presentation — Missouri, and I've done it now 11 years, I think it is. I think I left something old on there. I've done it for 11 years now. It's not a complex analysis but it's something I think…one of the things we add is that third-party credibility to the State of Missouri when we do that analysis.
It also can be things as well like getting some type of…I guess seen as being progressive from the law enforcement agency and engaging that third-party credibility. I bring in a university researcher, I'm showing that I'm doing something. I'm being progressive. I'm integrating science into the work I'm doing, and that's seen as giving credibility to the work — at least that was told to us — as well as creating change. If they're creating change and are bringing outside a person who's an expert in this area, it also brings some type of, again, third-party credibility to the change I'm trying to make.
Again, those are the points advocated by the practitioner, but a researcher on the other hand, we're talking about obviously from the researcher's perspective, I get access to things that are important for producing research, producing publications. That could be related to an individual project or it could be related to sampling a project down the road. A good of example of this: I just sat in the office of a sheriff in South Carolina — where I'm from — recently. I'm helping him with a couple of things, small project. I've got a student working on something, a citizen satisfaction survey he wants, but I'm also working on a project that looks at officer-involved vehicle accidents — fatalities, injuries and so forth. We have a large amount of data. We have all the data from the past 13 years in California. Then I realized that South Carolina has a very similar dataset, a statewide data storage. I've talked to the folks at the Department of Public Safety and they gave me some of it, but I need more. I'm sitting there talking to the sheriff and he says "You need the DMV data too." "Well, exactly but I don't know if I'm going to get it." "Not a problem. I know those people. I know those people there. It's a simple phone call. I'll make that phone call. We'll get this to happen. I can talk about that I trust you and we can send them an agreement." That comes from the benefit of doing the other project. It's not directly related to what he and I are doing, but that relationship builds that.
Research opportunities for students, particularly, in a program — we have a PhD program. I can't have all my students filing grant-funded projects. I need other opportunities to help train and develop them.
The last is just opportunity — and it probably is actually the most common one we heard from the researcher — was the opportunity to do practical work or to do what they call applied work and do work that impacts the communities in which they live, which very often is not happening when they're in the classroom or doing their research. Actually doing things that address crime or disorder problems or whatever victimization in their communities.
When I talk about two sides here, one of the key things we'll talk about in a second is the interpersonal relationship between the researcher and the practitioner, whether a chief, a captain, whatever they may be, is very key. The other end was the support, the structural. Some of the things that were relevant that could be barriers and facilitators was funding. Most of our agencies were funded — excuse me, about 40 percent started off with some degree of funding, through a grant project, whatever it may be. That pays for labor. That pays for equipment. It also creates status for the researcher, but even the practitioner.
We had some practitioners tell us "Well, when we get the grant money, it shows it's a real project; people actually want to fund it." So, a lot of folks said they could use more funding, but the reality is, particularly for a number of our projects for one, some of them were done for free by the researchers. Some of them — a lot of them — the agencies paid for. If you think about the sustainability, you have to fund these alternatives. There's not grant funding for every single agency to continue on always having funding to fund a relationship. So, you find those agencies that sustained had a hodgepodge, if you will, of grant funding, agency funding, researcher working for free, etc. as the project continued.
Agency size. Does the agency have the capacity and resources, but in particular from the researchers sometimes, does it have the interest of something I could do with, to do some work or analysis with? Very few of our interviews were with small agencies, but one for example looked at a gun violence program in a small community, and they ended up not being able to do analysis because there's simply not enough events to do an analysis with.
Stability of practitioners. Obviously if one side leaves, whether it be the chief or the captain supporting the project or the researcher moves on, the project was very vulnerable to that. Geographic proximity was another key factor. Is the researcher close or not? Some of our practitioners complained about: "I've got a researcher who comes in from out of town, takes off, I see them two years later, and they've got a product for me. I want somebody I engage with and I want somebody who's close."
Organizational- institutional norms. Among them — and there's a number of them here and I'll kind of keep it quick — one of the things we noticed was particularly with the goals between the two agencies. The police department goal is about providing community safety, responding to stakeholders. The academic goal — and none of that's to say the researchers don't have interest — but particularly in an academic setting at a research institute or a university that puts a lot of pressure on research, publications, grants, etc. becomes what is very important — engaged in these partnerships, and sometimes caught in conflict because the practitioners say "why are you so concerned about that?" Well, for a lot of folks, whether it be a promotion or even tenure, if you're at a research institute, I'm going to lose my job if I don't produce. I want to help the community and so forth. And so trying to find a way to negotiate that goal, and we asked a number of folks — we recognized this in talking to the researchers — how do you balance this? A lot of the comments were: "Well, for one, I do it after tenure." I had one individual say to me — it was echoed by others — "I'm a terminal associate. I'm never going to make full professor. I don't care. I like doing this work. This is what I'm going to do, and I'll publish when I have interest and have something interesting to do, but this is what I truly like to do." Also the universities have impact. Some universities are more researcher-oriented. Other universities, and particularly some of our smaller universities and colleges that were involved, had a more service orientation, and the university was very supportive of those type of efforts.
Next I'm going to real quickly start the interpersonal discussion by looking at a comment by Dr. Robyn Diehl who I think really hits this idea that have to build the relationship.
Robyn Diehl If they don't build the relationship from the front end, a lot of times we get challenged in academia. We get a grant. We get the money. We show up at the agency and say here's what we need and this is how we need it, and we need it tomorrow. There is…that will build a wall. Instead, thinking about how do I develop those relationships while I'm writing the grant? How do I bring the department in to say are there any other aspects of my research agenda that might fit with what your needs are that we can put together, so that as we're moving through our research project we're doing it together and not simply saying I'm the one who knows what we need to ask, knows how we need to ask it, and simply you need to provide me the data. That takes time. You need to build those relationships and gain that trust on the front end. It'll be worthwhile if you do it on the front end, but I think when you don't do that, you've got those barriers.
Rojek I think what Dr. Diehl was talking about is, again, trying to build the relationship. There's a value in building the relationship before you get a project. Having lunch with those folks in your local communities, practitioners, etc., having ongoing relationships, so when you have that, you have that established. You don't have to worry about establishing all this working relationship. These factors were key as we talked to the practitioner and the researchers, the idea of building trust. What is this researcher's motive? What are they going to do with the data? Are they going to go to the press with something and not tell me about it? Then from the researcher's side, is the practitioner committed? Are they going to stick with this project, stay with the fidelity and the design, or is this really for political gain?
The researcher and practitioner orientation. Is the practitioner open to working with researchers? On the same side, not all the folks we talked to…I'm going to step back for a second. Do the researchers have an orientation to work with police? Do they like being with the police? Do they have the social relationships, flexibility? As one of the folks told us, "I bring colleagues along, researcher colleagues, and they never come back. They don't have their orientation. They don't like being around the police." They have to have that comfort factor.
Negotiating common goals, and particularly we talked about those different competing interests from the researcher perspective as opposed to the practitioner perspective of serving the community as well as the demands of the institution for the researcher. Respecting each party and what they have to offer. Just because the researcher isn't a cop, they can provide expertise and knowledge. They're very thoughtful and reflective practitioners that can inform your research project and get them involved. Then a commitment to the partnership; that is, does the practitioner care about my relationship? One of the folks that I talked to — very vivid to me — is they talked about this relationship. We did this work for free. We really want to build this relationship, but then the agency gets a grant and they go to another university. They don't ask us to be involved. From the practitioner perspective, does the researcher just want my money or do they care about the agency and the community to do something? Do they just want my data?
These are all things that work out. They're not insurmountable, but they're two sides to one coin. If you have trust, if you have that open orientation, it's a facilitator. If you don't, it's a barrier, according to the folks we talked to.
I'll get to the last couple points here. I've got some concluding points. One of the things is what we see in these relationships, interpersonal relationships and structural, are not unique to us. The educational field finds this as well, public health, and so there are solutions also to look to outside our field as well to try to build these type of partnerships. The small agencies is clearly an issue or a problem that we've got to work on, because that's where partnerships are not being done. That group between say of 25, 50 officers to 400 officers is where we can find growth.
The importance of interpersonal considerations. It's kind of a "duh!" idea, but I have a police background. I should know better and I've known those things about trust. I've violated them. I haven't been known sometimes as the best partner. It was an eye-opener when I kind of reflected and listened to the words of those folks who interviewed that you have to constantly maintain and build those relationships.
The last couple of points, and we look forward. How do we motivate folks to be involved? What are different models we can use for these types of partnerships, or different structures, particularly for the small agencies? Maybe you get multiple agencies to work together? Then the last idea is how do we identify the willing pool of researchers, and how do we find mechanisms to grow that pool? — are, I think, the keys to moving forward and building these types of relationships. Thank you.
[End of video clip]
Researcher-Practitioner Partnerships: Highlights, Lowlights, and State-Level Support
Tami Sullivan Good morning. I'm going to dive right in. I just want to acknowledge that my colleague and co-investigator on this project, Bonnie Fisher, wasn't able to be here today and regrets that she doesn't have the exciting opportunity to share the findings with you.
This was a two-part study, and the first part was that we conducted a Web survey at the government systems level, at the state level, of agencies across the country. We wanted to learn more about what the infrastructure was that existed that can support or advance researcher-practitioner partnerships. Obviously criminal justice systems don't work in the clichéd vacuum. We wanted to have a better understanding of what goes on at the state level to support or advance these partnerships. This component of the study actually was developed because as we were developing our grant application, we were searching for information on the Web and we couldn't find any one source that could help us understand even who to contact within each state who is responsible for the conduct of research. This component really grew out of a gap that was identified as we were developing the second component of the grant application.
The Web-based survey was conducted to learn about the infrastructure as well as specific experiences with violence against women issues — that's what VAW stands for. The second part of the project is where we interviewed researchers and practitioners about their experiences in successful partnerships. What I'm going to report to you in the second part of this talk is about the barriers and facilitators within successful partnerships.
We interviewed practitioners, and those were defined as those folks who were employed by the criminal justice system as well as those folks who worked on the front lines, providing services to those who interfaced with the criminal justice system — victims, offenders, obviously. We also interviewed administrators at the government system level to get a sense of their experience and their take on researcher-practitioner partnerships and successful collaborations.
Last, of course, we interviewed researchers. Those of us who are researchers, practitioners wanted this category created for them. We interviewed those folks as well.
I'll begin with the Web-based survey results. First of all, how many people participated in this study and how many have collaborated? We were very thrilled to have representation from 47 states. I promise not to stick out those three states who did not respond, but if you did not respond, we would still love your information. We had representation from 47 states, and 55 percent of the respondent organizations had engaged in collaborations in the last five years. So that was just the last five years, the 55 percent. There was an average of 5.5 collaborations per agency that had collaborated.
I apologize — this is a dense slide. I know all of the rules about doing this, but I couldn't help it. It's the only dense slide there is. I'm a visual person. I'm going to cover three points about the surveys. I'm going to talk about the facilitators to developing a collaboration, the barriers to developing a collaboration, and then, importantly, what products result. In terms of the facilitators, what you can see here, the orange bar is what's most helpful when we asked what factors were helpful to developing a collaboration. What you can see is that funding was available. I mean this is kind of a no-brainer. You need funds to have a collaboration and develop a project.
Second was the availability of the researcher. That was a facilitator to developing a collaboration. Now, a lot of us take this as a given, but as Jeff identified in his talk, there's not always someone who's in close proximity to the agency that wants to collaborate. Perhaps there's not a researcher who's interested.
There's also this other piece that actually has been my personal experience but I didn't realize for how many people it was an issue. This issue about — I could not figure out a term for this presentation — it's like a financial restriction. At a lot of academic universities, the overhead costs or the indirect rate is astronomical. Yale has the second highest indirect rate in the country. That indirect rate precludes me from being able to partner with people in the community for applying for some of our state research project collaborations. That's a problem, and it turned up to be a factor that a number of folks in the government systems — they identified that as a problem as well or as a facilitator. Of course the agency and department staff had time allocated to collaborate, protected time, was critical as well.
In terms of barriers — see, I told you the slides get cleaner — in terms of barriers, the top three are not surprises to any of us, I don't think. Barriers are lack of financial resources, challenges with time and difficulty with red tape. So as we move forward to enhancing researcher-practitioner partnerships, we need to understand better what that red tape is and what we can do, what NIJ can do to help with that red tape.
Last, about the Web-based survey, what products resulted? Again, we have a ton of findings from this component but three to highlight here. This was surprising to me, and I thought it was important to share here. What you can see is that we are wonderful at documenting things in written reports. We have to do that. It's an expectation. It's a requirement. Of course if we're paid to do something, we've spent all of this time and all of this energy and this hard work, of course we have to communicate the knowledge that we've gained. We need to share it, so we write reports. We're great at that. I would imagine that Director Laub as well as others are very disheartened to see that improved practical procedures, improved administrative procedures, and changes in public policy, improved or new services are endorsed much less frequently. So we are not doing as good of a job as we can in terms of translating those findings into improvements or changes in policy and practice.
Now, it's hard to say if the reason for that is because the study wasn't designed well to begin with or if it's because there was non-significant findings or negative findings — though many of us would argue that still should lead to changes in practice and policies and services — or if perhaps those findings never made it to the hands of the right people who could help translate it.
I'm going to shift gears just to tell you about the quick take-home points. We did learn that research is valued at the state level and that a fair number of folks are collaborating. We learned that time and funding and red tape were the biggest barriers, and that in the planning stages of the project, we as researchers who are collaborating really need to work with the practitioners to think at the outset about how to translate, that that should not be something we think about after we've written everything up. That should be in the grant application states, when you're developing the project.
To shift to the researcher-practitioner partnerships, the first clip I'm going to show is a clip that I just really like. We videoed all of our interviews. This is from a victim advocate who's been working on the front line for 25 years. She's an amazing person. This is not relevant but I find it necessary to share: She and her four staff work in a 12-foot by 10-foot room in a basement, no windows, and she's worked there for 25 years. Talk about dedication. I mean, they share a room that's smaller than this stage. They are so dedicated to the work they do, and I think her statement here just speaks to the critical need for researchers to join with practitioners to do better work.
Female Voice I just think it would be refreshing if the people who are in positions of making the decisions that we live with every day spent more time talking to those of us who — I mean this is just like an age-old issue for people, but — really spent more time talking to those of us who work on the lines, so to speak. So, the same thing in terms of research. As I said to you earlier, we literally sat down this morning. It was two family relations counselors and myself taking a look at "How effective is this, really?"
No one's looking at these numbers. No one's sitting down and talking to the people who are doing this work on a day-to-day basis. We really need — we could come up with a huge list of things that we'd like to see analyzed. I think that most of the things that the state and the judicial system analyzes are more…they have their own agenda, and it's usually funding. It's usually money. Where are they going to put their money? And I get that and I know that that's important, but that versus what's making a difference, we're spinning our wheels. Let's look at the big picture. You may want to put money in a particular area of funding for this particular program because it meets the needs of this particular population, but in the meantime we're processing — and that's what it feels like — processing cases. That we're processing so many family violence cases where our energy and our time and our resources and our expertise can be better spent if we could just sort of focus things a little bit better. We need the data. We need the research to support what we are feeling every single day, and no one's listening to that.
Sullivan So, I think she just speaks very clearly about the need for this. This is followed up by a clip from a researcher, Carol Jordan, for those of you who do violence against women work, a very well-respected researcher. What precedes the clip I'm going to show is her talking about how violence against women work is about working in the trenches, working directly with victims. I really love the analogy she has here.
Carol Jordan It is, if you will, the riverbank work. It is being at the side of the river pulling out drowning people. A graduate school professor of mine once said, "It's always very important to pull drowning people out of a river, but at some point you have to go upriver and see who's throwing them in." To my way of thinking, academia is what is upriver, because upriver we need to start thinking not just about how to best help a survivor, but we've also got to understand better why violence happens, to whom, who's most at risk, what kinds of interventions are going to be effective? Ultimately how can we prevent violence? We've got to start coming with answers. So, those answers will not be able to be asked effectively nor answered effectively if we do not have connection on that riverbank, if you will, between the advocates and practitioners who are providing services and the researchers who have the capacity to answer the questions upstream so that we can change the way we're providing services for survivors, and ultimately we can end violence against women.
I think what we need to remember is that this idea of translational research, this idea of collaborations between people in practice and people in research is not just catchy. It's not just kind of the next trend that will be gone eventually. I think it is ultimately the way to do research in the violence against women area. It is absolutely the way to do research and it's absolutely the way to ensure that the services that you provide are effective if you can do evaluations, if you can have these kinds of partnerships.
So, ultimately, if we go back to the metaphor about the riverbank, if there are a lot of footprints along that riverbank between those people who are providing direct services and pulling drowning people out of the river and those people who are upriver able to start putting the processes in place to answer questions, if there are a lot of footprints between them, I think we have the best shot at ending violence against women around the world.
Sullivan Again, I couldn't quite put into words that wonderful analogy to the riverbank, but I think it communicates a lot. I debated whether or not to start with the lowlights or highlights of the successful researcher-practitioner partnerships, but I figured it would be better for you all if I ended on a happy note. So, I'll start with the lowlights first. Do remember that these are from successful collaborations, and not surprisingly a lot of the issues that'll be highlighted here are issues that were highlighted in Geoff Alpert and Jeff Gordon's project as well, even though their work was on police partnerships and ours was on violence against women. A lot of these issues are cross-cutting. It's not necessarily about topic or discipline, though certainly there are unique issues, but the bigger picture is much more broad than that.
In terms of lowlights — I put this in quotes. I don't want people to get offended by some of the terminology. It's what people told us — but this notion of "drive-by researcher." That's a lowlight for folks, this notion of drive-by researcher, the researcher who kind of swoops in, grabs his or her data, swoops out, and is never heard from again. The biggest issue here is of course distrust, distrust in the researcher, the research agenda, and it's not even necessarily unique to the researcher who's approaching the practitioner system. It's just general distrust in research in general because of stories they've heard from other folks, that it's not been done in collaboration and that at best it might have been cooperation, where they say "Sure you can access our files," someone accesses files, leaves and is never heard from again.
These two quotes I really like, as they speak to this point. "She just never did that end of the project. So, didn't write the paper and didn't complete the research and sort of fell off the face of the earth." That actually came from the victim advocate of 25 years with five people in an office, letting someone go into their files. They never heard from this person again. This second quote is actually from Sarah Buel who was from Suffolk County in Massachusetts who did the documentary "Defending Our Lives," which is a documentary about domestic violence victims. "I had a very negative first impression of researchers. My sense was that they don't listen to you, they're arrogant, and any time they disagree with you, they chalked it up to your ignorance as the practitioner." Certainly a lowlight.
Next, sorry about the play on the Beatles tune, but time is not on your side. First of all, everything takes longer in collaboration. Folks have alluded to that. Everything takes longer, because if you collaborate, you're having discussions. You're making decisions together. Forget about research projects. Think about just in your offices or you departments or whatever. When you can make the decision by yourself, maybe it takes you 20 minutes, maybe two days if you have to find resources. When you're doing it in collaboration with other people in your department or where there's a shared decision-making process, it takes much longer. Time is a factor.
Second is research takes longer than practitioners could ever imagine. There's a great clip that speaks to this as well. For folks who aren't trained in research, of course they have no idea how time-consuming it is, but this can be a project killer if you don't think about this in the development of your grant applications and also in terms of funding. These two quotes: "This project went on for ages, I don't know, seven years, something like that, a long time. And we only ended up with a sample of like 145 people." As a researcher, that made my heart sink. 145 people over seven years is not a lot, given the design they had. The second one: "…And from the practitioner's point of view, this is a project that needs to be done in the next month and can be done in the next month, and in my view, nothing gets done in a month. Nothing gets done really in three months or sometimes in a year."
Another lowlight is this staff revolving door or high turnover with staff. When there's high turnover, of course that contributes to decreased investment in the research project, and if there's not buy-in from higher-ups, so to speak, it can be challenging to get people on board, and it's incredibly time-consuming. "Well, that lead prosecutor who was a supervisor, that person turned over seven times. There are just multiple ways that we couldn't get everybody to coalesce, and then getting momentum going. It just kept restarting and restarting and restarting."
So, some of the highlights, to move on to the happier information to share. So, the first highlight is that the relationship was established or that they spent time developing their relationship. As Jeff talked about, this is critical. This was critical across every single interview we had. Critical is an understatement. For folks who do work on offender programming, Ed Gondolf in his interview called it "courting." You need to spend time courting. The next clip is about dating, and I think it's just a wonderful clip.
Female Voice I asked her if she told you this story or not. I said did you tell her about the dating story, and she said no, I left that for you. But this was kind of a joke in our relationship about when we first started to meet and kind of talk about this project, and at one point we looked at each other and it's like okay, we've agreed to date for a little while and we're just going to see where this goes, and that was the pilot process of this. And we just kind of continued to talk about how is this going. Are we going in the direction that's going to meet your needs? But then I felt like she really appreciated what I bring as the researcher in terms of kind of understanding the method and what you can and cannot look at. And as the relationship and the pilot work continued to progress and we got to the point of writing a grant to NIJ, I remember calling her saying, "Okay, this is the next step in our relationship, we're moving from dating into a longer-term commitment here, and are you ready for that?"
Sullivan I just think that's funny, because it's true! So, another highlight, the researcher was knowledgeable about the context and the system. Think of whatever clichéd term you'd like: living in the trenches, being a fly on the wall. Those things led to a greater understanding of the system and so much more. It served to build trust. When the researcher understood the context, spent time on the ground level in the 10-by-12 office, out in the field, at the prosecutor's office, whatever it might be, spending time in that context made a world of difference. Certainly it contributed to a stronger project and a stronger design for the study, but also it contributed to really a strong relationship and building that sense of trust.
This is from a practitioner in the state system: "That's what the researcher said to us. 'I need to see your context. I need to understand this.' And so we sent him to an urban site, a rural site, to a mid-urban site, and when he realized the pressure that family relations counselors faced to process those cases, it became clear that we were not going to ask 37 questions on an assessment that they were trying to develop, a risk assessment. That's just not practical."
Another highlight is that practitioners were willing to learn about the conduct of research. Now, for folks who don't understand research, sometimes they don't like it. Sometimes they don't care for it. Sometimes it's intimidating, even if it's not the statistics and math. It's just an unknown process. Well, when the practitioner was willing to learn about research, it made a world of difference to the project. It also made a world of difference to the relationship because it meant that both people felt like the other person cared about what they needed and what they wanted to do.
Another great quote, I think: "Instead of having graduate students do the data collection, have my staff help you with the data collection. Have my staff help you learn how to clean the data, learn how to code things, learn how to conduct certain analyses so that we leave with some tools." I really like this quote as well because it also speaks to the component of capacity building when you're collaborating and empowering the folks that you're working with, because as Jeff said, there's not always going to be a researcher there to conduct that research.
This is another great clip that I think speaks to this issue as well as a number of the issues before, and this person that you're going to see is actually the partner — practitioner partner of the researcher who just talked about dating.
Female Voice What's gone well is that I think Carolyn was truly interested in learning about us before sort of jumping in. So, the ability to have that pilot project and that time for her to get to know us has been amazing, and really that time allowed me to get to understand a lot more about the research process and something that we thought we were just going to jump into and maybe create a little questionnaire has grown tremendously, but that process has also meant that it's slowed things down tremendously. I think that will be to the good in the end, but it's been a long process for sure. But the good part again about that pilot project is that we really had the opportunity to grow the idea together, to take something that was really amorphous and learn about one another's respective sort of focuses and then build it together in a way that made sense for both of us.
Sullivan Another highlight — again, I don't like this term "higher-ups" but it's what people used — higher-ups were invested and helped moved the project forward. Now, this investment was critical to deal with the barriers that we talked about: resources, red tape and high staff turnover. When you had investment in administration, it made a world of difference to navigate all of those challenges. Why was the collaboration successful? "We've had buy-in from our executive director. He's — we're all very interested in trying something new and stepping outside of the traditional model to see if we can make something more effective." I really like her shift between "he's" and "we're" all very interested, because it speaks to the culture that built when you have people invested in the higher-ups.
The last highlight is my favorite, that the findings had direct relevance to services, policy for practitioners and clients. We asked people, the way we started our qualitative interviews is that we said after they described the project to us, "Tell us about the highlights and lowlights." That's how unstructured it was, and this particular researcher said, "Highlights is being able to say that part of the knowledge that you produced had an impact, and it's about knowing that it is contributing to making changes."
I want to leave with something I've always wanted to do in a presentation but have never been able to, a top 10 list. This came very easy for this project. Here's a top 10 list for successful collaborations. The first one is connect with higher-ups and staff on the front lines. Two, as a researcher, spend time in the trenches; spend time understanding the idiosyncrasies, learning about the minutia. That makes a world of difference. It's time-consuming, but it's critical. Three, as a practitioner, learn about the conduct of research. It's not so scary, I swear. I actually have a book that I give to my new students about statistics, and it's called "Statistics for People Who Think They Don't Know Statistics." It's a really good, user-friendly book, but it's really informative and very helpful to them, and to find ways like that to share information about research with practitioners.
Four, slow down to invest time in relationship building. Date, court, all of those kinds of words that were put on the relationship building process. Five, develop the project with input from both sides. This kind of is one of those "duh" statements, but it's a really critical "duh." Budget for extra time and funds. Please don't assume that because practitioners already get paid that you shouldn't consider compensating them and that you should not integrate that into your budget, and to think about that at the outset. It is difficult in these challenging times, but it is important. Seven, discuss potential findings and dissemination. I put the words "negative" in here because one of the biggest challenges that people face is when they had negative findings. The practitioner agencies hadn't thought about that in advance. So, the researchers, some of them said when you work with a practitioner, advice for them is to think about what if this does not shed the best light on what you're doing? How do you want to handle this? Let's think about this now before that even happens.
Eight, plan in advance to translate findings at the outset, not just dissemination. Go beyond that written report. Think about the various findings you might have as a result of your study and how those can be translated into changes in policy and practice. That will make you think more about who to bring on the team, not just you and the practitioner, but is there someone at the state level who can consult? Is there someone at your local domestic violence service organization? We have a police department that has a dedicated domestic violence officer. Does it make sense to bring that person in? Thinking through to the translation piece can help you think about key players to bring on.
Last, "If at first you don't succeed…" I'm a psychologist, and when I refer people to therapists or even a family member who's looking to see a clinician for the first time, I tell them if it doesn't work, if you don't like the first person you meet with, please don't give up. It doesn't mean that your situation is hopeless or that all psychologists or all therapists are like the person you met with. Keep at it until you find someone who clicks for you, and I would encourage you to think about the same thing. If you've had a bad experience in the past, chalk that up to somebody's ignorance but don't be willing not to move forward, because you're missing critical opportunities to enhance the work that you do and the work that other people do to improve folks' well-being. I thank you very much for the opportunity to present this morning.
[End of video clip]
Vivian Tseng Good morning. So, I want to begin by thanking John Laub and also Kristina Rose for inviting me here. I am not a criminal justice person, but I hope…I'm very excited to be part of this panel, because as the title of this session said, I really do think these partnerships are game-changing moments. They are game-changing moments for us to rethink the ways to connect research and practice. I think there is enormous potential here. I'm very excited to be part of this panel and to be invited to share a couple of comments and reactions to the wonderful work we've just heard about, and to connect it with some of the work we've been doing at the William T. Grant Foundation.
For those of you who don't know, the William T. Grant Foundation — which is where I work — is a private research foundation, and our mission is to support research to improve the lives of youth, and more specifically we think about our mission as supporting research to inform practice and policy to improve the lives of youth. I think as an agency we are similar to NIJ. Figuring out how to connect research with practice is a really, really difficult problem, but like NIJ, figuring out how to do this is a core part of our ability to meet our mission.
I mentioned before that I really do believe that thinking about these partnerships is a game-changing moment because it fundamentally causes us to rethink the ways we've been trying to understand research and practice. I think the dominant framework that we've been using is one of moving research to practice. It's all about pushing research out to practice. When we've been operating in this paradigm, we've been doing a couple of things. One is that researchers have thought about the problem a lot in terms of needing to improve the quality of research. As researchers, we fall back on the things that we care about and we think what we need to do is improve the quality and the rigor in particular of research evidence so that it's more suitable for use. We've spent a lot of time focusing on improving the rigor of what works evidence so we can better test the effectiveness of programs and practices.
The other thing we've done is we've thought about this problem a lot in terms of needing to improve our communication, our dissemination and our marketing of research to get it out there into practice. Communication, we need to communicate it in clearer ways without all the jargon that researchers often use, to put it into briefer formats, clear-written formats so that it can be easily read by busy readers, and then we need to do a better job disseminating research, distributing it in more effective ways. Then you see in the last couple of years more and more folks building websites as a way of centrally storing research so that people can draw it down in easier ways.
The other thing you see more recently is this focus on better marketing research. This is the notion that practitioners are flooded with new products and information, and that we need to repackage research so that it can compete in this busy marketplace of information and products that are sold to practitioners.
The last thing you see is you see a big push to increase the adoption of evidence-based practices that researchers are testing and they have a larger and larger body of practices that have been tested and found effective in randomized control trials. Now the idea is how do we push it out and get it adopted and implemented well in practice settings.
Now, I think in the work that we do — and a lot of the work that we've done has been in education, but it's also because we're interested in youth, it's also in child welfare and mental health — we have found this paradigm way of thinking about research and practice is pretty limited. First of all, it's very unidirectional. It feels very much like a one-way street of just pushing research out to practitioners.
I think that as we start to think about partnerships, it causes us to rethink this relationship. We're kind of re-engineering the relationship between research on the one hand and practice on the one hand. It's no longer just about how research needs to inform practice, but it is just as much about how practice needs to inform research.
A lot of what we've heard earlier today from Jeff and Tami is about how can the priorities, the concerns, the dilemma of practitioners shape research agendas and the ways in which researchers engage with practitioners. It's putting these two communities on equal footing as they move forward.
As we focus on partnerships, it causes us to rethink some of the things that we're working on. We're focusing on developing some shared commitments. In education they talk a lot about core problems of practice. What are the core problems of practice that researchers and practitioners can identify and really work together on, going forward? It also causes us to refocus on building relationships and building trust, and that was a huge part of what you heard from Jeff and Tami. Then lastly, it causes us to focus both on improving research and improving its use, both sides of this equation. In terms of improving research, it's not just with the criteria that researchers would use, right? "Let's improve rigor." It's also about let's improve the relevance of research so that it really addresses the priorities and the concerns that practitioners are grappling with. Then let's also think about what it means to use research, and what the myriad ways that research can be used to help improve practice.
So, let me draw a little bit from some work that we have been doing to try to understand how and when practitioners and policy makers use research evidence. What I want to do is present for you a topology of the different ways in which research can be used. We can think about this going forward as part of the partnership work.
The first is instrumental uses. This is when research can be directly applied to a particular decision. When I hear people talking about the use of research, this is the kind of use I'm often hearing them talking about, that research can help people decide a practice that they want to adopt, a policy that they support, or how to implement a practice. It's research applied to a very discrete decision.
But research can also have conceptual uses. It can help people better understand the nature of a problem, orient themselves to the issues at hand and orient themselves to potential solutions. This way of thinking about the use of research can be just as important. Carol Weiss who did some classic studies of knowledge utilization said that probably research is most likely to be used in these conceptual ways.
A third way of using research are political uses. This is when research is used to bolster, add weight and heft to a position that's already been staked out. Say a legislature, say an agency head already knows that they support a reform effort or they don't support a reform effort. Research adds to the body of knowledge that's used to justify a particular stance.
Research can also be imposed. We've heard a lot about this in the last decade or two. An example of this is when government funding is tied to the adoption of a particular practice or a particular program. It's not a completely voluntary choice on the part of practitioners. It's rather an imposed use.
Lastly, there are process uses of research. This is not using research findings per se, but it's the learning that can come out of taking part in the production of research. This is a myriad of ways, and I think it's important to keep these in mind as we think about the ways in which research might be useful and utilized.
I also want to shift gears a little bit and highlight some of the key challenges. I think this work has enormous promise, but there are also some very tricky challenges and I don't think there are any easy answers to these challenges. If they are, I'll turn it over to Tami and Jeff to try to figure out what those are. Let me highlight them for you.
One is long-term commitments. This is something we haven't yet talked about this morning. I think this is the difference between dating and the long term relationships that that woman in Tami's video was talking about. I think if we're really going to realize the promise of these partnerships, they have to involve some long-term commitments. I think otherwise we go from drive-by research to drive-by partnerships, right? You're kind of quick in and quick out. I think it's through these long-term relationships that practitioners and researchers can really grapple with some of the very challenging problems that practitioners face, and there can be an iterative back and forth as they work together on these problems.
Two, we really need to understand how do to this work well. Successful strategies for partnership work is very important. In terms of identifying successful strategies, I don't think we can only look to what we consider successful partnerships. If you're only studying the things that successful partnerships do, you don't know the key ways in which they differ from unsuccessful partnerships. You have to identify the things that differ between the successful partnerships and the unsuccessful partnerships. That I think will give you some keys into some strategies for doing it well.
Let me suggest some types of strategies that we might dig deeper and understand from education. We have begun convening a learning community of research-practice partnerships in education between researchers and school districts that are working together in long-term institutional collaborations. One of the dilemmas that all partnerships have to face is how do they develop a joint research agenda that everyone agrees to work on together. There's no easy way to think about how you develop a joint research agenda that meets all these various stakeholders' concerns.
SERP, which is the Strategic Educational Research Partnership, has developed a set of ground rules which I think are very interesting to consider. In SERP, they try to develop these agendas, these research agendas in collaboration, but they say that there are certain non-negotiables. The non-negotiables are the district leaders, the school district leaders have the final say in the questions that are going to be explored. The researchers have the final say in the research designs and methods that will be used to address those questions. SERP — as a coordinating body for the work that goes on in different sites — SERP has the last say in the approach to solutions and approach to the problems so that whatever they work on together has relevance for different sites or different districts. I think it's coming up with some of these very practical strategies for doing this work well.
Another strategy that the Chicago Consortium for School Research, they have a no-surprises policy. Now, the Chicago Consortium has a 22-year relationship between researchers and the Chicago public schools. As you can imagine, a lot of the things that the researchers find don't put the Chicago public schools in the best light. In order to nurture this relationship and to maintain trust, they developed a no-surprises rule. Before the Chicago Consortium issues any report, they first share it with the Chicago public schools. This is not so that the school district can edit the report or offer any changes. This is merely so that the school district can prepare their response to it, that the first time they see the findings or the report is not on the cover of a newspaper, that they have some time to figure out what their response is going to be. I think that kind of a strategy and policy has helped the Chicago Consortium really maintain this relationship and maintain trust over time.
Another challenge is related to funding, but it's a little bit different from what we've heard earlier today. Funding, at least in education, a lot of it goes to project-by-project funding, for a research project or another research project. But when we're talking about partnership work, when a lot of it is about the dating — I love this analogy — this dating thing that has to go on in the beginning where people are figuring out what are our shared commitments, what is our research agenda going to be, how is it that we're going to come to trust each other in this relationship, that there has to be some support to enable people to do that work on the front end. If all the funding goes to project-by-project, my worry is that it draws research partners in particular to pursue the questions and the agenda that's important to the research funders, and it can draw them away from the key problems of practice that their practitioner partners are interested in.
Incentives and capacity. I think Jeff and Tami have talked about this at length so I'm not going to go into it, but just acknowledging that you need incentives and you need capacity on both sides of the equation so that they can be effective partners to one another.
Then lastly — this is something we see in education and I am understanding that you see it in criminal justice as well — is just stability. In urban school districts, superintendents are turning over pretty rapidly, and in many of those districts as their leaders turn over, so do the executive staff. That makes it very hard for a district to maintain stability in its work, but it also makes it very hard to be a partner to researchers. So that is a tough challenge I think for public agencies in general and hard to know how to address.
Next steps, John had asked me to suggest what I thought might be some next steps for the field. Let me just offer a few thoughts. One is I think we need to continue to build knowledge, and specifically of how to do this work well. I've talked a little bit about identifying and testing out some of these strategies, but not everyone is going to be cut out for partnerships, and certainly we're not going to develop partnerships for every agency out there. I think that this work also presents opportunities for us to build some broader knowledge for the field beyond just partnership work.
In some ways I see these partnerships as little laboratories where researchers and practitioners are working together in intense collaboration and innovating to try to figure out how to really connect the two. I think in these close collaborations, researchers are going to have to figure out how to do work that's timely, actionable, rigorous and relevant. I think it's going to put a lot of pressure on us in the research community to figure out how to do that and to really innovate around our own practices of doing research. I think it's going to put a lot of pressure on practitioners to figure out what does it take to really make sense of research and to put it to use. I think that these laboratories, these partnerships are going to teach the rest of us some generalizable lessons about how to bridge these two worlds in effective ways.
Second, you can't just build knowledge. We have to build capacity for folks to do this work well. We've talked already about funding and incentives, capacity, time, and expertise to make this work happen.
Lastly, I think this is a fantastic opportunity for us to also rethink our dissemination strategies. Once we culminate some of these lessons learned about partnerships and how to do them well, I think it would be a mistake for us to keep falling back on our old conventional ideas about dissemination. I think this is a great time for us to really rethink how we disseminate this work well. One way I would suggest is that we don't just think about dissemination as pushing out information in a broad-based way, but that we get more strategic about it. One way to do that is to really understand how people pull information in. What are the natural venues by which people pull in new ideas?
I have been very enamored by this idea of capitalizing on existing social networks and intermediary organizations. Practitioners and policy makers often tell us that when it comes to getting new ideas, there are certain peers that they really turn to. There are peer opinion leaders or there are certain intermediary organizations that they really see as trusted sources. We should identify who those folks are, and I think we should recruit them to help us engage in some of our dissemination efforts. I think that may be a more strategic way to give us more bang for our buck than these broad-based pushing out approaches. That's it.
Opinions or points of view expressed in these videos represent those of the speakers and do not necessarily represent the official position or policies of the U.S. Department of Justice. Any commercial products and manufacturers discussed in these videos are presented for informational purposes only and do not constitute product approval or endorsement by the U.S. Department of Justice.