Skip to content

    Navigation breadcrumbs

  1. Home
  2. Veterinary topics and resources
  3. All resources
  4. Veterinary Evidence Student Awards winners 2024

Library and information services

Access to electronic and print resources focused on veterinary science and animal health, plus services to support your study and keep up to date with clinical research.

Awards

Our awards celebrate achievements and build knowledge that contributes to evidence-based veterinary medicine.

History

We hold a unique collection of books, archives, artefacts and memorabilia which together offer an insight into the evolution of the British veterinary profession.

    Navigation breadcrumbs

  1. Home
  2. Veterinary topics and resources
  3. All resources
  4. Veterinary Evidence Student Awards winners 2024
Podcast7 February 2025

Veterinary Evidence Student Awards winners 2024

Find out more about the winning Knowledge Summaries from the 2024 Veterinary Evidence Student Awards.

In this podcast Veterinary Evidence Editor in Chief Peter Cockcroft speaks to the first and second place winners of the Veterinary Evidence Student Awards, Amelia Cannadine and Oliver Wilkinson. Amelia and Oliver both discussed the process of writing award winning knowledge summaries for Veterinary Evidence.

Podcast transcript

RCVS Knowledge Podcast: Veterinary Evidence Student Award winners 2024

RCVS Knowledge:

Welcome to this podcast from Veterinary Evidence, an online open-access peer-reviewed journal owned and published by RCVS Knowledge.

Peter Cockcroft:

Welcome to this podcast. My name is Peter Cockcroft. My role at Veterinary Evidence is as editor-in-chief. I’m very pleased today that we’ve got two of the award winners from our Veterinary Evidence Student Award Competition. It happens annually. Oliver and Amelia, who I’ve got with me today, are two of the winners from this year’s competition, the 2024 Veterinary Evidence Student Awards.

Just to start out with, I’m just going to read the titles, or the clinical questions if you’d like, that Amelia and Oliver used in their knowledge summaries. Amelia, the working title for knowledge summary was ‘Nonsteroidal anti-inflammatory drug administration to periparturient cows to reduce postpartum pain-related behaviours after parturition’. Oliver’s working title was A review into the therapeutic effectiveness of oral cannabidiol’ — or CBD as some people know it — ‘in Cats with Osteoarthritis’. Those were the working titles that we used in the knowledge summaries that Amelia and Oliver wrote.

I guess to start out with, and perhaps I’ll start with Amelia, just give us some sense of your background. Are you an all-qualified veterinary surgeon? If not, which veterinary school did you attend? What actually motivated you to submit a knowledge summary under the auspices of this particular award?

Amelia:

Hello, everyone. It’s good to be here. I’m actually from Australia, so I’ll be talking from there, which is a bit exciting. Excuse the Australian accent. I was from England seven years back. Well, no, 13 years back now. I was seven years old when I moved. Anyways.

I am a vet now. I was a student when I submitted this paper. I’m only in my first year out. I’m a mixed practice veterinarian in South Australia. I tend to all animals other than horses, I’ll put that out there. I’ve always had a keen interest in cattle and that’s where my paper came from. I actually went to the University of Sydney. Lucky enough to get in there and it was a great university, around for many years. Yeah, it was awesome. It was a good experience.

The reason that I made this paper or what motivated me to make this paper was actually I had to do a research project as part of my degree to get my doctorate. I wanted to work around welfare in some respect and related to cows in particular. The evidence out there was lacking in the use of nonsteroidals, but also there was a big gap that needed filling as well because there’s a lot of potential for their use. I ended up going down the pathway of pregnancy because dairy cows spend a lot of their time in that area. And also promoting their welfare. There was a big gap for quite some time in that area that’s being filled slowly but surely.

Yeah. That’s what inspired me, I suppose.

Peter Cockcroft:

Thanks, Amelia.

Oliver, if I can turn you, similar question. Which vet school did you attend, are you now in practice? And what motivated you to spend the time and effort, because it’s not insubstantial, to enter this competition and write a knowledge summary?

Oliver:

I went to University of Bristol in their graduate program, so it was a second degree. I’m a mixed vet now, currently up on Scotland. More heavy farm work with a bit of small animal on the side, but we do see everything.

For the paper, we had a piece of coursework where we had to write a group knowledge summary. During that process, we were told about the awards. They said they will support anyone who would like to enter. Obviously, with the piece of coursework, we’ve already done half the legwork. At that point, it’s then just a case of expanding beyond the coursework limitations to then meet the criteria of the journal. I got so far through it anyway, for me it was ‘I may as well go the rest of the way’.

I also like the topic because it’s quite, I think especially in recent years in human medicine, there’s obviously been more research going into marijuana and CBD for pain relief, and more herbal and therapeutic alternatives. I think this was quite a nice crossover into the veterinary world as well.

Peter Cockcroft:

That’s great. In a sense, you’ve answered my second question which is how did you choose your topic? But you both covered really quite well as to what motivated you and how you chose those particular topics.

I’ve got a very brief question because one thing that authors often struggle with is actually translating what starts out as a very high level clinical question into the PICO format where we define it in terms of patient intervention, comparison, and outcome. I’m just curious, how did you find moving from that fairly high level question into defining it into a PICO? Because that’s really quite critical in the process, isn’t it?

Amelia:

Yeah. Look, it definitely was. I definitely know what you’re saying, Peter, it was difficult. Because yeah, you do start with this massive question, you’ve got to really break it down so that it’s quite concise and it breaks up into your search terms, and that’s where you get those key papers relevant to what you’re actually looking for.

I was lucky enough to have quite some good lectures at uni about making a PICO specifically.

Peter Cockcroft:

Okay.

Amelia:

So going through quite a lot of examples. But by all means, it took me a while to get to my perfect PICO, and I had to alter it quite a bit to get to that small, specific search term that really provided the perfect papers, I suppose.

Peter Cockcroft:

That’s excellent. Oliver, do you mirror similar things? Did you find it difficult to translate that into a PICO that was specific and searchable? Because it does lead into the next step of the process, doesn’t it?

Oliver:

It does. I didn’t find it too bad, but I think the ease would be very topic and question dependent. I know, for example, on the original draft of my coursework for university, the PICO was definitely trickier to do than it was with this. Whereas this ultimately turned into quite a refined topic, so the difficulty wasn’t necessarily trying to work out what goes under each letter. It was more just getting that wording correct that wasn’t too vague or too focused, to actually make it searchable.

Peter Cockcroft:

Really good comments from both of you. You’re quite right, it’s context specific how easy or difficult it is to translate it into a PICO. Then with knowledge summaries, we’ve got quite clearly defined templates and we give a lot of assistance on the website as to how you do it. Of course, the library can provide some assistance if you’re having difficulty that search.

In terms of the steps and the templates that we provide, how useful did you find that and how useful was the support from us? For instance, did you lean on our library to help you with the search functions? Did the templates really help you? Did it give you a really good structure in order to … It’s still challenging, but at least it gives you a pathway and a roadway to try and support you to do this in a very structured, systematic way. Just curious, how did you find those templates? Did you use our library to help you with those search functions or not?

Amelia:

I actually used your template before I even knew I was going to submit the paper to you guys.

Peter Cockcroft:

Okay.,

Amelia:

That just shows how much I rated that template.

Peter Cockcroft:

Yeah.

Amelia:

I actually was recommended it by a professor at uni. And just said, “It’s a really good way of breaking down the topics and being concise, but covering what you need to cover,” without waffling too much, I suppose. I found the template awesome. You got those key headings of what you really want in there and what you want expanded on, because I’m sure me as well as a lot of other students tend to waffle and put loads of words down that aren’t necessarily super important or needed, or vital in that summary, and you end up with lots of pages of stuff you got to fix and cut out.

Peter Cockcroft:

Yeah.

Amelia:

Yeah, you definitely made that quite concise and helped me stick to what you really wanted. Then yeah, went down the route where I was like, “Well, I’ve used your knowledge summary template, now I might as well enter this competition, and alter it and make it as a good as I can.” That’s how I got there.

Peter Cockcroft:

Yeah.

Amelia:

I was going to use your template regardless, in all honesty. I really enjoyed using the template.

Library, I didn’t actually use myself only because I was lucky enough to have the uni library.

Peter Cockcroft:

Yeah, sure.

Amelia:

That’s just me.

Peter Cockcroft:

Thanks, Amelia.

Oliver, I turn to you again, similar question, really. How helpful was the structured templates? I don’t know if you leaned on the library or not to give you some assistance, you may not have. But over to you.

Oliver:

The template I started with was one of the university ones, but it was very similar to yours.

Peter Cockcroft:

Yeah.

Oliver:

I’m pretty sure that they modeled it off your template. It wasn’t a huge jump to just tweak things to get it to match your template. I did find it quite a nice template to use though, because obviously it broke everything down into smaller, more manageable chunks, rather than writing paragraph after paragraph and letting it flow one into the other. It was generally quite definitive on where your divisions were.

I think, also, it still enabled you to have some flexibility. I know going through the editing with Will, there were sections where we were like, “This paragraph might be better in this next section,” and vice versa.

Peter Cockcroft:

Yeah, sure. Yeah.

Oliver:

I actually found it generally quite a nice template to use.

Peter Cockcroft:

That’s great. Oliver, I’ll start with you maybe. I guess quite a challenging part is actually the critical appraisal of the papers, once you’ve found the papers. I’m just curious about whether it gave you, by having to look critically at papers, whether you felt that when you read an abstract, and most of us just read abstracts if we’re trying to find out about a topic, whether by forcing yourself by virtue of doing a knowledge summary and having to do a critical appraisal in detail of that paper, whether you felt that science is an absolute? Or whether actually, the abstract perhaps doesn’t always quite capture the evidence or the data that’s produced from the study? Just curious about your thoughts on that, really.

Oliver:

I don’t think the abstract always catches everything you want to know in definitives. I know that in my experience of going through this, there’s a lot of papers where I started with the abstract, but I discovered quite quickly that it didn’t always give me the full answer I wanted to know whether it was a paper I could use or not. Quite often, I ended up doing the abstract, and at the very least the introduction, if not throwing the conclusion in as well. Before I decided, “Right, I’m going through this paper in full.”

Unfortunately, in my topic because there’s so little research been done in cats, it was quite apparent early on whether I could exclude certain papers. There were some I did have to dive a bit deeper into to assess the small amount of evidence that was there.

But yeah, this process, my topic was a bit simpler to pull the information out of. Whereas I know, in my previous degree, I did a big knowledge summary as part of that. I know that one was much trickier, and I had to quite often go through the entire paper before I could actually make a decision.

Peter Cockcroft:

Yeah, point taken. Good observations.

Amelia, for you? That abstract versus reading a full paper, does one necessarily reflect in totality? Or are we asking a subset question that you do need to drill down into the paper to answer?

Amelia:

I think it’s, yeah, multifaceted in that I definitely agree with what Oliver’s saying. To really dive into the depths of the paper and fully understand how they got to where they go to. Especially if it’s a bit of a contentious answer or not what you were expecting, I think reading the paper does have a lot of value. But it also does prove how what you said, Peter, is exactly right. People just read the abstract. I know I do it. I probably did before this knowledge summary more so than anything else. You don’t have time. As a clinician, as both Oliver and I are now, you don’t have time to sit there and read through the whole paper, so you need that real critical information in that area.

But yeah, I guess on the flip side, you don’t get all those integral details. For example, there were a couple of my papers, and I do think overall they were awesome papers, but the sample size changed for whatever reason. The sample size that was in the abstract was the original sample size. Well actually, it was reduced for X, Y, Z reason, whether the cattle weren’t well or they had to be excluded for whatever reason.

Peter Cockcroft:

Yeah, yeah.

Amelia:

But you didn’t know that. I think that’s quite important. The way that they do these studies is so detailed, no wonder they have to do an abstract. Because ultimately, there’s so much to them.

I think there’s good and bad things about abstracts. But I think in short, to answer your question, you got to read the whole thing to really understand and get to the crux of the issue, I would say.

Peter Cockcroft:

Oliver, I’ll turn to you. Obviously, our abstract in a sense is our clinical bottom line, which is where we’re trying to present the reader the key elements of what we’ve found. From my perspective, it can be quite difficult to write it and there’s quite an onerous responsibility about getting it right, or at least not misreading, misrepresenting the information that you’ve worked through. Give me some sense of your experiences if you don’t find papers, is that easy or difficult? At some point, you have to commit yourself to some sort of strength of evidence. Over to you, just some general comments really about how easy or difficult you found that particular part of writing a knowledge summary.

Oliver:

This aspect for this particular topic, I didn’t find writing a clinical bottom line too bad. I think in my case, it was quite easy to fall on one side of the fence simply because there was a distinct lack of evidence in the species. It was very easy to start off there. Then in other aspects of the paper, in parts of it, you could expand into a little deeper as to why there was no evidence or what little bits there were out there.

I didn’t find it too bad. I know that in other research that I’ve done, it would have been much trickier because of the different natures of the topic and how much information there was there. That’s where it gets hard to whittle it down to a specific clinical bottom line when there is a lot to work through.

Peter Cockcroft:

Amelia, if you can switch to you because you did have quite a large number of papers to deal with. In a sense, really just looking for your insights as to how easy or difficult it was ultimately to write the conclusion, or at least give you the strength of evidence that those collective papers provided across your particular topic?

Amelia:

Yeah. I would say I had more difficulty than it sounds like Oliver did. I did end up having four papers, but ultimately my search term gave me something like 18. I had to go through them to really get to the ones that were important and met my criteria. That in itself took a bit of time. But then those four papers measured, I’m just giving my example quite specifically here, but they had different behaviors within that paper which then had different findings. Well, I needed to give a summary of did these papers overall find a significant effect on the welfare of these cows? Breaking just the behaviors down to come up with, okay, was that a yes or a no, is quite complex.

Peter Cockcroft:

Yeah.

Amelia:

And also, just being able to say the time. I had to talk about when to give an anti-inflammatory in the prepartum and postpartum period. Some papers found giving it before was good, some found after was good. Then summarising that in a way that didn’t sound too silly by saying, “Give it this amount of time before and this amount of time after, and that was the most consistent between all the papers.” That was quite thought-provoking, I’ll say. I remember sitting down, and highlighting everything and going, “Okay, well, that was within six hours, that was within 48.”

Peter Cockcroft:

Yeah.

Amelia:

And finding a common ground between them all. Yeah, a bit of a long-winded answer in saying yeah, I found it quite difficult and probably took me quite a bit of time and brain power to get there.

Peter Cockcroft:

No, it’s really good insight because it is challenging. It sounds simple in principle, but ultimately there is a subjectivity about ultimately what we do tease out or distill out.

I guess for me, a question perhaps starting with Oliver. What did you learn from the process of writing and researching a knowledge summary? A very broad question, so don’t dwell too much on it. From your perspective, what did you learn from putting yourself through the process of writing and researching a knowledge summary?

Oliver:

I think the main thing for me was no evidence found is some for of evidence in itself. I actually quite liked what Amelia just said a minute ago. I know that one of the tricky bits that I found, especially when I was in the earlier stages, was obviously assessing the evidence but then being able to do some form of comparison with it, where you do have some investigations. They’ve all been done differently and slightly different ways, and it’s how can you compare them to each other? And obviously, as part of that you then build a almost criteria in your head of what you’re looking for that makes something relevant evidence or not relevant to your case. Obviously, that criteria, it almost evolves as you go through it, as you’re going through the papers because you’ll find the paper that you initially thought had a good standard actually really pales in comparison to others. I had papers where I initially thought I would have included them, and then later on it’s actually no, I can’t include these based on what I have since assessed on others.

Peter Cockcroft:

Amelia, any additional thoughts to what Oliver has said? You’ve put yourself through this process and it’s taken you down a certain road. On reflection, I guess this is my additional question, was it all worthwhile and did you get something out of it?

Amelia:

Yeah, absolutely. Ultimately, I have made a paper that I have been able to synthesise from four papers down and concisely give an explanation as to what those papers basically did. There’s no credit taken by any means, because they did the hard work. All I was able to do was get their key points, summarize it into a nice way and say this is how it can relate to clinicians and producers. How can we improve welfare, ultimately?

I think it just informs us that critical knowledge summaries are really important tools for people to be able to go, “Okay. Well, there’s actually four papers within this one paper in a way,” that you’ve been able to summarise and get key points from. That’s a lot easier than having to go manually and finding those four papers as someone externally.

Peter Cockcroft:

Yeah.

Amelia:

I think it just accentuates that point of how important they are and there should be more out there, basically.

Peter Cockcroft:

I’ll look to Oliver. Did you change the way you think about evidence-based veterinary medicine? We often use this soundbite research into practice, and this is effectively what we’re trying to do. We’re trying to move things from research and have some understanding as where we can apply them into practice. I’m just wondering, having done a knowledge summary, whether it changes to where you think a little bit more about evidence-based veterinary medicine?

Oliver:

I think it does because I know that I’ve found myself since, especially during my last year at university, I found myself digging more into research papers rather than textbooks. Even now, just say for example if I’ve got an animal in that’s say got some comorbidities, I’ve actually found myself, where I can access it, looking and assessing research that’s been done for different arthritic protocols rather than just reaching for a textbook which doesn’t always give you the answer you’re looking for. Obviously a textbook will only cover certain eventualities and certain advice, whereas research tends to be more broad. But it’s just being able to assess what you’re looking at in that paper, and that’s the hard part. That’s where I think actually doing the knowledge summary has been quite good because I’ve managed to put that into practice doing the paper and then since.

Peter Cockcroft:

Amelia, any additional comments to that observation, about how you think about evidence-based veterinary medicine now having done a knowledge summary, if you like?

Amelia:

Well, I already have respect for it before, and now definitely even more so now. I’d say, adding to what Oliver said, keeping your knowledge base up-to-date is so important. There are key things that I’ve found throughout just by doing these studies that can be quite crucial to a clinician or producer, or anyone that this is relevant to. Keeping that up-to-date because things do change, and finding cases pop up, outliers pop up, and it’s really important that they’re in the literature. It just provides that, I suppose, blank area that textbooks don’t do. Ultimately, they have a set year and it sticks to that one, and they only bring them out when they can. These, ultimately knowledge summaries or papers, just scientific literature is brought out all the time and I think it’s super valuable.

Peter Cockcroft:

My question is would you do it all again, having gone through the process? Secondly, I think you’ve already told us, that you consider it to be important. Again, if you have any parting thoughts about the importance of somebody doing knowledge summaries? We not call it knowledge summaries, but actually drawing together fragmented information from a variety of different papers and asking a specific question that may have value to us as clinicians when we’re in practice. Any part comments in that regard? Is it important that we do knowledge summaries? Does it prevent the workload falling on clinicians if we’ve asked a specific question and they’re also asking that specific question? If they can look at one place and find that somebody’s already looked at the literature, and collated that information, and come up with a clinical bottom line, then obviously it shortens the journey from research to practice and the application of that research into practice would be my personal view, but you may have a different view.

Amelia:

I would say, as a clinician now, I couldn’t do one. But it’s definitely something I would consider doing if I wasn’t going down that clinical route because I found it super rewarding. We’re in the early days of publishing our papers so we haven’t heard a whole lot about how that’s going to help others. But it’ll be really rewarding to know if it does, and ultimately give me that boost to potentially do it again. It is a lengthy process. I don’t think that it couldn’t be done as a clinician and doing it at the same time, it might take longer. But ultimately, if someone has that void that needs filling, it’s a really good way of doing it.

Look, it’s quicker than doing a blind clinical trial or something like that. If the work’s already out there and it just needs that summary and that concise take on it, I think it’s really valuable. Yeah, I think definitely worthwhile.

Peter Cockcroft:

Oliver, one last comment from you. We’ve obviously put you through a bit of a review process. How did you find that review process, the communication between yourself and associate editors and reviewers, and so on and so forth? Did you find it quite brutalising, or did you find it quite supportive and helpful?

Oliver:

I think it very much depended on the reviewer. The thing is the hard part was when you had different reviewers disagreeing with each other.

Peter Cockcroft:

Yeah, fair point.

Oliver:

If that makes sense?

Peter Cockcroft:

Yeah. Yeah, it does.

Oliver:

I know that in my process I had one person who didn’t like a certain section of the search parameters, where everyone else was happy with it. It’s trying to work out how do I approach this? Because I risk changing something to meet one person’s expectations, but then by doing that, I then don’t meet other people’s expectations. It’s finding that specific balance to either achieve that balance or be able to justify why you’re picking one side over the other.

I just think as well, there were some comments where everything lined up, it agreed. Everything lined up and they all said similar things, and it was very nice and easy to make those amendments. And others where it was just trickier to really nail down even things like phrasing, just so things couldn’t be misunderstood because we all read things slightly differently and interpret things slightly differently. It’s just trying to work out not necessarily a right or a wrong but falling on the right side of the fence that makes the most sense for the most people.

Peter Cockcroft:

I think that’s a really good final comment from you, Oliver. I think that’s a fair commentary of the review process. Science isn’t categorical, it is subjective. Reviewers have different insights and different viewpoints. Ultimately, it is about defending your work. Or at least acknowledging that, “Yes, there is a problem and I need to change it.”

I just want to thank you both for participating this morning. I think you’ve been very transparent about your experiences. Hopefully that’s given others insights, and perhaps some encouragement, to have a go at doing knowledge summaries, and in particular perhaps having a go at the competition.

I’ll congratulate you both again. You’ve both produced publishable pieces of work, which is a fantastic achievement at your stage of development as a professional veterinary surgeon. My congratulations again. Again, thanks for participating in what I think has been a very insightful podcast.

RCVS Knowledge:

Thank you for listening to this podcast from Veterinary Evidence. We publish freely available content related to evidence-based veterinary medicine and its application to enhance the quality of patient care. Learn more at veterinaryevidence.org.

Our transcripts and closed captions are generated manually and automatically. Every effort has been made to transcribe accurately. The accuracy depends on the audio quality, topic, and speaker. If you require assistance, or something doesn’t seem quite right, please contact ebvm@rcvsknowledge.org

Further reading

Read Amelia’s Knowledge Summary on Veterinary Evidence

Read Oliver’s Knowledge Summary on Veterinary Evidence

Veterinary Evidence

Veterinary Evidence is an online, open access, peer-reviewed journal owned and published by RCVS Knowledge. It publishes content relating to evidence-based veterinary medicine (EBVM) and its application in veterinary practice to enhance the quality of care provided to patients.

Related resources