This week I encountered two very different training scenarios which once again reinforced the variety of skills and experience needed to do this job. Both training sessions occurred on the same day, which probably made the juxtaposition more evident in my mind but I thought it would be a good topic to reflect on in this blog.
The first session was a refresher for the Orthopaedics registrar who had come in to see me last week about his dissertation. His topic was about treating Achilles tendon with various non-surgical immobilisation strategies such as bandaging or binding and in the last session we had done a combination title-MeSH search using surgical and non-surgical terms. Now he wanted to focus on the non-surgical and try searching all fields (.af) instead of just title. I had a suspicion this would definitely increase the number of articles obtained but they might not be as relevant. We built a good search though and in this session I really took a backseat and let him get on with it, making suggestions and giving encouragement when he needed it. He was rather disappointed by the large number of results, especially when we re-ran the search in Embase, CINAHL and AMED and he is going to try title and abstract next, rather than all fields. It was interesting that he never actually looked at his results, just at the number of results and I suspect he felt a bit overwhelmed by it all. I don’t think anyone would be able to fault his search strategy however and it really brought home to me just how individual every search is and the importance of trying different things. It was also reassuring that he had picked up what I had shown him last week and seemed relatively comfortable with using the databases and combining terms. To be honest I don’t think I really needed to be there but I think he wanted some reassurance that he was doing it ‘right’. The session last just under an hour and a half.
My second session took place in the afternoon with a staff nurse who confessed herself she was not comfortable with computers and instructed me to talk to her like she was ‘stupid’. She had the additional problem that she was suffering from vertigo and had difficulty concentrating. But she had an essay to write by the beginning of January and had not written anything academic for over eight years! Our search was on treatment of acne in adolescents, which was fairly straightforward but I found myself having to explain things a few times over and I could tell by the middle of the session she was starting to flag. The session lasted just under an hour.
I have been trying to get the trainees to do more themselves recently, rather than me showing them how to do things, and am finding it incredibly frustrating! I am so familiar with the HDAS interface and use it all the time so watching someone try to use it for the first time and clicking all the wrong things is very difficult for me. It’s very tricky trying to get the balance between keeping the search relevant to them (e.g. choosing a topic that they want to search) while making sure all the basics of searching effectively are covered (e.g. truncation, phrase searching, MeSH). If a user wants to find articles on nursing and mentoring, they naturally want to type “nursing and mentoring” in the search box (like Google!) not “nurs*.ti AND exp NURSING” AND “mentor* and exp MENTORING”. It’s not logical to them, even though it’s perfectly logical to me. This is something I must overcome as it is only by doing that trainees are truly going to learn. Perhaps I should start the session by getting them to put in the terms they would naturally use to search then go from there, demonstrating through trial and error that truncation, phrasing and MeSH terms equals more results and more relevant results (hopefully!).
Both sessions were very interesting in very different ways and I found myself utilising different skills in each one. The first was rigourous on an intellectual level while the second tested my skills in teaching and patience. In terms of what was learned I think the first session was probably more successful but am hoping the nurse took something away which she will be able to put to use later. We did at least find some relevant articles for her essay, so that was a bonus!
On Friday I hosted what I am hoping will become a regular event in the library for the next six weeks or so. It was a Point-of-Care Tools “Face-off” which gave participants the opportunity to test out three leading point-of-care tools currently on the market; Up-to-Date, Best Practice and Dynamed. We subscribe to Up-to-Date through the Trust already but Best Practice (from the BMJ) and Dynamed (an Ebsco product) we have on a trial basis through the London Deanery until 31 December 2010. Basically what they do is present a medical condition and walk the clinician through it from initial diagnosis through to treatment and follow-up, all backed up with the latest evidence. A clinician’s dream you might think (!) but the uptake of these tools has been fairly slow and I really wanted to find out if they have any practical use in the real world by getting those in the real world of medicine to try them out.
I advertised the Face-off to just the medical students and FY1s/FY2s this time around, liaising with the medical students coordinator and emailing all of them, following up with a reminder this week. There wasn’t much interest unfortunately and I’m not sure whether this is because they are too busy or just genuinely not interested. I tried to use hooks in the email to get their attention (e.g. “good medical exam resource”) and even offered food but in the end we only had two medical students come along, plus my manager, the library assistant and a student radiologist. However they were very enthusiastic and gave some very helpful feedback. I kept the session very simple and informal (more of a drop-in rather than an actual session) where they chose one condition and tried out the tools searching on that one topic and filling out an evaluation for each one. Unfortunately we lost access to Dynamed in the afternoon (it had been working fine in the morning) so the session just trialled Up-to-Date and Best Practice. I wrote up their vote and comments on the whiteboard and so far Up-to-Date has 3 votes and Best Practice has 2 votes. Interestingly the clinicians both voted for Best Practice.
I am hoping to run more of these Face-offs every Friday now until mid-December and advertise them to the wider Trust. I now have all the materials set up and it will just be a case of topping up the refreshments every other week or so. Will be sure to record what happens on this blog!
A blog reflecting on professional life as a medical librarian / information skills trainer. Topics include information literacy, training, medical/health librarianship, the role of libraries in the internet age, Web 2.0,... If your eyes are glazing over already, go no further gentle reader...
Showing posts with label literature searches. Show all posts
Showing posts with label literature searches. Show all posts
Sunday, 7 November 2010
Monday, 21 June 2010
"A medical degree is not required for this job...but it wouldn't hurt!"
I had a very interesting clinical query last week which took me completely out of my comfort zone and made me think afterwards "I must blog about this!" We had a call from a registrar on the wards on Tuesday asking if we could track down the lowest ever recorded sodium level (in a human) in the literature. He had a patient with a sodium level of 100 and he wanted to see if there had been any recorded lower. "Well that shouldn't be too difficult" thought I, but having sat myself down in front of Medline it took me ages to think how to approach the question in a standard database query sort of way. I started out very basically entering "low sodium", "lowest sodium" or "sodium requirements" as a title search but just got loads of articles about dietary sodium levels and things to do with animals. I then branched out to a thesaurus search for SODIUM and HUMANS and "low*" but again, too broad. I had to take a break then to do a one-to-one session with an OT but I asked the library assistant and senior library assistant to have a think as well.
An hour or so later I was back on the case. The library assistant had found a wonderful site on medical world records but unfortunately sodium levels were not exactly exciting enough to merit an entry. The senior library assistant suggested I consult some books on fluids and electrolytes which proved to be an excellent move because from these books ("Fluids and Electrolytes Made Incredibly Easy" and "Fluids and Electrolytes: A 2-in-1 Reference for Nurses") I discovered that a low sodium (or serum sodium) level was known medically as hyponatremia and it is measured in terms of MeQ/L. A serum sodium level of 100 is dangerously low (low is considered to be between 120 and 135) which made me hope our registrar was treating this patient and not sitting around waiting for my answer!
I returned to my Medline search and search hyponatremia in the title and the thesaurus, combined these searches with OR then searched "100 MEQ/L" OR "less than 100 MEQ/L" and added these searches to the hyponatremia search. I got 6 results, which were not bad but still not quite what I was after. I started working down from 100 MEQ/L to 90, then 80, 70 and 60 but that didn't work very well so I then tried combining the hyponatremia searches with "severe" and "serum sodium level" which, while not giving me the answer, helped me to gain a better knowledge about what I was looking for. In the end I found the combination of hyponatremia (title and thesaurus) AND "serum sodium.ti,ab" AND "MEQ/L" AND "severe.ti.ab" gave me a good range of low sodium levels, both case studies and research on groups of patients and I could be reasonably sure that the lowest serum sodium level ever recorded in the literature is 99 MEQ/L.
It was a very interesting search, not least of which the way it illustrated the different ways humans and computers "think". Funnily enough, the library assistant told her friend about the search a few days later and he promptly put in "lowest sodium level recorded" into Google and came up with a relevant article mentioning 99 MEQ/L as the lowest ever but without having done the search myself on a database I couldn't have been certain that this was the right answer. I certainly now know more about sodium levels than I ever wanted to know!
An hour or so later I was back on the case. The library assistant had found a wonderful site on medical world records but unfortunately sodium levels were not exactly exciting enough to merit an entry. The senior library assistant suggested I consult some books on fluids and electrolytes which proved to be an excellent move because from these books ("Fluids and Electrolytes Made Incredibly Easy" and "Fluids and Electrolytes: A 2-in-1 Reference for Nurses") I discovered that a low sodium (or serum sodium) level was known medically as hyponatremia and it is measured in terms of MeQ/L. A serum sodium level of 100 is dangerously low (low is considered to be between 120 and 135) which made me hope our registrar was treating this patient and not sitting around waiting for my answer!
I returned to my Medline search and search hyponatremia in the title and the thesaurus, combined these searches with OR then searched "100 MEQ/L" OR "less than 100 MEQ/L" and added these searches to the hyponatremia search. I got 6 results, which were not bad but still not quite what I was after. I started working down from 100 MEQ/L to 90, then 80, 70 and 60 but that didn't work very well so I then tried combining the hyponatremia searches with "severe" and "serum sodium level" which, while not giving me the answer, helped me to gain a better knowledge about what I was looking for. In the end I found the combination of hyponatremia (title and thesaurus) AND "serum sodium.ti,ab" AND "MEQ/L" AND "severe.ti.ab" gave me a good range of low sodium levels, both case studies and research on groups of patients and I could be reasonably sure that the lowest serum sodium level ever recorded in the literature is 99 MEQ/L.
It was a very interesting search, not least of which the way it illustrated the different ways humans and computers "think". Funnily enough, the library assistant told her friend about the search a few days later and he promptly put in "lowest sodium level recorded" into Google and came up with a relevant article mentioning 99 MEQ/L as the lowest ever but without having done the search myself on a database I couldn't have been certain that this was the right answer. I certainly now know more about sodium levels than I ever wanted to know!
Labels:
electrolytes,
hyponatremia,
literature searches,
low sodium,
Medline,
serum sodium
Subscribe to:
Posts (Atom)