Putting Primo to the Test (Against Google Scholar)

My university’s library recently acquired access to Primo Discovery Services, which they have made the default search on their web page. So, I decided to give it a spin and do a side by side test with Google Scholar (which is my normal research avenue, mostly to avoid the need to search a dozen different databases).

I tested the topic of play for literacy learning, using the following searches:

  • “play theory” and literacy
  • play-literacy

Both searches yielded relevant results in Google Scholar. Primo was a different story.

My top results in Primo for “play theory” and literacy (simple search, sorted by relevance, limited to peer-reviewed)


My top results in Google Scholar for “play theory” and literacy (using library links)


As you can probably tell, relevance was clearly better for Google Scholar. Though, only two of the results in Google Scholar are available at my university’s library.

I had better luck with the play-literacy search.

My top results for play-literacy in Primo (simple search, sorted by relevance, limited to peer-reviewed)


My top results for play-literacy in Google Scholar (using library links)


Top results of the two for play-literacy only yielded one common article, although three of the articles from the Google Scholar results are available at my library. Interestingly, there were an adequate number of relevant articles from my Primo results, I just had to dig deeper to find them. For example, drilling down into specific databases was more fruitful than cross-searching (which kind of defeats the purpose).

Overall, Google Scholar beat Primo for this particular search. And I’ve had similar frustrations with Primo for other research topics. It could be that my library simply lacks coverage of the areas of research I’m interested in (true to some extent). Or it could be due to vastly different ranking algorithms (Google Scholar weighs citation counts heavily). Whatever it is, I’m sticking with Google Scholar for the time being.

This also makes me wonder about the value of discovery search tools. Who are they good for? From a library skills development point of view, I think caution should be taken when promoting them with beginning researchers. More critically literate researchers may find them useful, but most likely only those who are unsure where to start or are in multi-disciplinary fields. Seasoned researchers who know precisely what they need will likely feel the same frustration as I feel when results are not exact enough (and when other databases just work better for them).

Why Discovery Tools Are a Bad Idea for Beginning Researchers

The other day, I came across a pre-print article for C&RL that looked at the search effectiveness of different discovery tools, including EBSCO Discovery Service, Summon and Google Scholar. Even though EBSCO came out on top, the authors noted that students were generally incapable of evaluating sources and they relied heavily on default settings. That doesn’t surprise me.

I’m all for one-stop shopping in research. I do it myself. Google Scholar with library links is much less cumbersome than individually searching the dozen or so databases that cover ed tech and library literature. But I’m an information expert, a master of search strategies, and highly capable of analyzing my results. I also know when to go into an individual database for further research. And that’s the problem with discovery tools – they require a higher level of skills than novice researchers possess.

There are four roadblocks that make discovery tools a bad idea for beginning researchers:

  1. Poor search strategies. We all know how bad students are at developing strong search strategies. It takes practice, and they are accustomed to the ease of Google searching. Until discovery tools perfect semantic searching (and they haven’t), strong search strategies are key to finding the most appropriate information. Beginning researchers are still developing those skills.
  2. Information overload. I’ve talked about this before. Too many results lead to cognitive overload. This is particularly problematic for novice researchers (they’re overwhelmed). It’s one reason why students tend to select sources from the first page of results.
  3. Reflective judgment. I posted about this last week. There are seven stages of reflective judgment. Beginning researchers are typically in stage 3 when they enter college. This impacts both critical thinking and information literacy. The C&RL study found that students tended to place their trust in the authority of the search tools because of uncertainty in evaluating sources. That should be expected of a student who is still in stage 3 of reflective judgment.
  4. Reading comprehension. I don’t think this is talked about enough. Reading is the basis of all learning, and reading comprehension impacts the entire research and writing process. According to a 2006 study by the American Institutes of Research, only 38% of 4-year college students are graduating with proficient levels of literacy (23% for 2-year college students). Proficient literacy is defined as “reading lengthy, complex, abstract prose texts as well as synthesizing information and making complex inferences.” Proficient literacy is essential for information literacy. Read the full report here.

So what’s the best one-stop shopping alternative to discovery tools? For beginning researchers, I prefer Credo Reference or any of the Gale databases (e.g. Student Resources in Context, Opposing Viewpoints). And I’m betting that most librarians out there already know this. In my own experience, these are also the databases that students tend to go back to once they discover them. Why? Because they are well-designed and just-right for the developmental level of beginning researchers. They are also topic-driven, so much more relevant.

With so many databases out there, knowing where to start is a challenge. Discovery tools are the attempt at remedying this. But they’re a bad idea for beginning researchers for all the reasons above. I think it’s a much better idea to encourage faculty to scaffold (limited) database suggestions into the design of their research assignments.