Brown University Homepage Brown University Library

Search relevancy tests

We are creating a set of relevancy tests for the library’s Blacklight implementation.  These tests use predetermined phrases to search Solr, Blacklight’s backend, mimicking the results a user would retrieve.  This provides useful data that can be systematically analyzed.  We use the results of these tests to verify that users will get the results we, as application managers and librarians, expect.  It also will help us protect against regressions, or new, unexpected problems, when we make changes over time to Solr indexing schema or term weighting.

This work is heavily influenced by colleagues at Stanford who have both written about their (much more thorough at this point) relevancy tests and developed a Ruby Gem to assist others with doing similar work.

We are still working to identify common and troublesome searches but have already seen benefits of this approach and used it to identify (and resolve) deficiencies in title weighting and searching by common identifiers, among other issues.  Our test code and test searches are available on Github for others to use as an example or to fork and apply to their own project.

Brown library staff who have examples of searches not producing expected results, please pass them on to Jeanette Norris or Ted Lawless.

— Jeanette Norris and Ted Lawless

Best bets for library search

The library has added “best bets” to the new easySearch tool.  Best bets are commonly searched for library resources.  Examples include JSTOR, Pubmed, and Web of Science.  Searches for these phrases (as well as known alternate names and misspellings) will return a best bet highlighted at the top of the search results.

bestbets

To get started, 64 resources have been selected as best bets and are available now via easySearch.  As we would like to know how useful this feature is, please leave us feedback.

Thanks to colleagues at North Carolina State University for leading the way in adding best bets to library search and writing about their efforts.

Technical details

Library staff analyzed search logs to find commonly used search terms and matched those terms to appropriate resources.  The name, url, and description for each resource is entered into a shared Google Spreadsheet.  A script runs regularly to convert the spreadsheet data into Solr documents and posts the updates to a separate Solr core.  The Blacklight application searches for best bet matches when users enter a search into the default search box.

Since the library maintains a database of e-resources, in many cases only the identifier for a resource is needed to populate the best bets index.  The indexing script is able to retrieve the resource from the database and use that information to create the best bet.  This eliminates maintaining data about the resources in multiple places.

Announcing a Researchers @ Brown data service

Campus developers might want to use data from Researchers@Brown (R@B) in other websites. The R@B team has developed a JSON web service that allows for this.  We think it will satisfy many uses on campus. Please give it a try and send feedback to researchers@brown.edu.

Main types/resources

  • faculty
  • organizational units (departments, centers, programs, institutes, etc)
  • research topics

Requesting data

To request data, begin with an identifier.  Let’s use Prof. Diane Lipscombe as an example:

/services/data/v1/faculty/dlipscom

Looking through the response you will notice affiliations and topics from Prof. Lipscombe’s profile.  You can make additional requests for information about those types by following the “more” link in the response.

/services/data/v1/ou/org-brown-univ-dept56/

Following the affiliations links from a faculty data profile will return information about the Department of Neuroscience, which Prof. Lipscombe is a member.

/services/data/v1/topic/n49615/

Looking up this topic will return more information about the research topic “molecular biology”, including other faculty who have identified this as a research interest.

Responses

Faculty

  • first name
  • last name
  • middle
  • title
  • Brown email
  • url (R@B)
  • thumbnail
  • image – original image uploaded
  • affiliations – list with lookups
  • overview – this is HTML and may contain links or other formatting
  • topics – list with lookups

Organizations

  • name
  • image (if available)
  • url (to R@B)
  • affiliations – list with lookups

Topics

  • name
  • url (to R@B)
  • faculty – list with lookups

Technical Details

  • Requests are cached for 18 hours.
  • CORS support for embedding in other sites with JavaScript
  • JSONP for use in browsers that don’t support CORs.

Example implementation

As an example, we have prepared an example of using the R@B data service with JavaScript using the React framework.