The introduction of federated search engines (e.g. MetaLib) seems to open up an opportunity for some kind of ‘automatic researcher’. I’m thinking of a piece of software that would do sequential searches on a variety of sources, and put together a ‘reading list’ of relevant references.
Just to describe how this might work:
The researcher puts together a list of keywords, and defines a starting point (e.g. a list of databases).
The federated search engine does a search on the databases specified by the researcher
From the results, the software could compile a number different search ‘facets’ to then continue to search on these facets. These facets could be, for example, subject words not specified in the original list and author names.
Alternatively it could something like find all the papers which cite, or are cited by, a paper retrieved by the original search.
The effectiveness of this kind of functionality would depend on the databases available for cross-searching, how effectively the results can be ‘relevance’ ranked, and how much structure their is in the retrieved records (the more context available for the retrieved records, the better I guess).
In combination with a link-server (e.g. SFX) and local library catalogue, you can even see this being able to prioritise the material easily available to the researcher…
I think all the pieces are actually already in place for this, but the functionality isn’t quite there yet. I wonder if anyone would be interested in funding a bit of research in this area… – a couple of months work with a federated search engine supplier should really be enough to get this up and running.
(I’ve used the Ex Libris products as examples here, just because these are the ones that I am familiar with, so I can kind of see how it could work using them. I’m sure the similar products from other vendors could do the same kind of thing)