Tag Archives: publishing-statistics

Microsoft Academic Search

With scholarly communications on the brain, I was thrilled to learn about Microsoft Academic Search (MAS).  Building on the work of resources like Google Scholar that gathers and indexes journal articles and Scopus and Web of Science that gathers information about how and where articles are cited by other researchers, Microsoft Academic Search brings these two items together in a *free* search. MAS also enhances the information with direct organization and department scholarly output comparisons.  Some features, like the Call for Papers (CFP) Calendar, are very much in beta but would be an overwhelmingly useful tool for academics to help manage the many, many CFPs and conference submission deadlines.

To better understand my resource comparisons, I took a sample author and compared the citation results in MAS and Scopus; images of my search results appear in the slideshow below.  The author information in MAS and Scopus both listed 4 articles publications, indicating that perhaps this information was correct.  Yet, when I looked at the 4 articles listed in each, I found that MAS had a duplicate entry for one article and, therefore, completely missed a publication.  Using the Help on MAS, I could get a look at all of the publishers they are working with and could identify that the missing article was from Wolters Kluwer/Lippincott Williams & Witkins.  For the health sciences, nursing, and medicine, they are a pretty major publisher so the absence of their information in MAS would be a significant hindrance.  At the same time, the openness with which Microsoft lists their sources made this tracking down easy.

At the article level, in Scopus, the article “Breast cancer disparities and decision-making among U.S. women” was cited 24 times since its publication in 2007 (although Scopus lists the default “Cited by since 1996”) and has 84 references within the paper.  In MAS, only 60 references are listed and 15 cited papers are listed.  What happened to the rest?  This may be one of the drawbacks to the *free* service.  Looking a little closer at the details, it appears that Scopus is more complete while MAS currently stops at about 2010 for its citations.  I’m guessing a similar limitation may be the issue with the references.  While the resource appears to be in a very early beta phase, may need some data corrections, and, currently, it still doesn’t seem to account for other measures of impact such as social media sharing, it looks like I have another new toy to share with faculty in the Fall….

This slideshow requires JavaScript.

For updates on MAS, you can stalk, er, follow them more directly @MSFTAcademic

Advertisements

Who’s running Wikipedia?

From a Twitter message on my TweetDeck to an eContent blog to CNet, I stumbled across an article debating the current state of Wikipedia.  At the community college level, Wikipedia has been the bane of many librarians’ existences.  While I think Wikipedia is a useful tool for students to get familiar with a topic or argument, develop a vocabulary for searching other resources like databases and library catalogs, and often provide links to credible resources, the general teaching philosophy remains to ignore and/or preach against Wikipedia as a research tool for students since “everyone” can edit an entry.  Or so we think…

According to Augmented Cognition Research Group at the Palo Alto Research Center, a hierarchy of editors seems to be developing amongst Wikipedia editors while a plateau also seems to be occurring among the number of editors participating in the site.  For the frequent contributors (1,000+/month group), new entries seem to be accepted willingly while occasional users face significantly higher reversion rates for content they’ve provided.  While the information provided thus far appears to be creating a stratified structure between groups, I have to still wonder who are these frequent providers and should we be trusting their judgment.  Unlike in other open source communities, Wikipedia’s filter appears to be based on frequency of postings, not any educational or professional background or skills.

Moving away from the open source concept, what will the future of Wikipedia have if an active group controls the participation of others?  Who are these people?  How are the edits affecting the quality of postings? With more questions than answers at this point, I’ll be curious to see what other research the group can provide on the topic at their presentation at WikiSym2009.

Research impact factors – How can libraries get in on this?

While browsing my Twitter feed, I stumbled across the article in the Chronicle of Higher Education discussing the latest tool for analyzing the impact of a researchers’ publication in a discipline.  Instantaneous feedback from programs like Google Analytics has helped bloggers, website creators, and librarians increase their understanding of who, where, and when their sites are being accessed and read.  Among the journal publishers, Elsevier is now coming to the impact analysis part with their SciVal Spotlight tool.  In contrast to other analyzers like Google Scholar, Reuters Thomson’s citation indices, and Springer’s AuthorMapper (it’s free!), Elsevier attempts to review and categorize articles (instead of journals) into one of their 80,000 clusters, allowing for “a much more precise picture of influential work in emerging fields.”  In conjunction to the bibliometric analyzer, Elsevier also looks to be creating a SciVal Funding database to connect researchers to funding opportunities.

Now, as libraries face budget cuts and collection development demands, tools like these could potentially be of great value in researching and ranking resources of interest and value for our institutions.  Our researchers would also benefit from access to this resource to understand other, related research in the field.  However, this all assumes a level of accuracy in the cataloging of these articles.

Sadly, I feel somewhat disappointed in the premature results, as the same problems that face the keyword indexing of journals in databases remains anything but precise and consistent, continues in the citation indexer.  While I don’t have access to Elsevier’s edition, I did test some searches in Springer’s AuthorMapper.

Image of AuthorMapper webpage, including search bar and Google maps mashup with location of authors
Image of AuthorMapper webpage, including search bar and Google maps mashup with location of authors

Although a variety of subjects are listed below for browsing, I chose to conduct my own topic search on Dante. What I found was that most of my results were not based on the author Dante Alighieri of the Commedia, but others, particularly in the sciences, that had Dante somewhere in their name and primarily were based in South America.

Map and list of keywords retrieved for "Dante" search results in AuthorMapper
Map and list of keywords retrieved for "Dante" search results in AuthorMapper

Like any good librarian, I decided perhaps the fault was my own for not refining my search phrase enough.  So, using his full name Dante Alighieri, I was able to find many more results related to the specific Dante I was after (yes, the results under Mineralogy do related to Dante Alighieri).  However, upon reviewing the various facets, I found that while 18 articles were tagged as Comparative Literature and Linguistics, 12 articles were still related to Medicine and Public Health.  While the concepts of medieval science do arise throughout Dante’s writing, 12 seems a bit excessive.  Upon reviewing the journals list, Deutsche Zeitschrift fur Chirugie (Langenbeck’s Archives of Surgery) is the 4th most popular journal to appear in the list!

Google Maps mashup and keywords listing for AuthorMapper results for Dante Alighieri
Google Maps mashup and keywords listing for AuthorMapper results for Dante Alighieri

These inaccuracies are evidence to just some of the problems bibliometric analyzers have in reviewing research, let alone the issues of citation inflation by colleagues and friends.  Even misspelling Dante’s name as Dante Aligheri still provides another single search results that falls outside of its proper retrieval space.

So, in short, while these tools may be helpful, like the Internet, information literacy will be key in deciphering these results as validity is not guaranteed.