Review of College Scholarship Databases – First Cut

February 3, 2012

I wondered what kinds of financial aid might be out there, for people interested in undergraduate and graduate education.  As I started into this inquiry, it became clear that I was dealing with a big, rangy issue.  The approach I have sometimes taken with big, rangy issues has been to go through them at length, first time around, and then just refer back to that first analysis in my later attempts.  That approach has resulted in more streamlined treatments when the issue has come back up, months or years later.  Sometimes the long writeups have proved to be useful, not only for people who want the information I am seeking, but also for webmasters and others who appreciate a user walk-through of their site.

Filling Out the Forms
First Comparison of Search Results
First Look at Individual Scholarships
Data Analysis

* * * * * * *


I started with a search for websites oriented toward financial aid.  This search produced a lot of results.  I ran a different search, looking this time for scholarship databases or search engines, and got a more targeted list.  That search, and subsequent digging, led to some impressive and diverse lists of scholarship databases, such as those offered by, UCLA Scholarship Resource CenterFinancial Aid Finder,, Central Maine Community CollegeKern Valley High School, the SallieMae Fund,, the Philadelphia Foundation, and  Collectively, these sites offered links to dozens of sources of information, leading to thousands of scholarships.

In this investigation, I did not try to come up with a list of the many other sites (e.g.,, Education Corner, Admission Hook) that did not provide a database that would yield results in response to a profile, but did provide writeups, sorted lists, advice, and other relevant information.  Nor did I try to examine all of the general and special-purpose scholarship databases, lists, and other sources named in all of the foregoing lists and databases — never mind the thousands of books and software that I could buy (or perhaps take out of the library) that might help me with a scholarship search.  Nor did I even begin to sample the possibilities that could arise from free-form searches on Google or other general-purpose or customized search engines.  Indeed, my own search (above) could easily be modified to focus on specific needs and/or areas of interest.

Like most people, I had limited time and patience for this sort of thing.  So I started by going for those scholarship databases that appeared to be the most comprehensive and/or best-known.  After the first couple of writeups (below), I started following my nose, developing my approach to this inquiry on a free-form basis, shaped by emerging information.  That is, as I worked through various lists, I came across other information that influenced both the list of lists provided in the first paragraph (above) and the individual databases that I decided to scrutinize (below).

By the time I was done, it appeared that most competently run scholarship funds that were handing out respectable amounts of money would tend to be aware of at least some of these sites, and would list their offerings accordingly.  In other words, it seemed unlikely that I would have overlooked many major scholarship sources.  That’s not to say the world of scholarships is organized and logical.  It’s still a sprawling mess, doubtless disappointing atypical students and funders alike.  It surely intimidates many thousands of students from giving it a good shot — because they think they are not the right type or would otherwise not qualify, or because they don’t like or understand the process, or because they focus on the wrong sources.  This area of higher education funding, like others, could stand some improvement.

This post’s discussion proceeds in two steps.  First, I identify some major scholarship websites, and describe my experience as I went through the process of filling out the profiles and providing the other information necessary to produce lists of scholarships.  This part of the discussion has to do with procedural nuts and bolts — with how the site felt and, roughly, what it produced.  Procedure is important.  As just noted, it can have a significant effect upon a site’s usefulness and ability to achieve its mission.  That’s the first set of bullet points listed below.  Second, I return to those websites and describe the substance of what I found there.  In that second section of bullet points, I mostly ignore the mechanics and focus instead on the results.

Filling Out the Forms

Probably like most students, I did not begin with clear guidance on which scholarship websites might be most helpful.  It would have been useful to have, say, a federal report (or even a scholarly review) stating (for each site) the numbers of scholarships listed, the numbers of students who obtained scholarships through each of these sites, the amounts of scholarship monies obtained altogether and per person through such sites, the amounts of money or other commercial connections between these sites and other entities (e.g., student lenders, universities), and so forth.  Such information could have been very helpful in focusing my attention, like that of other potential students, on appropriate sites.

I could have done research into the scholarly literature.  I decided not to, for three reasons.  First, I had limited time, and good research can take a while.  And second, as this project developed, I decided that I wanted to provide (for the benefit of people who design and recommend scholarship search sites) some indication of what a student might experience as s/he goes through the process, as distinct from a comprehensive, definitive, and soon outdated statement of which sites are the most important.  In other words, this post is very much about the process as well as the outcome.  Third, what I have done here actually *is* research, albeit of an exploratory nature.  Focused, precise research is most feasible when the territory has already been mapped out with some specificity, whereas I was not encountering indications that this frontier had yet been entirely settled.

As I worked through the various databases, I did not attempt to enter data about myself in a rigorously consistent manner.  In part, it would have been impossible to do so, since different databases asked different questions, and also asked similar questions in different ways.  For instance, some just threw a list of organizations at a student, without providing much of an explanation of what should guide his/her selection among them, while others explained to varying degrees that what was desired was, say, an indication of whether the student or a member of his/her family (presently?  in the recent past?  ever?) belonged to any of them.  Such variations probably had some effect:  I might have gotten 10 or 20 more hits on one site, or 10 or 20 fewer on another, if I had answered the questions differently.  In a few cases, as largely captured in the following comments, the website’s design or questions were such as to inspire significantly more expansive or taciturn responses.  It also might have made a difference if I had approached the websites in a different order, applying my learning from one to the next, as I came to have a hopefully clearer sense of what they wanted, or about what I could say about myself.

At any rate, with these caveats and limitations, the major scholarship websites for which I filled out information, and my reactions as I did so, were as follows, taking the sites in the order in which I viewed them:

  • SchoolSoup.  The scholarship search on this site made me answer tons of questions, as I expected.  I decided to err on the liberal side, checking all sorts of things that might be related to what I wanted to do.  This resulted in a list of more than 500 scholarships.  (Greater restraint on my part was doubtless partly responsible for the far smaller numbers of hits found in other databases, below.)  There were no global options to eliminate numerous items (e.g., those whose deadlines had already passed) with a single click.  There were not even checkboxes to mark those that I wanted to remove or archive.  As a third-best, I went down the list and right-clicked on those with expired deadlines, so as to open separate webpages for each.  Then I closed those webpages.  This changed the expired ones from “Unread” to “Read,” but did not remove them from the list.  Even after I left the site and came back later, they still had that same list of 518 items.  Eventually I realized that I could click on Step 1 or Step 2 or Step 3, at the top of the webpage, and modify my search criteria.  But this wouldn’t show me how many items I would be adding to or removing from the list by making a change.  It would have been better if I had been able to see, somewhere, an indication that 187 of the items in the list were there because I had clicked “Administration & Leadership” as an area of career interest, or that 28 items would drop off the list if I changed my GPA from 3.9 to 3.8 — or whatever.  I did uncheck the Administration & Leadership item and then went back to the top of the page and clicked View Scholarships but, no, that hadn’t made any difference:  I still had 518 items.  But if I unchecked Administration & Leadership and then clicked Next … Next … Next to get through Steps 2 and 3, I could discover that I had now reduced the list to — you guessed it — 517 items.  Woo hoo!  I could spend all day at this.  I liked having 517 items, but I didn’t like the very limited options to see what was going on in this list.  They displayed them 20 at a time, so I would have to go through 26 pages of listings.  For many of their entries, they did not display deadlines.  They also forced me to look at an awful lot of ads for their featured colleges, and almost always just for Grand Canyon University, which — pardon me, but when you say a word over and over again, soon it loses its meaning and becomes ridiculous.  The more problems those ads caused me — with my password manager, with my browser’s Back button — the more I came to hate this site.  They didn’t even offer a search box for scholarships that had already appeared on my results list!  If I wanted to take another look at one, I had to page down through all those other results.  It got worse when I returned to the website later.  I had made a list of the items I had encountered during my previous visit, but now some items on that list no longer appeared on the webpage, and others were added that had not been there previously.
  • FinAid.  In my impression, this was the granddaddy on the list — it had been in business back in the late 1990s, as I recalled, and maybe earlier.  They claimed that theirs was “the largest, most accurate and most frequently updated scholarship database.”  It developed that the super database being promoted by FinAid was not their own (or at least was not called) the FinAid database.  It was, rather, the FastWeb database (below).  FinAid gave me a link to a list of other free scholarship search websites.  They also included links to a number of specialty lists, including especially prestigious undergraduate scholarships and graduate fellowships, and colleges that offered full-tuition scholarships.  Items on these lists did not necessarily turn up in my search results (below).
  • FastWeb.  In general, FastWeb was probably cited by more websites than any other scholarship database.  I was not sure, though, whether that popularity was due to performance or merely to longstanding habit.  Maybe a lot of people had just heard, or had long been telling others, that this was the go-to site.  Filling out their questionnaire had its stumbling blocks.  First, I wanted to help them pay for their free service by sharing my basic data with their advertisers, but I didn’t want to be getting junk calls on my cellphone.  But even if I unchecked the “share data” box, they still insisted on having a phone number, and they wouldn’t take 555-5555.  The number for the local Walmart worked, though, so I was eventually able to proceed past that, with apologies to the hapless clerk on the receiving end.  They also insisted that I enter a College of Interest, whereas I was thinking that it worked the other way around:  I wouldn’t know which college was most realistic until I saw how the dollars stacked up.  And would they give me different results if I entered Harvard instead of Oforget U, or a four-year or state college rather than a two-year or private one?  I couldn’t tell whether the thing was even suited for graduate students:  it asked for my ACT or SAT, but not my GRE.  I decided to try by entering several colleges of interest (of different kinds), and entering a combined SAT score that had the same percentile value as my GRE score.  They allowed a maximum of five colleges.  But now FastWeb figured out that I had already registered my email address on their site, long ago, and therefore they gave me search results without further ado.  When I went into my profile, though, I saw that what they were offering was related to the data that I had entered several years earlier.  Had they blown off everything that I had just entered?  Now that I was logged in under my old ID, I tried using the browser’s Back button, to get back to the questionnaire, and then marched forward from there.  Now their website seemed to be defaulting to a username that wasn’t familiar to me.  Had they (or possibly LastPass) just invented that?  It didn’t matter — they were still giving me that same “Click here to login” option.  So, yes, I had just wasted all that time entering data because I had not thought to login first, and because they hadn’t asked.  They weren’t recognizing the ID and password that I had saved in my records.  Either they had changed it, or I had.  Partway through this process, I had to walk away from it overnight, and when I got back to it, I saw that I was already getting spam from FastWeb.  I would have preferred to be asked what, if anything, I wanted to receive.  At least the unsubscribe option seemed to work.  But anyway, I did get logged in, and I went in to update my profile.  The “Possible Majors” question would offer choices, as before, if I just typed a word or two, but I didn’t recall seeing that question when I was filling out the questionnaire.  So it seemed that a newcomer would want to be sure to come back and edit the profile.  Elsewhere in the profile, I noticed that FastWeb had better Religion selections than SchoolSoup, but varied somewhat in the Personal Attributes section (e.g., no Homeless option, but did have a Trailer Park option).  I did like that FastWeb gave me a single display of my current profile, so I could save it in case I had to re-enter it or use that information for some other purpose (e.g., applications, résumé).  Now that I had entered all that stuff, I went to the Scholarships option > Scholarship Matches.  They showed me a list of 25 scholarships.  Not 525; just 25.  Definitely not the outcome I had expected.  Was I even in the same universe as School Soup?  On the positive side, they did have checkboxes, so I could click and discard those whose deadlines had obviously passed.  Then again, that wasn’t even necessary.  A closer look revealed that they weren’t even bothering me with expired opportunities.  I had to admit, the longer I fooled with SchoolSoup, the more I wished it were like this one.
  • Cappex.  My early impressions of Cappex were positive.  They had a nice interface and they began spamming me immediately with offers from their ad supporters.  I opted out of that.  At first, I did not fill out the endless lists of sports, activities, and other items that were mostly geared toward high schoolers applying to colleges, not to graduate students like me.  So at this point, Cappex matched me with 26 scholarships.  Their list of scholarships had a “find more matches” option.  It also had an option for merit aid scholarships at specific schools I might be considering.  The way to find more matches was to fill out additional questions about my (or family members’) memberships in various organizations, employing companies, disabilities, sports, etc.  After filling out that additional information, I found that Cappex had unexpectedly reduced me from 26 to 19 scholarships.  I guess they saved me some time that I would have had to spend digging through things to find out that I wasn’t eligible.
  •  I saw one or two references to this site.  It sounded different, so I was curious.  I found it had very limited data options (e.g., choose one religion, one ethnicity, one disability, one general-purpose major — with “social sciences” as close as I could get to, say, sociology).  Bizarrely, the form would allow only a single digit for GPA (I entered “3”).  The FAQs explained that the data options were dictated by the terms of the scholarships — implying that the Fresch database contained scholarships for people of Australian “ethnicity” but none for social work students.  My data, entered in less than one minute, produced 319 alleged matches, displayed according to the number of stars next to them.  The FAQs did not say what the stars indicated.  The FAQs page made clear that this database was one person’s labor of love (but see Scholarship Monkey, below).  She came across as a hardworking, well-meaning person.  I guessed that she must have identified a specific need that she could meet, that the better-known databases would tend to overlook; but I was not sure what that need might be.  I felt bad for her, after reading her remarks about how people criticized her for not including other religions in her selection box, not realizing that this was because those were the only religions for which she had identified scholarships.  The general idea seemed to be that the user would enter just one fact combination (e.g., white, majoring in business), and s/he could run different searches if s/he wanted to try out other combinations.  There did not seem to be any way to save a search, to remove individual scholarships from the list, or otherwise fine-tune the potentially hundreds of hits that a search might produce.  I decided not to explore this database further.  One postscript:  somewhere along the line, I encountered an indication that many scholarship search websites draw upon the same core database, apparently paying for access to it.  I was not clear where that database was or how I could find which sites used it.  But it did seem that the Fresch site was not that kind of operation.
  • Princeton Review.  I could not find their scholarship database, if they had one.  There didn’t seem to be any such link on their main page nor on their graduate student page, and their search box produced 13 hits in response to a search for “scholarship” — and none of those 13 looked like a scholarship database.
  • Peterson’s.  Their scholarship search directed users toward the Cappex database.
  • Bill & Melinda Gates Foundation.  Focused on low-income and minority students.  Which is great, but is not within the scope of this look at general-purpose scholarship databases.
  •  Very limited data options (e.g., no questions about parents, organizations, activities; few religions in list).  It appeared their database must have few such special-purpose scholarships.  Before I even entered any data about myself, they said they already had “57 results found.”  I thought maybe they had somehow detected me from a previous login that I had forgotten; but when I tried to log in, they said they did not have my email in their database.  But there didn’t seem to be a registration link so that I could register.  I eventually went back to the starting point and registered.   Once we got that sorted out, I saw that I was now up to “67 results found.”  Sadly, they did not have an option for graduate students, so my interest faded somewhat.  But I did go through the paces, listing myself as a returning adult student.  The had a good selection of majors to choose from, and the biggest list I’d seen yet of specific employers who apparently had some kind of scholarship deal for their employees.  They were also clearer than some as to exactly what they were asking:  instead of just shoving a list of veteran statuses (e.g., veteran) at me, they specified:  was I a dependent child of, say, a deceased veteran?  Oddly, to continue that example, they specified the Civil War (I didn’t know: was I descendant of a Civil War veteran?) and Grenada, but not Iraq or Afghanistan (except under the general rubric, “Middle East War,” perhaps bracing themselves against the possibility that we would soon be invading Iran.  After all this was done, I had bumped it up to a total of 71 results.  The list showed the name and amount of the award, and its closing date, but did not give me an option to archive or delete individual items on the list.
  •  The early data collection pages were organized sensibly and functioned well.  They offered checkboxes where checkboxes were appropriate, instead of requiring me to do a Ctrl-click to select multiple items (without being able to verify that my selections stuck).  Unfortunately, things went downhill as I continued onto later pages.  The college choices option was a little screwy, but it ultimately worked.  Although they seemed to be open for graduate students, there was (as with most others) no box for GRE scores, and they hadn’t asked whether my stated GPA was for graduate or undergrad work.  Their list of major field checkboxes was completely screwed up — more than half of the majors were hidden or missing (i.e., the alphabetical list jumped in several places, like from “Aviation” to “Environmental Science”).  I checked something, hoping that would clear up the mess.  It didn’t.  For some reason, they decided to build their ad for Le Cordon Bleu cooking schools right into the middle of the application, making me wonder if my struggles with the major field dialog had somehow persuaded them that I wanted to be a chef.  The way it was structured, I didn’t realize that their questions about my high school education were part of an ad for some technical institute until after I’d filled them out, making me a bit anxious that or its partners might soon be spamming me.  I gave up on the majors dialog and just typed some majors into the box.  They didn’t have some of the majors that I had seen on other forms, which may have just meant that they were overwhelmingly oriented toward undergraduates.  Their number of affiliated corporations was quite short, with fast-food restaurant chains especially visible.  Their list of student organizations was absurd, with entries like “Farm Bureau” and “Screen Actors Guild.”  It appeared that they had munged their list of employers with their list of student organizations — and then that nobody caught it or, more likely, that a number of database users had pointed it out, but their part-time programmer had been in jail for the past several months, or something, and there was nobody else who could fix it.  In other words, I was starting to have some serious doubts  about the quality of the information that I would get from their database.  There were so godawful many entries on their list of student organizations that I frankly got bored and gave up about halfway through.  Feeling obliged to put myself down for something, I checked the box next to the First Catholic Slovak Ladies Association and moved on.  They had the ads for the cooking school and the tech school again on that page as well.  The next page showed the colleges that I had listed, and then provided a list of “similar colleges.”  Some of these, such as Grand Valley State University, had been advertising itself elsewhere, so I suspected this list was at least partially informed by advertising dollars rather than by any sincere calculation of what would interest me.  Finally, at long last, they did give me a well-formatted results page, with what appeared to be an option of making a given scholarship a favorite or removing it from the list.  There were about 36 items on the list (no precise number reported), and none were expired.  Obviously, many had expired; the point is just that they were not burdening me with irrelevant entries in that regard.
  • U.S. Department of Education.  They offered a Financial Aid and Scholarship Wizard, which turned out to actually be called their Scholarship Matching Assistant (as distinct from the nearly empty search box in which a user would apparently get best results by entering the name of a particular college or university that might or might not be offering a scholarship).  Getting started took a couple of unnecessary clicks before I found the spot, but once I was in, it felt pretty solid.  The only exception was that I got a Server Error, the first time I tried to fill out my National Merit status — which was itself more detail than other sites had requested.  (I didn’t actually have a National Merit status; I just wanted to see what the options were.)  The scroll boxes were too small to display the full text of the item being selected, forcing me to wait for the tooltip to pop up and tell me which item I was on the verge of selecting.  (On some lists, that became a time-consuming chore.)  Generally, the print on this website unnecessarily tiny.  They askd me to enter all schools I had attended, as distinct from those I was going to attend, which seemed odd.  Their list of universities seemed to be missing a lot of them:  the “University of ” section had only a couple dozen, leaving out tons of others — University of Arkansas, University of Colorado, University of Michigan, etc. — and when I did a search, I got “No results for University of Michigan.”  They had a hellacious list of college majors — probably every one you could imagine.  It could feel a little spooky to give so much detail to the federal government, but then I realized that an organization able to collect this much information on citizens’ private lives would surely be able to justify its continued existence to any future president, Democrat or Republican.  Consider it my service to my country, helping to keep the DoE on its feet.  I liked that I could clearly see what items were being saved in my profile as I went along, and could also get a detailed list of all those items from all categories on one page.  I noticed they didn’t have an entry for Lawyers, in their list of professions; did that mean there were no scholarships for people studying things related to law (other than Law Enforcement, of course)?  As with other websites, they offered the usual unbelievably long list of organizations with which one might be affiliated — and yet here, again, nothing for the New York or New Jersey bar associations.  And as I was writing these words, having just made an entry, the site logged me off due to inactivity.  Obviously a programming glitch there.  The webpage didn’t give me a re-login link, but the browser’s Back button worked, and my information was still there; but at other times (it logged me off repeatedly), when I wasn’t thinking of the Back button, it took too many clicks to get back in, relying on their menu picks.  It did these logoffs without warning.  (While noting interface problems, I should mention that their typeface was unnecessarily tiny.)  Glitches aside, by the time I was done with this thing, I really felt that this was a step toward the answer:  that the U.S. DoE should handle, or contract out, the task of programming a user-friendly front end and maintaining an up-to-date national scholarship database, containing all the relevant information from students and funders.  Or at least there should be software or a website, compatible with numerous scholarship databases, in which the student could enter his/her data just once, in a comprehensive manner like the one afforded by the DoE, and then have it relayed to a user-selected set of such databases.  For better and for worse, there was nothing fly-by-night about this whole data entry ordeal, nor about my decision that this felt like the kind of site that was worth all this time and effort.  When I was done, it concluded that I was eligible for about 70 scholarships.  It didn’t provide a precise number, but it looked like there were about 25 per page, and almost three full pages of listed items.  They were laid out alphabetically, which was good.  Their page didn’t specify expiration dates and didn’t give me checkboxes or other options to archive, delete, or otherwise structure this list.
  • Sallie Mae.  Their scholarship questionnaire was well laid-out and organized.  They offered a more limited scope of possibilities (e.g., a maximum of five major fields), which was probably more appropriate for purposes of scholarship grantors, who would presumably want to give their money to someone who *really* loves rugby, as distinct from a dabbler.  After the DoE’s website, this felt less like an all-purpose personality inventory and more like a concise and limited-purpose informational tool.  Their list of major fields, to return to that particular example, seemed skimpy — for instance, nothing for educational research, educational psychology, organizational psychology, or research methods.  The conciseness seemed a bit incongruous in their “Special Circumstances” area, where I had to choose at most four conditions from their list, which included such possibilities as planning to attend a public college, experiencing domestic violence, or being tall.  The only real problem I had with their website was that, when I was trying to update my regular (non-scholarship) profile, they required me to select a college, yet there were no colleges to select from.  It wouldn’t save the rest of my data; I had to completely abandon the profile update.  This happened a couple of times, and it required some extra digging around to find a link that would let me get to the scholarship selector.  The process of filling out the data form was relatively quick.  They found 40 matches (by my count; no precise number stated).  Their results page put the results all in one list, which made it easier to view them.  They had check boxes for selecting items that I wanted to discard or, alternately, label as Favorites.  They showed the deadlines, and none of the ones they showed me were past their deadlines.  They showed the amounts, the stars for user ratings (though it appeared nobody had actually rated any of these items — or perhaps they were blank until *I* rated them).  They also showed their estimate of Match Accuracy — that is, how close was I to being on target for this scholarship?  All told, this was the most user-friendly statement of results that I had seen from any of these scholarship databases.

At this point, I wondered whether I was getting close to having a good list of the leading scholarship databases.  I decided to run some simplistic Google searches for the numbers of hits associated with these databases.  I realized this would be a coarse indication — that, indeed, some such searches might be incorrectly designed.  Moreover, popularity would hardly equal usefulness, in this market where data on outcomes were scarce and most of the competitors seemed too new to have allowed guidance counselors to build up a store of experience.  But I had not encountered other clear guidance.  A rough estimate seemed better than nothing.  So in those simplistic searches, Google reported 546,000 hits for SchoolSoup; 1,280,000 for FastWeb; 826,000 for Cappex; 20,500 for FreschInfo; 1,830,000 for CollegeBoard; and 3,530,000 for  (I wasn’t sure how to do a comparable search for the scholarship database at

The two on that list that I had found least impressive (pending analysis of results, below) were SchoolSoup and FreschInfo, and those were also the two that had the smallest numbers of hits.  On that basis, it did preliminarily appear that I might fairly focus on databases for which a simplistic Google search would yield at least 500,000 to 800,000 hits.  It certainly seemed that I could have reasonable doubts about databases that had drawn only a fraction as many references.  It also seemed that, for the most part, the databases that I had stumbled into, so far, were among the most frequently cited of all scholarship databases.  In other words, word of mouth seemed at least to be self-confirming, so far.

As noted above, I was concerned about reports that multiple scholarship search websites drew upon the same underlying database, or at least upon very similar databases.  In other words, I might be wasting my time on redundant efforts, or (more constructively) the phenomenon I might be triangulating could be myself (with my own vagaries in answering the same questions posed in different ways by different people) rather than the universe of available scholarships.  It also seemed that the large majority of users would eventually reach a point of having insufficient time and patience for the process of entering the same data over and over, in one database after another.  Nevertheless, I did some more looking around, at this point, to see whether there were other databases, or lists of databases, that I should have been looking at.  Quick searches for reviews or lists of best scholarship databases didn’t yield much.

I did find another couple of lists of databases, and added them to the list of lists (of databases) provided in the first paragraph of this post.  I also accumulated the names of some other databases that had been recommended by one or more such lists.  I ran more Google searches, like those noted above, to see how many hits these other databases would get.  Google reported about 3,460 results for Fund My; 766,000 for 25; 90,000 for Broke; 160,000 for; 149,000 for; 5,440 for ScholarshipMonkey; 168,000 for Careers and; 1,050,000 for; 284,000 for Career InfoNet; 131,000 for; 19,000 for; 90,000 for; 566,000 for; 156,000 for; 88,500 for; 50,900 for; and 381,000 for (which advertised itself as being “powered by Scholarships101”).

By those numbers, it seemed that I probably should take a look at CollegeNet and CareerOneStop.  So I did.  But first, for reasons described in the following paragraphs, I started with ScholarshipExperts, ScholarshipMonkey, and FastAid:

  •  Among the websites that Forbes considered best for various aspects of college planning, as of some unspecified date, this was the one they named as “the best scholarship search engine we’ve found.”  Their brief review seemed to indicate that they might have based their decision on the number of scholarships that the site produced in response to the profile data they entered.  What was perhaps most remark about the Forbes endorsement was its uniqueness; my travels certainly did not uncover a plethora of magazine writeups of scholarship databases.  Anyway, I decided to comment on a few details that I had not mentioned, but could have mentioned, in connection with most if not all of the foregoing databases.  In particular, something that I had learned, in my research methods courses, was that if you want good information, and especially if you want people to like your website, you should ask questions that are appropriate to their circumstances.  I appreciate that many users would find it bizarre to face a question asking whether they consider themselves “male/female/other.”  But it’s a simple fact — an uncomfortable one, surely, for people who have not been educated in such matters — that numerous humans are born with physical, bodily reasons for ambivalence regarding their own place in the male/female binary.  I have never trumpeted LGBTQ issues, and am not doing so here.  I am only suggesting that, in matters related to education especially, intelligent people should be well past the age in which, euphemistically speaking, parents and society try to force lefties to become right-handers.  Anyway, as a separate and more trivial point (but still worth mentioning from the user’s perspective), this website could have benefited from a more user-friendly data-entry experience (e.g., when I try to type my year of birth, don’t force me to use the mouse instead).  Ideally, the drop-down options would open up as soon as I tab to a box, so that I could just hit a key (either the number of the entry or a letter, like “H” for high school).  (This remark is relevant for purposes of both accessibility and speed.)  It would be helpful if data entry pages could automatically center themselves around the box in which the user is currently entering data, instead of being at the bottom of the screen, so that the user could get a better sense of context without having to scroll manually.  For this website in particular, when I type “I” in a list of states, that means, “take me to the first state whose name begins with the letter I” (not D, not G).  More positively, while did not provide what I found the most usable option of checkboxes to designate such things as areas of career focus and intended major, at least they provided an Add button instead of requiring the Ctrl-click maneuver (but they did allow that too).  They allowed up to 50 of each (i.e., career choices and majors).  It was interesting that, like some (but not all) other sites purporting to capture enormous numbers of scholarships, they apparently had nothing for people who hoped to specialize in certain fields, including some that were pretty well-known (e.g., educational psychology).  My guess was that the user was supposed to conclude that, if your specific field is not shown, you should check the general field instead (e.g., education).  Whether that would make subsidiary options (e.g., physical education) redundant, I could not tell.  In other words, as with most other sites, the lists of majors and such were not as exhaustive, or exhausting, as those provided by the U.S. DoE (above).  Also, as with some others, I was not sure why they were distinguishing “the student’s current or intended career” from “the student’s current or intended major or field of study,” given that both lists contained substantially similar entries.  At least it would have been convenient to have a “Same as above” checkbox.  Same thing for the Clubs/Organizations and Associations/Societies boxes.  There was also some seemingly unnecessary redundancy and data entry in the obligation to specify a city of residence after stating the city in my address; I had not encountered that elsewhere and, again, a “same” checkbox would have helped.  Likewise, it should be possible to automate (perhaps with an option to correct) the selection of county of residency, once the user has specified a city of residency.  I found the selection of colors surrounding the data entry boxes unpleasant, and did not like having to click on a box once just to get it to wake up, before I could actually start entering data in it.  As with most sites, for a graduate student (particularly one in his 50s), it was mildly irritating to have to enter the year of high school graduation (or, elsewhere, to encounter offers to send information to my high school guidance counselor, who by this time might have been in his grave).  And why was it necessary to specify the state of high school attendance, when the next box still required me to scroll (in a tiny, inconvenient box) among thousands of high schools, across the country, whose listed names already indicated the state?  Maybe the answer was that they did not actually have my own high school on that list, though I was not sure why not.  Similar questions and remarks applied to other aspects of this and other websites, especially if there was a hope that users would have a positive experience.  Regarding the endless lists of hobbies and activities, would it not be possible for at least one of these scholarship sites to narrow down the selections — to infer, perhaps, that no scholarships in the database would require an engineering student to indicate whether s/he enjoyed knitting?  The contents of some of these lists could also be iffy, as in the references to the “accordian” as a musical instrument and to “logging” as a sport.  Some of the limits seemed arbitrary; for example, listing no more than two ethnicities would not work well for a student with four diverse grandparents.  The list of organizations was wrongheadedly grouped, such that a person trying to navigate the tiny scroll box might not notice, after reviewing those categorized as “Career” items, that there were also some “Military” items that would seemingly be part of a career.  On the positive side, this website was one of few that explained that the hobbies and activities of interest were those in which the student participated “on a regular basis.”  It did not help that I was growing tired of this project, but my feeling at this point was that this site was one of the least pleasant of all those I had visited.  It appeared that thought that user friendliness meant having pretty colors and buttons to click — huge buttons, by the way, which increased the amount of scrolling required.  Some of the inconveniences identified in this post were so egregious as to raise the question of whether the site adminstrators considered students to be a subhuman species unworthy of basic considerate treatment.  All in all, it seemed advisable for webmasters to think in terms of user friendliness, and to conduct critical field tests of their creations.  But at any rate, we did come, at last, to the results:  56 scholarships, with “a potential award amount of $133,563.00!”  Well, I sure was excited.  Their results page gave me deadline dates, but did not state a dollar value for any of those 56 scholarships individually.
  • Scholarship Monkey.  Firefox gave me an error — “This connection is untrusted” — when I tried to enter this site.  It appeared to be another webmaster oversight.  The site seemed to use the same search engine and/or front end as Fresch (above):  very similar-looking list of results, similar data input oddities (e.g., ability to enter only a single digit for GPA).  Their university selector was somewhat unfriendly and did not have all universities.  They allowed only general-purpose majors (e.g., education, not ed psych).  Only one choice allowed for religion, disability, choice of major, and ethnicity.  It took five minutes to complete the application and get a list of 46 alleged matches.
  •  This site advertised itself as “The largest private sector scholarship database in the world. 30 years of Scholarship Research, Constantly updated.”  Its homepage said that it was “From Dan Cassidy, author of “The Scholarship Book“, “Worldwide Graduate Scholarship Directory” and “Worldwide College Scholarship Directory“.   I was not sure what to make of the claim of being a large “private sector” database.  Mr. Cassidy did not seem to be using “private sector” to mean for-profit, given the indication that the service was free.  The National Scholarship Research Service, whose name was linked with Mr. Cassidy in some of his publications, appeared to be just a business name for him personally; he was its president.  It was a for-profit enterprise in the sense that the books were for sale.  The need to make books publishable would arguably supply an incentive to make them good, and thus to have good data backing them up.  In that regard, the books seemed to have drawn average reviews.  On the other hand, they were published by Penguin, a major publisher.  The solo nature of his undertaking could have mixed implications:  even an excellent researcher can indulge funky ideas and eccentric priorities.  In such conditions, a user could develop additional uncertainty from little things, such as the odd punctuation and capitalization just quoted.  Two of those three books did not appear to have bene updated in the past ten years.  The webpage describing him seemed to indicate that his last degree had been a B.S.  That webpage had apparently not been updated for years.  It claimed that he was “generally recognized as the nation’s leading expert on private sector financial aid programs.”  He may have been, at one point; but in my browsing so far, I did not recall seeing his name anywhere outside of his own website.  Moreover, that statement seemed to indicate that, by “private sector,” he meant that his site was devoted to scholarships that did not come from nonprofit or governmental organizations.  It was not clear to me how that would be a plus.  But the results of the site could speak for themselves.  Having criticized his site, however, I was reluctant to place highly personal information at his disposal.  In that regard, the emphasis on the private nature of his enterprise generated (at least for me) a concern that I probably should have had for other sites as well, given that I had no idea who was behind them; but somehow the overt link with an individual made it a more palpable concern in this case.
  •  The UCLA list (above) said that this site had been featured by CNN, the Christian Science Monitor, and Attache! magazine.  CollegeToolkit’s media coverage webpage named others (e.g., The New York Times) as well.  But that webpage did not appear to have been updated since 2007.  A focused search for the past year turned up only 723 hits.
  •  Mach25 appeared to be the sexy-sounding name for’s scholarship search.  I was not impressed.  To the contrary, as I looked at their webpages, it occurred to me that, as with other sites I had examined, many of the hits I had seen in my Google searches may have had to do with other aspects of their operation.  For instance, CollegeNet also offered various college search and information pages.  I decided to redo the Google search, looking only for Mach25.  This time, the search produced only 12,100 hits.  I decided not to invest time in an exploration of their offerings.
  •  Like CollegeNet, this website — sponsored by the U.S. Depatment of Labor — served multiple purposes.  On the positive side, its data were said to come from the kinds of sources that one would hope to find on a governmental site.  But for the scholarship portion, the data came from the Gale Group and Reference Service Press.  These seemed to be solid sources too.  The database “includes over 5,000 scholarships, fellowships, loans and other opportunities.”  That sounded like a small number, when a number of the foregoing sources claimed to have hundreds of thousands of scholarships in their databases.  I decided not to pursue this.

By now, I felt that I was reaching the point of diminishing returns.  There may still have been other sites I should look at.  But it seemed that the most productive effort, at this point, might be to look at what I had already found.  That way, I might be able to eyeball a new scholarship source and recognize, rather quickly, whether it had much to add to the foregoing list of databases.  So I turned to the analysis of findings from those databases (above) that had seemed to provide promising leads to numerous relevant scholarships.

First Comparison of Search Results

I had been relieved to see that, of all these sources, only SchoolSoup gave me a really messy list of results.  Theirs was also at least seven times larger than any of the others.  I figured, in other words, that I could go ahead and tackle their list, get it out of the way, and in doing so might encounter most of the scholarships that would also be named by the others.

For all of these results, and for those of SchoolSoup in particular, it seemed that it might be most helpful to begin by constructing a spreadsheet.  After some fooling around, I approached this by saving (Ctrl-S) each of SchoolSoup’s 26 pages, listing various scholarships, as a sequentially named text file (01.txt, 02.txt …).  Then, in a Windows 7 command window (Start > Run > cmd), I typed “COPY *.txt Combined.txt” to merge them all into one text file.  I opened that text file in Word and ran some macros and global replaces to clean it up.  I put its data into Excel and separated out the fields provided by ScholarSoup:  scholarship name, amount, deadline, and participating college.  I saved the Combined.txt file for reference, and added a SchoolSoup Index Number column so that, during my various manipulations, I could return the Excel file to its original order for purposes of comparison against Combined.txt, if needed.  Some of the data (e.g., dates) came over in the wrong column, so I had some further cleanup to do.  I also divided out Minimum and Maximum amounts for clarity.  With some false starts and mistakes, this process took more than two hours.  Needless to say, those were two hours that I would have preferred to save by just downloading a spreadsheet with the results.

That gave me a spreadsheet showing the information provided by SchoolSoup.  For the other databases, I did similar downloads or just typed in the data.  This difference in approach could affect the number of useful scholarships provided by a database:  downloading them all, as I did with SchoolSoup, would naturally bring along some irrelevant entries that I would simply ignore if I were manually entering only the ones of interest.  This is not to deny that, in all events, SchoolSoup gave me an inordinate number of duplicates.)

Databases varied somewhat in the information they provided, and I did not bother to type in complete details for hundreds of scholarship opportunities that I might not be pursuing.  So I had extensive but not complete information on the various scholarships’ names, minimum and amounts dollar amounts, and expiration dates.  I added a column for the program URL, and filled that in especially for the scholarships that I expected to investigate further.

When I turned from SchoolSoup to FastWeb (i.e., the second database analyzed in detail, above), I was surprised to see that the list of 25 hits that I had obtained previously had changed.  Now FastWeb was saying I had 23 available scholarships with 26 upcoming deadlines.  I noticed that SchoolSoup and FastWeb listed some of the same scholarships, and that they disagreed on certain facts (e.g., due dates, amounts) for those items.  I decided that it might not be necessary, for my purposes, to investigate such factual discrepancies.  Hence, in this investigation I did not try to determine how accurate the various databases were.

I moved on to the next database analyzed above:  Cappex.  They still had me down for 19 matches.  Same thing happened here, though.  Not the redundancy, but the novelty:  14 of their 19 hits did not appear in either the SchoolSoup or the FastWeb lists, and among the five that did appear there, one or two differed substantially on dates and/or amounts.  I was hoping that this would not turn out to be a complete mess — that, somehow, a consensus would emerge, among these various databases, and I would maybe be able to identify some most and least accurate databases.

Moving along.  Next database on the list:  FreschInfo.  The site didn’t remember my data, so I had to spend a few minutes re-entering it.  I must have changed something, because this time I had dropped slightly, to 309 matches.  I did enter a half-dozen potential target schools, and none of them was Curry College, Westminster College, or any of the many other colleges whose names I saw among those 309.  In other words, their database was showing me a lot of scholarships that obviously had nothing to do with me.  (I did like how the website allowed me to mouse over a scholarship and see a tooltip summary — nice feature).  The top item on the list, bearing six stars, was actually the Cappex $1,000 Health Careers & Nursing Scholarship whose deadline had been back on December 31.  I was able to scan the list and delete all 309 entries within 20 minutes or so.  Some entries were for scholarships whose deadlines expired several years earlier.  The FreschInfo database was a complete waste of my time.

Next:  CollegeBoard.  I didn’t like that the 71 hits resulting from my profile were spread out across nine pages, without an option to view them all on one page, and without even the courtesy of item numbers, for future reference.   I also didn’t like that so many of these entries had no apparent connection with my interests.  I was not sure what I, an aspiring white doctoral student in an academic field and no connection to Alabama, would have done to deserve an entry for an internship in a repertory theater, or for a scholarship in Alabama, or with the United Negro College Fund, or from the Blinded Veterans Association.  They also threw in entries for federal student loan programs.  Ultimately, out of the 71 hits, only 28 were not obviously irrelevant.  Then again, of those 28, only five appeared on the three major lists just revisited:  three on Cappex, two on SchoolSoup, and none on FastWeb.  Many of the entries they showed me did not indicate a closing date, which meant I would have to look them up individually.  At this rate, I was afraid I might find that they were outdated or inapplicable.

I hoped for better from, next on the foregoing review list.  Thanks to their checkboxes, I was able to remove some obviously irrelevant items.  That brought me down from the original 36 hits to 27.   Of those, two were also on Cappex, one was also on Fastweb, and one was on both Fastweb and Cappex.  In other words, I was still not at a point of saturation, where the scholarship databases were largely repeating one another.  Not even the hundreds of search results from SchoolSoup seemed to have exhausted the available scholarships.  As for the irrelvancies in my search results, it appeared that some could be due to my uncertainty about what they wanted in response to one of their survey questions, and my decision to take the expansive route if in doubt — though some of that could be traced, in turn, to poorly worded questions.

Next on the list, as noted above, the U.S. Department of Education (DoE0 search resulted in 70 hits.  Of these, most seemed relevant.  Fastweb, Cappex, and each had two of these 70 items, and CollegeBoard had one.  But there was some overlap there:  together, those four databases contained only three of these 70 items.  SchoolSoup, by contrast, overlapped with DoE on seven items, and none of the others had any of those seven.  By this point, it had become increasingly clear that I was not going to be able to form conclusions about these databases yet.  The test would come later, when I got to the point of checking individual scholarship records.  I surely was not tapping into websites that were using the same underlying databases; or if I was, their qeustions and/or my responses were producing pretty widely divergent results.

Moving on, the ~40 results from Sallie Mae’s database — 35 of which seemed potentially useful on slightly closer examination — were, again, largely distinct from the results in other databases.  SchoolSoup overlapped on three scholarships, FastWeb and Cappex on two each, CollegeBoard and on one each, and there were no common entries at all in the results from the DoE database. (above) had given me a preliminary list of 56 potentially relevant scholarships.  I noticed, as I was going through these, that they seemed to do a bettter job of providing deadline dates than other sites did — that is, I found myself filling in a number of blank date fields left by overlapping entries from other databases.  In addition, this site seemed to be identifying state-level scholarships, of which I had seen little mention in most of the other databases thus far.  Of the 51 hits that survived my slightly closer look, 18 overlapped with entries from other databases — especially with SchoolSoup and the DoE (seven each), but also with (four scholarships listed on both databases) and FastWeb and Cappex (three each).  There was no overlap at all with the Sallie Mae list.

Similar appearances suggested (above) that Scholarship Monkey might draw from the same database as FreschInfo.  A look at the items on the SM list disabused me of that fear.  I didn’t know why they looked so much alike, but apparently it was due to a potentially unauthorized borrowing of webpage layout, not from a shared core set of information.  Of the 46 hits initially returned by Scholarship Monkey (above), 25 looked potentially relevant at a second glance.  Only two overlapped with any other database.  I was interested to see, in closer detail (below), how SM’s very quick and rough data-input approach would fare, in terms of ultimate results delivered to me.

It was time to wrap up this section of this post.  I had a couple of reactions, at the end of this first run through the lists of individual scholarships produced by these databases that had seemed most popular and/or capable.  Maybe the main thing was that the results were not what I had expected.  I thought I would be engaged in a largely repetitive process, with most databases listing most of the same scholarships.  I had seen that SchoolSoup had a far larger list than any of the others, so I supposed that the others would mostly be repeating what was on SchoolSoup.

I was pretty far from that.  Instead, as I continued to look at more databases, I just kept getting more and more new scholarships on the list, mostly unknown to more than one or two databases.

To quantify the situation, I now had a list of 730 scholarships.  Other students would have more, or less, depending on how they answered the questions.  This list would be trimmed, perhaps dramatically, as I proceeded to check eligibility criteria, deadlines, and other details on a case-by-case basis (below).  But as it stood at this moment, 688 (94%) of those scholarships were listed on just one database, and 35 (5%) were listed on just two.  To be sure, SchoolSoup made a big splash:  it contained an entry for 71% of these 731 scholarships.  But even with SchoolSoup out of the picture, 90% of the scholarships listed by any of these other databases would be listed on just one database, and 98% would be listed on just two.

The picture was not clear yet.  I needed to see which of these individual scholarships were current and relevant.  Trimming the list could change the story.  But so far, it seemed that looking at more databases would just keep on producing more and more scholarship leads.  The preliminary impression was that these database people were not just copying or repeating each other, except maybe with the bigger, more obvious scholarships.  Instead, it was starting to appear that they were all researching off into different directions, making unique contributions as they dug into data that their competitors weren’t digging into.

First Look at Individual Scholarships

It was time to look at those 730 scholarships.  Some deadlines were already past, but would be renewed in the next academic year.  Other deadlines would come later in 2012 and would apply to either the 2012-2013 or to the 2013-2014 academic year.  I looked at them all.  Many would be renewed in the next year, but in any case they all provided a realistic representation of the sort of thing I could expect to encounter in a typical academic year.

I sorted the 730 scholarships into four groups:  Error, Unwanted/Irrelevant; Potentially Relevant But Not Applying; and Might Apply.  The first decision was whether a scholarship belonged in the Error category.  This category was for scholarships that should not have been listed at all.  Key examples included duplicates, blank items (i.e., having a title but no scholarship details or contact information), and links to programs on which I could get no information — that apparently never existed, no longer seemed to exist.

For purposes of the Error category, I identified most duplicates through a preliminary look at scholarship titles, arranged in alphabetical order, sometimes confirmed by a look at the summary writeup provided for a scholarship by the database.  Other duplicates came to light as I went through subsequent stages of analysis.  If the same scholarship was identified by two different databases, it was not an Error; I simply combined the data on a single row of my spreadsheet.  To be marked an error for reasons of duplication, the same scholarship had to be shown to me twice by a single database.  Before marking a duplicative title as an Error, I preferred to confirm duplication by looking at the summary writeup provided by the scholarship database, to make sure there were not actually two separate scholarship offerings with the same names (involving different deadlines, perhaps, or being directed to somewhat different audiences).  Unfortunately, it could be difficult to look at individual scholarships.  Some databases (notably but not only SchoolSoup) made the process just about as frustrating as possible.  Their links to program websites would usually go to general-purpose webpages with no obvious link to the actual scholarship program.  In some cases (e.g., The Scholar Ship), I had to back out to Google and start over.  I was actually unable to use SchoolSoup at all on one computer:  it was constantly logging me out and showing me its Grand Canyon University ad pages and not letting me go beyond the first page or two of search results, regardless of whether I tried in Firefox or Chrome.  It did the same in Firefox on another computer, but I was finally able to get Chrome to work on that other computer.  In short, a far as I could tell, I had marked duplicates correctly; it’s just that I decided not to invest the time to verify that duplicative titles (in SchoolSoup, especially) denoted duplicative content in every case.

After excluding the Error items, the Unwanted/Irrelevant group consisted of those scholarship opportunities that should not have cluttered up my list because the stated criteria clearly excluded me, and that could have been established through appropriately precise questions in the profile creation stage, or by paying attention to my answers to the questions that the databases did ask.  For example, I did not want to see scholarships that were available only to women, or only to people who were majoring in fields that had nothing to do with me.  In some cases, to cut through the hundreds of scholarships that had come up in my searches of these databases, I relied on the title of the scholarship to tell me that it was for engineers, or nurses, or some other kind of student that I was not.  After encountering one or two examples to the contrary (e.g., AES Engineering), I became more cautious in relying on the name of the scholarship as an indication of its focus.  A number of the items I deemed Unwanted and/or Irrelevant had appeared, in my search results, because I misunderstood the questions asked by the databases, when I was setting up my profiles, or because I decided to err on the expansive side (e.g., including rather tangential fields of interest).  In other words, items could fall into the Unwanted/Irrelevant category because of my own imprecision and/or because the databases failed to ask precise questions and to apply my answers to the questions that they did ask.  Some of my own errors were reduced as I worked my way through this process and gained a better understanding of what this whole scholarship search process was about, and what I should or should not enter into a profile.  Hence, I expected that the results I got from the databases that I examined later in the process (e.g., ScholarshipExperts, ScholarshipMonkey) would be at least slightly more accurate than the results I got from databases that I had examined earlier in the process (e.g., SchoolSoup, FastWeb) due to some improvements in what I told the databases.  The point of the Unwanted/Irrelevant category was that I was now having to look at individual scholarships that the databases hadn’t weeded out, largely because they asked vague questions and/or ignored my answers.

After excluding the Unwanted/Irrelevant items, I came to a third category, Potentially Relevant But Not Applying.  The key concept in this category was that I did not want to penalize databases for showing me items that, as I could now see, were not really useful to me, as long as they were probably consistent with what I had entered in my profile.  My use of this category became more consistent as I went through the process — so here, again, my spreadsheet was probably more accurate for scholarship opportunities presented to me later in the game (by e.g., ScholarshipExperts) than by the first databases I examined.  One guiding question I developed along the way, for purposes of distinguishing the Potentially Relevant from the Irrelevant, was whether I could realistically imagine my situation evolving, within the next 12 months, to a point where I would consider applying for the scholarship in question.  Some scholarships fell into this category because, as I looked at the specifics, I decided I wasn’t really interested.  Others landed here because they seemed to call for more work than they were worth.  I would typically be willing to fill out a basic questionnaire (e.g., name, address, and answer a brief question or two) to qualify for a scholarship opportunity; but I tended to lose interest as the requirements expanded to include essays that could take several hours to write.  Roughly speaking, the question was whether I would enjoy writing the essay for its own sake — whether it might be something that I would polish enough to post on a blog, for instance.  If I wasn’t knowledgeable and/or passionate about the subject, it would probably end up in this Potentially Relevant group.  An item could also land in this category if my participation seemed very likely not to pay off for other reasons.  As an example, while almost all scholarships gave the impression of wanting as many applicants as possible, the NACA Foundation indicated that they would consider only the first 75 applicants to their scholarships — so unless I was applying in the first day or two after such an opportunity became available, I might not bother.  For scholarship opportunities that looked more like sweepstakes, I took the approach I would take to an offer to enter any other sweepstakes or drawing:  I rejected it pretty quickly if it looked like it was going to be a hassle, or would open me up to spam, or for some other reason wouldn’t be worthwhile, and especially if it was just a marketing sham, designed to get me to stare at stuff without being honest about that objective.  It did seem, overall, that the scholarship search process would have been cleaner if the databases had  shunted sweepstakes-like opportunities into a separate Sweepstakes category, so that the hunt for scholarships relevant to an applicant’s actual education and career plans would not be trivialized by association with items that required little more than a name and an address.  Such a segregation would presumably facilitate faster selection among sweepstakes, and perhaps greater awareness of and/or access to them by people who might not be interested in an otherwise time-consuming scholarship search.

Scholarship opportunities that survived the foregoing stages went into my Might Apply category.  That’s not to say that I actually did apply to them.  For one thing, some had expired, and it was not yet clear whether they would be offered again for the next academic year.  As noted above, I kept these in the list anyway, for the time being, partly to retain a reminder in case they did come back to life, and partly to preserve a reasonable representation of a typical year’s scholarship opportunities.  Another reason why I could not yet go beyond saying that I “might” apply was that there was a fair number of scholarships left at this final stage; it could take a while before I would have time to work through their nuts and bolts.  There might be some that I never would work through, depending on future developments.  For instance, a sensible decision on whether to apply for Scholarship X might depend on a prior decision as to whether I would attend University Y or major in Field Z.  The Might Apply group belonged to scholarships that looked good preliminarily, and that might or might not continue to look good as my situation evolved and as I looked into the terms of the scholarship opportunity more closely.  These were items that I had not been able to eliminate as Errors or as being Unwanted/Irrelevant, and on which I had not yet made a decision not to apply.  No scholarship got into this category until I moved past the summary page provided by the scholarship database and looked at the funder’s own webpage describing the scholarship.  That was typically as far as my analysis went, as of the time of this writeup.

As I worked through these stages, I encountered certain difficulties.  One sort of difficulty that I did not explore had to do with incomplete and inconsistent information.  Many scholarships were misleadingly or indistinctively named, or were reported inconsistently by the databases, making it possible that I would incorrectly detect or overlook apparent duplicates or would misconstrue the nature of a scholarship.  Also as noted above, my spreadsheet’s entries for various scholarships did not report full information in every case, and I had generally not bothered to verify the information reported by the scholarship databases against funders’ webpage.  A better spreadsheet, containing definitive information on dollar amounts, expiration dates, and other factors would surely have facilitated a faster and more accurate triage, as well as a comparison of the databases’ rates of accuracy.

Another type of difficulty arose from the fact that my activities in this project carried over from late January to early February.  A number of scholarships had January 31 or February 1 deadlines.  When I went back to these databases in early February, most of them no longer showed the scholarships that had expired on January 31 or February 1.   The databases did not typically offer an option to view archives.  Fortunately, I was able to use other databases or Google searches to get a sense of whether I would have considered these scholarships relevant and appealing.

Ordinarily, when trying to look at multiple links on a single webpage (such as these databases’ lists of scholarships), I found it convenient to use a tool like the Firefox add-ons called Multi Links or Snap Links Plus.  These would let me use the right button on my mouse to draw a box around selected links on a webpage.  Then, when I released the button, all those links would open.  Regrettably, a number of the databases’ search results webpages were unfriendly to that technology:  they would divide the database search results among a series of webpages, each containing just eight to ten of the search result items, and they would make the Multi Links option dysfunctional.  In those cases, it was necessary to open each listed scholarship by hand.  That could be quite a chore for someone who found himself reviewing hundreds of scholarships.

On the level of individual scholarships, as with the overall listings discussed above, each database had its own unique and sometimes eccentric characteristics.  That was especially the case for SchoolSoup.  For one thing, they included data on items that were not really scholarships, especially in my home state.  It looked like they had gone through the financial aid offerings for several of the colleges and universities in the state (while oddly neglecting most of the offerings in the state’s largest universities, as well as in a number of its smaller colleges) and had listed everything that looked like a financial benefit.  So I found myself incongruously picking through various entries regarding minor travel grants for students at a branch campus, while seeing few if any references to major scholarships at much larger institutions.  Their decision to limit this sort of information mostly to in-state colleges did not make sense, given that thousands of students attend out-of-state institutions.  SchoolSoup did another strange thing, not driven by the change from January to February during my completion of this project (above):  they changed the names of a number of scholarships in their list, while keeping the total unchanged.  In other words, while it was still January, my subsequent visits to their database found that they still showed me as having 517 matches, but some of those that had appeared during my previous visit were gone, and others had arrived to take their place.  The departures, as I say, did not have anything to do with deadlines.  I didn’t keep the exact number of scholarships that were replaced in this way, but there were probably about 50 — that is, about 10% of the 517.  It seemed very random.  I used Google searches to get a sense of those that were no longer on the list.  Meanwhile, among those that were newly added to the list, there was more randomness than before:  some were so far from my interests as to be amusing.  But others were good finds — raising the question of why they had not come up in my original SchoolSoup search.

While viewing the scholarships listed on SchoolSoup, I changed some of the data recorded on my spreadsheet (e.g., deadline date) when I saw disagreements in duplicative spreadsheet entries from other databases (e.g., FastWeb).  I also made some such changes when I saw such discrepancies on the scholarship’s website.  In other cases, I did not bother trying to make a change — especially when the scholarship in question was not relevant to me.  I did not get the impression of chaotic inconsistency between SchoolSoup and these other sources.  I became more aware of this issue when I turned from SchoolSoup to FastWeb, the next database on my list (above).  It’s not that there were more discrepancies on FastWeb.  There didn’t seem to be.  It was, rather, that I now began to focus on the possibilities for exploring the levels of accuracy exhibited in various scholarship databases.  That sort of inquiry would have been far more time-consuming, and would also have benefited from familiarity with the scholarships available within a given category.  For example, if SchoolSoup contained records for three identically named scholarships, and FastWeb contained only one, was that because SchoolSoup was being redundant (a common occurrence) or, instead, because SchoolSoup was being thorough (also a possibility)?  Some funders did offer multiple scholarships that looked very similar until one got into their details.  For purposes of the present study, all I could realistically do was to offer impressions of consistency or inconsistency that arose as I went along.

SchoolSoup was by far the worst at being inconsistent, but it was not the only one.  Cappex was good in this regard.  At CollegeBoard, I had to be careful to sign in, if I wanted to see the 71 scholarships it had shown me upon my completion of the questionnaire.  Otherwise, without telling me that I wasn’t logged in, it would revert to its list of 57 scholarships (above), and if I wasn’t freshly informed on the matter, I would wonder why I was getting results that didn’t fit me, while not seeing others that did.  Once I did log in, CollegeBoard didn’t give me an option of seeing all of my 71 search results on a single page.  As if to increase my inconvenience, their site would also not allow me to use tools like MultiLinks or Snap Links Plus, so as to automate the opening of all scholarships shown on one of their search result pages.  They wouldn’t even let me open a scholarship by right-clicking on it, so as to preseve the tab I was on, and they also didn’t give me an option of right-clicking on the link to the funder’s website:  I had to click through to it, though at least it would open in a separate tab.  In short, if I wanted to see what CollegeBoard had for me, I had to click on nearly 80 links (i.e., 71 scholarships plus eight “next page” links), one at a time; and for each of the 71 scholarships, I had to click through to those websites for which I wanted further information; and then I had to click on the browser’s Back button to get me back to their list.  The exception was if I wanted to look at every one of those scholarships in serial order.  In that case, once I had gone through the rigamarole of getting to the first item on the list, I could click Next to go on to the next one.

In writing up these reactions to the scholarship databases, it may be appropriate to spare a few words about the funders’ own sites.  Often, the databases provided only the most general links to such sites (e.g., to their homepages), probably for fear that the funders would rearrange their websites, causing a more specific link to fail.  (Even then, the provided links failed sometimes.)  So once I did click through to a funder’s website, I usually had to click around some more before I found the page pertaining specifically to the scholarship in question.  On a few occasions, I never did find it.  I wasn’t sure if that was because of a failure in my technique or if, instead, the database entry was outdated.  When I did find the relevant page, I might discover that basic information was not there.  Rotary’s Ambassadorial Scholarship would be an example.  When I found a reference to it in the CollegeBoard search results, I recalled that my spreadsheet also contained a reference to another Rotary scholarship.  I checked and saw that the other one was called the World Peace Fellowship.  I wondered why I hadn’t seen both of them when I had looked into the World Peace one.  It turned out that I hadn’t actually checked the Rotary website for details on the World Peace Scholarship because SchoolSoup had told me that it was for master’s programs, not PhDs.  So now I clicked the SchoolSoup link and found myself on Rotary’s main webpage, which was also where CollegeBoard’s link took me.  I typed “World Peace Fellowship” into the search box and went to the first item.  This understandably did not seem to lead immediately to a comparison of the Ambassadorial and World Peace options.  I decided to do a search for both of them instead.  I actually did several versions of that search, making a mistake or two and repeatedly reaching webpages that, contra Google, did not actually seem to contain references to both options (even in the cached version).  I finally found information on a non-Rotary PDF.  It provided some information on each.  Apparently CollegeBoard had not gotten this far in their investigation of these scholarships; their listing did not even state the amounts or deadlines involved.  Amazingly, only one database mentioned either of these options:  CollegeBoard was the only one to tell me about the Ambassadorial, and SchoolSoup was the only one that disclosed the World Peace.  That was amazing, not only because Rotary was a major organization — in fact, according to Wikipedia, the Ambassadorial program was “the world’s largest privately funded international scholarships program” — but because these were opportunities involving large amounts of money, from the student’s perspective.  According to that non-Rotary PDF, the Ambassadorial Scholarship was worth $25,000, and SchoolSoup had told me that the World Peace Fellowship was worth $60,000.  And yet here I was, scrambling around for basic information to an extent that not even the databases were inclined to indulge.  I never did find a definitive Rotary page on the World Peace Fellowship, though perhaps I would have if I had dug further.  Their page on the Ambassadorial Scholarship did not tell me much about it, other than that I should contact my local Rotary club for more information.  The page did say that the Ambassadorial Scholarship program would be replaced by something else in 2013.  It wasn’t until I went to Wikipedia that I learned that, actually, there were six different types of Rotary scholarships.  Having completed this bit of investigation, a person might fairly wonder what else might be slipping through the fingers of my database sources — and might be nonplussed at the thought the databases were meanwhile capturing oodles of $400 grants for writing an essay on My Pet Goat.

I should mention, by the way, that I did not dig around for the Rotary example.  It fell right into my lap.  I was just jotting down these notes, as I went through the process of categorizing the various scholarships, and it occurred to me that a Rotary scholarship should not be among those that are rarely, sparsely, and inaccurately reported by scholarship databases.  The rest of the foregoing writeup unfolded from that.  I would have to guess that there are many other surprises lurking in these databases.

The preceding remarks imply a number of steps that funders could take to make their scholarship information more accessible.  By way of additional suggestions, it would be very helpful to encounter encouragement, disclosure, and other inducements to treat the application process as something other than a mechanical chore.  For instance, some sites provide a counter that indicates how many applications have actually been filed.  In a sane world, they might all do this voluntarily; but even if a substantial minority did it, many students might be appropriately encouraged or discouraged from adding their input to the mix.  Sites could also provide commentary (e.g., “We received some amazing applications” or “Don’t be intimidated from applying — we find that those who do well in our process tend to be ordinarily people, not perfect, who just happened to write good essays for us”).  Scholarship webpages might link to a Reviews or Kudos page, where the applicant could get a better sense that this foundation (or whatever the funder is) has been recognized in the Washington Post or has identified students who tended to go on to do well, or whatever the case may be — as distinct from some flaky operation that exists just to generate traffic and make a living from ad dollars, or something of the sort.  Yes, the potential applicant could root around on the website of a reputable funder and probably find something like that sooner or later anyway; it’s just that, as hinted here, the student does encounter various hurdles even to get to the point of finding the scholarship page, never mind coming to perceive it as part of something special.

Much of the difference between SchoolSoup and the other databases, in terms of the total number of listings, seemed to be due to divergent cataloging strategies.  While SchoolSoup seemed to break out some funding sources (though not, obviously, those of Rotary) into discrete entries, some of the others were content to provide a single entry covering multiple distinct kinds of scholarships.   For instance, CollegeBoard cryptically indicated that the “ESA Foundation Scholarship Program” offered something worth $500 to $7,000.  What this meant, as it turned out, was that the Epsilon Sigma Alpha Foundation had an unknown but apparently substantial number of scholarships and grants for various purposes.  Here, again, only two databases — CB and ScholarshipExperts — even mentioned it.  ScholarshipExperts provided a better writeup than CB, but incorrectly indicated a flat amount of $1,500 rather than the variation that CB had correctly observed and, like CB, munged together those dozens of possibilities into one entry.

These remarks may already have demonstrated that the databases’ characterizations of funding sources could be substantially incorrect or misleading.  Having become so displeased with SchoolSoup, I was rather surprised to observe that, at this granular level, CollegeBoard was also not exactly a paragon of verity.  To the contrary, as just noted, I seemed to be finding inaccurate content in a number of its records.  To cite one other example, CollegeBoard indicated that the Lindbergh Grant could be used for “any undergraduate study.”  But the very first FAQ on the Lindbergh site clearly said, “Funds may not be used for tuition. You MUST be working on a “research or educational project.”  Overall, it seemed a student could reasonably conclude that the databases were primarily useful for (a) eliminating some (certainly not all) clearly irrelevant scholarships, (b) identifying some (certainly not all) clearly relevant scholarships, and (c) providing links to the latter, so that the student might thenceforth ignore whatever else the databases said and proceed to get the current facts.  This construal of the situation would seem to imply that even a diligent and well-informed student would face real limits in his/her ability even to become aware of all relevant funding opportunities, much less assemble timely and competent applications for them — which suggested, in turn, the commonsense observation that a student ought to be able, somehow, to submit his/her information, letters of reference, transcripts, and perhaps even some essays to a central data repository, making it easier for students to find and apply for scholarships, and easier for funders to figure out (at least on the basis of anonymized statistical reports) why they were not making contact with seemingly appropriate candidates.

I found less to dislike when I turned from the list of scholarships given to me by CollegeBoard to the one from  I thought it was odd, not to say unhelpful, that would introduce me to an entry that it called “NSF Graduate Research Fellowship Program.”  “NSF” turned out to be shorthand for the National Science Foundation — which, according to, was offering 1,000 different awards with a maximum value of $30,000 (no minimum stated).  In other words, they were telling me that the NSF existed; now it was up to me to do their job of going in there — starting at the website address of that they provided — and finding my way around to scholarships or fellowships that might be relevant to me.  I was not sure what to make of the fact that, otherwise, my list of 700+ scholarships contained only two NSF entries.  Did that mean that someone else actually had figured out that I might qualify for, and be interested in, only two of those 1,000?  Or was this just dipping a toe in the pool, leaving 998 other opportunities unmentioned but for this casual remark from  A quick look informed me that NSF offered its own internal search engine.  I hadn’t actually become acquainted with it until this point.  So in the end, I had to chalk this one up as a plus for  At least they had notified me.  But the issue of laziness was still there, and it arose again in the Sigma Xi listing.  Like other databases, pointed me toward a general-purpose webpage.  It took some additional time, for me and probably for quite a few others, to go in, get past the nonworking links on the Sigma Xi website, and gather that the sciences for which they would consider applications apparently did include social sciences.

I was impressed when I looked at individual listings on the U.S. Department of Education (DoE) website, just as I had been impressed when I compared their overall data collection and presention arrangements (above).  They had a standard data-collection form to answer FAQs, including one or two that I presented on my own wish list (above), notably the number of applications received for a given scholarship.  Then again, as I began to compare these entries, I saw that they were not necessarily reporting the same information for all scholarships.  For instance, the Requirements section of their form describing one scholarship might contain only one information item, for “Enrollment levels” (e.g., undergradate), while the Requirements section for another scholarship might not have that, and might instead have information on whether the scholarship was available only to students in certain majors.  It took me a few minutes to get my head adjusted to their form, and particularly to train myself to look especially at the Requirements and Criteria sections down in the middle of the webpage for each scholarship; but once I did that, I found I was able to cruise through them at about the same speed as with the other databases (above).  The writeups in their so-called Criteria sections for each scholarship was actually a mini-summary, in many cases.  Some of those writeups could have been less cryptic; I still found that I had to click through to the program website, in a number of instances, to have a fair sense of what the scholarship was about, though probably less so than I had found necessary in using the other databases.  On the other hand, there was no fluff in those writeups; most of them did get right to the point in summarizing the offering.  As with the other databases, the click-through links almost invariably took me to the funder’s homepage, requiring additional searching to find actual scholarship information.  As with SchoolSoup, their list did include several school-specific offerings, though in this case those came from schools in which I had expressed no interest, outside of my home state — giving them, in other words, a random feeling that did not seem typical for this DoE database.  It did feel like I got through the DoE database’s listing faster than I had gotten through any of the others (above).

Sallie Mae’s individual listings were presented in a form somewhat like that of the DoE, but with less detail and in a more readable typeface.  As I worked through these entries, I missed DoE’s explanations of the individual scholarships.  I had to click through to the funder’s website in almost every case.  On the positive side, almost all of the links worked, so I was able to cruise through these offerings pretty quickly, and there were a number of appropriate results, as discussed below.

ScholarshipExperts provided a much more detailed writeup of the scholarships than DoE, and yet thereby probably slowed me down.  I had to page down through their writeups, and still had to open the funder’s website in most cases,  to see whether I was eligible and interested.  Their Epsilon Sigma Alpha entries (above) confused me for a while, so I lost some time there.  It seemed that a person pretty much had to approach that site like the NSF site (above), insofar as both had their own internal search engines and did not seem conducive to attempts to list their contents in an external database.  So maybe had done the best they could with the NSF as well.  Taking them as an example, I altered my spreadsheet to include just one main Epsilon Sigma Alpha entry.  But I lost some time working that out.  Overall, I definitely did not have the same sense of momentum, here, that I’d had with DoE (above).

Last on the list of databases:  Scholarship Monkey.  The tooltips that popped up for individual items on their list of scholarships were as brief as those on the DoE list, and as useful for making preliminary decisions on which scholarships to tentatively include in my spreadsheet.  But without checkboxes, the most efficient way to open the webpages pertaining to those individual scholarships was to use Multi Links (above) to open them all, and then quickly close (Ctrl-W) the resulting webpages that were not of interest.  In other words, I could see that some proffered scholarship opportunities were not good matches, but I wound up opening the related Scholarship Monkey webpages anyway.  I was able to work through the Scholarship Monkey list quickly, because the writeups were so brief and also because so many items in the list were obviously ill-suited for me.

With Scholarship Monkey out of the way, I had completed my first pass through the individual scholarships that the selected databases had shown me, in response to the profile data I had entered.  My spreadsheet now contained my preliminary decisions on where each scholarship opportunity stood, with respect to the criteria stated at the beginning of the preceding section of this post.  Now it was time to analyze what I had found.

Data Analysis

By way of brief review and clarification, my process (as described in more detail in the early paragraphs of the preceding section) began by eliminating scholarship offerings that I should not have been shown (Errors) or that were Unwanted or Irrelevant in light of the information that I had entered into the database profiles (or would have entered, if I had been asked better questions, had had my own thoughts more clearly in order, and had been more experienced with scholarship databases).  Those steps gave me items that were at least Potentially Relevant.  The decision at that point was whether I would move toward actually applying for those scholarships.  I wasn’t going to file applications for all of them right away:  it would take a while to work through them in detail, and some might prove irrelvant as my interests and opportunities continued to evolve.

So now I had a spreadsheet, and I wanted to see what it could tell me.  With various revisions and adjustments noted above, the spreadsheet contained 732 rows of data, each presenting information on a single scholarship.  (For these purposes, “scholarship” included sweepstakes, grants, and websites containing multiple scholarships and their own internal search engines.)  All of these 732 scholarships had come to my attention as the result of searches in the nine scholarship databases discussed in detail above:  SchoolSoup, FastWeb, Cappex, CollegeBoard,, U.S. Department of Education, SallieMae, ScholarshipExperts, and ScholarshipMonkey.

SchoolSoup had contributed 520 (71%) of those 732 entries.  Of those 520, 494 (95%) were cited by no other database.  That is, in this research, SchoolSoup was the sole means by which I became aware of 67% (i.e., 494 of 732) of the scholarships listed in my spreadsheet.  Of SchoolSoup’s 520 entries, 149 (29%) fell in the Error category.  Another 305 (59%) were Unwanted/Irrelevant.  So I had to wade through 454 items that should not have been on my list (i.e., almost 88% of all SchoolSoup entries), in order to find the 66 scholarships that I would consider at least potentially relevant.  (To reiterate some caveats noted above, SchoolSoup was the first database I used; I did make mistakes in using it; and I used a different procedure with it than I used with most of the others.  I felt that my results could be very different if I did return to this process for a second try at some point in the future.)  Of those 66 semifinalists, 52 went into the Potentially Relevant But Not Applying category, leaving 14 scholarships in the Might Apply category.  To summarize, I reviewed 520 SchoolSoup entries to find 14 that I liked, for a yield of about 3%.  Most of these statements are summarized in the first row of the following table, which contains similar calculations for the other databases.

Results of Searches in Nine Scholarship Databases

As the table shows, the U.S. Department of Education (DoE) and ScholarshipExperts tied for second place, in terms of the number of scholarships cited, with 58 each.  There were some differences between those two in other regards, though.  DoE’s items were nearly one-third more likely than those of ScholarshipExperts to be unique contributions, cited by no other database.  Also, nearly half of the ScholarshipExperts items were Errors.  But in the final analysis, each yielded only five scholarships that seemed worth applying to.  In that last regard, they were very different from Cappex and Sallie Mae, both of which managed to float valuable suggestions in more than one-third of all attempts (rightmost column).  In other words, I was 12 times more likely to be interested in a suggestion from Cappex than in one from SchoolSoup.

One other aspect of that table that may be worth discussing:  the Unique columns.  As mentioned above, 95% of SchoolSoup’s scholarships were found in no other database’s output.  Granted, again, SchoolSoup was a special case, in terms of the volume of its output and the potentially poor quality of my input.  Yet it was not alone.  For all databases except Cappex, a majority of the scholarship suggestions were unique:  I would not have heard of them if I had not consulted one particular database.  Overall, as shown in the table’s bottom rows, 86% (i.e., 684/793) of all suggestions provided by all databases came from only one database.  Of the remaining 106 suggestions, 78 came from only two databases.  That is, only 39 (i.e., 78/2) scholarships, out of the total of 732, were mentioned in two databases rather than just one.  Only six scholarships (producing a total of 18 references) were mentioned in three databases, only two were mentioned in four, and only one was mentioned in five.  No scholarship was listed by more than five of these databases.  So the penultimate line of the table shows that, with these few redundant mentions, the 732 scholarships received (from all databases combined) a total of only 793 mentions.

It thus appeared that it might be necessary to draw upon a potentially large number of additional databases before one would reach the point of saturation, at which time the act of consulting yet another database would be unlikely to lead to many new scholarships.  Going down the rows of the table in the order in which I approached these databases, it turned out that I was still at a point of being able to get 25 unique referrals from Sallie Mae, after having already queried a half-dozen large and well-known scholarship databases.  Granted, there was not an endless supply of databases like the one at Sallie Mae.  Nonetheless, it appeared that nobody, including the U.S. Department of Education, had yet corralled even a respectable majority of scholarship opportunities within a single, smallish group of databases.

That reference to the DoE database raises another point.  For some entries in that database, as noted above, DoE stated the numbers of applications received for a given scholarship.  Why did some scholarships receive more applications than others?  Reasons probably included time, difficulty, money, and marketing.  But nondisclosure of relevant information was also a likely factor.  Some startling numbers emerged from those DoE records.  Consider the numbers of applications vs. the numbers of awards at the AnyCollege Graduate Scholarship (12,000:2); the Campus Discovery $5,000 ‘Value of College’ Scholarship (100,000 : “minimum 1”); the Katherine Anne Porter Prize for Fiction (900:2); the Outstanding Young Volunteer of the Year Award (1,297:1); the Shout It Out Scholarship (20,000:5); the SPENDonLIFE Scholarship (3,000:5); and ‘I AM a Superhero’ Scholarship Award (4,000:1).  Disclosure of such numbers on application websites could help numerous college students make informed decisions about investing time in scholarship applications.

I ran some calculations on the relationship between uniqueness and outcome, for the scholarships that turned up in my search.  That is, was there a difference in outcome, between those scholarships that were mentioned in only one database, as compared to those that were mentioned in more than one?  For this purpose, I disregarded those in the Error category, which seemed to reflect a problem with scholarship databases rather than with scholarships.  I found that, of the 492 non-Error scholarships that were mentioned in only one database, 398 (81%) fell into the Unwanted/Irrelevant category, 65 (13%) into the Potentially Relevant But Not Applying category, and 29 (6%) into the Might Apply category.  By comparison, the 33 non-Error scholarships mentioned in more than one database were coincidentally divided, 11 (i.e., 33%) each, into those three categories.  This outcome suggested that scholarships listed in more than one database were significantly more likely to offer me something of value than were scholarships listed in only one database.  One hunch from this finding was that a law of diminishing returns might be at work, not in the sense that the databases were exhausting the world’s supply of scholarships, but rather in the sense that databases seeking to add ever more unique scholarships to their lists might be picking shriveled fruit that might not be very appealing to very many students.  Perversely, the scholarships that were more desirable due to their relatively favorable terms could become yet more sought-after due to greater coverage by databases.  I was not familiar with how databases choose which scholarships to cover, so I did not pursue that line of thought further.

While my spreadsheet contained only limited data on scholarship amounts, I decided to see if there seemed to be any patterns of interest in those data.  I calculated that the 425 unique scholarships for which my spreadsheet contained dollar values had maximum amounts, on average, of $7,680 each, while the maxima of the 45 scholarships that were listed by more than one database averaged only $4,639.  I guessed that this difference might be due to the nature of higher-value scholarships.  They would probably tend to involve PhD-level work, or rare and hard-to-apply-for opportunities like those offered by the Rotary, and would therefore not attract much mass interest among students generally.  That is, there might not be so much incentive for scholarship databases to make sure they were included.  Also, my results were doubtless influenced by the fact that I did not list my schools of interest in all of these database searches; perhaps I would have found that higher-end scholarships were mentioned by more databases if I had done so.  In a separate calculation, I found that the 244 unique scholarships rated Unwanted/Irrelevant for which I had dollar data averaged a maximum of $8,289, while the 32 scholarships (unique or not) rated Might Apply averaged a maximum of $4,687.  It preliminarily appeared that I was not presently eligible for and/or interested in the sorts of opportunities connected with the higher-amount scholarships.

I was out of time for this study.  By this point, I had also become increasingly convinced that my errors in this first run, and my unfamiliarity with matters related to scholarships and their databases, had given me data of mixed value.  It seemed that I had made a good first start at trying to get my arms around this topic, and that, if there was a next time, I might be better prepared to do a more streamlined writeup and analysis, perhaps with some attention to scholarly literature on the topic of scholarships.

While working through these various steps, I had a number of reactions and insights.  I have largely presented those thoughts above, in connection with the experiences that provoked them.  Among other things, there seemed to be a variety of ways in which scholarship funders and databases could provide more helpful and complete disclosure.  I did wonder to what extent, if any, these matters would be addressed by the time of my next review, if any.

It appeared to me, so far, that there were a number of inefficiencies in the scholarship system, such as it was, and that these inefficiencies could cost students inordinate amounts of wasted time and might also frustrate scholarship funders who are perhaps not always reaching the kinds of people, drawing the kinds of attention, or otherwise achieving the kinds of outcomes that they might have hoped for.  It seemed that, as one possibility, in place of these variously dysfunctional and universally incomplete databases, it might be advisable for someone to set up a wiki or other online data center in which any grant or scholarship funder could obtain a free listing in exchange for providing certain required information (e.g., application/approval ratio) and consenting to certain standard arrangements (e.g., deriving some applicant information from a passworded central database rather than requiring applicants to submit it reiteratively to every scholarship of interest).


2 Responses to “Review of College Scholarship Databases – First Cut”

  1. I really was browsing for creative concepts for my blog site and uncovered your post, “Review of College Scholarship Databases – First Cut Improving Higher Education”, will you care
    if I actually use a bit of of your suggestions? With thanks -Lavonda

  2. Ray Woodcock Says:

    Sure, Lavonda, feel free. Just the usual courtesy of a reference back here, please. Good luck!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: