Scholars often have a vested interest in journal quality and prestige, as publications in “high impact” journals can influence career decisions and outcomes. Yet, two of the most commonly used bibliometrics—the Google Scholar (GS) h5 index and the Web of Sciences five year impact factor (5Y-IF)—sometimes lead to vastly different rankings. In the field of criminology and criminal justice (CCJ), the GS h5 rankings actually correlate negatively with the 5Y-IF and other metrics of journal impact. We argue that part of the discrepancy stems from the fact that the maximum h5 value for any given journal is strictly increasing with the number of articles a journal publishes, thus potentially “punishing” journals that publish articles less frequently. To help address this concern, we introduce a bootstrap sampling method to calculate an adjusted h5 index that sets the number of articles published per journal as equal, thus allowing for a more direct comparison when assessing journal impact. We apply this method to 10 journals in CCJ. We find that adjusting for number of articles published leads to very different rankings when using the h5. Further, whereas the unadjusted h5 correlated negatively with other metrics of journal impact and prestige, we find that the adjusted h5 correlates strongly and positively with the rankings deriving from other metrics. The GS h5 maintains a number of valuable features, including accessibility and inclusivity, but until it adjusts for denominators, scholars should place less stock in its rankings of journals.