The motivation behind LaxMetrics.com is to unearth and develop new ways of evaluating player performance. As is evident across the site, there is no shortage of methods for evaluating offensive players against each other. Whether it’s drawing from existing stats or tracking new categories, we have ample inputs available to which we can apply some arithmetical magic. The same is true of defense/transition players and goalies, only to a lesser extent. Truthfylly, it’s especially difficult to measure the performances of defensive/transition players objectively.

It begs the question: how can we strip away positions and compare players out the front door to their teammates out the back door? Is it possible to use numbers to create an apples-to-apples comparison across all positions on the floor? In this LaxMetrics blog entry, we’re going to attempt to do exactly that.

First, we are going to introduce you to a new kind of metric—the LaxMetrics Weighted Average. Then once we’ve established the origins of the Weighted Average and how it works, we will group players into three tiers based on their score distributions. And lastly, we are going to highlight a few interesting cases that may support conventional wisdom or may come as a surprise.

Let’s get into it.

At its core, the LaxMetrics Weighted Average is just a collection of a bunch of individual statistics pooled together, rated by their importance, and then averaged out. Whereas a typical average considers each input equally, a weighted average allows us to assign different orders of magnitude (weights) to each input. In this case, the advantage in employing a weighted average as opposed to a classic average is that we can curate which stats we think should be considered more important than others. After all, it would be silly to value short-handed goals and loose balls the same. One is far more common than the other, so weighting them the same would void any nuance of scarcity that exists around short-handed goals. That would be a waste of time, right?

While we’ll use the Weighted Average to compare players across positions, the formula that we use actually depends on the player’s position. One formula is applied to forwards, while another is used for defense/transition guys. Despite the formulas being different, the outputs that they offer should be comparable. Below are the lists of inputs we used. The first grouping, applied to forwards, is labeled “O Weighted Average” and the second grouping, applied to defense/transition players, is labeled “D Weighted Average”. The weights of each input are listed in parentheses next to their correlating inputs.

**O Weighted Average:** Even Strength G (2), PPG (1.5), SHG (3), PD (1), PA (2), UA (1), FoA (2), SoA (1), GoE (1)

**D Weighted Average:** Even Strength G (1), PPG (0.75), SHG (3), PD (1), LB (0.25), CTO (2.5), UA (0.25), FoA (1.5), GoE(1)

As you can see, seven of the nine inputs are the same across the two weighted averages. The differences are that the O Weighted Average includes Pick Assists and Second Order Assists, while the D Weighted Average includes Loose Balls and Caused Turnovers. Additionally, Even Strength Goals, PowerPlay Goals, Unrealized Assists, and First Order Assists are assigned different weights in each average. Because defense/transition players are not primarily goal scorers and shot creators, it isn’t appropriate to suggest that they deserve the same gravity as things like Caused Turnovers in evaluating defensive performances. If we were to weight Even Strength Goals and PowerPlay goals the same in each average, we would unintentionally suppress D Weighted Average scores, ruining our effort at an apples-to-apples comparison.

The only scoring categories that are weighted the same in each average are Short-Handed Goals and Goals Over/Under Expectations. The weights assigned are the same because both stats are roughly the same across positions. For example, league-wide, there isn’t a tremendous difference between the number of shorties scored by forwards as opposed to defense/transition players. The same can’t be said for other inputs like Even Strength Goals, PowerPlay Goals, Unrealized Assists, and First Order Assists. Those four categories are inherently offense-centric, meaning that the distribution is disproportionately tilted toward forwards. The inverse is true of Caused Turnover and Loose Balls, which are generally slanted heavily toward defense/transition guys.

As a cumulative statistic that grows as a player accumulates more relevant stats, season totals are a viable tool for comparing players to each other. Weighted Average Per Game can also be useful, but won’t be the primary method of comparison used by the LaxMetrics blog.

Now onto the players and their scores.

Below you can see a list of the top-30 NLL players ranked by their total Weighted Average scores through Week 16. Keep in mind that goalies are excluded from these rankings, so this can’t be considered a true top-30 list.

It doesn’t take much digging to notice that the top-30 is still heavily slanted toward forwards. Of the group, 21 are forwards, while only 9 would be classified as defense/transition players. While this might reflect the imperfection in our attempt at creating an apples-to-apples comparison metric, the LaxMetrics blog might argue that there are more truly elite, game-changing players on offense around the league than there are out the back door. Looking at the list, virtually every team in the league has at least one star forward included in the Weighted Average top-30. Additionally, the breakdown begins to balance in the grouping of players from 31-60 in which nearly half (13) are defense/transition players.

Looking at the graph below, we can see a distribution breakdown of all of the Weighted Average scores league-wide.

There is a fairly clear demarcation between what we would consider Tier 1 and Tier 2. The drop-off from Tier 2 to Tier 3 is even steeper. Tier 1 is comprised of only 13 players, which is just 4.1% of the 317 total position players ranked by Weighted Average. This means that in order to qualify as a Tier 1 member, a player has to rank in the 96th percentile or better. These players are truly having elite seasons.

Tier 2, however, is far larger. The group is comprised of 48 players, which amounts to 15.1% of all players ranked. This means that in order to qualify as one of the 61 players in Tier 1 & Tier 2 combined, a player has to score in the top 20% of all Weighted Averages. If Tier 1 is comprised of the league’s elite, Tier 2 is made up of the very, very good.

Tier 3, which includes all other players, contains a wide range from the league’s least productive players ranging toward a group of very good players at the top. The reason we don’t break Tier 3 into additional tiers is that the graph illustrates how clear the groupings are. At the border of each Tier is a steep drop-off in score distribution. This means that there are roughly as many very good players in Tier 3 as there are very unproductive players. For a player to fall into Tier 3 isn’t necessarily an indictment on his performance, it is just an illustration of where he stands compared to the league’s best.

Below is the full 318-player list, which you can sort by either total Weighted Average or Weighted Average Per Game:

To offer a little bit of added context, the league average Weighted Average Per Game is 0.345 and the median total Weighted Average is 3.27, which we would describe as roughly being the “middle” of all 317 scores.

When we compare individual players to the median, it puts into perspective just how excellent both Tier 1 and Tier 2 players have been. Each of the 61 players that comprise Tiers 1 & 2 score significantly higher than the median, often more than double its value. For example, Toronto’s LaTrell Harris is the final player in Tier 2, posting a Weighted Average of 5.83, roughly 178% of the median. Following Harris, there are still 97 players who have scored above the median, but are confined to Tier 3.

But through all of this, who are the outliers whose positions in the rankings might come as a surprise? One example is Toronto’s Challen Rogers. While he is still a Tier 2 player, he doesn’t rank among the league’s elite transition players. His overall position of #44 on the Weighted Average list places him clearly behind the primary group of contenders for Transition Player of the Year, an award that Rogers has dominated recently. In fact, Rogers ranks behind both Mike Messenger from Saskatchewan and Panther City’s Matt Hossack, a pair of players making a relatively recent ascent into the “league best” transition conversation.

Perhaps even more surprising than Rogers’s position is that of Colorado’s Joey Cupido. A former Transition Player of the Year himself, Cupido ranks outside the top two tiers at #73 overall. Statistically, Cupido has taken a bit of a step back this year, despite still being an impact player. While Cupido’s Weighted Average is held down by his relative lack of loose balls and limited point total, the Mammoth are still 5-1 when he records a point.

In another mildly surprising realization, a pair of Saskatchewan forwards rank among the league’s 10 best players, despite the Rush having featured one of the league’s least impressive offenses. Between Mark Matthews (#5) and Robert Church (#7), the Rush are the only team with two forwards in the top-10. In fact, the only other team with two players in the top-10 is the Roughnecks, who boast Zach Currier at #3 and Jesse King at #10. It’s crazy to think that two of the league’s worst teams combine to feature four of the league’s top-10 players by Weighted Average. Saskatchewan and Calgary have plenty of problems, but star power certainly is not one of them.

Interestingly, the current state of the LaxMetrics Weighted Average rankings aligns quite well with much of the conversation about postseason awards. Much of the MVP conversation has centered around the top three forwards on the list: Albany’s Joe Resetarits, Buffalo’s Dhane Smith, and Georgia’s Lyle Thompson. For good measure, dark horse MVP candidate Zach Currier is visibly present at the top.

Similarly, the top four defense/transition guys in the Weighted Average rankings are Currier, Vancouver’s Reid Bowering, and Toronto’s Mitch de Snoo and Brad Kri in that order. If you were to ask most media people around the league who their finalists for the Transition Player of the Year would be at this point, most would return an answer including three of those four players.

The conversation around the Rookie of the Year is also reflected quite clearly in the Weighed Average rankings. The hierarchy seems to follow what most of the debate suggests: a top-four of New York’s Jeff Teat, Bowering, Panther City’s Patrick Dodds, and Buffalo’s Tehoka Nanticoke in that order. You’d have a hard time finding anyone around the league whose RoY finalists would include someone other than that top four.

While it is only our first true attempt at creating an apples-to-apples comparison metric, there still is inevitably room for improvement in optimizing the inputs and their weights. Surely it can be improved upon in the future, but its current alignment with most of popular opinion bodes well for the accuracy of its rankings. Inevitably the Weighted Average will give way to an even more meticulous positionless ranking of players, but for now we can use it to explore some of the NLL’s landscape. Numbers and the eye test don’t always agree, but it’s particularly interesting when they do. In this case, the duo is more in sync than not.

Make sure to subscribe to the LaxMetrics blog to get each new exploration sent directly to your email inbox!