clock menu more-arrow no yes mobile

Filed under:

MLB Doesn’t Have Much Appetite for Innings Eaters Anymore

Modern strategies for pitching usage have turned “innings eaters” into a dying breed. What does the term even mean at this point? And will starting pitchers ever get to feast again?

Getty Images/Ringer illustration

I can pinpoint precisely when I realized I no longer knew what an innings eater was: July 31, 2023, the day before the MLB trade deadline, when ESPN’s Jeff Passan bestowed the venerable label on Aaron Civale.

Civale hasn’t totaled 125 innings in any of his five major league seasons—one of which, to be fair, was 2020, when he was among the league leaders in workload. All in all, he’s averaged roughly 5.5 frames per start. Innings nibbler, snacker, or sampler, I’d accept. But by historical starting-pitcher standards, Civale hasn’t done much more than push innings around on his plate.

I’m not trying to pick on Passan, who was named national sportswriter of the year despite his questionable classification of Civale. In fact, I’d suggest that the problem isn’t Passan’s innings eater-dar, but rather the increasing difficulty of discerning who fits the description. When that tweet was sent, Civale ranked 20th in innings per start out of 144 pitchers who’d started 10 games. He was eating innings, relative to his peers. Yet, if we’re affixing an “innings eater” tag to a pitcher whose season high in innings would barely fill a first half for Liván Hernández, what does the term even mean anymore? And what does the innings eater’s shrinking appetite tell us about teams’ pitching priorities?

Although innings eaters have existed since baseball was born, the expression seems to be younger than Adam Wainwright. In his 1985 book Nine Innings, Daniel Okrent described Jerry Augustine as “an inning eater, an omnivore who could occupy the mound when it was hopeless to waste a truly valuable pitcher.” Neither I nor Wall Street Journal language columnist Ben Zimmer could find an earlier reference, which means that Okrent, who’s also credited with inventing rotisserie baseball, may deserve credit for coining a second catchy term that blended baseball and food. “Every time I see it used, I think, ‘Did I coin that?’” he says via email. “In fact, I think I did!”

Nine Innings chronicled a single game from 1982, when Augustine was a sub-replacement reliever. That’s not an innings eater as we know them (or think we know them) now. Nor was Mark Ross, the subject of a 1988 article that called him “the 10th man on the staff, the likely inning eater in lost causes.” (In 1988, teams made do with fewer than 13 pitchers apiece.) Ross “ate” all of 42 1/3 innings in his major league career, which can hardly have hit the spot.

Initially, “inning eater” was a versatile term that could apply either to sacrificial relievers who filled the garbage-time role now occupied by position-player pitchers, or starters whose main talent was taking the ball. In an early non-Okrent use in the summer of ’87, an ancient Phil Niekro was sold to skeptical Blue Jays fans as “an inning eater, a guy who can give five or six reasonable innings every fifth or sixth day and give them a chance to win without tuckering out the bullpen.” Niekro, who was having a Wainwright-esque last gasp, couldn’t quite live up to the title: His 48-year-old right arm had only four starts left, in which he went 15 innings and allowed 16 runs. In early ’89, the sobriquet found a more deserving recipient: Kevin Gross, who was then about to embark upon the last of five straight seasons in which he topped 200 innings, a span that saw him compile a cumulative 96 adjusted earned run average (ERA+). Ahh, that’s the stuff.

Over time, “inning eater” has morphed into “innings eater”—a linguistic phenomenon known as “pluralism” (the same process that turned “Yankee fan” into “Yankees fan”). And it’s also settled into the definition laid out in the Dickson Baseball Dictionary: “A starting pitcher who can be relied upon to pitch a lot of innings with respectable, but generally unspectacular results.” Dickson—whose third and most recent edition was published in 2011—specifies, “Such a pitcher will throw more than 200 innings a year.” In 2010, the year before that book came out, 43 pitchers threw more than 200 innings. In 2023, only four did, three of whom were unacceptably spectacular. By the Dickson definition, then, the Cardinals’ Miles Mikolas (201 1/3 IP, 91 ERA+) was the last surviving specimen—an innings-eater endling.

Speaking of the Cardinals: In a Midwestern way, they still seem to value unspectacular steadiness. Cardinals president of baseball operations John Mozeliak, who rebuilt the team’s rotation by signing Sonny Gray, Kyle Gibson, and Lance Lynn, told The Athletic’s Chad Jennings, “We aggressively targeted innings.” But much as several managers quoted in Jennings’s piece extol the importance of innings eating, most teams, in practice, are putting pitchers on a diet. St. Louis is “certainly excited about having guys who understand getting deep into a game is a good thing,” but other contenders’ pursuit of high-performance, low-volume options prompted FanGraphs’ Dan Szymborski to observe “the increasing tendency of some of the top teams in baseball to basically de-emphasize inning-eater types, instead accumulating scads of talented pitchers with ifs and feeling confident at cobbling together the right available guys when needed.” Hence the Dodgers and Braves acquiring and extending Tyler Glasnow and Chris Sale, respectively: Those guys are good on a per-inning basis, but availability isn’t their strong suit.

You can see why those two teams in particular would prioritize upside: Both suffered early exits from the playoffs last season after entering the NLDS with diminished rotations, and both are virtual locks to make it back to October this year. Thanks to their ridiculous lineups and depth, L.A. and Atlanta have the luxury of punting on proven durability in hopes of having high-upside arms available in a crucial short series, when teams rely a lot more heavily on high-performance pitchers. (Not that Glasnow and Sale are certain to be functional when those series start, but they’ll be free to take precautions and pace themselves with the playoffs in mind.) Even for less stacked teams, though, there’s not as much downside to this strategy than there once would’ve been. The lower the innings ceiling for starters, and the more October comes to dictate how a season is perceived, the more modest the opportunity cost of favoring efficiency over bulk.

Remember Dickson’s definition of an innings eater? “A starting pitcher who can be relied upon to pitch a lot of innings with respectable, but generally unspectacular results.” Erase the last six words, and you’d still have a hard time finding pitchers who fit the bill. It’s not just the workmanlike pitchers who are working less; the spectacular starters are on similarly severe innings rations. The graph below, which dates back to the beginning of the live ball era more than a century ago (and excludes shortened seasons), shows the average number of pitchers per team who’ve thrown at least 180 innings. The orange line is for better-than-average pitchers, as indicated by an ERA- below 100. (With ERA-, lower is better, in contrast to ERA+.) The blue line represents average-or-worse pitchers, as classified by ERA- marks of 100 or higher. Although good pitchers still surpass the 180-inning mark more regularly than mediocre arms do—getting outs more efficiently makes it easier to accumulate more—both cohorts have seen their membership plummet after several decades of relative stability.

It’s not news that individual workloads are down; starters’ inning allotments have been falling in fits and starts since the era of Old Hoss Radbourn, prompting previous generations of writers to wring their hands. (“LAME PITCHERS,” blared an 1884 headline on a St. Louis Post-Dispatch story that lamented how “high-priced twirlers shirk their duties”—which were “nothing more than nine innings of pitching once a day … during an average of four days a week.”) But along with lots of other analytically driven developments, this trend has accelerated in the past decade—to the point that even for elite arms, innings are almost off the menu. And how can anyone eat them when they’re barely being served?

If innings eating is over, it’s not just because innings totals are lower than they used to be, but also because they vary so little from pitcher to pitcher. As the chart below of the standard deviation in average start length suggests, there’s less variation in how deep non-opener starters go into games than there used to be. Whether starters are good or bad, they tend to be pulled at about the same point in the game—because bullpens are better and deeper, and teams are so wary of injuries and the times-through-the-order penalty.

Of the 117 starters who threw at least 100 innings last season, 13 averaged more than six innings per start (with a high of 6.6 from Marlins throwback Sandy Alcántara, who had Tommy John surgery in October), and 12 averaged fewer than five (with a low of 4.6 from Luke Weaver). Everyone else clustered into the narrow five-to-six range. In that sense, starter workloads are a little like game lengths post–pitch clock: Everyone knows that they’ve shrunk, but not everyone appreciates how homogeneous they’ve become.

We can see the same trend with per-game pitch counts, using data that dates back to 1988 (and excludes outings of fewer than 40 pitches to eliminate most openers). As the average number of pitches per start has fallen, especially in the past several seasons…

…and the highest pitch counts of the season have steadily decreased…

…so has the variation in the number of pitches per start.

Even when a starter is “cruising,” he’s usually not going to go into the ninth, eighth, or even seventh—both because past performance doesn’t predict clean innings to come, and because pitchers aren’t conditioned to push their pitch counts deep into triple digits. The next chart, which displays the percentage of starts of a given pitch count, broken down by decade, also illustrates the increasing conformity of pitch counts imposed by the dropping upper bound. The slope leading up to 90 pitches looks largely the same, but the descent beyond that point keeps getting steeper.

It would be one thing if the average pitch count were falling but exceptions were still made for certain aces or rubber-armers who abided by different rules. Instead, there’s a growing rigidity to single-game workloads, which is also reflected in teams’ unwillingness to give pitchers longer leashes when they’re chasing historic starts. In 2012, there was only one instance of a pitcher exiting a game with more than five innings pitched while he had a no-hitter going—and, in that case, the culprit was a groin pull. In each of the past two seasons, there were a record 19 such games, and the once-controversial mid-no-hitter hook has become so common that the rare example of deference to tradition is shocking.

Essentially the same thing has happened on a season-long level. A century ago, there was next to no correlation between how effective a regular starter was on a per-inning basis (as measured by wins above replacement [fWAR] per inning pitched) and how many innings they averaged per start. Even so-so pitchers were expected to “finish what they started.” Over time, as bullpens got established, the correlation increased: The quality of a starter dictated how much he pitched. (I know, novel concept.) Now, though, that correlation is on the wane. Pitching well doesn’t necessarily lead to pitching prolifically.

Part of the reason for this decreased correlation is that the goals of going hard and lasting long (phrasing) aren’t mutually conducive. Moving the finish line closer to the starting block allows starters to air it out early: If the objective is five or six innings, they don’t have to hold back as much as they would if they wanted to go eight or nine. In other words, some of the starters with the most max-effort approaches are intentionally upping their WAR per inning at the expense of their innings per start—and teams typically encourage that trade-off.

In that sense, the line between starters and relievers (who’ve always been more max-effort) is blurrier than before—or, as Russell Carleton of Baseball Prospectus put it in 2022, “Everyone is a reliever. Even the starters.” The corollary, it seems, is that everyone is a starter, even the relievers. Last year, the Rays successfully moved habitual bullpen guy Zack Littell into the rotation (as they’d previously done with Jeffrey Springs and Drew Rasmussen); the Padres pulled the same trick with Seth Lugo, as the Yankees did with Michael King. This offseason, several other relievers have toyed with rebranding as starters, including Jordan Hicks, Reynaldo López, Brent Suter, and A.J. Puk. The innings bar for starting has sunk so low that even pitchers who previously washed out of a rotation can try tweaking their pitch mix and giving it a go.

In an environment where relievers like these are viewed as viable starters, the choice between a hurt-or-good guy and an available-but-blah guy isn’t that tough. It’s not as if Atlanta’s Sale slot could have gone to Gross, who’d be below average on a rate basis but could undoubtedly compile 230-plus innings of the 1,400 or so that a team has to hand out. The Grosses are long gone, along with their mustaches and mullets. (Never mind, maybe mullets are back.) The opportunity cost of spurning the era-adjusted innings eaters of today to chase the ceiling of a more talented but less dependable pitcher is pretty small.

Consider: The Reds and Mets took fliers on the oft-injured Frankie Montas and Luis Severino, respectively, for more guaranteed dollars than the Cardinals gave Lynn. Lynn is likely to throw more innings than either Montas or Severino in 2024, but how many more? He managed 183 2/3 last season despite a 77 ERA+, but that’s 43 fewer innings than Hernández accrued with a 76 ERA+ in 2001, 53 fewer than Vida Blue tallied with a 70 ERA+ in 1979, and 80 fewer than Jim Bibby amassed with a 75 ERA+ in 1974. Opting for the innings eater now is the setup for a pitching spin on the joke from Annie Hall: “These innings taste terrible,” someone says, to which someone else answers, “Yes, and such small portions!”

From a team point of view, practicing portion control makes a certain sense with aces and innings eaters alike. At first, teams mostly lopped off ineffective frames in which a starter might have stayed in the game and gotten into trouble because they were tired or facing a hitter for the third or fourth time. By reallocating replacement-level innings to fresh relievers, teams could curtail starters’ workloads without proportionately lowering their value. After all, most of today’s teams wouldn’t want 200 extra outs from a fatigued Lynn or Jordan Lyles; those innings could be in better hands (or stomachs).

At this point, though, even baseball’s best arms spend so much less time on the mound that they can’t make up in quality what they used to supply in quantity. On average, the top 10 pitchers today provide fewer WAR per season than ever before.

Similarly, the correlation between innings pitched and WAR among starters who qualify for the ERA title is weaker than ever. In earlier eras, there was a wide range in innings totals, even among those who cleared the “qualified” bar by pitching at least one inning per team game; some might have squeaked by with 165 innings, while others might have thrown 300. (Steve Carlton’s 1980 was the last of the 300-inning seasons; Justin Verlander’s 2011 was the last to top 250.) Knowing how many innings a starter threw went a long way toward telling you what that starter was worth. Now, the few pitchers who still qualify cluster close enough to the unchanged minimum that their innings totals alone don’t differentiate them.

Last season, starters threw less than 58 percent of major league innings, though they still accounted for roughly 73 percent of pitching WAR. In the past few years, the former figure has cratered—although sparing starters the third time through the order and dipping deeper on the bullpen depth chart has made the latter figure decline slightly less. The next graph reveals the recent sea change, following a few flat decades: The role-resistant pitching plans that emerged from the fevered dreams of sabermetric baseball bloggers in 2012—including Dave Cameron (now of the Seattle Mariners) and Sam Bankman-Fried (now of the Metropolitan Detention Center, Brooklyn)—aren’t that far from fruition. Pitching staffs have surrendered their regimented structures: They’re now amorphous masses of five-and-dive starters and reliever rat kings (many of them making the major league minimum).

At some point, presumably, the pendulum will stop swinging this way. There may be some signs that it’s swung too far: the explosion in position-player pitching, and the struggles of the soft underbellies of bullpens (which still lack many multi-inning relievers); how quickly productive pitchers expire, as the percentage of hurlers who’ve had Tommy John surgery continues to climb; the fretting over fatigue and familiarity effects, stemming from relievers pitching too often or facing the same hitters too many times. MLB is big on changing rules to restore the sport’s old look, and a stricter limit on active arms—to lengthen starts, suppress strikeouts, and protect pitchers from themselves—remains on commissioner Rob Manfred’s to-do list.

That tweak would be wise. But even if it happens, we won’t return to a time of irresponsible pitch counts, managers who never wanted to know the odds, and starters who insisted on staying in games when they had little left. Nor should we want to; most pining for the past is just nostalgia speaking.

Then again, the old ways weren’t always worse, even if the caliber of competition was. Some pitchers are better than others. They should probably pitch more than others, too.

By definition, innings eaters weren’t riveting to watch, but at least you knew who they were—unlike some of the fungible starters now streaming through staffs. In the innings eaters’ absence, we might be missing out on a more positive familiarity effect: the fact that the more we see someone, the more we tend to like them. As it is, most players pass through rosters so fast we never get to know them.

You know what the end of innings eating reminds me of? The evolution of scripted TV. (Stay with me.) Before Billy Beane and Tony Soprano started winning in ’99, there were only so many channels, and only so many shows. The staples aired so often that, to avoid getting gassed, they had to take something off: They stretched and recycled story lines, filled weeks with junk, and sometimes resorted to clip shows. Some of them—maybe most of them!—were bad, but people watched them anyway because they were all that was on.

Then came the onset of prestige, peak TV. Seasons shrank rapidly, and on an inning-per-inning—er, episode-per-episode—basis, the shows were way better than before. The new model was such a success that the series kept multiplying. Soon, instead of a shortage, there was too much TV: Series came and went too quickly to keep up, characters cleared out just when we were getting attached, and some seasons were whittled down too far for them to fulfill us. The new norm created room for shows that weren’t built for the long haul, but threw heat in short bursts. But that cost us the chance to see what the standouts could do with long seasons, and endangered the comfort TV we could count on to air a lot of episodes with respectable, but generally unspectacular, results.

If 2023 taught TV execs anything, it’s that there’s still a place in our streaming rotations for old-school series with voluminous libraries—the TV equivalents of Kevin Gross. “The traditional type of sitcom and drama and procedural—people still love those,” one TV writer told The Ankler last month. The latest trend in TV buying, per a separate report, is “away from niche … and toward big and broad.”

Episode eaters are back, baby. Maybe one of these seasons, innings eaters will get to gorge again.

Thanks to Ryan Nelson for research assistance.