Monday, February 2, 2015

The 25% Rule; where did it come from?

Have you heard of the 25% rule?

In case you are unaware or have been off-planet for a while let me explain.

The 25% rule is used primarily by pretreatment authorities to determine when a grease interceptor (or oil separator) is full. A fairly universal definition would be:

"The total depth of the floating grease layer plus the settle-able solids layer cannot exceed 25% of the total liquid depth of the interceptor."

Determination is made by taking a core sample with something like a Sludge Judge or Dipstick Pro (I know of a few jurisdictions that use florescent light covers (clear plastic tubes) from Home Depot or Lowe's, with a rubber stopper).  The device is lowered slowly through the fats, oils and grease layer all the way to the bottom through the solids layer of the interceptor and then capped or plugged and slowly removed and set aside to rest.  This allows the captured FOG to collect at the top of the device while the solids settle at the bottom.

A measurement is taken, typically in inches, from the top of the FOG layer to the bottom of the device, which represents the tanks total water column.  Then the FOG and solids layers are each measured, also typically in inches, and added together.  If the combined FOG and solids layers are equal to or greater than 25% of the total water column then the interceptor is considered full.

For example lets take a typical gravity interceptor in the field like the one pictured here on the left. 

The technician is using a Dipstick Pro which appears to be showing a 48" water column and the technician is showing by the spread of his fingers a FOG depth of about 6" (okay, I'm guessing on that, but I have fairly large hands - not like Wilt Chamberlains, but I wear a large golf glove - and when I spread my fingers like his against a ruler its about 4.5" and adding a bit for the extra FOG above and below his fingers its about 6" or so, give or take).  If the solids layer at the bottom of the device is also 6" that would be a combined 12" of FOG and solids.  When you divide 12 by 48 you get 25%.

This particular interceptor is full, hence the stunned demeanor from the restaurant owner/manager as the technician gives him the bad news.

But wait, there appears to be quite a bit of space left in the interceptor for collecting even more FOG and solids, so how do we know that this interceptor is actually full (a question this restaurant owner/manager probably asked the technician)?

Here's the thing, it's not that the rule is a scientifically based determination of efficiency breakdown, or that the EPA has mandated it, it's more like a generally accepted rule-of-thumb that many jurisdictions have adopted.

Where did it come from?

Good question.

I've been looking for the answer for a while now and no one seems to know.


In 2011, while preparing for a presentation at the Pacific Northwest Grease Summit in Bellevue Washington, I wondered if there was a correlation between the capacities of certified hydromechanical grease interceptors and the 25% rule.

I took all of the major manufacturers certified units (JR Smith, Zurn, Mifab, Watts, Josam, and Wade) and did some very basic math.  For example, if a unit was certified at 20 gpm with 40 lbs grease capacity using lard, I converted the amount of lard in the interceptor, when it was full, into gallons and then divided that by the amount of water the unit could hold.

It didn't matter which manufacturer's unit I checked, the results were very similar and all within a narrow range.  The maximum capacity for storing grease before failure in each unit I checked was between 25% and 35%.

This is further supported by the Plumbing and Drainage Institutes 1998 (R-2010) paper Guide to Grease Interceptors - Eliminating the Mystery, in which they stated that PDI-G101 certified interceptors may need maintenance when as little as 25% of their rated capacity has been reached.

Fast forward to my more recent research in which I have been emailing jurisdictions, googling the internet and searching all available forums for any clue as to the origins of the 25% rule.

Honolulu appears to be one of the early if not earliest users of the rule.  A post on the Yahoo Pretreatment Coordinators forum said that the jurisdiction did the same calculations as far back as 1995 as they were developing their FOG program.  It was stated that the jurisdiction chose 25% to be conservative and it became the rule for grease interceptor maintenance enforcement in their new FOG program.

Many of the jurisdictions in Orange County California use the 25% rule based on a recommendation in the Orange County FOG Control Study which was not based on any science, but rather on a survey of FOG control programs around the US, many of which were using the 25% rule or similar standard such as maximum inches of accumulation of FOG or solids.

The 25% rule appears to be ubiquitous in FOG programs and ordinances, not because of any scientific or technical merit, but rather it seems to have its footing in the idea that 'everyone else is doing it' so it must be right.

The problem now is that the rule is well entrenched in these FOG programs and ordinances making it difficult for jurisdictions to be flexible with newer technologies that hold more grease in comparable foot prints to traditional designs, in some cases matching the storage capacities of much larger gravity style interceptors.

Schier Products Great Basin, Thermaco Trapzilla and other products looming on the horizon are capable of storing grease and solids to well over 50% of their liquid volume, but jurisdictions are challenged to figure out a way to allow an owner to actually benefit from these higher capacities, owing to limitations set on themselves through enacted policy.

Hopefully by understanding how we got where we are, jurisdictions can gain insight into how to either avoid the pitfalls in setting universal capacity limits and/or perhaps, correct any problems that may have been created in enacted policies that inadvertently punish owners who would choose to use newer more efficient higher capacity technologies.

Anyone out there that has more information on the history of the 25% rule please message me and I'll update this post. 


  1. Maybe it would make the most sense to institute a policy that FSEs are subject to the 25% rule unless they provide manufacturer's certification that the installed interceptor has a higher operational capacity. The 25% rule is so ubiquitous that many manufacturers probably design to that standard, so it makes sense to maintain that as a default.

    The trouble with higher capacity interceptors is that maybe they work fine past 25%, maybe they don't. If they don't, who pays for the corrective actions when FOG starts clogging pipes after the interceptor exceeds 25%? The municipality would likely only have authority to charge the user, but it would really be the manufacturer who's at fault. Wanting to avoid that kind of headache, I can't blame municipalities for standing by the 25% rule.

  2. Let's look at another factor, first I have no idea how engineers test their designs, but someone has to control the time of the testing. But we know they have innovative ideas which have help the end user. Moving away from steel and concrete is a positive for the end users. But Time is my point. When reaching the capacity limit, 25%, but how long has it been say a foot thick. Time is an important factor. If the 25% blanket is 1 month old, very viscus but fine. If the grease is 5 months old it becomes hardened, I seen a guy walks on it. Consider the nasty environment, high sulfides, low pH, stagnate areas, all species of bacteria. As this hardening blanket of grease matures, it requires emaciation before evacuating which can often plug the laterals with clumps. Cleaning companies may damage the structure just breaking up the grease burgs. You have flow, concentration, capacity, don't forget time. And that is more of a user variable than the design. gh, Ventura CA