How AI complements 4 common waste sensors

Alisa Pritchard

Alisa Pritchard

Nov 26, 2024

6 min read

As waste flows swell and the waste labour shortage persists, materials recovery facilities (MRFs) are running out of staff to keep track of material. Hiring more samplers is expensive, so many have turned to sensors across the electromagnetic spectrum to gain visibility into their waste streams.

Sensors like RGB cameras, X-ray units, and near-infrared (NIR) sensors make it possible to gather data on waste objects at scale — right down their food grade or polymer type. They provide a more scalable alternative to manual sorting and sampling, and more in-depth recognition.

Now, operators face a new challenge. Sensors help them turn masses of waste objects into masses of data — but they don’t replicate a human's ability to interpret that information. Visibility is still a hurdle if they can’t turn the huge amount of sensor data into action.

AI waste analytics addresses that challenge. If sensors are the modern MRF’s eyes, AI is the brain that consolidates visual input, and turns it into actionable insight.

How MRF sensors gather visual data 

Waste tracking sensors gather information at different points on the electromagnetic spectrum, making each suited to a specific recognition task.

Near-infrared (NIR) sensors

NIR sensors use a near-infrared light beam to identify materials. NIR helps identify specific polymers without chemical intervention.

That makes it a vital tool for plastics recovery facilities (PRFs) that need to produce pure, single-polymer bales for reprocessing. Operators can install an NIR sensor on a PET line, for example, to ensure they’re removing HDPE and PP before baling.

⚠️ NIR’s limitations

The biggest barrier for NIR is its cost. Individual infrared units can cost tens of thousands of pounds, making it impractical to install them across an entire facility. Often, they’re better-suited to specific use-cases on individual belts.

Besides cost limitations, materials like black plastics often escape NIR recognition by absorbing infrared light. Multi-material packaging also poses a challenge: if a PET bottle has a PP sleeve, the NIR sensor will register the object as PP without accounting for the material underneath.

NIR isn’t able to identify the function of waste objects, either — it can’t register whether a PET bottle is food-grade or not, which limits the data’s value for some recyclers.

Digital watermarking

Digital watermarking chemical tracing combine packaging design with visual recognition to track products in the waste stream.

Systems like HolyGrail 2.0 embed QR codes in labels that are invisible to the human eye, then tracked by QR code scanners during the waste management process. Chemical watermarking alters the makeup of the polymers used for packaging, which spectroscopic sensors can then identify even if products are damaged. 

Other digital watermarking systems allow designers to add QR codes to product labels that are only visible under UV light. It’s an effective method for tracking specific stock-keeping units (SKUs) as they pass through recovery facilities, giving brands a better understanding of their products’ fate.

⚠️ Digital watermarking’s limitations

Watermarks require packaging design changes that can be expensive, with long lead times. That's especially true in the case of chemical tracing, which requires changes at the polymer level. The sensors required to detect watermarks are equally costly, reaching well into the hundreds of thousands of pounds at scale.

Any watermarking scheme on a global scale is likely to need complex international regulatory alignment and global standards. Until regulation supports watermarks, MRFs will remain reluctant to purchase the cameras needed to recognise QR codes until enough brands use the compatible labels. Brands themselves won’t want to invest in packaging changes until enough facilities can recognise watermarks, either.

That makes it difficult to scale to packaging as a whole, which means it’s more effective as a targeted, country-specific solution rather than a comprehensive solution for tracking all post-consumption resources.

X-ray machinery

Like NIR, X-rays operate at an invisible point on the electromagnetic spectrum.

X-rays are useful in scenarios where density separates one valuable material from another — in the case of metals recycling, for example. In some cases, they are also able to identify batteries, which are often hidden within waste electronics.

⚠️ X-ray’s limitations

Like NIR, X-ray can’t take an object’s function into account, and struggles to identify low-density materials like paper and cardboard.

It also shares the cost limitations of NIR — X-ray machinery is often too expensive to deploy across an entire facility.

The presence of radiation also introduces health and safety requirements that other sensors do not.

RGB cameras

RGB cameras are an order of magnitude less costly than NIR cameras of the same quality, and provide a visual overview of the materials passing over a belt.

Artificially-intelligent recognition systems trained on millions of historical images can process those pictures, identifying the waste objects that appear on the belt in the same way a human would. RGB data can account for the broadest range of material, and help provide context on the function and food grade of objects. That context even makes it possible to estimate a material’s polymer type.

⚠️ RGB’s limitations

Everything an RGB camera can see is identifiable — if the camera is connected to an AI waste analytics system. Without it, RGB cameras just generate images of waste that still need to be manually interpreted.

How AI waste analytics turns data into action

AI waste analytics systems are not sensors. Instead, they use machine learning to consolidate and interpret the information gathered by sensors.

Greyparrot Analyzer units, for example, are fitted with RGB cameras that capture real-time images of waste. Before our AI processes them, they look like this:

Raw visual waste data from an RGB camera

Raw sensor data can guide smart machinery, but it doesn’t provide actionable insight on material composition or historical trends in sorting performance — and it’s certainly not easy for operators to interpret in real time. AI takes that task off of our customers’ hands.

Once our AI has used its recognition library of over 89 waste types to identify each object in the image, it looks like this:

RGB waste data processed with AI waste analytics

Analyzer’s AI uses visual data to interpret much more information than the object’s material, though. It gathers seven layers of detail the waste objects passing through recovery facilities: 

  • Material 
  • Mass 
  • Size
  • Financial value
  • Food grade and object function (e.g. 'Shampoo bottle')
  • Brand and stock-keeping unit
  • Potential emissions

Our customers access those insights on intuitive dashboards, empowering them to take specific action. When the process is complete, systems like Greyparrot Analyzer have transformed raw visual data into insights that operators can interpret and act on:  

Greyparrot Facility Dashboard at a PRF

Why we consolidate sensor data with AI

Alone, tools like NIR, X-ray and digital watermarking gather valuable data, but they’re specialised to solve specific visibility challenges.

AI waste analytics systems like Analyzer aren’t limited by material type — and aren’t even limited to RGB: we’re piloting a project to incorporate NIR sensors in our hardware, which will add detailed polymer differentiation to our seven layers of visual recognition.

By interpreting data from multiple sensors via Greyparrot Sync integrations and hardware updates, Analyzer is helping us creating a more complete picture of each waste object. Today, that coordinated visibility is empowering waste managers to run more profitable plants. In the near future, we’ll use it to guide fully-automated facilities.

Learn how we connect Greyparrot Analyzer insights to third-party machinery and software here.

 

Subscribe

Keep up to date with AI waste analytics and circular economy events with our monthly newsletter.