AI at the Edge: Why Power Consumption Becomes Mission Critical

Edge devices — from smartphones to IoT sensors, wearables and remote embedded systems — are increasingly expected to run AI inference locally. The benefits are many: reduced latency, lower bandwidth demands, improved privacy, resilience without continuous cloud access. But these benefits bring a severe constraint: power. Edge devices are often battery-powered or otherwise tightly constrained in power budget, and inefficiencies that might be tolerable in data centers become untenable in edge scenarios. Atomic-level defect mitigation is a key enabler for reduced leakage currents and power consumption.

Leakage currents, defects & why they matter

In modern semiconductor devices, power dissipation has two large components:

As transistors scale down (5nm, 3nm, etc.), physical dimensions shrink, voltage margins compress, gate oxides become thinner, higher fields appear. All this exacerbates leakage issues. But in addition to the usual scaling-driven leakages, atomic-level defects, contamination, and surface/interface flaws introduce localized states, trap levels, unwanted charge paths or enhanced leakage at edges or perimeters of devices. These degrade switching thresholds, reduce carrier mobility, degrade control over off-state leakage — hence increasing static power.

Recent studies have explicitly shown the impact of such defects. For example, Semiconductor Engineering recently commented that atomic-level defects and contamination are “hidden but powerful driver(s)” of power consumption in data centers, via mechanisms like increased leakage currents, reduced switching efficiency, and degraded carrier mobility. Likewise, in GaN substrates, work on threading dislocations has shown that screw and mixed dislocations act as leakage paths under reverse bias.

In edge devices, where duty cycles may include long standby periods, static leakage becomes a larger fraction of total power. A milliwatt or microamp extra leakage per transistor, multiplied by many transistors, can drain batteries faster than expected.

AI inference & power efficiency

AI workloads (even inference) generally require higher switching activity, more complex architectures (accelerators, specialized memory, vector units, etc.) and more persistent idle states. Some strategies help:

  • Quantization, pruning, model compression, hardware accelerators that lower switching energy.

  • Better sleep / duty cycling to minimize active time. TinyVers, for example, shows very low power consumption in continuous operation and even lower in deep-sleep or wake-up modes.

  • Using more efficient microcontrollers or AI-optimized SoCs: in comparative studies, newer cores with better vector instruction sets and optimized cache architectures have significantly better energy per inference than older ones.

But none of these approaches can fully compensate if the base static/leakage currents are high due to defects or contamination.

Why surface cleanliness & atomic-level defect control become key

This is where technologies like those offered by SisuSemi matter. To get down to very low leakage and reliable behavior in ultra-low power standby or always-on modes, a chip must have:

  • Clean surfaces/interfaces, with minimal contamination (metals, residues) that can act as trap / recombination centers or leakage paths (e.g. metal contamination in pMOS devices correlates with higher off-state leakage)

  • Minimal atomic‐scale defects: dislocations, vacancies, interface states, surface roughness. Studies indicate that e.g. screw dislocations in GaN, or surface damage at the perimeters, significantly increase reverse leakage currents.

  • Process optimizations: annealing, oxidation / oxide interface control, ultra-clean environment, gettering, careful edge / perimeter definition. All to reduce trap densities, passivate defect states, and avoid unintended leakage.

SisuSemi’s UHV surface cleaning, atomic-level defect mitigation, and low contamination process steps can help reduce leakage currents, enabling better battery life, lower standby power, higher reliability — all essential for edge AI devices.

Implications for design, manufacturing & ROI

  • Design side: Edge AI chips will need to budget not just for dynamic/inference power but also for static power leakage. Designs that ignore leakage have a risk that battery life is not optimized.

  • Manufacturing / process control: fabs will need tighter controls on surface contamination and defect levels especially at interfaces and perimeters; cleaning steps (before critical depositions, before overlay, etc.) become more critical.

  • Metrology and QA: detecting and quantifying atomic‐scale defects, surface residues, interface trap densities, etc., becomes necessary to ensure yield and low leakage.

  • ROI: While more stringent cleaning, UHV methods and defect mitigation may increase cost, they may more than pay off via longer battery life, fewer warranty returns, better device reliability and competitive differentiation in power-sensitive markets.

 

Conclusion

As AI pushes outward from the cloud into edge devices and battery-powered IoT nodes, every micro‐watt counts. Dynamic power optimizations are well known, but static or leakage power becomes disproportionately important at low activity or in ultra-constrained power settings. Atomic-level defects and contamination are not minor nuisances — they directly undermine leakage current, threshold stability and device efficiency. Technologies that can clean, passivate, control surfaces and defects at the atomic scale — such as what SisuSemi offers — are going to be increasingly critical components in the semiconductor roadmap for edge AI.

Contact us to learn more
Next
Next

How Atomic-Level Surface Defects and Contamination Drive Up Water Consumption in Semiconductor Fabrication