- A British startup wants to bring computing even closer by integrating micro-data centers into street lamps
- Self-destructing Nvidia chips will ensure sensitive information stays private
- Some challenges need to be addressed before this becomes a reality
British startup Conflow Power Group Limited (CPG) has proposed a major shake-up of global data centers by integrating micro-units directly into urban infrastructure such as streetlights, rather than concentrating computing in the hyperscale facilities we know today.
In addition to dispersing computing across cheaper, more manageable micro-sites, the plan also focuses on local solar power generation and battery backup systems to address one of the biggest criticisms data center campuses face: sustainability and environmental impacts.
Under the new proposals, CPG aims to bring AI computing closer to users and devices, which would reduce latency and ease pressure on the national telecommunications infrastructure.
Future data centers could be located right outside your front door
THE BBC reports that $2,000 Nvidia AI accelerators could be used in place of high-end, flagship GPUs like the H100 and B200 systems, which cost tens of thousands of dollars per unit.
The use of self-destructing chips is also a noteworthy addition to the system, with Nvidia including firmware locking, encryption, and other anti-tampering protections that can effectively disable hardware if it is compromised, moved, or accessed by unauthorized methods or individuals.
Such anti-tampering technologies already exist for export compliance and are typically used when AI accelerators are sold in restricted or edge deployments to ensure maximum security.
Tens of thousands of microdata centers could distribute computing across cities, each managing localized AI workloads that could span applications such as traffic monitoring, video surveillance, autonomous vehicle coordination, telecommunications, environmental sensing and much more.
While the idea of embedding data centers into streetlights may be new, bringing AI to the edge is a trend that is accelerating as AI workloads become more sensitive to bandwidth and latency. Bringing computing physically closer to where it is applied would also likely reduce costs associated with data transfers.
The energy implications are also compelling, with large-scale data centers facing constraints from network connection delays and unsustainable supplies. Geographically spreading energy consumption certainly solves current bottlenecks, with some of the largest campuses consuming as much electricity as smaller cities.
Street lights in particular are attractive due to their existing electrical connections, their dense distribution in urban environments such as towns and villages, and the fact that many of them are also already connected to fiber optic networks.
Many have already been versatile into 5G small cells, traffic camera mounts, Wi-Fi hotspots and electric vehicle chargers.
Interestingly, positioning micro-data centers on existing road infrastructure networks also has significant geopolitical benefits. Growing sovereign computing concerns in Europe and the United Kingdom are pushing governments to implement programs that favor more local processing.
For these to work at scale, however, CPG would need to address the many challenges of integrating computing into streetlights. Existing infrastructure may require upgrading to provide protection against weather and vandalism. There are also concerns around thermal management, with large-scale data centers expected to consume excessive water for cooling.
Although upgrading existing infrastructure may not be economically viable, building multi-purpose networks in the future could allow a single item, such as a street light, to serve many more functions than we previously imagined.
Follow TechRadar on Google News And add us as your favorite source to get our news, reviews and expert opinions in your feeds.




