Lattice’s virtual wall: the border becomes invisible

Lattice Anduril

Deployed on the US-Mexico border, Lattice combines sensors and AI. Promises of efficiency, fears of permanent invisible surveillance.

In summary

Anduril’s Lattice is often described as a virtual wall: automated surveillance that replaces constant human presence with sensors and artificial intelligence. The system aggregates feeds from towers, thermal cameras, radars, and, where the device provides for it, ground sensors to detect, track, and classify objects of interest. Supporters see it as a pragmatic tool for reducing false alarms, saving officers’ time, and covering a 3,145 km (1,954 mile) border. Critics, NGOs, and civil liberties researchers speak of persistent surveillance and dehumanization: individuals are tracked well beyond the border, sometimes inland, without immediate human interaction. The real debate is not between “technology” and “humanity.” It is between operational efficiency and transparency, democratic control, and proportionality.

The virtual wall that is transforming border surveillance

Lattice is not just another camera. It is orchestration software. It is used to centralize dispersed sensors and produce a single operational image. The idea is simple: instead of having an agent watching screens, the system sorts, alerts, and prioritizes.

This logic has been taken to extremes on the U.S.-Mexico border. Autonomous towers, sensors, and communication networks form a digital layer over the terrain. This is exactly what critics call invisible surveillance: it does not physically block, but it observes and reports.

The promise is attractive to an administration. A 3,145 km (1,954 mile) land border cannot be “held” with patrols everywhere, all the time. It is held by prioritization. And a system like Lattice sells precisely that: reducing noise, isolating the signal.

The technical mechanics that make the difference

Sensor fusion at the heart of the system

The key point is sensor fusion. A tower can carry a radar, an electro-optical camera, and a thermal imager. The radar sees movement. The thermal imager sees heat signatures. The video provides context. Taken separately, each sensor has its limitations. Together, they reduce ambiguity.

Lattice serves as an integration layer. It receives the streams, aligns them, and correlates them. Movement detected by radar is associated with a thermal signature. A trail is generated and then tracked. The system can also aggregate complementary sensors when they exist in the local architecture, including ground sensors used in certain security devices (seismic or acoustic detection). This is where the line between “filming” and “tracking” begins to blur.

Automatic classification and the risk of error

Anduril and certain reference documents explain that the system is primarily aimed at the automatic classification of objects, not the identification of individuals. In concrete terms, the AI seeks to identify whether something is an animal, a human, or a vehicle. This sorting changes the operational economy. An officer is not called out for a coyote. He is alerted for a pickup truck.

But we must be clear: classifying is not the same as understanding. Under certain conditions, AI can confuse a group of people with an animal. Fog, heat, terrain, vegetation, backlighting. The consequences are asymmetrical. A false alarm costs time. Failure to detect something costs an operational opportunity. And misclassification can lead to more tense interactions with civilians. The public debate also plays out here: how much tolerance is there for false positives and false negatives, and who sets it?

Data retention and the issue of traceability

Another sensitive issue is data. A Q&A document related to Lattice mentions that images and content are retained for “no more than 30 days” as part of DHS policy, and emphasizes that the system is designed not to collect personal information. This nuance is important, but it is not enough to calm critics. A 30-day trace is already enough to reconstruct trajectories. And multi-sensor fusion facilitates reconstruction.

What NGOs fear is not just the eye. It is the memory. A surveillance zone becomes more powerful when it stores and cross-references data.

Massive deployment fuels controversy

The phrase “massively deployed” is not just a slogan. Public figures show a ramp-up in deployment. Anduril announced that it had deployed its 300th autonomous tower in 2024 and claimed that these towers represented approximately 30% coverage of the southern land border. For their part, digital freedom organizations have mapped a larger set of towers from multiple providers, counting 581 towers in their dataset as of December 2, 2025.

It is important to understand the network effect. A single tower sees one sector. A mesh network allows for relaying. An object “leaves” one field and ‘enters’ another. The border becomes a succession of detection bubbles. This is where the criticism of “invisible extension” comes from: even if a tower only “sees” 8 km (5 miles) as the crow flies, the relay between towers can track much further.

The budget is also an indicator. A local investigative article detailed contracts showing a plan to acquire approximately 277 new towers and upgrade 191 towers, at a cost of $67.8 million over several years, across the entire border. Other public documents describe much higher amounts over several years, depending on the exact scope (hardware, software, maintenance, integration).

This deployment is not neutral. It anchors infrastructure. And infrastructure always calls for new uses.

Lattice Anduril

Criticism from NGOs and the dehumanization lawsuit

Surveillance replacing immediate judgment

NGOs talk about dehumanization for one simple reason: humans intervene later. Before, the agent saw, assessed, and decided. Here, the algorithm alerts. The flow is reversed. We no longer monitor to find. We find, then we intervene.

This reversal changes the psychology. It can reinforce a logic of permanent suspicion. It can also trivialize surveillance, because it becomes “automatic,” and therefore invisible in the budgetary routine.

The border that extends without physical barriers

The criticism of “border extension” is often misunderstood. It is not a legal shift of the line. It is a functional extension. A tower located a few hundred meters from the border can observe beyond that line, on both sides. And it can do so continuously.

A physical reminder helps to reason. A 10-meter-high (33 ft) tower has a visibility range limited by the horizon of approximately 11 km (7 miles) on flat terrain, even before considering sensor performance. If it is on a ridge, the range increases. If a network of towers relays the signal, tracking over several dozen kilometers becomes plausible without any single tower “seeing” everything. This is precisely the idea of a virtual wall: control is not a wall. It is a continuity of detection.

The risk of abuse and the question of democratic control

The most difficult and political issue is that of democratic control. Algorithms are black boxes. Contracts are often technical. Complete maps are not always easy to access. Audits are rare. And the temptation to “plug in one more source” is constant: drones, tethered balloons, satellites, license plates, etc.

This is not science fiction. Actors such as the Electronic Frontier Foundation are already documenting sets of sensors and towers, beyond Anduril’s devices alone. The border is becoming a system. And a system tends to outlive the debates that gave rise to it.

The question that remains after the controversy

One can defend the idea of a better-monitored border. One can also reject a society of continuous surveillance. Both positions exist. But the real question, the one that deserves an adult debate, is more specific: what concrete limits do we impose on a virtual wall?

If the goal is to detect a suspicious vehicle in the immediate vicinity of the border, the perimeter is defensible. If the objective shifts towards permanent surveillance of entire areas inland, without transparency, the nature of the project changes. And this is where Lattice, with its powerful integration capabilities, becomes a symbol. Not just of innovation. A symbol of the border that disappears from view, but not from sensors.

Sources

Anduril, “Anduril Deploys 300th Autonomous Surveillance Tower (AST),” September 25, 2024.
CBP, “Autonomous Surveillance Towers Declared a Program of Record,” July 2, 2020.
CBP, “Smart Wall Frequently Asked Questions,” January 20, 2025.
U.S. DOT BTS, “U.S.–Mexico Border: 1,954 miles / 3,145 km,” July 5, 2016.
Electronic Frontier Foundation, “CBP Is Expanding Its Surveillance Tower Program…,” updated December 2, 2025 (581 towers).
CalMatters, “Border patrol extends surveillance towers in California,” January 30, 2024.
Privacy International, “Dual-use tech: the Anduril example,” November 10, 2025.
Privacy International, document “Lattice Systems” (object classification, 30-day retention), March 2019.

War Wings Daily is an independant magazine.