For decades, the lineage that began with Barr & Stroud and continues today under Thales has built the optical eyes of Britain’s submarines and armoured vehicles. Now, they’re teaching those systems to think.
The initiative, known as Digital Crew, began as an experiment in sensor fusion and artificial intelligence. It has since become a core part of Thales’ vision for a fully connected “digital battlefield.”
The goal is not to replace human operators, but to create a virtual partner that helps them perceive and decide more quickly than ever before.
When I visited the company’s Optronics and Missile Electronics site, a demonstration took place showing me exactly how this works. On one screen, a live feed streamed from a vehicle-mounted sensor suite; on another, a composite image formed from infrared, visible, and acoustic data. Within seconds, the system detected movement in the distance, zoomed in automatically, and tagged a shape.
During the visit, staff at the site described Digital Crew as a system that blends artificial intelligence with facial recognition principles, designed to attach to a range of platforms and analyse sensor data in real time.
That represents a clear evolution for Thales, which built its reputation on periscopes and sensors but is now teaching those systems to interpret what they see. Digital Crew combines visual, radar, acoustic and environmental data to create a single, simplified picture for the operator.
Staff said the goal is to give operators more time to think and act. By flagging changes and highlighting what’s new, the system is intended to help crews anticipate threats rather than simply react to them.
Digital Crew uses convolutional neural networks to learn and recognise objects much as a person might. The software begins with a reference library of known images and expands its knowledge by analysing every frame it observes, learning to identify objects from new angles or through obscured views such as smoke or dust.
Developed at Thales’s Glasgow site with support from the Defence and Security Accelerator and local universities, the system has already been trialled with the Royal Navy and Army, and is being studied through research contracts in the UK, Canada and Australia. Its strength lies in its ability to monitor multiple feeds without fatigue, providing persistent, pattern-based awareness that enhances human judgement rather than replaces it.
Several of the stabilisation algorithms once developed to keep a submarine’s mast steady are now being adapted for other roles, from vehicle-mounted cameras to systems that merge multiple optical feeds into a single image. The technology has already been trialled in a wide range of settings, including military exercises on Salisbury Plain, crowd management scenarios and border surveillance trials. It has attracted international attention, with Canada among those evaluating its ability to track and classify multiple targets in different conditions.
Engineers at the site described how the same precision once devoted to glass and optics in Glasgow’s Barr and Stroud heritage is now applied to data alignment and image processing. The system has been tested aboard a Royal Navy submarine to support contact tracking during periscope use and on an Army test vehicle to identify thermal signatures at distance. In each case, the software automatically adapts to its surroundings.
As vehicles and sensors become increasingly networked, staff explained that this kind of technology allows smaller numbers of platforms to cover wider areas, multiplying the effectiveness of crews already in the field. The principle running through all of it is to expand human capability rather than displace it, maintaining the operator at the centre of decision-making while allowing machines to shoulder the cognitive burden of perception.
During the visit, staff described Digital Crew as a tool designed to assist, not command. The intention is to keep humans firmly in control while reducing their cognitive load. Rather than making decisions, the system continuously scans multiple feeds for anomalies and presents information in a clear, intuitive format. In one demonstration of an urban reconnaissance scenario, it combined data from cameras and microphones to show where noises originated and what each camera could see. When movement was detected, the system highlighted it for the operator, who could then decide whether to act or dismiss it.
Those involved in the project compared its function to that of a co-pilot, providing a constant watch so that the crew can focus on higher-level decisions. The goal is to anticipate a future in which military platforms carry more sensors but not more crew, meaning that commanders will rely on systems capable of filtering and interpreting an overwhelming flow of visual, radar and acoustic data. In trials, Digital Crew has produced its analysis almost instantly, generating a prioritised picture in the time it takes a mast to retract.
Developers at the site described the project as part of a broader shift from the analogue to the digital battlefield. What once meant giving troops better optics now means giving them better situational understanding. Digital Crew is platform-agnostic and can be fitted to vehicles, aircraft or submarines, operating across the armed forces through shared data networks.
The team also acknowledged the ethical debate surrounding military AI. They made it very clear that humans remain responsible for all final decisions, and that every automated detection must be verified by an operator before any action is taken. By fusing sensor inputs and highlighting potential threats more quickly, they argued, the system enhances awareness and reduces the risk of error.
The philosophy underpinning the work is not to replace human intuition but to protect it, helping operators stay focused on what matters most.










Teaching machines to think in Glasgow… don’t do it “Away doon that strategic cesspit where ye belong!……..”