The Royal Air Force says it is leveraging advanced Virtual Reality (VR) technology to enhance the training of helicopter aircrew for Puma and Chinook helicopters.
This initiative, involving VR headsets and dynamic training environments, has been developed through a collaboration between Defence Equipment & Support (DE&S), 28 (Army Cooperation) Squadron RAF, the Support Helicopter Simulation Delivery Team, and CAE, a global leader in training for the civil aviation, defence and security, and healthcare markets.
“CAE has delivered an Embedded Virtual Instructor Training (EVIT) capability for the RAF and has secured a 3-year contract to support the 360-degree video lessons for both Puma and Chinook training programs. This VR instruction facilitates more efficient airborne training and raises entry standards for new skills. EVIT notably enhances familiarity with cockpit and cabin environments and bolsters confidence in essential pre-flight tasks.
The EVIT system uses enhanced headset technology to deliver efficient ground-based and simulator lessons, consequently increasing trainee knowledge and readiness for flying sorties. Trainees receive a headset loaded with various EVIT sessions, ranging from airborne instruction on dust landing techniques to detailed procedures like hooking up underslung loads and preparing a casualty for a winch rescue from a jungle clearing.”
Flt Lt Brett Jones, based at RAF Benson, praised the technology, stating, “The introduction of the VR headsets on 28 (AC) Sqn has been excellent for both our trainees and instructors. The immersive environment has provided the trainees with a better understanding of what will be taught prior to the simulator/aircraft sorties and this has allowed more time to consolidate their skills.” He also noted the benefits of greater standardisation across the unit.
A trainee commented on the effectiveness of this approach, saying, “The addition of VR equipment to supplement our learning and consolidation of checks in the early stages of the course was invaluable. It allowed nuances, flow, and interaction with the crew to be understood prior to stepping into the aircraft and meant I got more out of my sortie in the aircraft itself.”
I don’t think ‘enhance’ is the word of choice when it comes to VR training. Reduce costs might be more appropriate. No pilot wants their real time substituted for synthetic hours which is where this inevitably leads. As somebody with first hand experience flying expensive ‘state of the art’ simulators they are nowhere near as beneficial as real hours.
Synthetic works for simulated emergencies and weapons operations, for everything else it’s second tier.
A&D wrote:
“”I don’t think ‘enhance’ is the word of choice when it comes to VR training. “”
Perfectly summarised by Ripley in the film Aliens where the dropship is descending down to the planet and she comes out with:
“How many drops is this for you lieutenant?”
He replies:
“38, simulated”
Some positive news for our helicopter fleet.
Royal Navy to upgrade Merlin HM2 radar
26 January 2024
“The UK Royal Navy (RN) is to upgrade the radar of its Lockheed Martin AW101 Merlin HM2 maritime helicopters to see them out to their planned 2040 retirement date.
The UK Ministry of Defence (MoD) released a supply chain notice on 25 January, in which it said it is looking for “expressions of interest” from organisations wishing to be considered for qualification for the competitive procurement process for radar solutions for the RN Merlin HM2.
“The radar is to provide helicopter-based surface search for general maritime surveillance including small target detection/tracking, situational awareness and contribution to collision avoidance.
Additional capabilities including air detection and over-land modes should be identified,” the MoD solicitation said, adding that responses should be submitted by 18 February. No further details or timelines were disclosed.”
@Janes
As someone with some experience with this particular project, I have to agree that “enhance” is the wrong word (indeed, the use of the term “VR” is also inappropriate; they’re 360 videos viewed in a headset).
The training is designed as a supplement to “prime” the student prior to engaging with actual sims so that they can progress through the sims and to sorties faster. The areas where the most drastic improvements can be seen are in familiarisation and procedures, e.g. start-up/shut-down. There is no expectation of this replacing sim/sortie hours due to how linear it is.
The idea behind EVITs (or rather VITs in this case, as they’re not actually “embedded”) is that we can enable students to be instructed on SOPs and famils that would otherwise require access to equipment (FFSs, PTTs, FTDs, aircraft, etc.) and instructors via a 360 VIT session. It supplements and speeds up the training program and reduces the demand for other, more scarce, training resources.
Imagine you want to teach a student to do a walkaround of a chinook. What would be (we expect) more effective, quicker, and potentially cheaper:
– Trying to schedule a chinook and instructor to be available (whenever that might be) and have the student spend a few hours learning the process; or
– Organising the chinook and instructor to be available (whenever that might be) to train the students AND having a VIT headset-based course that they can take home with them and review and revise as many times as they’d like
Well explained. It seems people’s default setting is to complain about something when they don’t understand the subject matter.
From the comment of the trainee in last paragraph it seems that for those “new” to the aircraft and drills it allows them to practice their roles and responsibilities and also interact with other crew, in a benign environment, so that they don’t initially feel overwhelmed. We have all practiced and trained during our time in the military to ensure we are slick and competent, so to follow on from A&D’s comment, yes it will save money, but also hopefully wear and tear on expensive equipment and not put anyone in danger.
Exactly.
If it is used pre sortie to rehearse the sortie then it is very valuable.
The value is also in simulated emergencies and sensory overload experiences. Where instructors will be looking for the point where the pilot/crew becomes overwhelmed.
It also helps develop muscle memory. A prime example is Typhoon and Hand on Throttle and Stick (HOTAS). HOTAS is the primary means the pilot interacts with the aircraft and employ its weapon systems. However, there is something like 20 buttons/switches on the two controls. Where they are used in combinations to switch between various radar modes for example. When a pilot is detached to a desk job etc, to maintain currency, I have known a few pilots make a mock up of the two sticks including buttons. Basically to keep up to speed on using the button combinations. If they can include the cockpit interactions via VR using realistic controls. Pilots can then operate them with second nature, rather than looking at the controls., thus speeding up the decision loop.
This is one of the end goals of the broader project, however, one of the biggest hurdles will be haptics. What you’re describing would still require physical peripherals to ensure that actions are performed correctly and that there is suitable physical feedback to ensure the correct kinaesthetic experience for the student. In short, pressing virtual buttons that may be in the wrong virtual space, even if off by a few mm, and then trying to transfer that to the actual platform is highly risky, and may do more harm than good wrt muscle memory.