Case study into Incident report 12.10.14 Mike McGrath
Introduction
My name is Mike McGrath, I am (or was) a professional skydiver with over 12,000 jumps (7,000 tandem jumps) and was the Chief Instructor at NSPC (Newcastle Sport Parachute Club).
Skydiving has been my life’s work for over 20 years. I have jumped all over the world which has given me the opportunity to work with and learn from the best of the best. As a human being, my ego now and again tricks me into a false sense of confidence, into thinking that ‘I know what I know’. Luckily life is there to remind me that there is still so much to learn…
Had you asked me before the 12th October 2014 how I would deal with discovering an inverted 3 ring on a tandem deployment, I would have answered that IT COULD NOT HAPPEN TO ME!
This is not because I think I am special, or infallible in any way, but because I have consistently performed a physical, visual and verbal 10 point safety check prior to exit on every one of my 7000+ tandems. The 10th and final point in that process being a visual and physical check of both 3 rings and my RSL status. Therefore, logically, this simply could not happen, right? Wrong!
Even as I write this addendum to an otherwise straight forward incident report (my gear checks failed to pick up an inverted 3 ring packing error on the left riser assembly, the load of the opening shock was transferred to the loop and bendix cable which were damaged and could have broken at any second, I couldn’t cut away and had to land it), I find it hard to reconcile the fact that I missed this equipment defect during both of my pre jump checks (1. on the ground before donning the rig, and 2. in the aircraft prior to exit).
However, after reviewing the video evidence of the jump I can only conclude that I must have missed this because the misrouted 3 ring is clearly visible on the ground, in the aircraft, and in free-fall.
The bottom line therefore is that this has happened and as such I accept full responsibility.
What follows is not an attempt to transfer, mitigate, reduce or otherwise deny my responsibility. Rather it is a personal exploration into how I could have let this happen and how, if at all, I can prevent it from happening again.
For the purposes of understanding, I have broken this report in four parts:
Part 1- Objective circumstances: what appears to have actually happened, based on the video of the jump and incident investigation
Part 2- Subjective experience: what I thought was happening at the time and what I did about it in the heat of the moment
Part 3- What I have learned: the role of human factors in incidents
Part 4 – What I have learned: cognitive biases as contributing factors
WARNING: This report contains content which may be disturbing to some people. Part 1- The objective circumstances
We all know that incidents occur as a result of several small factors or variables combining together. On examining this incident after the fact I have created a list of variables that may or may not have contributed to this incident occurrence. This is not a list of excuses that mitigate the situation- again I want to state that I accept full responsibility for the situation and accept the consequences of any disciplinary action.
Much of this content is speculation and is written here for the purpose of discussion and exploration:
It was a very busy day with 37 tandems, 6 AFF students that wanted to do as many jumps as possible, and many fun jumpers. We had 4 tandem masters (TM) and 2 dedicated AFF instructors. I recently took on a 5th TM to help free up more of my time to act as CI / DZSO. Alas our 5th TM had a landing incident on an AFF jump the day before. I am a new CI and am learning to manage various aspects of supervising a busy operation while doing tandems, AFF and other jumps, and making a living. I am an active jumper and like to jump as much as possible. (Perhaps too much). The incident occurred on load 3 of the day. I had done a tandem on load 1 and an AFF stage 4 on load 2. A qualified packer B had packed the equipment.
Prior to donning my equipment I did my usual visual and physical check of the back of the equipment, and a visual check of the front of the equipment. My logic to date has been that any issue with the front of the equipment will be picked up during my final 10 point check. NOTE: I will now most certainly change this procedure to include a physical check of the front of the equipment prior to donning the equipment.
Another TM geared up my customer. This TMs’ procedure is to fully tighten the front harness adjuster on the main lift web on the ground and to tighten the adjusters on back of the harness prior to connecting the customer. My procedure is the opposite. The last thing that I do before commencing my 10 point safety check is to tighten down the front of the harness and instruct the customer to hang on to their harness. On this jump I did not adjust the front of the harness. After the incident in a moment of self doubt I thought that this change in procedure may have interrupted my usual chain of events and caused me to miss my 10 point check altogether. However after checking with my customer 2 days after the incident, he confirms that I did do my 10 point safety check.
As part of my 3 ring check I also check the RSL connection. On this jump when I was under canopy I had cause to check my RSL connection prior to trying to cutaway. After the incident in another moment of self doubt, the fact that I had to double check this lead me to suspect that I may not have checked it prior to exit and therefore that I may not have checked my 3 rings or the other 9 points of my safety check. Again, this was not the case because my customer confirmed that I did do my 10 point safety check.
Part 2- My subjective experience
I did not become aware of the problem until the opening sequence when I heard an unusual sound in my left ear sometime during line stretch. After deployment I checked the left 3 ring and found that is was not as it should be. The canopy was open and flying in every other respect. Unsure of the problem or it’s potential consequences, and whether or not I could cut away, I proceeded to examine it in detail. I found that something was very wrong with the configuration of the riser loop and the yellow bendix cutaway cable.
I could not completely ascertain exactly what was wrong. (At this point it did not occur to me that it could be a result of an inverted 3 ring for reasons that will become clear later).
I decided that if it was possible to cut away, (I was now at 2800ft) then I should try. My logic being that I would rather be under a perfect reserve that a imperfect main. I thought it unlikely that the 3 ring release system would work correctly in its current configuration, but I gave it a go first with one hand on each handle, then with both hands on the cutaway. I didn’t try again as it became obvious that this was not an option.
My next step was to further examine the 3 ring situation. It soon became clear that there was a catastrophic failure of some form. I could see that the 2 smaller rings, the yellow bendix cable and the riser loop were a long way from the riser. I suspected, but wasn’t sure, that the riser loop had gotten hitched around the small ring somehow, and I tried, gingerly, to dislodge it. It wouldn’t move. At this point I stopped pursuing the option of cutting away and began exploring the consequences of flying this through to landing.
I suspected, but again was not sure, that my customer and I were suspended by the strength of the riser loop and/or the bendix cable. I was hopeful, but not at all confident, that it would hold our weight all the way down to the ground.
I had little choice that I could see at that time but to ride it out and hope for the best.
NOTE: In retrospect I could have considered using a my hook knife to physically cut away. that did not occur to me at the time That decision however may have brought with it its own additional problems.
I decided to leave the customers lower attachment points connected until we were below 1000ft in case the riser gave way and I could execute emergency procedures. I flew with minimal and very light toggle input so as not to increase wing loading and to remain ready to cutaway and deploy the reserve at a seconds notice in case the left riser assembly gave way. I did not re-stow my cutaway and reserve handles because I may have needed them at split seconds notice if the left 3 ring assembly gave way. If the worst happened and the riser loop gave way below 1000ft, I wanted to be able to get the reserve out as quickly as humanly possible.
Once this plan was formed, I decided, just in case of the worst case scenario, to leave a short message for my family on my GoPro hand-cam, to ask the customer if he wanted to do the same for his family, and I explained to the customer our situation as best as I could.
NOTE: Several people have challenged whether or not it was a good idea to tell the customer what was happening. My reply is simple: Every situation and every customer is different. This was my call alone to make. If I thought it was likely that this customer would panic I might not have told him. As it was I trusted this customer to stay cool and he did.
The bottom line here is that this is a very personal thing and you don’t know what you will do until it happens. This was and forever will be between me, my customer and whatever god either of us choose to believe in. When you know that you may be experiencing the last moments of your life and you are sharing your fate with the person on the front of you then it comes down to this. Do you want the last moments of your life, the last words that you may speak to the last person that you see to be a lie? My answer was no.
Once below 1000ft I disconnected the lower right attachment point but could not disconnect the lower left attachment point. We did a practice legs up for landing, and then set up for a straight in approach.
After landing I continued to film the three ring system and invited other senior instructors and a packer “A” to inspect it with me. On inspection I (incorrectly) concluded that the on opening the integrity of the yellow bendix cutaway cable and the riser loop had pulled the bendix cable through the grommet on the cutaway housing and the riser.
That this could have been due to an inverted 3 ring and to my not picking it up during my gear checks just did not seem possible to me. When I say that I simply couldn’t believe that this could happen to me, it is not because I believe I am infallible in any way but because my 10 point safety check is so comprehensive, it checks for this specific issue and I have performed it religiously on every single tandem up until this point.
Of all the things that can and do happen on a tandem skydive, I was completely convinced THIS could not happen to me, despite the evidence in front of me. So what did I do? I went up on another tandem to “get back on the horse”.
Now understand that at this point I believed with total conviction that I had just been dealt a random catastrophic equipment failure which I had dealt with relative calm, integrity and professionalism. As far as I was concerned something had “happened to me” and I had “dealt with it”. The fact that this could have been partly or totally my fault, simply didn’t, wouldn’t or couldn’t enter my head.
I know how this sounds, and some of you might be thinking “what a fucking egomaniac” or “this guy thinks he can’t go wrong”, but that is not how I think. It was simple logic for me that if the last thing I did (apart from checking my drogue in the door) was to check my 3 ring configuration, then it simply can’t be possible that an inverted 3 ring was the the cause of the problem. Ipso facto, eliminate that from the list of possible causes.
To say that this was a shock to my system is the biggest under statement of my life. My very reality was shaken to the core. It was like the scene in Fight Club when you realise that Brad Pitt’s character Tyler Durden and Edward Norton the narrator are actually the same person. Or like in the Wizard of Oz the world goes from black and white to colour in the blink of an eye.
I have believed up until now that if I follow certain procedures, check my gear and have a safety first attitude, then I can jump out of a plane in relative safety. This belief had kept me alive and skydiving safely for over 20 years. I now had to reconcile that belief with the video evidence in front of me showing me making one of the biggest and most clear cut mistakes in the business: failing to properly check my gear. My confidence was shattered.Part 3- What I have learned: the role of human factors in incidents
I have learned that psychologists have a term for what I experienced post-jump: the negative emotions I experienced when unable to resolve my subjective belief of what happened with the objective facts in the video is called “Cognitive Dissonance”.
Cognitive dissonance is the mental stress or discomfort experienced by an individual who holds two or more contradictory beliefs, ideas, or values at the same time, or is confronted by new information that conflicts with existing beliefs, ideas, or values.
What happened had left me with a number of difficult questions. How had this happened? Had I totally missed my gear checks? Or had I done my gear checks and not picked it up? Which was worse? Most importantly, should I skydive again, what about tandems and if so, how could I trust my checks from this point on?
I was disoriented and really confused. I decided that I would not jump again until I could resolve these questions for myself. So I set out to find out what I could.
After a several weeks of online research and multiple conversations with ATSB (Australian Transportation Safety Bureau) Human Factors investigators, an SAS patrol commander, Psychologists, an Air Traffic Controller and Aviation Safety Officer, several senior members of the APF and countless fellow skydiving instructors, I have learned about several “Human Factors” which may have been at play in my incident.
NOTE: Despite the research presented below, I don’t believe I will ever know for certain how I managed to miss the inverted 3 ring during my 10 point check. But I hope that my research will go some way toward understanding.
Human Factors
Anyone involved in aviation will have heard the term “human error” before. Human errors are defined as ‘the result of actions that fail to generate the intended outcomes’. Human errors have their own science, a branch of psychology known as Human Factors. Human Factors investigators are the guys called onto the scene of an aircraft or motor vehicle accident in order to determine what went wrong and what can be learned.
Human factors categorise errors according to the cognitive processes involved towards the goal of the action and according to whether they are related to planning or execution of the activity. Actions by human operators can fail to achieve their goal in two different ways:
- The actions can go as planned, but the plan can be inadequate
In the case of planning failures (mistakes), the person did what he/she intended to do, but it did not work. The goal or plan was wrong. This type of error is referred to as a mistake. A person intends to carry out an action, does so correctly, the action is inappropriate, and the desired goal is not achieved . A planning failure has occurred.
Planning failures are Mistakes. “Mistakes may be defined as deficiencies or failures in the judgmental and/or inferential processes involved in the selection of an objective or in the specification of the means to achieve it.”
Diagram 1: Classifying Human factors in incidents
2) The plan can be satisfactory, but the performance can still be deficient
A person intends to carry out an action, the action is appropriate, but the person carries it out incorrectly, and the desired goal is not achieved. An execution failure has occurred. Execution errors are further broken down into Slips and Lapses. They result from failures in the execution and/or storage stage of an action sequence. Slips relate to observable actions and are commonly associated with attentional or perceptual failures. Lapses are more internal events and generally involve failures of memory.
In a familiar and anticipated situation people perform a skill-based behaviour. At this level, they can commit skill-based errors (slips or lapses). In the case of slips and lapses, the person’s intentions were correct, but the execution of the action was flawed – done incorrectly, or not done at all. This distinction between being done incorrectly or not at all is another important distinction. When the appropriate action is carried out incorrectly, the error is classified as a slip. When the action is simply omitted or not carried out, the error is termed a lapse.
If we analyse my incident using the diagram above we can see that both a planning error and an execution error contributed to my situation.
1) The planning error – My practice on the ground of doing a physical check of the back of my equipment but only doing a visual check of the from of my equipment can be classified as a mistake, or a planning error. The action of performing only a visual check of the front of the equipment on the ground was an inappropriate action.
2) The execution error – When I performed my 10 point safety check prior to exit, but failed to pick up the packing error of an inverted left 3 ring, this error can be classified as a slip execution error. The appropriate action was carried out but was carried out incorrectly.
Had I forgotten to perform my 10 point safety check as I had at one time suspected, this would have been classified as a lapse execution error. In many ways I personally believe that this would have been easier for me to accept instead of having to accept that I performed my checks incorrectly.
Failure to perform a safety check is a relatively easy fix, just DON’T DO IT AGAIN!
However, performing the check but still failing to see the problem has had me doubting my own senses for several weeks and until I can resolve this to my own satisfaction there is no question of my taking another person for a jump.
I have come to accept that it wasn’t my gear that malfunctioned but my brain. But I still have to decide whether or not I should continue skydiving. As I write, I still haven’t jumped since the incident. I have a 2 year old son and that changes a how you see the world especially when it comes to managing personal risk. I will be undergoing some pretty comprehensive retraining as well as psychological counselling in the hope of jumping again in the near future, but how I actually feel when I get in the air remains to be seen.
Part 4 – What I have learned: Cognitive bias in skydiving
Cognitive Bias – or “We’re only human”
A cognitive bias is a pattern of deviation in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion. Individuals create their own “subjective social reality” from their perception of the input. An individual’s construction of social reality, not the objective input, may dictate their behaviour in the social world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality
Some cognitive biases are presumably adaptive. Cognitive biases may lead to more effective actions in a given context. Furthermore, cognitive biases enable faster decisions when timeliness is more valuable than accuracy. Other cognitive biases are a “by-product” of human processing limitations, resulting from a lack of appropriate mental mechanisms (bounded rationality), or simply from a limited capacity for information processing.
The execution error I performed when I checked but failed to pick up the inverted 3 ring packing error was likely due to one or more of the following cognitive biases which are what we will explore further in this section.
- Optimism Bias
- Confirmation Bias
- Expectation Bias (Inattentional Blindness)
- Selective attention
- Automaticity
Much is known about the optimism bias (also known as unrealistic or comparative optimism). This is a bias that causes a person to believe that they are less at risk of experiencing a negative event compared to others.
There are four factors that cause a person to be optimistically biased: their desired end state, their cognitive mechanisms, the information they have about themselves versus others, and overall mood.
The optimistic bias is seen in a number of situations. For example: people believing that they are less at risk of being a crime victim, smokers believing that they are less likely to contract lung cancer or disease than other smokers, first-time bungee jumpers believing that they are less at risk of an injury than other jumpers, or traders who think they are less exposed to losses in the markets.
Although the optimism bias occurs for both positive events, such as believing oneself to be more financially successful than others, and negative events, such as being less likely to have a drinking problem, there is more research and evidence suggesting that the bias is stronger for negative events. Different consequences result from these two types of events: positive events often lead to feelings of well being and self-esteem, while negative events lead to consequences involving more risk, such as engaging in risky behaviours and not taking precautionary measures for safety.
Almost every human being who is not in a state of depression could be said to be suffering from optimism bias in some form. If they weren’t, it would be hard to get out of bed in the morning. The best example I found to illustrate how most people are prone to optimism bias is the case of the wedding. The divorce rate among married couple is 2/5 (or40%). That is a statistical fact. But ask every couple getting married if they think they will get divorced and you will get 5/5 believing not. So where do the divorced couples come from?
Skydivers and optimism bias
As far as skydivers are concerned I suspect we have A LOT of optimism bias baked into our very culture! So much so that it may be our optimism bias that actually sets us apart from a lot of people who don’t skydive. From your very first jump when you except the risk of jumping out of a plane, regardless of whether your estimate of the risk involved is way under or way over, you are taking an optimistic jump. You need to be a pretty positive person to risk your life jumping out of a plane, no matter how much you know, or don’t know about the objective risk s of skydiving.
As a qualified skydiver you take your life into your own hands on a regular basis and you learn to have fun while doing it. If you do that enough times then you actually increase your optimism bias making you more likely to perceive a more positive outcome when assessing a risk situation than is actually likely. This is called confirmation bias. More on that in the next chapter.
Here is an interesting point though. As skydivers we don’t just underestimate the risk of something bad happening to us. We also over estimate our ability to deal with a problem if it arises. This bias alone has some serious implications in terms of the potential to cause a fatal incident. Consider for example the ultimate optimism biased skydiver, the guy who jumps without an AAD.
Optimism bias and skydiving instructors.
As AFF instructors, we try to be as objective as possible when assessing an AFF first jump candidate, but even then we are prone to optimism bias. We assess AFF candidates on their ability to perform such tasks as their main deployment sequence. If they can do it on the ground under pressure, we assume that they will do it in the sky. Otherwise, how would we ever take them jumping. Never the less, every AFF instructor has had to dump out heaps of people who fail to execute their main deployment sequence. (at a guess, close to at least 1 in 20, but I’d love to have hard data on this). Despite this knowledge, we continue to take AFF students into the air, overestimating the likelihood of their doing entirely the right thing, and our ability to deal with it if they don’t. But what about their ability to perform their reserve procedures when there is no instructor around to assist them? We can simulate and practice emergency situations as much as we like, but you still never know how you will perform until the time actually comes. Ergo, a healthy dose of optimism bias is absolutely necessary in order to become a skydiver.
Optimism bias and Dropzone culture.
Dropzones, it could be said, actually cultivate optimism bias, through what is called the “underlying effect” which suggests that overall positive experiences and positive attitudes lead to more optimistic bias in events. That is ok, because optimism bias is proven to be a good thing when applied to positive events. (Jumping out of a plane and surviving, even managing to learn something on at the time). But it is not not such a good thing for negative events such as estimating the probability of something bad happening to you. In that case, it is better for your subjective perspective to be as close to the objective facts of the matter as possible. By becoming aware of when optimism bias is affecting us we have the chance to manage it. That could be the difference between jumping on a load that kills you or waiting it to on the ground.
Optimism bias is broken down into 4 areas all of which are all applicable to skydivers
- Self-enhancement
- Self presentation
- Personal control/perceived control
- Cognitive mechanisms
I recommend reading up on wikipedia because I wont go into them all here. I will however pay closer attention to the one factor that I suspect accentuated my optimism bias in particular, Personal control or perceived control.
People tend to be more optimistically biased when they believe they have more control over events than other others. For example, experienced jumpers are less likely to be nervous about landing their canopy, but more likely to be nervous about landing in the plane. They don’t like not having control.
Studies have suggested that the greater perceived control someone has, the greater their optimistic bias. It also turns out that control is a stronger factor when it comes to personal risk assessments, but not when assessing others. The optimistic bias is strongest in situations where an individual needs to rely heavily on their own direct action and responsibility of situations. Hence, a chief instructor with 10’000 jumps is inherently predisposed to thinking that young jumpers with 200 jumps cannot get away with things that they themselves thought they could get away with when they had 200 jumps.
Perhaps this also explains why chief instructors attitudes towards safety and risk management sometimes conflicts with attitudes of younger jumpers.
Think about it the next time your DZSO threatens to ground you for doing some just a little bit outside the box and you think he acting like a control freak. Remember that when you finally have as much experience sunder your belt as your chief instruct does, you to will be just as prone to overestimating your own abilities in a given situation and underestimating those of the people around you.
Confirmation Bias.
If optimism bias is doing a tandem an “thinking it is going to be ok’. Then confirmation bias is having thousands of jumps and “thinking that you were right’.
Here is how it works. If you believe something is true, you notice information in the world that supports your world view more than you notice information that doesn’t.
I have known about confirmation bias for a while thanks to a very cool TEDx talk that a connection of mine Ash Donaldson gave a couple of years ago.
One example that I actually use to explain confirmation bias involves driving errors and racial stereotyping.
Regardless of how much we dislike rational stereotyping, a lot of people believe that Asians make consistently worse than average drivers. They aren’t. Confirmation bias is why this happens.
If we see a Caucasian make a blatant error on the road, we think “oh thats odd, you don’t see that very often”. If we see an Asian make the exact same error, we think “typical Asian drivers”! What happens is that you remember the Asian error, literally saving it in your mental equivalent of database, but you don’t remember caucasian error.
In terms of skydiving the implications of confirmation bias should be obvious. If you underestimate the likely hood of negative events happening to you in the first place, every time you jump you prove yourself right. If you think that your a better canopy pilot than your peers then you will notice your peers making more mistakes than you. Perhaps most worryingly YOU WILL BELEVE THAT THIS IS ACTUALLY THE REALITY OF LIFE!
Expectation bias:
After performing a checklist hundreds of times an individual can become predisposed to “see” the item in the correct position, even when it is not, i.e. “looking without seeing.
Expectation bias, or Expectancy is a factor that can influence the visual system, including how and where people look for information.
The six factors identified by Wickens and McCarley that affect the visual system are habit, salience, event rate (the more frequently an event happens in an area the more individuals will look at this area) and contextual relevance (individuals look at something they believe has relevant information there)
You can get some personal experience of expectation bias by studying the sentence in the triangle below. (Read it several times just to make sure you’ve got it).
Did you read “Paris in the spring” or Paris in the the spring”. Tests that I have performed myself show that 8 out of 10 people read it incorrectly. Moreover, when challenged to read it again and again they continue to read “Paris in the spring” and get visibly frustrated when you ask them to read it again fourth time.
In an aircraft cockpit a common technique to prevent “looking without seeing” is to “Point and Shoot”, or respond to each challenge by pointing or touching the item before verbalising the response. This technique encourages crosschecking and makes the checklist procedure more deliberate.
“Point and Shoot’ is something we teach our AFF students as part of their reserve procedure when locating their handles. It is also something that we continue to do throughout our skydiving career when we perform our gear checks in the plane.
Inattentional blindness or the ‘looked-but-failed-to-see-effect’, is a failure to perceive what would appear to others as an obvious visual stimulus. This occurs when an individual’s attention is engaged on another task and does not necessarily mean an individual was ‘not paying attention’ but that the individual’s attentional resources were occupied elsewhere. All individuals have limited attentional resources, so it is possible to miss vital visual stimuli if attention is allocated to another task.
Check out this 60 second selective attention test on YouTube It is a lot of FUN!
Research on human information processing suggests that inattentional blindness can be influenced by workload, expectation, conspicuity and capacity.
Gear checks in the plane prior to exit are an essential part of our safety procedures however there can be a lot going on in the aircraft directly before exit. Things can get pretty busy up there at that time, especially for instructors. Intentional blindness then reinforces the need for comprehensive gear check on the ground, where it is quiet and we are not rushed prior to doing our gear. I for one will most certainly be beefing up the way I do my gear checks on the ground for each and every jump from now on.
A wonderful example of looking but not seeing, or perhaps more accurately, “seeing but not reading”, was shared with me by Sean Walsh, a highly experienced Instructor B, Air Traffic Controller and Aviation Safety Officer for the RAAF.
Sean’s hypothesis is that when people look a their watch they often don’t see what the time is but instead they see what the time isn’t.
Example: Imagine, a person is asked the time shortly after having check their watch. how many could answer straight away and how many would need to check their watch again? For those that need to check again, is it because they CHECK the time but do not actually READ the time?
It may be that this also occurs more frequently in cases where people have an event coming up in the future.
Imagine two groups of people, group A, is told that they will have to answer a very personal question at exactly 10am. Group B is not told this. Would group A check their watch more frequently in the period leading up to 10am? Probably yes. But because their is a stronger focus on doing something at a specific point in the future, (10am) would a higher proportion of group a focus on what the time isn’t and just see but not read their watch.
As skydivers Sean suspects that may of us actually do this they we CHECK our altimeter.
I would be interested to conduct an experiment where a tandem master asks ten tandems customers to read their altimeter at some point on the way down and read the result back to the instructor after opening. call the group A. They ask ten more to do the same thing but also ask them to tap the Alti at 5000ft to acknowledge it is time to open the parachute. This is group B.
Would more of group A remember the time from their first altimeter check more constantly than group B? If so, would this be because group B were just checking that is wasn’t yet 5’0000ft…?
Automaticity
While not a bias in the strictest sense, Automaticity refers to the fact that humans who perform tasks repeatedly will eventually learn to perform them automatically (Pascual, Mills and Henderson, 2001).
The implications for skydivers here are obvious. From an operational perspective skydiving professionals methodically remove variables from the process of making a tandem jump. We do this because less variables means less unknowns which in turn leads to reduced probability of a knowledge based mistake. (see the Human Factors diagram on page 8)
That is advantageous, however Automaticity teaches us that the flip side of this is that we also reduce novelty and individual attention from the experience which can lead us to perform a lapse execution error such as a person automatically performing a function (such as a checklist item) without actually being cognisant of the task itself. This can lead a person to assume that the item is correctly configured even if it is not. It appears that is what happened in my case as a result of checking my 3 ring assembly 7’000 times before yet I couldn’t see the problem that was staring me in the face.
It is ironic that the more we do something the more we become at risk of several cognitive failures but that is what science seems to be telling us. This doesn’t mean that less experienced skydivers are safer than more experienced skydivers. It just means that more experienced skydivers may be at greater risk of cognitive biases such as looking but not seeing something during their gear checks.
So what can we learn from my mistake from an operational automaticity perspective?
In standardising every aspect of skydiving operations we need to balance the advantages of reducing knowledge based mistakes though automation and standardisation with the disadvantages of increasing lapse related execution failures brought on by automaticity.
Based on Automaticity, we could argue that Phil Onis, owner operator of Sydney Skydivers and Australia’s most experienced skydiver is at higher risk than anyone of making an automaticity related error. When I spoke with Phil about my incident he emphasised how he deals with this on his DZ. Simply, Phil is a big fan of everyone cross checking each others equipment in the boarding area and again in the plane.
We as humans we are all capable of cognitive bias and missing something in our gear checks. That is why we keep an eye out for each other in the plane and on the ground. But how often do less experienced jumpers think they need to keep an eye our for their more experienced counterparts. How would they feel about double check the gear of the local skydiving legends on their DZ?
If we now know that more experience makes us more fallible in certain areas, perhaps next time you jump at Picton DZ, you might want to cast an eye over Phil Onis’s gear in the boarding area. Having spoken with Phil, my guess is he would be thankful to you for doing it.
A spiritual perspective on cognitive bias
Many spiritual teachers actually point to how deeply cognitive biases affects us as human beings when they tell us that “perception is projection”. This really gets to the heart of the matter because it argues that as human beings none of us actually see reality objectively but instead we see our own subjective version of reality which is filtered through a mental projection of our thoughts and emotions which are derived from our subjective past experience. According to spiritual teachers, reality as experienced by we humans is a subjective experience that we actually create in our brain and is NOT something that is objective, true or real out there in the world. Yet, most of us believe that the way we see the world is the way that is actually is and not just how we see it. Which is interesting because in my world, my 3 ring was hooked up correctly or so I believed (and would have argued to the death) until after I watched the video…
Conclusion
So what does it mean if the majority of skydivers are prone to such cognitive biases as optimism and confirmation bias?
Well it makes the camp fire on a DZ an extraordinarily positive place to hang out on a Saturday night for one thing. Which from my life experience of skydiving I believe to be is a truly good thing.
Skydiving and hanging out with skydivers has given me a lot of confidence in the potential and competency of us human beings. Both in myself, in those around me, and in our human species as a whole.
I have been so confident in human abilities that I have jumped out of a plane, been in free-fall for 60 seconds, survived and even enjoyed it and managed to learn something on the way down over 12’000 times.
To do this requires a degree of confidence both in my own abilities and in my fellow humans abilities to make and execute any number of critical decisions and actions. We need to trust other people to fly the plane safely, fuel the plane, manage the ground operations and pack my main and reserve parachutes. Not to mention the people who design and build these wonderful tools we skydivers use as toys.
Now I ask you is the confidence in ones self and other people that allows us to do such a “crazy” thing just a question of “blind faith’ in human beings, or is it the the result of applying the scientific method to our experience of the world and intermixing that with a healthy dose of optimism bias?
I would like to argue the later because I still think human beings, especially ones that can fly their bodies, are pretty bloody awesome!
Having said that, as a result of this incident, my understanding of the frailties and shortcomings of the human mind has undergone a significant re-adjustment to bring it more into line with the objective evidence of now presented. We are fallible. Very fallible. And it would appear that some of the time we can’t even trust our own senses.
In the light of that re-adjustment, I now need to re evaluate on a personal level whether or not I want, should, would, or could continue to jump out of a plane every day as a job. And furthermore, can take responsibly for the life of another person as a tandem master.
My confidence as a skydiver and as a human being has been badly shaken by this incident. Especially with respect to my ability to manage risk and skydive in relative safety.
We say that ‘what doesn’t kill you only makes you stronger”. In my personal experience, what doesn’t kill me includes a canopy collapse due to turbulence from 100ft in 1994, and then hooking in and breaking my back with around 500 jumps in 1997. This one due strictly to youthful over confidence and cognitive bias.
I count these as two of the most enlightening experiences of my life as they are burned so deeply into my memory that I hope I am unlikely to have to learn from such mistakes again.
The learning involved in writing this report has been cathartic. So I hope that if I start jumping again in the time ahead that I will be able to say the same about this incident. Only time will tell.
If you see me on a DZ in the years to come, feel free to ask about the worst moment in my skydiving career. If I don’t jump again, then I hope that anyone who reads this, especially those with thousands of jumps and decades in the sport, can learn something from my mistakes and from what I have learned in the weeks that followed.
Blue Skies,
Mike McGrath 12.11.14
PS – Last weekend I made my first fun jump back after 5 weeks away from the sport.
It was fantastic to get back in the air and a real confidence boost to jump with the awesome people that came up with me. One jump over a whole weekend is not a lot for someone who loved to do 10-15 jumps a day, but this has been a great first step on the road back into professional skydiving.
Mike McGrath 19.11.14
CASA (Civil Aviation Safety Authority) recently published a cut down version of this article in the Close Calls section their Flight Safety magazine. (My first paid work ting gig 🙂 Hopefully others will get to learn from my experience.
http://www.flightsafetyaustralia.com/2016/03/three-ring-mistake/