top of page
Writer's pictureSATNAVmagazine

It's not rocket science: How effective communication is key to preventing future space calamities

Charlotte Tomlinson looks at the disasters of space travel that were all down to communication


On January 28th 1986, 73 seconds into its flight, the Challenger space shuttle exploded, killing all seven crew members on board[1].

The Challenger Disaster, 28th January 1986

The explosion was caused by a joint in the right solid rocket booster failing at lift off - a result of the O-ring seals used in the joint not being designed to function at extremely cold temperatures. It was revealed at the Rogers Commission that NASA managers had known since 1977 about the flaw in the O-Rings, and they had failed to report warnings from engineers about the dangers of launching on that cold morning to their superiors[2]. Despite this, they had claimed that the probability of failure of the mission was 1 in 100,000. Out of the seven people that died, one was Christa McAuliffe, a school teacher who was meant to be the first ordinary citizen in space, acting as a symbol of the shuttle’s commitment to safety[3].


On April 24th 1967, after being in orbit for a day, the Soyuz 1 crashed back into the ground at 144 km/h, making its pilot Vladimir Komarov the first fatality in the history of spaceflight [4]. Approximate cause of death was a faulty parachute deployment system. However, that unfortunate space mission would not have gone ahead, were it not for the political pressure to launch a mission in time for the 50th anniversary of the Bolshevik Revolution and the May Day celebration, and an obstructive management culture[5]. The design organization in charge of the Soyuz program was the Central Design Bureau of Experimental Machine Building (TsKBEM)[6]. In 1966, it was headed by Vasilii Pavlovich Mishin. Although he was a brilliant engineer, he lacked the communication skills needed to be a great leader and manager. It is under Mishin’s command that TsKBEM carried out three automated test flights, all three failing spectacularly, ending in destruction and desolation. The two attempts that reached orbit also faced problems at the re-entry stage, due to the failure of the components involved such as the heat shield and parachute system.


A famous member of the Rogers commission, Richard Feynman, upon conducting his own investigation, found that the chance of failure was closer to 1 in 100 [7]. Feynman concluded by recommending that the NASA officials and management deal in the “world of reality”, in that they must set realistic flight times and approximations of success, based on the actual findings of the engineers[8]. This can only be achieved via effective communication between management and its rocket scientists.


The cause of both disasters were preventable. These problems were known years in advance – yet bureaucracy and poor communication were the chief causes of these terrible incidents.

There is no “dark side” to science – only human misdirection and miscommunication, which threaten scientific achievement and societal progress.


More information:

[3] Feynman, R. P., (1993) What do YOU care what other people think? : Further adventures of a curious character, London: Harper collins, p. 236

[7] Feynman, R. P., (1993) What do YOU care what other people think? : Further adventures of a curious character, London: Harper collins, p. 220

[8] [8] Feynman, R. P., (1993) What do YOU care what other people think? : Further adventures of a curious character, London: Harper collins, p.237


From Issue 22: the Dark Side of Science

17 views0 comments

Comentários


bottom of page