Reflections: Seminar 4
Contents
Reference: Software Disasters: Boeing 737 MAX
The Boeing 737 MAX aircraft disaster revolved around an upgraded Boeing 737 model that was released hurriedly in an attempt to stay in the aircraft market competition against Airbus. In fitting a more powerful engine into the old aircraft body, a software device had to be built to compensate for the change in aerodynamics. However, this software device was not disclosed to the prospective flight companies, and as a result insufficient training (or rather a lack thereof) made pilots ill-prepared to handle the the issues of this new aircraft.
I believe the two plane catastrophes were at the fault of the management team of Boeing, whom placed time pressure on the engineers, and sacrificed thoroughness in favour of a quick release time. A lack of comprehensive testing, documentation and training for the new system ultimately places the blame on the management of Boeing, as the aircraft would have been much safer it proper measures were taken instead of cutting corners.
It can be said that there was a lack of a sustainable team dynamic within the project, as safety concerns were ignored and discarded. A "just culture" should be adopted, where no one is afraid to bring up issues and problems.
Ultimately, the aircraft company failed to correctly draw the line between a system upgrade, and a complete new feature that would require proper training and pilot competency
Reference: Software Disasters: Therac-25
The Therac-25 was the third iteration of a medical linear accelerator, used for treating patients through radiology. Compared to its predecessors, the Therac-25 placed an emphasis on purportedly sophisticated software control. However as a result of a mistake in the software, patients lives were put at risk
A question that could be asked is "How much is a human life worth?".
Of course, this question is a rhetoric. And if a numeric figure is given as an answer by anyone, they should not be working in the medical field.
The purpose of the Therac-25 was (and is) to save humans, and as such the safety of the patient as well as the radiologist should be the most important criteria and concern of the product. Any action that could possibly harm or jeopardise this safety should be seriously discussed and convincingly detailed.
However, as seen in this case study, several parties failed to put safety before other factors (cost, thoroughness).
As a result of hindsight bias (of previously successful models), the manufacturers believed that the product was also safe. However there was a lack of testing of the machine. The fault can also partially lie on the regulatory body (the FDA), who failed to meticulously inspect and check the product.
In the case study of this accident surrounding Uber's autonomous vehicle, the ethical issue of sacrificing safety in favour of cost again arises. As a result of recusing the number of supervisors that had to be present in the vehicle during the test drive, Uber could therefore test more vehicles at the same time - However this action increases the risk for all pedestrians, other motorists and the driver themselves.
As a result of negligent driving behaviour (being distracted by your phone), paired with the lack of a second pair of (human) eyes, the fail-safe of a third means of control (by the second supervisor) was removed, leaving control purely to the software (which was being tested), and a single operator.
Consequently I believe that it is at the fault of the driver, whom would have been aware of their responsibility as the sole supervisor. Questions can be raised against Uber however, as to their statement on cost-effectiveness and driver safety.
It was interesting to a similarity between all of the different case studies presented this week, as well as the Killer Robot case study in week 2.
In almost all cases, there was an issue that in some way or form decreased the importance of 'safety', either as a result of time constraints, financial constraints, or managerial overruling.
In an ideal world, where there is unlimited budget and time - many shortcomings of all of the systems could be mitigated and addressed. Whilst of course this is realistically impossible, it is important to learn from these mistakes to better equip ourselves for scenarios that we ourselves will experience in the workforce.