Hostname: page-component-cd9895bd7-8ctnn Total loading time: 0 Render date: 2024-12-24T02:15:32.506Z Has data issue: false hasContentIssue false

Some Social Problems in Modern Navigation Systems

Published online by Cambridge University Press:  23 November 2009

V. David Hopkin
Affiliation:
(Royal Air Force Institute of Aviation Medicine)

Extract

A navigation system is not normally changed for social reasons. When changes are made, they are intended to meet new or more stringent requirements, to utilize technological advances, to increase efficiency, to improve safety, or to extend the functions which can be fulfilled. Innovations are evaluated according to their technical feasibility and cost. Not only have social factors failed to influence the nature of changes, but the social implications of changes have also been ignored. Yet greater benefit would often accrue from changes if their social effects were predicted and allowed for. Such predictions are usually feasible.

Social problems in navigation systems are gradually becoming more severe. In the past, man could often compensate for their effects because his role remained sufficiently dominant for him to be innovative and flexible in resolving difficulties, and he retained a full understanding of the system and control over it. More recent aids tend to curtail human communication, restrict the nature of teamwork, and reduce the social aspects of such functions as consultation, supervision, and verification. Man-machine relationships are emphasized, instead of relationships between people. Each man tends to be more remote from his colleagues, and to know less about what they are doing. These effects are generally incidental rather than the reflection of a deliberate policy.

Type
Research Article
Copyright
Copyright © The Royal Institute of Navigation 1980

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

1Ford, R. L. (1978). Fully-automated, pilot-monitored air traffic control. This Journal, 31, 259.Google Scholar
2Ratcliffe, S. (1978). Strategic or tactical air traffic control. This Journal, 31, 337.Google Scholar
3Hopkin, V. D. (1979). Boredom and human reliability: some hypothesised relationships. Second National Reliability Conference, Birmingham. ‘Reliability – the key to survival.’ Paper no. 98.Google Scholar
4Schein, E. H. (1970). Organizational Psychology. New Jersey: Prentice-Hall.Google Scholar
5Hopkin, V. D. (1975). The provision and use of information on air traffic control displays. In Plans and Developments for Air Traffic Systems (Ed.A., Benoit and Israel, D. R.). NATO: AGARD Conference Proceedings, no. 188 (13), 1–12. Also The Controller (1976). 15(2), 8–22, and (3), 6–14.Google Scholar
6Hopkin, V. D. (1977). Colour displays in air traffic control. Institution of Electrical Engineers, London, Displays for man–machine systems. Conference Publication, no. 150, 4649.Google Scholar
7Hopkin, V. D. (1977). What is job satisfaction? The Controller, 16, 431.Google Scholar
8Hopkin, V. D. (1975). The controller versus automation. In A Survey of Modern Air Traffic Control (ed. A., Benoit). NATO: AGARDograph no. 209, vol. 1, pp. 4360.Google Scholar
9Hopkin, V. D. and Taylor, R. M. (1979). Human Factors in the Design and Evaluation of Aviation Maps. NATO: AGARDograph no. 225.Google Scholar
10Wiener, E. L. (1977). Controlled flight into terrain accidents: system-induced errors. Human Factors, 19, 171.CrossRefGoogle Scholar