Did you ever encounter the following problem: Suppose, you have written a text document. There are some very obvious typos in it. But, no matter how often you read and check your text, you just can't detect those errors. They somehow resemble blind spots. Same for writing source code although the compiler will point you to all syntax errors.These blind spots are even harder to find if they appear on the semantic or logical level. E.g., you are absolutely sure that you should do a depth first search, while in fact a breadth first search is more appropriate such as in a chess program. What is the reason for these blind spots? I don't know whether there is a biological or psychological explanation. However, I assume that the brain due to its filtering capabilities let's you just ignore the problem because it doesn't focus on the details but on the whole picture.
What does this imply for software engineering? From my viewpoint, this is a great example why pair programming and the 4 eyes principle are important. And also why code and architecture views are so essential. Whenever I am writing a document or creating a design, I will always ask someone else to crosscheck. Blind spots are a principal problem of humans. Thus, it is no good idea to pretend they do not exist.
No comments:
Post a Comment