CAT came across two concepts recently - aesthetics in software testing and Conway's law.
These two are not directly linked with software testing in a way. However, when I was listing the primary inputs that we (testers) use apart from '(so called)documented requirements', I realized that such inputs are seldom considered. And as a result we loose out on few crucial aspects.
To elaborate further ... , let me list out typical aspects that we consider apart from requirements -
- methodologies
- techniques
- framework
- statistics
- theories
- industry benchmarks / trends
CAT witnessed a presentation in a conference - the speaker (T. Ashok from STAG software) talked on Aesthetics of software testing. Truly it is a nice concept, we generally do not dedicate ourselves to bring in the beauty in software testing activities like - test design, team composition, etc.
It is not that hard to bring in "beauty": e.g .Test artifacts can be beautified with grammar, clarity, 'apt' diagrams than huge paragraphs. Testing processes is beautified with clarity and yet discipline. Testing tools can be beautified with naming standards, comments, scripting practice and so on...
A point to note - whenever we are adding beauty, we are in fact adding value to testing! So beautifying is not merely the surplus-time activity.
Conway's law (one eponymous law) suggests that 'organizations that design systems are generally constrained to produce designs which are copies of their communication structures. Wow! Yet another interesting thought! It is like organization's soft-skills get reflected in its 'product'.
We as testers never focus on organization's soft-skills ....
Well, CAT is not suggesting to translate an organization communication structure into a test case and then dig out the defects ... neither the suggestion is to focus on beautification like 'color, fonts, width, huge coding standard documents, etc' beyond acceptable limit.
CAT is just asking all testing professionals to raise the questions(to oneself) like -
do we pay enough attention on organization's soft skills .... ?
has my team worked on beautifying the deliverable in true sense....?
are we ignoring the obvious loophole in existing 'soft-skill' and its associated impact on product under test ....?
Few unconventional inputs to look for (in addition to those listed above) -
- inter organization communication structure(as suggested by Conway's law)
- existing QA / QC activities.
- structure of BAs / product owner team/s.
- Management structure.
- Infrastructure availability.
For sure these might provide a useful link or association with one of the existing pain areas, loophole, etc.
We find a number of models / approach / methods in industry that identify the common problems associated with the existing testing activity. Close look indicate that the 'unconventional inputs' provide vital information captured in 'assessment model' or 'compliance model' or 'audit checklist'. In CAT's opinion first an individual should focus & understand the unconventional inputs and then analyse them using one industry model.
What after we understand the problem areas from this exercise? We can design a solution - it could be transitioning approach / test centre (CoE) formation / automation / improvement in test strategy / change in existing toolkit ... anything.
So let us think unconventionally a bit...
These two are not directly linked with software testing in a way. However, when I was listing the primary inputs that we (testers) use apart from '(so called)documented requirements', I realized that such inputs are seldom considered. And as a result we loose out on few crucial aspects.
To elaborate further ... , let me list out typical aspects that we consider apart from requirements -
- methodologies
- techniques
- framework
- statistics
- theories
- industry benchmarks / trends
CAT witnessed a presentation in a conference - the speaker (T. Ashok from STAG software) talked on Aesthetics of software testing. Truly it is a nice concept, we generally do not dedicate ourselves to bring in the beauty in software testing activities like - test design, team composition, etc.
It is not that hard to bring in "beauty": e.g .Test artifacts can be beautified with grammar, clarity, 'apt' diagrams than huge paragraphs. Testing processes is beautified with clarity and yet discipline. Testing tools can be beautified with naming standards, comments, scripting practice and so on...
A point to note - whenever we are adding beauty, we are in fact adding value to testing! So beautifying is not merely the surplus-time activity.
Conway's law (one eponymous law) suggests that 'organizations that design systems are generally constrained to produce designs which are copies of their communication structures. Wow! Yet another interesting thought! It is like organization's soft-skills get reflected in its 'product'.
We as testers never focus on organization's soft-skills ....
Well, CAT is not suggesting to translate an organization communication structure into a test case and then dig out the defects ... neither the suggestion is to focus on beautification like 'color, fonts, width, huge coding standard documents, etc' beyond acceptable limit.
CAT is just asking all testing professionals to raise the questions(to oneself) like -
do we pay enough attention on organization's soft skills .... ?
has my team worked on beautifying the deliverable in true sense....?
are we ignoring the obvious loophole in existing 'soft-skill' and its associated impact on product under test ....?
Few unconventional inputs to look for (in addition to those listed above) -
- inter organization communication structure(as suggested by Conway's law)
- existing QA / QC activities.
- structure of BAs / product owner team/s.
- Management structure.
- Infrastructure availability.
For sure these might provide a useful link or association with one of the existing pain areas, loophole, etc.
We find a number of models / approach / methods in industry that identify the common problems associated with the existing testing activity. Close look indicate that the 'unconventional inputs' provide vital information captured in 'assessment model' or 'compliance model' or 'audit checklist'. In CAT's opinion first an individual should focus & understand the unconventional inputs and then analyse them using one industry model.
What after we understand the problem areas from this exercise? We can design a solution - it could be transitioning approach / test centre (CoE) formation / automation / improvement in test strategy / change in existing toolkit ... anything.
So let us think unconventionally a bit...