Académique Documents
Professionnel Documents
Culture Documents
3. Describe some problem that you had with automating testing tool.
I had several problems working with test automation tools like,
a. Tools Limitations for Object Detections
b. Tools Configuration / Deployment in various Environments
c. Tools Precision / Default Skeleton Script Issues like window synchronization
issues etc.
d. Tools bugs with respect to exception handling.
e. Tools abnormal polymorphism in behavior like sometimes it works but
sometimes not for the same application / same script/same environment etc.
Here are some of the attributes of test automation that can be measured,
Maintainability
• Definition: The effort needed to update the test automation suites for each new release.
• Possible measurements: The possible measurements can be e.g. the average work effort
in hours to update a test suite.
Reliability
Flexibility
• Definition: The ease of working with all the different kinds of automation test ware.
• Possible measurements: The time and effort needed to identify, locate, restore, combine
and execute the different test automation test ware.
Efficiency
• Definition: The total cost related to the effort needed for the automation.
• Possible measurements: Monitoring over time the total cost of automated testing, i.e.
resources, material, etc.
Portability
Robustness
Usability
• Definition: The extent to which automation can be used by different types of users
(Developers, non-technical people or other users etc.,)
• Possible measurements: The time needed to train users to become confident and
productive with test automation.
10. How you will evaluate the tool for test automation?
Whenever a Tool has to be evaluated we need to go through few important
verifications / validations of the tool like,
a. Platform Support from the Tool.
b. Protocols / Technologies Support.
c. Tool Cost
d. Tool Type with its Features Vs Our Requirements Analysis.
e. Tool Usage Comparisons with other similar available tools in market.
f. Tool’s Compatibility with our Application Architecture and Development
Technologies.
g. Tool Configuration & Deployment Requirements.
h. Tools Limitations Analysis.
16. What types of scripting techniques for test automation do you know?
Scripting Technique: how to structure automated test scripts for maximum benefit and
Minimum impact of software changes, scripting issues, scripting approaches: linear,
Shared, data-driven and programmed, script pre-processing, minimizing the impact of
Software changes on test scripts. The major ones I had used are,
a. Data-Driven Scripting
b. Centralized Application Specific / Generic Compiled Modules / Library
Development.
c. Parent Child Scripting.
d. Techniques to Generalize the Scripts.
e. Increasing the factor of Reusability of the Script.
18. What tools are available for support of testing during software development life
cycle?
Test Director for Test Management, Bugzilla for Bug Tracking and Notification etc are
the tools for Support of Testing.
22. How to find that tools work well with your existing system?
While evaluating any tool we should ensure few things to make sure that it fits well to
existing like,
a. The tool should support our System Development and Deployment
Technologies.
b. Tool should have compatibility to work with all the third party tools
used by our application.
c. Tool should support all platforms that our application supports for
deployment.
d. There should be major environmental settings required by the tool to
work for the application that might result in Problems for the existing
system.
e. Tool should not create any conflict with the other tools existing in
current System. (e.g. There is Java-Corba-SSL Environmental Conflict
if we have WinRunner 7.5 and LoadRunner 7.5 even when both tools
have support for Java-Corba)
f. Tool does not create any memory conflict issues for application.
23.Describe some problem that you had with automating testing tool.
This Question is same as Question No.15
24.What are the main attributes of test automation?
Maintainability
• Definition: The effort needed to update the test automation suites for each new
release.
• Possible measurements: The possible measurements can be e.g. the average work
effort in hours to update a test suite.
Reliability
Flexibility
• Definition: The ease of working with all the different kinds of automation test ware.
• Possible measurements: The time and effort needed to identify, locate, restore,
combine and execute the different test automation test ware.
Efficiency
• Definition: The total cost related to the effort needed for the automation.
• Possible measurements: Monitoring over time the total cost of automated testing, i.e.
resources, material, etc.
Portability
Robustness
Usability
• Definition: The extent to which automation can be used by different types of users
(Developers, non-technical people or other users etc.,)
• Possible measurements: The time needed to train users to become confident and
productive with test automation.
25.What testing activities you may want to automate in a project?
This is repeated Question. Refer Q 14.
26.How to find that tools work well with your existing system?
This is repeated Question. Refer Q 22.
Load Testing:
1.What criteria would you use to select Web transactions for
Load testing?
In case of load testing for web based applications the transactions
definitions criterions should include,
a. Web Transactions for load testing should follow the Business
Flow.
b. Web Transactions for load should concentrate more on
functionality not on test data much.
c. Web Transactions should be concentrating more on the
functionalities that will have more loads in real term world.
7.How can data caching have a negative effect on load testing results?
The Response of the application under load will be faster if cached
since it will not be making direct request to the server so these
results will not give the timings which actual un-cached response might
give.
8.What usually indicates that your virtual user script has
dynamic data that is dependent on you parameterized fields?
In VuGen Scripts if we find any syntax of creating HTML(Local)
parameters and using them rather than parameterize using data
files.(e.g. web_create_html_param();)
General questions:
1. What types of documents would you need for QA, QC, and Testing?
2. What did you include in a test plan?
3. Describe any bug you remember.
4. What is the purpose of the testing?
5. What do you like (not like) in this job?
6. What is quality assurance?
7. What is the difference between QA and testing?
8. How do you scope, organize, and execute a test project?
9. What is the role of QA in a development project?
10. What is the role of QA in a company that produces software?
11. Define quality for me as you understand it
12. Describe to me the difference between validation and verification.
13. Describe to me what you see as a process. Not a particular process, just the basics of
having a process.
14. Describe to me when you would consider employing a failure mode and effect
analysis.
15. Describe to me the Software Development Life Cycle as you would define it.
16. What are the properties of a good requirement?
17. How do you differentiate the roles of Quality Assurance Manager and Project
Manager?
18. Tell me about any quality efforts you have overseen or implemented. Describe some
of the challenges you faced and how you overcame them.
19. How do you deal with environments that are hostile to quality change efforts?
20. In general, how do you see automation fitting into the overall process of testing?
21. How do you promote the concept of phase containment and defect prevention?
22. If you come onboard, give me a general idea of what your first overall tasks will be as
far as starting a quality effort.
23. What kinds of testing have you done?
24. Have you ever created a test plan?
25. Have you ever written test cases or did you just execute those written by others?
26. What did your base your test cases?
27. How do you determine what to test?
28. How do you decide when you have 'tested enough?'
29. How do you test if you have minimal or no documentation about the product?
30. Describe me to the basic elements you put in a defect report?
31. How do you perform regression testing?
32. At what stage of the life cycle does testing begin in your opinion?
33. How do you analyze your test results? What metrics do you try to provide?
34. Realising you won't be able to test everything - how do you decide what to test first?
35. Where do you get your expected results?
36. If automating - what is your process for determining what to automate and in what
order?
37. In the past, I have been asked to verbally start mapping out a test plan for a common
situation, such as an ATM. The interviewer might say, "Just thinking out loud, if you were
tasked to test an ATM, what items might you test plan include?" These type questions are
not meant to be answered conclusively, but it is a good way for the interviewer to see
how you approach the task.
38. If you're given a program that will average student grades, what kinds of inputs would
you use?
39. Tell me about the best bug you ever found.
40. What made you pick testing over another career?
41. What is the exact difference between Integration & System testing, give me examples
with your project.
42. How did you go about testing a project?
43. When should testing start in a project? Why?
44. How do you go about testing a web application?
45. Difference between Black & White box testing
46. What is Configuration management? Tools used?
47. What do you plan to become after say 2-5yrs (Ex: QA Manager, Why?)
48. Would you like to work in a team or alone, why?
49. Give me 5 strong & weak points of yours
50. Why do you want to join our company?
51. When should testing be stopped?
52. What sort of things would you put down in a bug report?
53. Who in the company is responsible for Quality?
54. Who defines quality?
55. What is an equivalence class?
56. Is a "A fast database retrieval rate" a testable requirement?
57. Should we test every possible combination/scenario for a program?
58. What criteria do you use when determining when to automate a test or leave it
manual?
59. When do you start developing your automation tests?
60. Discuss what test metrics you feel are important to publish an organization?
61. In case anybody cares, here are the questions that I will be asking:
62. Describe the role that QA plays in the software lifecycle.
63. What should Development require of QA?
64. What should QA require of Development?
65. How would you define a "bug?"
66. Give me an example of the best and worst experiences you've had with QA.
67. How does unit testing play a role in the development / software lifecycle?
68. Explain some techniques for developing software components with respect to
testability.
69. Describe a past experience with implementing a test harness in the development of
software.
70. Have you ever worked with QA in developing test tools? Explain the participation
Development should have with QA in leveraging such test tools for QA use.
71. Give me some examples of how you have participated in Integration Testing.
72. How would you describe the involvement you have had with the bug-fix cycle
between Development and QA?
72. What is unit testing?
73. Describe your personal software development process.
74. How do you know when your code has met specifications?
75. How do you know your code has met specifications when there are no specifications?
Here are some questions you might be asked on a job interview for a testing opening:
(from MU COSC 198 Software Testing by Dr. Corliss)
Interviewing Suggestions
1. If you do not recognize a term ask for further definition. You may know the
methodology/term but you have used a different name for it.
2. Always keep in mind that the employer wants to know what you are going to do
for them, with that you should always stay/be positive.
Preinterview Questions
1. What is the structure of the company?
2. Who is going to do the interview-possible background information of
interviewer?
3. What is the employer's environment (platforms, tools, etc.)?
4. What are the employer's methods and processes used in software arena?
5. What is the employer's philosophy?
6. What is the project all about you are interviewing for-as much information as
possible.
7. Any terminologies that the company may use.