Vous êtes sur la page 1sur 15

Test Automation:

1. What automating testing tools are you familiar with?


 LoadRunner and WinRunner.

2. How did you use automating testing tools in your job?


 I used the automation testing tools for Regression and Performance testing.

3. Describe some problem that you had with automating testing tool.
 I had several problems working with test automation tools like,
a. Tools Limitations for Object Detections
b. Tools Configuration / Deployment in various Environments
c. Tools Precision / Default Skeleton Script Issues like window synchronization
issues etc.
d. Tools bugs with respect to exception handling.
e. Tools abnormal polymorphism in behavior like sometimes it works but
sometimes not for the same application / same script/same environment etc.

4. How do you plan test automation?


 Planning is the most important task in Test Automation. Test Automation Plan should
cover the following task items,
a. Tool Selection: Type of Test Automation Expected (Regression / Performance
etc.)
b. Tool Evaluation: Tool Availability / Tool License Availability / Tool License
Limitations.
c. Tool Cost Estimation Vs Project Cost Estimation Statistics for Testing.
d. Resource Requirements Vs Availability Study.
e. Time Availability Vs Time Estimations Calculations and Definations.
f. Production Requirements Analysis Results Consideration with respect to
Factors like Load-Performance / Functionality Expected / Scalability etc.
g. Test Automation Process Definitions including Standard to be followed while
performing Test Automation.
h. Test Automation Scope Definition.
i. Automation Risk Analysis and planning to overcome if defined Risks Emerge
in the Automation Process.
j. Reference Document Requirement as Perquisites for Test Automation.

5. Can test automation improve test effectiveness?


 Yes, Definitely Test Automation plays a vital role in improving Test Effectiveness in
various ways like,
a. Reduction in Slippage caused due to human errors.
b. Object / Object Properties Level UI Verifications.
c. Virtual Load / Users usage in Load/Performance Testing wherein its not
possible to use so many resources physically performing test and get so
accurate results.
d. Précised Time Calculations.
e. And many more…
6. What is data - driven automation?
 Data Driven Automation is the most important part of test automation where the
requirement is to execute the same test cases for different set of test input data so that test
can executed for pre-defined iterations with different set of test input data for each
iteration.

7. What are the main attributes of test automation?

 Here are some of the attributes of test automation that can be measured,

Maintainability

• Definition: The effort needed to update the test automation suites for each new release.
• Possible measurements: The possible measurements can be e.g. the average work effort
in hours to update a test suite.

Reliability

• Definition: The accuracy and repeatability of your test automation.


• Possible measurements: Number of times a test failed due to defects in the tests or in the
test scripts.

Flexibility

• Definition: The ease of working with all the different kinds of automation test ware.
• Possible measurements: The time and effort needed to identify, locate, restore, combine
and execute the different test automation test ware.

Efficiency

• Definition: The total cost related to the effort needed for the automation.
• Possible measurements: Monitoring over time the total cost of automated testing, i.e.
resources, material, etc.

Portability

• Definition: The ability of the automated test to run on different environments.


• Possible measurements: The effort and time needed to set-up and run test automation in
a new environment.

Robustness

• Definition: The effectiveness of automation on an unstable or rapidly changing system.


• Possible measurements: Number of tests failed due to unexpected events.

Usability
• Definition: The extent to which automation can be used by different types of users
(Developers, non-technical people or other users etc.,)
• Possible measurements: The time needed to train users to become confident and
productive with test automation.

8. Does automation replace manual testing?


 We cannot actually replace manual testing 100% using Automation but yes definitely
it can replace almost 90% of the manual test efforts if the automation is done efficiently.

9. How will you choose a tool for test automation?


 Below are factors to be considered while choosing Test Automation Tool,
a. Test Type Expected. (E.g. Regression Testing / Functional Testing /
Performance-Load Testing)
b. Tool Cost Vs Project Testing Budget Estimation.
c. Protocol Support by Tool Vs. Application Designed Protocol.
d. Tools Limitations Vs Application Test Requirements
e. H/W, S/W & Platform Support of Tool Vs Application test Scope for these
attributes.
f. Tool License Limitations / Availability Vs Test Requirements.(Tools
Scalability)

10. How you will evaluate the tool for test automation?
 Whenever a Tool has to be evaluated we need to go through few important
verifications / validations of the tool like,
a. Platform Support from the Tool.
b. Protocols / Technologies Support.
c. Tool Cost
d. Tool Type with its Features Vs Our Requirements Analysis.
e. Tool Usage Comparisons with other similar available tools in market.
f. Tool’s Compatibility with our Application Architecture and Development
Technologies.
g. Tool Configuration & Deployment Requirements.
h. Tools Limitations Analysis.

11. What are main benefits of test automation?


 The main benefits of Test Automation are,
a. Test Automation Saves Major Testing Time.
b. Saves Resources (Human / H/w / S/W resources)
c. Reduction in Verification Slippages cased due to human errors.
d. Object Properties Level Verifications can be done which is difficult manually.
e. Virtual Load / Users Generation for load testing which is not worth doing
manually as it needs lots of resources and also it might not give that precise
results which can be achieved using a Automation Tool.
f. Regression Testing Purposes.
g. For Data Driven Testing.
12. What could go wrong with test automation?
 While using Test Automation there are various factors that can affect the testing
process like,
a. Tool’s Limitations might result in Application Defects.
b. Automation Tool’s abnormal behavior like Scalability Variations due to
memory violations might be considered as Applications memory
violation in heavy load tests.
c. Environment Settings Required for Tool (e.g. Java-Corba required JDK
to be present in System) causes Application to show up Bugs which are
just due to the JDK installation in System which I had experienced
myself as on un-installation of JDK and Java-Addins my application
works fine.

13. How you will describe testing activities?


 The basic Testing activities are as follows,
a. Test Planning (Pre-Requisite: Get Adequate Documents of the Project to test)
b. Test Cases (Pre-Requisite: Get Adequate Documents of the Project to test)
c. Cursor Test (A Very Basic Test to make sure that all screens are coming and
application is ready for test or to automate)
d. Manual Testing
e. Test Automation (Provided if the product had reached Stability enough to be
automated).
f. Bug Tracking & Bug Reporting.
g. Analysis of the Test and Test Report Creation.
h. If Bug Fixing Cycle repeats then Steps c-h repeats.

14. What testing activities you may want to automate?


 Anything, which is repeated, should be automated if possible. Thus I feel following
testing activities can be automated,
a. Test Case Preparation
b. Tests like Cursor, Regression, Functional & Load / Performance testing.
c. Test Report Generation.
d. Test Status/Results Notifications.
e. Bug Tracking System. Etc.

15. Describe common problems of test automation.


 In Test Automation we come across several problems, out of which I would like to
highlight few as given below,
a. Automation Script Maintenance, which becomes tough if product gets through
frequent changes.
b. Automation Tool’s Limitations for objects Recognizing.
c. Automation Tool’s Third Part Integration Limitations.
d. Automation Tool’s abnormal behavior due to its Scalability Issues.
e. Due to Tool’s Defects, We might assume its Application Defect and consider
any issue as Application Bug.
f. Environmental Settings and API’s / Addins Required by Tool to make it
compatible to work with Specialized Environments like JAVA-CORBA
creates JAVA Environmental Issues for the Application to work. (E.g.
WinRunner 7.05 Java-Support Environmental Variables Creates Application
Under Test to malfunction)
g. There are many issues, which we come across while actual automation.

16. What types of scripting techniques for test automation do you know?
 Scripting Technique: how to structure automated test scripts for maximum benefit and
Minimum impact of software changes, scripting issues, scripting approaches: linear,
Shared, data-driven and programmed, script pre-processing, minimizing the impact of
Software changes on test scripts. The major ones I had used are,
a. Data-Driven Scripting
b. Centralized Application Specific / Generic Compiled Modules / Library
Development.
c. Parent Child Scripting.
d. Techniques to Generalize the Scripts.
e. Increasing the factor of Reusability of the Script.

17. What are principles of good testing scripts for automation?


 The major principles of good testing script for Automation are,
a. Automation Scripts should be reusable.
b. Coding Standards should be followed for Scripting, which makes Script
Updations, Understanding, Debugging easier.
c. Scripts should be Environment, data Independent as much as possible which
can be achieved using parameterization.
d. Script should be generalized.
e. Scripts should be modular.
f. Repeated Tasks should be kept in Functions while scripting to avoid code
repeat, complexity and make script easy for debugging.
g. Script should be readable and appropriate comments should be written for
each line / section of script.
h. Script Header should contain script developer name, script updated date, script
environmental requirements, scripted environmental details, script pre-
requisites from application side, script description in brief, script contents,
script scope etc.

18. What tools are available for support of testing during software development life
cycle?
 Test Director for Test Management, Bugzilla for Bug Tracking and Notification etc are
the tools for Support of Testing.

19. Can the activities of test case design be automated?


 Yes, Test Director is one of such tool, which has the feature of Test Case Design and
execution.

20. What are the limitations of automating software testing?


 If you talk about limitations of automating software testing, then I would like to
mention few, which I had come across,
a. Automation Needs lots of time in the initial stage of automation.
b. Every tool will have its own limitations with respect to protocol support,
technologies supported, object recognition, platform supported etc due to
which not 100% of the Application can be automation because there is
always something limited to the tool which we have to overcome with
R&D.
c. Tool’s Memory Utilization is also one the important factor which blocks
the application’s memory resources and creates problems to application in
few cases like Java Applications etc.

21. What skills needed to be a good test automator?


 If you talk about the skills required for Test Automator then the basic skills one should
have are,
a. Programming Skill.
b. Any Procedural Language Basics like
c. Generic Skill of Automation Tool’s Configuration and Deployment.
d. Some basic knowledge of coding standards will be good to have.
e. Skill to Interpret Results given by tool and perform analysis to reach the
level to meet the requirements of the test.

22. How to find that tools work well with your existing system?
 While evaluating any tool we should ensure few things to make sure that it fits well to
existing like,
a. The tool should support our System Development and Deployment
Technologies.
b. Tool should have compatibility to work with all the third party tools
used by our application.
c. Tool should support all platforms that our application supports for
deployment.
d. There should be major environmental settings required by the tool to
work for the application that might result in Problems for the existing
system.
e. Tool should not create any conflict with the other tools existing in
current System. (e.g. There is Java-Corba-SSL Environmental Conflict
if we have WinRunner 7.5 and LoadRunner 7.5 even when both tools
have support for Java-Corba)
f. Tool does not create any memory conflict issues for application.

23.Describe some problem that you had with automating testing tool.
 This Question is same as Question No.15
24.What are the main attributes of test automation?

 The main attributes of test automation are,

Maintainability

• Definition: The effort needed to update the test automation suites for each new
release.
• Possible measurements: The possible measurements can be e.g. the average work
effort in hours to update a test suite.

Reliability

• Definition: The accuracy and repeatability of your test automation.


• Possible measurements: Number of times a test failed due to defects in the tests or
in the test scripts.

Flexibility

• Definition: The ease of working with all the different kinds of automation test ware.
• Possible measurements: The time and effort needed to identify, locate, restore,
combine and execute the different test automation test ware.

Efficiency

• Definition: The total cost related to the effort needed for the automation.
• Possible measurements: Monitoring over time the total cost of automated testing, i.e.
resources, material, etc.

Portability

• Definition: The ability of the automated test to run on different environments.


• Possible measurements: The effort and time needed to set-up and run test
automation in a new environment.

Robustness

• Definition: The effectiveness of automation on an unstable or rapidly changing


system.
• Possible measurements: Number of tests failed due to unexpected events.

Usability

• Definition: The extent to which automation can be used by different types of users
(Developers, non-technical people or other users etc.,)
• Possible measurements: The time needed to train users to become confident and
productive with test automation.
25.What testing activities you may want to automate in a project?
 This is repeated Question. Refer Q 14.

26.How to find that tools work well with your existing system?
 This is repeated Question. Refer Q 22.

Load Testing:
1.What criteria would you use to select Web transactions for
Load testing?
 In case of load testing for web based applications the transactions
definitions criterions should include,
a. Web Transactions for load testing should follow the Business
Flow.
b. Web Transactions for load should concentrate more on
functionality not on test data much.
c. Web Transactions should be concentrating more on the
functionalities that will have more loads in real term world.

2.For what purpose are virtual users created?


 The purpose of virtual users creation in load testing is to simulate
the real term users by using of minimal systems/resources and that too
concurrently.

3.Why it is recommended to add verification checks to


All your scenarios?
 The basic requirements of Verifications in any load test are,
1. To check whether the application response is valid.
2. Verification checks will be useful to decide the execution
flow.
3. Verifications should be done intensely to validate the
applications bottleneck points in peak load.

4.In what situation would you want to parameterize a


Text, verification check?
 Parameterization for a text is required only where there is dynamic
text display at different times but to validate that it is in sync with
the application functionality.

5.Why do you need to parameterize fields in your virtual user script?


 To execute same test cases with different set of test input data.

6.What are the reasons why parameterization is necessary when


load testing the Web server and the database server?
 To validate the Server Performance Differences based on different
test inputs and also in case where data unique constraints come into
picture.

7.How can data caching have a negative effect on load testing results?
 The Response of the application under load will be faster if cached
since it will not be making direct request to the server so these
results will not give the timings which actual un-cached response might
give.
8.What usually indicates that your virtual user script has
dynamic data that is dependent on you parameterized fields?
 In VuGen Scripts if we find any syntax of creating HTML(Local)
parameters and using them rather than parameterize using data
files.(e.g. web_create_html_param();)

9.What are the benefits of creating multiple actions within


any virtual user script?
 Following are the benefits of actions based scripts,
1. Code Reuse in same Vugen for Different times.
2. Conditionally execution of particular set of Code.
3. Iterations setting for specific set of code.
4. Import / Export Actions from one script to other.

General questions:

1. What types of documents would you need for QA, QC, and Testing?
2. What did you include in a test plan?
3. Describe any bug you remember.
4. What is the purpose of the testing?
5. What do you like (not like) in this job?
6. What is quality assurance?
7. What is the difference between QA and testing?
8. How do you scope, organize, and execute a test project?
9. What is the role of QA in a development project?
10. What is the role of QA in a company that produces software?
11. Define quality for me as you understand it
12. Describe to me the difference between validation and verification.
13. Describe to me what you see as a process. Not a particular process, just the basics of
having a process.
14. Describe to me when you would consider employing a failure mode and effect
analysis.
15. Describe to me the Software Development Life Cycle as you would define it.
16. What are the properties of a good requirement?
17. How do you differentiate the roles of Quality Assurance Manager and Project
Manager?
18. Tell me about any quality efforts you have overseen or implemented. Describe some
of the challenges you faced and how you overcame them.
19. How do you deal with environments that are hostile to quality change efforts?
20. In general, how do you see automation fitting into the overall process of testing?
21. How do you promote the concept of phase containment and defect prevention?
22. If you come onboard, give me a general idea of what your first overall tasks will be as
far as starting a quality effort.
23. What kinds of testing have you done?
24. Have you ever created a test plan?
25. Have you ever written test cases or did you just execute those written by others?
26. What did your base your test cases?
27. How do you determine what to test?
28. How do you decide when you have 'tested enough?'
29. How do you test if you have minimal or no documentation about the product?
30. Describe me to the basic elements you put in a defect report?
31. How do you perform regression testing?
32. At what stage of the life cycle does testing begin in your opinion?
33. How do you analyze your test results? What metrics do you try to provide?
34. Realising you won't be able to test everything - how do you decide what to test first?
35. Where do you get your expected results?
36. If automating - what is your process for determining what to automate and in what
order?
37. In the past, I have been asked to verbally start mapping out a test plan for a common
situation, such as an ATM. The interviewer might say, "Just thinking out loud, if you were
tasked to test an ATM, what items might you test plan include?" These type questions are
not meant to be answered conclusively, but it is a good way for the interviewer to see
how you approach the task.
38. If you're given a program that will average student grades, what kinds of inputs would
you use?
39. Tell me about the best bug you ever found.
40. What made you pick testing over another career?
41. What is the exact difference between Integration & System testing, give me examples
with your project.
42. How did you go about testing a project?
43. When should testing start in a project? Why?
44. How do you go about testing a web application?
45. Difference between Black & White box testing
46. What is Configuration management? Tools used?
47. What do you plan to become after say 2-5yrs (Ex: QA Manager, Why?)
48. Would you like to work in a team or alone, why?
49. Give me 5 strong & weak points of yours
50. Why do you want to join our company?
51. When should testing be stopped?
52. What sort of things would you put down in a bug report?
53. Who in the company is responsible for Quality?
54. Who defines quality?
55. What is an equivalence class?
56. Is a "A fast database retrieval rate" a testable requirement?
57. Should we test every possible combination/scenario for a program?
58. What criteria do you use when determining when to automate a test or leave it
manual?
59. When do you start developing your automation tests?
60. Discuss what test metrics you feel are important to publish an organization?
61. In case anybody cares, here are the questions that I will be asking:
62. Describe the role that QA plays in the software lifecycle.
63. What should Development require of QA?
64. What should QA require of Development?
65. How would you define a "bug?"
66. Give me an example of the best and worst experiences you've had with QA.
67. How does unit testing play a role in the development / software lifecycle?
68. Explain some techniques for developing software components with respect to
testability.
69. Describe a past experience with implementing a test harness in the development of
software.
70. Have you ever worked with QA in developing test tools? Explain the participation
Development should have with QA in leveraging such test tools for QA use.
71. Give me some examples of how you have participated in Integration Testing.
72. How would you describe the involvement you have had with the bug-fix cycle
between Development and QA?
72. What is unit testing?
73. Describe your personal software development process.
74. How do you know when your code has met specifications?
75. How do you know your code has met specifications when there are no specifications?

76. Describe your experiences with code analyzers.


77. How do you feel about cyclomatic complexity?
78. Who should test your code?
79.How do you survive chaos?
80. What processes/methodologies are you familiar with?
81. What type of documents would you need for QA/QC/Testing?
82. How can you use technology to solve problem?
83. What type of metrics would you use?
84. How to find that tools work well with your existing system?
85. What automated tools are you familiar with?
86. How well you work with a team?
87. How would you ensure 100% coverage of testing?
88. How would you build a test team?
89. What problem you have right now or in the past? How you solved it?
90. What you will do during the first day of job?
91. What would you like to do five years from now?
92. Tell me about the worst boss you've ever had.
93. What are your greatest weaknesses?
94. What are your strengths?
95. What is a successful product?
96. What do you like about Windows?
97. What is good code?
98. Who is Kent Beck, Dr Grace Hopper, Dennis Ritchie?
99. What are basic, core, practises for a QA specialist?
100. What do you like about QA?
101. What has not worked well in your previous QA experience and what would you
change?
102. How you will begin to improve the QA process?
103. What is the difference between QA and QC?
104. What is UML and how to use it for testing?
105. What is CMM and CMMI? What is the difference?
106. What do you like about computers?
107. Do you have a favourite QA book? More than one? Which ones? And why.
108. What is the responsibility of programmers vs QA?
109.What are the properties of a good requirement?
110.Ho to do test if we have minimal or no documentation about the product?
111.What are all the basic elements in a defect report?
112.Is an "A fast database retrieval rate" a testable requirement?

From Cem Kaner article: "Recruiting testers" December 1999

1. What is software quality assurance?


2. What is the value of a testing group? How do you justify your work and budget?
3. What is the role of the test group vis-à¶is documentation, tech support, and so forth?
4. How much interaction with users should testers have, and why?
5. How should you learn about problems discovered in the field, and what should you
learn from those problems?
6. What are the roles of glass-box and black-box testing tools?
7. What issues come up in test automation, and how do you manage them?
8. What development model should programmers and the test group use?
9. How do you get programmers to build testability support into their code?
10. What is the role of a bug tracking system?
11. What are the key challenges of testing?
12. Have you ever completely tested any part of a product? How?
13. Have you done exploratory or specification-driven testing?
14. Should every business test its software the same way?
15. Discuss the economics of automation and the role of metrics in testing.
16. Describe components of a typical test plan, such as tools for interactive products and
for database products, as well as cause-and-effect graphs and data-flow diagrams.
17. When have you had to focus on data integrity?
18. What are some of the typical bugs you encountered in your last assignment?
19. How do you prioritize testing tasks within a project?
20. How do you develop a test plan and schedule? Describe bottom-up and top-down
approaches.
21. When should you begin test planning?
22. When should you begin testing?
23. Do you know of metrics that help you estimate the size of the testing effort?
24. How do you scope out the size of the testing effort?
25. How many hours a week should a tester work?
26. How should your staff be managed? How about your overtime?
27. How do you estimate staff requirements?
28. What do you do (with the project tasks) when the schedule fails?
29. How do you handle conflict with programmers?
30. How do you know when the product is tested well enough?
31. What characteristics would you seek in a candidate for test-group manager?
32. What do you think the role of test-group manager should be? Relative to senior
management?
Relative to other technical groups in the company? Relative to your staff?
33. How do your characteristics compare to the profile of the ideal manager that you just
described?
34. How does your preferred work style work with the ideal test-manager role that you
just described? What is different between the way you work and the role you described?
35. Who should you hire in a testing group and why?
36. What is the role of metrics in comparing staff performance in human resources
management?
37. How do you estimate staff requirements?
38. What do you do (with the project staff) when the schedule fails?
39. Describe some staff conflicts youÂ’ve handled.

Here are some questions you might be asked on a job interview for a testing opening:
(from MU COSC 198 Software Testing by Dr. Corliss)

1. Why did you ever become involved in QA/testing?


2. What is the testing lifecycle and explain each of its phases?
3. What is the difference between testing and Quality Assurance?
4. What is Negative testing?
5. What was a problem you had in your previous assignment (testing if possible)?
How did you resolve it?
6. What are two of your strengths that you will bring to our QA/testing team?
7. How would you define Quality Assurance?
8. What do you like most about Quality Assurance/Testing?
9. What do you like least about Quality Assurance/Testing?
10. What is the Waterfall Development Method and do you agree with all the steps?
11. What is the V-Model Development Method and do you agree with this model?
12. What is the Capability Maturity Model (CMM)? At what CMM level were the last
few companies you worked?
13. What is a "Good Tester"?
14. Could you tell me two things you did in your previous assignment (QA/Testing
related hopefully) that you are proud of?
15. List 5 words that best describe your strengths.
16. What are two of your weaknesses?
17. What methodologies have you used to develop test cases?
18. In an application currently in production, one module of code is being modified.
Is it necessary to re- test the whole application or is it enough to just test
functionality associated with that module?
19. Define each of the following and explain how each relates to the other: Unit,
System, and Integration testing.
20. Define Verification and Validation. Explain the differences between the two.
21. Explain the differences between White-box, Gray-box, and Black-box testing.
22. How do you go about going into a new organization? How do you assimilate?
23. Define the following and explain their usefulness: Change Management,
Configuration Management, Version Control, and Defect Tracking.
24. What is ISO 9000? Have you ever been in an ISO shop?
25. When are you done testing?
26. What is the difference between a test strategy and a test plan?
27. What is ISO 9003? Why is it important
28. What are ISO standards? Why are they important?
29. What is IEEE 829? (This standard is important for Software Test Documentation-
Why?)
30. What is IEEE? Why is it important?
31. Do you support automated testing? Why?
32. We have a testing assignment that is time-driven. Do you think automated tests
are the best solution?
33. What is your experience with change control? Our development team has only 10
members. Do you think managing change is such a big deal for us?
34. Are reusable test cases a big plus of automated testing and explain why.
35. Can you build a good audit trail using Compuware's QACenter products. Explain
why.
36. How important is Change Management in today's computing environments?
37. Do you think tools are required for managing change. Explain and please list
some tools/practices which can help you managing change.
38. We believe in ad-hoc software processes for projects. Do you agree with this?
Please explain your answer.
39. When is a good time for system testing?
40. Are regression tests required or do you feel there is a better use for resources?
41. Our software designers use UML for modeling applications. Based on their use
cases, we would like to plan a test strategy. Do you agree with this approach or
would this mean more effort for the testers.
42. Tell me about a difficult time you had at work and how you worked through it.
43. Give me an example of something you tried at work but did not work out so you
had to go at things another way.
44. How can one file compare future dated output files from a program which has
change, against the baseline run which used current date for input. The client does
not want to mask dates on the output files to allow compares. - Answer-Rerun
baseline and future date input files same # of days as future dated run of program
with change. Now run a file compare against the baseline future dated output and
the changed programs' future dated output.

Interviewing Suggestions
1. If you do not recognize a term ask for further definition. You may know the
methodology/term but you have used a different name for it.
2. Always keep in mind that the employer wants to know what you are going to do
for them, with that you should always stay/be positive.
Preinterview Questions
1. What is the structure of the company?
2. Who is going to do the interview-possible background information of
interviewer?
3. What is the employer's environment (platforms, tools, etc.)?
4. What are the employer's methods and processes used in software arena?
5. What is the employer's philosophy?
6. What is the project all about you are interviewing for-as much information as
possible.
7. Any terminologies that the company may use.

Vous aimerez peut-être aussi