QA engineer interviews test your testing methodology, automation skills, and quality mindset. Companies want engineers who prevent bugs, not just find them.
Testing Fundamentals
1. "What's your approach to testing a new feature?"
Answer: "I start by understanding the requirements and acceptance criteria. I write test cases covering happy paths, edge cases, boundary conditions, and negative scenarios. I prioritize by risk — what breaks worst if it fails? I execute manual testing first for new features, then automate the regression-critical paths. After testing, I verify fixes and do a final regression pass before sign-off."
2. "What's the difference between black-box, white-box, and grey-box testing?"
Answer: "Black-box: testing without knowledge of internal code — you test inputs and outputs based on requirements. White-box: testing with full knowledge of the code — you test paths, branches, and logic. Grey-box: partial knowledge — you understand the architecture but test from the user's perspective. I use all three depending on context."
3. "How do you write a good bug report?"
Fundamental skill. Show precision.
Answer: "Title: clear one-liner. Steps to reproduce: exact, numbered, reproducible. Expected result vs. actual result. Environment: browser, OS, device. Severity and priority. Screenshots or video. Logs if applicable. A good bug report lets a developer reproduce the issue without asking me a single question."
4. "How do you decide what to automate vs. test manually?"
Answer: "Automate: regression tests, smoke tests, data-driven tests, anything that runs frequently. Keep manual: exploratory testing, UX evaluation, edge cases that are hard to script, and one-off tests. The rule: if you'll run it more than 3 times, automate it."
5. "What testing types are you familiar with?"
Cover: Unit, integration, system, regression, smoke, sanity, performance, load, stress, security, usability, accessibility, API testing, and end-to-end. Don't just list — show you know when to apply each.
Automation
6. "What automation frameworks and tools have you used?"
Be specific: Selenium, Cypress, Playwright, Appium (mobile), Jest, pytest, TestNG, Robot Framework, Postman/Newman (API), JMeter or k6 (performance). Mention CI integration — "My Cypress suite runs on every PR in GitHub Actions."
7. "How do you maintain test automation as the product evolves?"
The real challenge — not writing tests, but keeping them alive.
Answer: "I use page object pattern to isolate selectors, so UI changes don't break every test. I keep tests independent and atomic. I review and prune flaky tests weekly rather than letting them accumulate. I treat test code like production code — reviewed, refactored, and documented."
8. "How do you handle flaky tests?"
Every QA engineer faces this.
Answer: "I investigate the root cause — is it timing (add explicit waits, not sleeps), test data dependency (isolate test data), or environment instability? I fix the root cause, not the symptom. If a test is flaky and I can't fix it quickly, I quarantine it so it doesn't block the pipeline, and I schedule time to resolve it."
Process & Collaboration
9. "How do you integrate testing into CI/CD?"
Answer: "Unit tests run on every commit. Integration and API tests run on every PR. End-to-end smoke tests run on deploy to staging. Full regression runs nightly or before releases. The pipeline fails if critical tests fail — no deploying broken code. I work with DevOps to keep test environments stable and fast."
10. "How do you work with developers?"
Collaboration, not adversarial.
Answer: "I involve developers early — I share test plans before development starts so they understand what I'll test. When I find bugs, I provide detailed reports and discuss, not just assign tickets. I also review their unit tests and suggest additional coverage. The best quality comes from shared ownership, not QA vs. dev."
11. "How do you prioritize bugs?"
Answer: "I classify by severity (how bad is the impact) and priority (how urgently it needs fixing). A critical bug in production gets fixed now. A minor cosmetic issue in a rarely-used feature goes to the backlog. I communicate priorities clearly to the product team and push back if critical bugs get deprioritized."
Behavioral
12. "Tell me about a bug that made it to production. What happened?"
Ownership and process improvement. Don't blame others.
Structure: What was the bug → why was it missed → what was the impact → what did you change in the process to prevent recurrence.
13. "How do you approach testing when requirements are unclear?"
Answer: "I ask questions — I don't guess. I create a list of assumptions and validate them with the product manager. If I still can't get clarity, I test based on user expectations and document my assumptions in the test cases."
14. "How do you handle pressure to skip testing before a release?"
Answer: "I communicate the risk clearly: 'If we skip regression, here's what could break and the cost of fixing it in production.' I offer compromises — a targeted smoke test of critical paths instead of full regression. But I never sign off on quality I'm not confident in."
15. "What questions do you have for us?"
Ask about: the QA team structure, automation maturity, testing tools and frameworks, the release process, and how quality is valued by engineering leadership.
Want questions tailored to your exact role? Paste the job description at PasteJob and get a personalized cheat sheet in 15 seconds.
Want questions specific to your job listing?
These are generic questions. For questions tailored to your exact role and company — paste your job listing at PasteJob