Right. Hands up - Everyone who has been told "Testing for this project is to only verify the requirements."
This is fine as far as it goes. Where we get into trouble is what counts as a "requirement."
Most often we are told this means the documented requirements have been analyzed and considered by experts and they are firmly set and we can work from them. Doing so is a classic example of Pandora's Box Testing.
It is a firm belief that the requirements are fixed and unchanging. It is not proof. Frankly, unless you are in a fairly small number of fields of work - e.g., medical, telephony, aeronautics, navigation, I might suggest the first task of a tester is to test the requirements themselves.
I have found it a reliable heuristic, if not a maxim, that if there is an opportunity for more than one interpretation of a requirement or a set of requirements, someone will take advantage of this and interpret them differently than anyone else.
I hear it now: "Pete, if they are communicating and discussing the requirements, then this doesn't happen." And I suggest that "communicating" and "discussing" are not necessarily the same thing. Nor are they sometimes related at all.
When "communicating" means "repeating oft-used and oft-heard buzzwords" then does everyone mean the same thing? Are you all agreeing about the same thing? Are you certain?
Or are you hoping your plans are based on something more than buzzwords?
Work through the "documented requirements" is a good start. Test them. Do they make sense together? When you take them as a set, does the set seem to work? Do they describe what your understanding of the purpose is? Do they match what your understanding of the business need is?
Now then. If this is totally new development - that is pretty much where I start with evaluating documented requirements. Lets face it - most of our projects are not totally new, green field work. They are updates, changes, modifications to existing software. Existing systems.
Do the requirements you just went through describe how the changes interact with what is there currently? Do they describe what differences should be expected? Between them, can you discern what the customers (internal or external) would expect to see?
Do they clearly describe what to you should be looking for in testing? Will your testing be able to describe information in such a way as you can reveal for the stakeholders if this is correct behavior?
Will they be know if the behavior is correct? Are they relying on the documented requirements or something else?
Perhaps they are relying on hope? Maybe the only testing they are familiar with is Pandora's Box Testing.
That would be sad.