Critical thinking and asking the right questions are fundamental skills for effective legal practice. Successful attorneys know that the quality of their questions impacts the quality of their results, whether interviewing a client, examining a witness, or analyzing a case.
When a client comes to you with a legal problem, you ask follow-up questions: “What specifically happened? When did this occur? Who was involved? What circumstances led up to this?” These facts provide the necessary context to develop an effective legal strategy.
The same principle applies when using generative artificial intelligence (AI). Just as you wouldn’t accept “my boss was mean” as sufficient for a discrimination claim, you shouldn’t simply ask AI to “find employment law cases” and expect useful results. The key to effective AI prompting lies in applying the same critical thinking and questioning skills you use throughout your legal practice. Context matters with AI just as much as it does with people. Providing relevant information and direction in your prompts can dramatically improve your results and make you a more effective user of generative AI.
The 7 Ps Framework
Prompting is simply the art of asking generative AI the right question in the right way. Using a systematic approach, such as the “7 Ps Framework,” can help you guide AI more effectively by providing seven key elements to consider when crafting prompts: persona, product, prompt, purpose, prime, privacy, and polish.1 You won’t always need all seven elements, but understanding each component helps you make deliberate choices about what to include in your prompt.
The 7 Ps Framework helps you guide conversations toward useful results from the beginning, instead of hoping for the best and then spending time clarifying what you want after things go off track. Taking 30 seconds to think through your prompt can save you 30 minutes of poor results and follow-up questions.
Persona + Product + Prompt: Setting the Stage
Persona gives AI a role to play, helping it understand the perspective and expertise level you need. Instead of getting generic responses, you’re asking AI to adopt the knowledge and approach of a specific type of professional.
For example, “act as an experienced employment attorney in Milwaukee, Wisconsin” helps AI draw on relevant legal knowledge, consider jurisdiction-specific issues, and respond with appropriate professional language. You can be even more specific as needed: “Act as a family law attorney with 15 years of experience handling high-conflict custody cases” or “Respond as a corporate lawyer advising a startup on compliance issues.”
Product tells AI exactly which format you want for the response. This prevents you from getting a lengthy analysis when you need a quick summary, or a brief answer when you need detailed documentation.
Here are some examples:
“A 2-page client memo”
“A timeline of key events”
“An analysis of strengths and weaknesses”
“A summary of five recent Wisconsin cases”
Prompt is the specific task you want AI to perform. Use clear, active verbs that leave no ambiguity about what you’re asking for, such as “analyze,” “compare,” “draft,” “summarize,” “identify,” or “evaluate.”
For example, instead of asking “tell me about Wisconsin employment law,” you might say “analyze whether this non-compete clause is enforceable under Wisconsin law.” The second prompt uses a specific action verb and tells AI exactly which legal issue to focus on, while the first prompt would likely result in a generic overview that doesn’t help your specific situation.
Purpose + Prime: Adding Essential Context
Purpose means explaining the “why” behind your request. Generative AI performs better when it understands your goal, so tell it what you’re trying to achieve and what outcome you’re seeking.
Compare these two approaches: “What are the elements of a discrimination claim?” versus “I’m evaluating whether my client has a viable discrimination claim and I need to understand the elements to assess the strength of their case.” The second version gives AI the context it needs to tailor its response to your specific need rather than providing a generic legal textbook answer.
Prime is the step at which you leverage your legal expertise, providing the legal and factual context that guides AI’s analysis. Although generative AI draws from an extensive collection of information, it lacks the human context that makes that information practical and relevant. When you prime effectively, you’re providing the context to keep AI focused on what matters for your specific situation.
Key elements to consider when priming include the following:
Jurisdiction and applicable law,
Procedural posture of the case,
Relevant factual background, and
Tone and audience considerations.
For instance: “This involves Wisconsin employment law. We’re in federal court, and the defendant filed a motion to dismiss last week. The facts involve a 45-year-old employee who was terminated three days after filing a harassment complaint.”
Privacy + Polish: Protecting and Refining
Privacy is essential when attorneys use generative AI. Never include client names, confidential details, or privileged information in your prompts. The Model Rules of Professional Conduct apply to the use of generative AI just as they do to any other area of your practice.
Instead of using real client information, create hypothetical scenarios that capture the essential legal issues. For example, rather than writing “My client John Smith was fired from ABC Corp after reporting harassment,” use “A 45-year-old employee was terminated three days after filing a harassment complaint.”
With this approach, you’re still providing the legal context AI needs – jurisdiction, procedural posture, relevant facts, and legal theories – without compromising attorney-client privilege.2
Polish recognizes that your first prompt rarely produces the perfect result. Continue refining your conversation until you get useful outcomes. This is when the iterative nature of generative AI becomes powerful – you can guide it toward better responses through follow-up questions.
Effective follow-up prompts might include the following:
“What recent Wisconsin cases support this analysis?”
“Can you strengthen the section on damages?”
“Focus more on the procedural requirements.”
“Make this more persuasive for a federal judge.”
And don’t forget the most crucial polishing step: verify the results. Generative AI can produce authoritative-sounding responses that are incorrect. Always check citations, confirm legal principles, and validate claims against reliable sources. Remember that you are the expert guiding the analysis. AI is not a replacement for your legal judgment and due diligence.
Using a systematic approach, such as the “7 Ps Framework,” can help you guide AI more effectively by providing seven key elements to consider when crafting prompts: persona, product, prompt, purpose, prime, privacy, and polish.
Advanced Prompting Tips for Better Results
Once you’re comfortable with the 7 Ps Framework, several advanced techniques can help you get even better results from AI.
Beware sycophancy. Generative AI desperately wants to please you and will tell you what it thinks you want to hear. Prompt it to be critical: “What are the weaknesses in this claim?” or “Play devil’s advocate.” This approach forces AI to think critically rather than simply validate your perspective.
Ask for sources. AI sometimes “hallucinates” – creating convincing-sounding responses that are inaccurate and citations that don’t exist. Always ask for specific sources, then independently verify the information yourself. Asking AI “is this case real?” doesn’t work – it will cheerfully confirm citations that are completely made up.
Avoid drift. Long exchanges can cause AI to gradually lose track of your original question and context. Watch for responses that seem disconnected from your original question or get confused about your instructions. Don’t ask more than 15 follow-up questions. At that point, start a new conversation if needed, summarizing where you left off in your new prompt.
Watch out for “pink elephants.” Avoid telling AI what not to do. Just like when someone says “don’t think of a pink elephant” and you immediately picture one, AI systems often struggle with negative instructions. Instead of “don’t cite other states,” say “focus only on Wisconsin cases.” This positive framing helps AI understand exactly what you want.
Final Thoughts: Mind Your Ps and Qs
Good prompts require the same skills that make good lawyers: critical thinking, preparation, attention to detail, and the ability to guide through good questions. Keep in mind that generative AI amplifies your legal expertise – it doesn’t replace your judgment. When you mind your Ps (prompts) and Qs (questions) with the 7 Ps Framework, you’re using the professional expertise you’ve developed over years of practice to direct a powerful new tool.
The 7 Ps Framework
Using a systematic approach, such as the 7 Ps Framework, can help you guide AI more effectively by providing seven key elements to consider when crafting prompts.
Persona – Give AI a role to play; help it understand the perspective and expertise level you need.
Product – Tell AI exactly which format you want for the response.
Prompt – Name the specific task you want AI to perform.
Purpose – Explain the “why” behind your request.
Prime – Provide the legal and factual context that guides AI’s analysis.
Privacy – Never include client names, confidential details, or privileged information in your prompts.
Polish – Refine your conversation until you get useful outcomes.
The 7 Ps in Action: Before and After Examples
Example 1: Legal Research
Before: “Find employment discrimination cases”
After: “As an experienced employment attorney in Wisconsin [Persona], create a 2-page research memo [Product] analyzing [Prompt] recent federal circuit court decisions on religious accommodation claims under Title VII for developing litigation strategy [Purpose]. Focus on cases from the past 3 years in which employers successfully argued undue hardship, particularly in manufacturing settings [Prime].”
Example 2: Contract Drafting
Before: “Write a non-compete clause”
After: “As a business attorney representing a software company [Persona], draft a non-compete provision [Product] that will be enforceable under Wisconsin law [Prime] for inclusion in an executive employment agreement [Purpose]. Include [Prompt] specific geographic and temporal limitations that courts have recently upheld in the technology sector [Prime].”
Example 3: Client Communication
Before: “Explain custody law”
After: “As a family law attorney [Persona], draft a client email [Product] explaining [Prompt] the difference between legal and physical custody under Wisconsin law to help anxious parents understand their options during divorce proceedings [Purpose]. Use reassuring, plain language appropriate for clients with no more than high school education [Prime].”
» Cite this article: 98 Wis. Law. 29-31 (November 2025).