Prompting in Generative AI: Key Techniques and Insights from Google

Prompting in Generative AI: Key Techniques and Insights from Google

Prompting in Generative AI: Key Techniques and Insights from Google
lurantech-author
Yamish R. Bhalara
blog-display-date
Thu Apr 17 2025
Prompting in Generative AI: Key Techniques and Insights from Google

In the rapidly evolving world of Generative AI, the art and science of crafting effective prompts is becoming a game-changer. Here are some key prompting techniques and insights from Google's Generative AI learning program. Whether you're a beginner exploring the basics or someone looking to enhance the quality of AI-generated outputs, this blog breaks down practical methods, real-world applications, and tips to get the most out of your prompts. Dive in to discover how small changes can lead to big improvements in results.


What is Prompt Engineering?

Prompt engineering involves designing the input prompt in a way that maximizes the quality and relevance of the output generated by a language model like PaLM, Gemini, or GPT. These models don't "think" in a traditional sense; they predict the next word based on probabilities learned from vast datasets.

Key Prompting Techniques

1. Zero-Shot Prompting

Zero-shot prompting asks the model to perform a task without providing any examples. It relies purely on the model's pre-learned knowledge.

Example: Try in AI Studio ai-logo

  • Prompt:
    Which is the biggest island in pacific ocean?
  • Response:
    The biggest island in the Pacific Ocean is New Guinea.
    It is the second-largest island in the world, after Greenland.

Used effectively, this can be powerful for straightforward tasks, especially when the model has seen enough similar examples during training.

2. Few-Shot Prompting

Few-shot prompting provides the model with a few task examples before posing the final query. This helps guide the model toward the desired response format and logic.

Example: Try in AI Studio ai-logo

  • Prompt:
    i am creating a content for my documents i have services in following manner, 
    
    for Advance Designing :
    Description: This designing was created by my team to full fill the requirements of the project
    Team: Total 4 number of person was allocated for designing.
    Name:  1. Alex,  2.Max,  3. Dev,  4. Jenny
    Duration: It took 3 weeks to complete all the requirements.
    Date: 1st march to 22 march 2024
    Approval: Project manager,
    for Front-end Development :
    Description: Creating front-end using figma provided by the Designing team
    Team: Total 4 number of person was allocated for Front-end development.
    Name:  1. Rishi,  2.Crish,  3. Hanna,  4. Smith
    Duration: It took 5 weeks to complete all the requirements.
    Date: 24th March to  30th April 2024
    Approval: Project manager,
    for Back-end Development :
    Description: Creating back-end using instruction of clients requirements.
    Team: Total 2 number of person was allocated for Front-end development.
    Name:  1. Nancy,  2.Nova
    Duration: It took 4 weeks to complete all the requirements.
    Date: 24th March to  30th April 2024
    Approval: Project manager 
    
    now can you create similar for Domain Handling
  • Response:
    Okay, here's a similar structure you can use for "Domain Handling":
    for Domain Handling:
    Description: This service encompasses all activities related to the acquisition, configuration, and management of the project's domain name. This includes domain registration, DNS setup, SSL certificate installation (if applicable), and ensuring the domain is correctly pointed to the project's hosting environment.
    Team: Total 1 number of person was allocated for Domain Handling.
    Name: 1. Sam
    Duration: The domain handling process typically takes a shorter duration, often completed within a few days, depending on the complexity and any potential issues.
    Date: 23rd March 2024 to 25th March 2024 (Example - adjust based on your actual timeline)
    Approval: Project Manager,

3. Chain-of-Thought Prompting

Google research has highlighted chain-of-thought (CoT) prompting as especially useful for complex reasoning. This approach encourages the model to generate intermediate reasoning steps before the final answer. Here the word step by step is very important. You can try with out that word and can see the differences.

Example: Try in AI Studio ai-logo

  • Prompt:
    If there are 3 red balls and 2 blue balls in a bag, 
    and you draw two balls at random without replacement, 
    what is the probability that both are red? Let's think step by step.
  • Response:
    Step 1: Probability of the first ball being red
    There are a total of 3 + 2 = 5 balls in the bag.
    There are 3 red balls.
    The probability of drawing a red ball on the first draw is the number of red balls divided by the total number of balls:
    P(1st ball is red) = 3/5
    Step 2: Probability of the second ball being red (given the first was red)
    After drawing one red ball, there are now only 2 red balls left in the bag.
    The total number of balls remaining in the bag is now 5 - 1 = 4.
    The probability of drawing another red ball on the second draw, given that the first ball was red, is the number of remaining red balls divided by the total number of remaining balls:
    P(2nd ball is red | 1st ball was red) = 2/4 = 1/2
    Step 3: Probability of both balls being red
    To find the probability of both events happening (the first ball is red AND the second ball is red), we multiply the probabilities from Step 1 and Step 2:
    P(both balls are red) = P(1st ball is red) * P(2nd ball is red | 1st ball was red)
    P(both balls are red) = (3/5) * (1/2)
    P(both balls are red) = 3/10
    Therefore, the probability that both balls drawn are red is 3/10.

This are some prompting techniques that we can use to find optimise and required or desired results. We have used AI Studio by Google to get the response of our Prompts that we have mentioned. We have used gemini-2.0-flash model for these tasks.

The Role of Parameters: Temperature, Top-k, and Top-p

Beyond the prompt itself, the sampling parameters you use during generation significantly impact the response style and quality.

Temperature

Temperature controls randomness. A lower temperature (e.g., 0.2) leads to more focused and deterministic outputs, while higher values (e.g., 0.8 or 1.0) introduce creativity and variety.

  • Low temperature: Better for factual tasks.
  • High temperature: Useful for storytelling, brainstorming, or open-ended tasks.

Top-k Sampling

Top-k limits the model's choices to the top k most probable next tokens. For example, top-k = 40 restricts it to the 40 most likely options, adding a level of control over diversity.

  • Smaller k →more focused and conservative output.
  • Larger k →more diverse, possibly creative output.

Top-p Sampling (Nucleus Sampling)

Top-p chooses from the smallest set of tokens whose cumulative probability exceeds p (e.g., 0.9). This method adapts dynamically to ensure coherence and variability.

  • Top-p = 0.9: Considered a good balance between quality and diversity.
  • Often used in combination with temperature for fine-tuned control.

For most use cases, choosing either Top-k or Top-p (Nucleus Sampling) and tuning it appropriately will yield better results than trying to combine them. Combined, the parameters counteract each other and can have unintended consequences. If you do decide to experiment with both, proceed with caution and a clear understanding of your goals. Remember that simpler is often better.

Parameters are not limited to these three only there are many parameters apart form these as well but to for better undertaning and gaining fruitful result these are some common paramaeters that can help you.


Conclusion

Prompting is not just a technical skill—it's a creative one. As AI becomes more integrated into our tools and workflows, understanding how to design prompts and tune generation parameters will be essential for leveraging the full power of generative models.

Whether you're working with Google's models or other LLMs, mastering prompting techniques like zero-shot, few-shot, and chain-of-thought, alongside smart use of temperature, top-k, and top-p, can elevate your AI-powered projects to new levels.

Happy Prompting! 😉

For any query or more information related our blog don't hesitate to Reach us out.

Ready to Work, Let's Chat
Our team of experts is ready to collaborate with
you every step of the way, from initial consultation to implementation.
Contact Us Today!