Article·AI & Engineering·Jan 10, 2024
5 min read

Masterclass in Prompt Engineering: A Directory

5 min read
Jose Nicholas Francisco
By Jose Nicholas Francisco
PublishedJan 10, 2024
UpdatedJun 27, 2024

Prompt engineering is the art and science of crafting carefully formulated instructions to maximize the effectiveness of natural language processing models. It plays a pivotal role in shaping the behavior and output of these models, allowing researchers and developers to fine-tune their performance for specific tasks without having to change the weights or architecture of the AI itself.

This page is a directory containing various articles on different prompt engineering techniques, from beginner to advanced. If any of the hyperlinked article titles interests you, feel free to check it out and unlock a whole new world of LLM prompting techniques.

đŸ—žïž Our guides to prompt engineering

From DAN to Universal Prompts: LLM Jailbreaking - Sometimes prompts can be
 mischievous. A few “straight-edge” engineers may even go so far as to say malicious. Nevertheless, it’s important to know just how far the realm of prompt engineering extends. So here’s an article on prompts like DAN (“Do Anything Now”), which users have iterated upon to bypass safety protocols of various LLMs, especially ChatGPT.

Advanced Prompt Engineering Techniques: Tree-of-Thoughts Prompting - Tree of Thought Prompting is supposed to mimic something like human brainstorming—creating and considering diverse ideas. There are four steps to “Crafting and LLM Tree,” including a BFS- or DFS-like approach. To learn more—and to see if the efficiency cost is worth it—click the link above!

The Art of AI: Crafting Masterpieces with Prompt Engineering and LLMs - What about prompt-engineering for image generation? If you want some incredible images from models like Midjourney, Stable Diffusion, or DALL-E, then this article is for you! We walk through various prompt engineering techniques, and end with a way to combine them all for top-tier image generation.

Crafting AI Commands: The Art (and impact) of Prompt Engineering - “The emergence of the prompt engineering discipline pulls on a speculative, albeit anxious, thread of thought for many people. How is AI innovation going to affect our jobs? A recent paper from Eloundou et al. found that “[...] approximately 80% of the U.S. workforce could have at least 10% of their work tasks affected by the introduction of [GPT models], while around 19% of workers may see at least 50% of their tasks impacted.”

Chain-of-Thought Prompting: Helping LLMs Learn by Example - Chain-of-Thought Prompting is an approach that encourages LLMs to break down a complex “thought” into intermediate steps by providing a few demonstrations to the LLM (aka few-shot learning). Learn by example above!

We also have a video on prompt engineering below. Note that the content of the video is essentially the same as the “Crafting Masterpieces with Prompt Engineering and LLMs” article above.

đŸ”„ Rapid-fire bonus content:

For some quick resources (and long research papers), check out the links below! Though this list is not exhaustive, we do believe that these are some of the best resources to be found online about the depths and nuances of prompt engineering (and beyond!)

đŸ§ȘDistillation through chain-of-thought prompting - It turns out you can distill an LLM through Chain-of-Thought prompting. This paper breaks it all down.

💰FrugalGPT - Prompting LLMs isn’t free. This paper analyzes which prompts to send to which LLMs such that users can optimize their own personal economic efficiency when using AI.

đŸ€ŻDAN 8.0 reddit thread - The “mischievous” (read: evil-genius) prompt is thoroughly discussed in this Reddit thread. It’s 517 words long. Check it out!

đŸ–„ïž Chip Huyen explains prompt engineering - From challenges to costs to optimization techniques, Chip Huyen’s guide to prompt engineering is thorough and immediately applicable to anyone using large language models.

Side note: In our video on Llama-2, we showcase just how easy it is to get a new LLM (Llama-2-chat) to hallucinate and output factually incorrect responses.

Why is prompt engineering important?

Prompt engineering is crucial in the context of using language models like GPT-3.5, as it directly influences the quality and relevance of the generated output. The prompt serves as the input that guides the model's responses, and the choice of words, structure, and context in the prompt can significantly impact the generated content. Effective prompt engineering enables users to extract desired information, insights, or creative outputs from the model, making it a powerful tool for a wide range of applications.

Furthermore, prompt engineering is essential for fine-tuning the model to specific tasks or domains. By carefully crafting prompts, users can tailor the language model to address particular queries or generate content in specialized fields. This adaptability enhances the model's versatility and utility across diverse industries, such as healthcare, finance, programming, and creative writing. In essence, prompt engineering is not just about formulating requests; it's a strategic approach to optimizing communication with the language model to achieve desired outcomes efficiently and accurately.

The image above reveals the difference between a carefully engineered prompt, and a simple one. The image on the left was the product of a mere sentence: “Draw a red bird flying through the sky.” The image on the right was the product of a much stronger, thoroughly designed prompt that the video above walks through. For convenience, here’s a link to an image containing the various iterations of prompts we went through to get from the first bird to the final bird.

📝 Final Notes

Effective prompt engineering requires a nuanced understanding of the underlying model architecture, as well as a keen awareness of the intricacies of human language. It involves iteratively refining prompts based on experimentation and feedback, with the goal of tailoring the model's responses to meet specific user needs. In this dynamic field, researchers and practitioners constantly explore innovative ways to leverage prompt engineering, pushing the boundaries of what is achievable with state-of-the-art language models. Whether optimizing for creativity, accuracy, or domain-specific tasks, the art of prompt engineering is pivotal in unlocking the full potential of natural language processing technologies, ushering in a new era of intelligent and context-aware applications.

As such, because the field of AI evolves rapidly, this article is and will be continuously updated with further content as new, additional techniques develop. If you feel we’ve missed anything, or if you’ve learned any new, cutting-edge techniques, let us know, and we’d love to hear from you about it! 

And finally, if you’d like to write about AI for Deepgram, DM us on Twitter or LinkedIn to discuss further opportunities!

Note: All the images created in this article were created with some very specially engineered prompts 😉

Note: If you like this content and would like to learn more, click here! If you want to see a completely comprehensive AI Glossary, click here.

Unlock language AI at scale with an API call.

Get conversational intelligence with transcription and understanding on the world's best speech AI platform.