News

'Prompt Engineering' Jokes Abound when GitHub Asks Devs for Tips

"Prompt engineering" became the hot new AI thing (salaries north of $300K), then went through kind of a "trough of disillusionment" but seems to have recovered its "no this is a real thing" reputation -- kind of.

Both sides of that divide were represented when GitHub recently took to social media to ask developers for their prompt engineering tips. While many users chimed in with legitimate tips and guidance, the jokes abounded.

GitHub on Feb. 7 published a post asking "What are your top tips for prompt engineering?" which has garnered some 98.9K views and 41 comments.

Let the Fun Begin
[Click on image for larger view.] Let the Fun Begin (source: X).

The Unserious
On the unserious side of things were these:

  • Tip #1: Don't call yourself an engineer around a real engineer
  • no one cause that s___ doesn't exist.
  • Top tip: stop calling it that
  • I don't prompt write. It's not engineering. I'll die on this hill.
  • Be kind. When they become sentient they may show me mercy.
  • Always say "please", improves your chances when the machines rise up.
  • "I'll give you 20 dollars as tip" works flawlessly
  • Promise the LLM some money for good results

The Serious
Despite the ribbing, the true believers amongst the audience provided legit tips, including:

  • Start with, Dedicate this chat to .... Provide all the background info and then begin with question. This technique works well with me.
  • When it comes to prompt engineering, here's my scoop: Keep it simple, get specific with examples, stay organized in your response, highlight key points, listen carefully before you jump in, give it a final proofread, and most importantly, trust yourself!
  • Be clear & concise
    Provide context
    Use examples
    Stay consistent
    Incorporate feedback
    Experiment with variety
    Keep prompts ethical
    Include domain knowledge
    Evaluate & refine
  • Just ask chatGPT what u want and ask for some specific prompts for this work and try those.
  • Experiential learning works wonders in prompt engineering! It's like baking a cake - the more you do, the better you get
  • Just think like a technical writing describing something, that's it.
  • I just gave a Lunch&Learn the other day, so I had a list:
    1: Stop Googling for answers
    2: Ask Who (is the expert), What (is the problem), How (do I want the answer)
    3: Show me step-by-step
    4: Conversion is chatGPT's sweet spot
    5: Drop the error message into chatGPT
  • Have a super clear notion of the results you want in the end. And be as specific as possible in your prompt. Language structure is also super important.
  • Be clear and spesific, Iterate and refine, Use examples.

GitHub's Experts
GitHub replied to its own post with a reminder of tips provided by two developer advocates in June 2023 to help prompt its GitHub Copilot tool, including:

  • Set the stage with a high-level goal. This is most helpful if you have a blank file or empty codebase. In other words, if GitHub Copilot has zero context of what you want to build or accomplish, setting the stage for the AI pair programmer can be really useful. It helps to prime GitHub Copilot with a big picture description of what you want it to generate -- before you jump in with the details.
  • Make your ask simple and specific. Aim to receive a short output from GitHub Copilot. Once you communicate your main goal to the AI pair programmer, articulate the logic and steps it needs to follow for achieving that goal. GitHub Copilot better understands your goal when you break things down.
  • Give GitHub Copilot an example or two. Learning from examples is not only useful for humans, but also for your AI pair programmer.
  • Experiment with your prompts. Just how conversation is more of an art than a science, so is prompt crafting. So, if you don't receive what you want on the first try, recraft your prompt by following the best practices above.
  • Keep a couple of relevant tabs open. We don't have an exact number of tabs that you should keep open to help GitHub Copilot contextualize your code, but from our experience, we've found that one or two is helpful.
  • Stay smart The LLMs behind generative AI coding tools are designed to find and extrapolate patterns from their training data, apply those patterns to existing language, and then produce code that follows those patterns.

Our Own VP of AI:
Becky Nagel provided her own list of prompt engineering tips that come free with a subscription to the Bytes of AI "weekly guide to ChatGPT and generative AI technologies."

Business Prompts
[Click on image for larger view.] Business Prompts (source: Becky Nagel).

The vice president of AI at 1105 Media Inc., the parent company of Visual Studio Magazine, listed these:

  • Use GPT-4 Whenever Possible: You've heard of the saying, "The right tool for the right job." This is as true with AI as it is with anything else. ChatGPT and other LLMs are tools, and you want to pick the right version of ChatGPT for what you want to accomplish. In the vast majority of cases, that choice will be the newest version of ChatGPT, version 4. [Things move fast in AI, so update this with latest/greatest LLM]
  • Always Open a New Conversation Window: This is a short tip but it's absolutely crucial: For every prompt you enter in ChatGPT, each time you want new output from it, you must start a new session.
  • Give ChatGPT a Role: The first step most ChatGPT prompt experts recommend you do when you open that new chat window is to give ChatGPT a role to play.
  • Ask ChatGPT To Tell You About the Topic You Want To Focus On: Like many of the recommendations in this PDF, I have no scientific proof that this specific hint will improve your eventual prompt output.
  • Be Polite : Yes, this is a real tip! There's a surprising amount of debate among ChatGPT power users about how polite one should be with their ChatGPT queries. Some users swear that using "please" and "thank you" will result in at least somewhat higher-quality output, while others dispute that.
  • Include Length and Tone Instructions in Your Prompt : There are a million ways to write something, and ChatGPT isn't going to know offhand that, for example, you prefer a business-casual (versus formal) tone or a more concise (versus longer) output.
  • Provide an Example: Often with ChatGPT, it can be better to show it what you want versus instructing it.
  • Don't Use Negative Prompts: ChatGPT (like almost all creative generative AI technologies) tends to have issues processing negative instructions because it struggles with the logic behind them.
  • Experiment with Methods of Querying: Some people believe that the exact wording you use when you ask ChatGPT for output can make a difference in the output quality you receive.
  • Ask It To Expand on Sections (vs. Whole): Running ChatGPT is expensive; every output costs it money and longer output costs more. Even if you're paying $20 a month to use ChatGPT Plus, you may not be covering the processing costs of your queries.
  • Try Again Later: When I'm training people on using ChatGPT, I often encourage them to think of ChatGPT like a teenager: It can be moody and cantankerous and sometimes will completely shut down on you. When that happens -- when you're not getting the results you want and the system just isn't working for you -- the best thing to do is not to continue to fight it, but just to walk away for a bit and try again later.

There are still a lot of prompt engineering jobs out there (maybe not reaching $335K like they used to), so stay tuned for more tips, debates and maybe even jokes.

About the Author

David Ramel is an editor and writer at Converge 360.

comments powered by Disqus

Featured

  • Compare New GitHub Copilot Free Plan for Visual Studio/VS Code to Paid Plans

    The free plan restricts the number of completions, chat requests and access to AI models, being suitable for occasional users and small projects.

  • Diving Deep into .NET MAUI

    Ever since someone figured out that fiddling bits results in source code, developers have sought one codebase for all types of apps on all platforms, with Microsoft's latest attempt to further that effort being .NET MAUI.

  • Copilot AI Boosts Abound in New VS Code v1.96

    Microsoft improved on its new "Copilot Edit" functionality in the latest release of Visual Studio Code, v1.96, its open-source based code editor that has become the most popular in the world according to many surveys.

  • AdaBoost Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the AdaBoost.R2 algorithm for regression problems (where the goal is to predict a single numeric value). The implementation follows the original source research paper closely, so you can use it as a guide for customization for specific scenarios.

  • Versioning and Documenting ASP.NET Core Services

    Building an API with ASP.NET Core is only half the job. If your API is going to live more than one release cycle, you're going to need to version it. If you have other people building clients for it, you're going to need to document it.

Subscribe on YouTube